What is Phase Noise?

2 Answers
Can you answer this question?

- everything RF

Aug 13, 2019

Phase noise is defined as the noise arising from the rapid, short term, random phase fluctuations that occur in a signal. These random fluctuations are caused by time domain instabilities called as phase jitter.

Phase noise is the noise spectrum that is seen spreading out on either side of a signal as a result of the phase jitter. In most radio receiver applications the phase noise is quoted in terms of single sideband phase noise. The phase noise spreads out equally either side of the carrier but only one side is measured - hence the name single sideband phase noise.

While specifying phase noise, three elements need to be specified:

1. Phase Noise Amplitude: It is expressed in dB relative to the carrier. This is normally denoted in dBc, e.g. -70 dBc - 70 decibels down in level from the carrier.

2. The offset from the carrier: Ex. 1 kHz, 10 kHz, 100 kHz.

3. Measurement bandwidth: Noise power is proportional to the bandwidth so it should be always mentioned. Typically 1 Hz is taken as the bandwidth.

May 6, 2020

Phase noise is a kind of specification which indicates short time frequency stability in the frequency domain. The unit of phase noise is -dBc/Hz @ offset frequency, for example: -130 dBc/Hz @ 100 Hz offset. This means when frequency offset is 100 Hz, the phase noise is -130 dBc/Hz. A higher absolute value of the phase noise is better.

Most RF systems require an overall integrated phase noise specification to be met, as phase noise can corrupt both the upconverted and down-converted signal paths. For digital systems, the integrated phase noise can be converted and expressed as phase jitter (phase noise in the time domain). Please note that phase noise just indicates short time frequency stability. If one needs to confirm long time frequency stability, one needs to refer to the specification "Aging".

Contributed by

Dynamic Engineers

Country: United States
View Profile