DATV Signal Definitions

Carrier-to-Noise Versus Signal-to-Noise

Abbreviations and Definitions

•Carrier-to-noise ratio C/N ratio or CNR

•Signal-to-noise ratio S/N ratio or SNR

•In the world of telecommunications, “SNR” and “CNR” are often used interchangeably

•For this discussion, the ‘noise’ in CNR and SNR is assumed to be additive white Gaussian noise (AWGN). “The signal-to-noise ratio expresses in decibels the amount by which a signal level exceeds its corresponding noise”    Telecommunications Transmission Handbook, 2ndEd., by Roger L. Freeman John Wiley & Sons, Inc., ©1981. “In the most general case, SNR is expressed as the ratio of rms(root mean square) signal level, Srms, to the rmsnoise, Nrms, (SNR = Srms/Nrms)”.  Measuring Noise in Video Systems Tektronix Application Note 25W-11148-0, ©1997

•In the world of cable, we generally use CNR and SNR to represent quite different measurement parameters: one in the RF domain and the other in the baseband domain.

•Technically speaking, when measuring CNR or SNR against real thermal noise, one actually is measuring (C+N)/N or (S+N)/N. Not normally an issue unless carrier-to-noise or signal-to-noise ratios are very low—say, single digit values.

 Carrier-to-Noise Ratio

•CNR is generally accepted to be a pre-detection measurement, that is, one made at RF

•From the perspective of analog TV channels, CNR is the difference, in decibels, between the amplitude of a TV channel’s visual carrier and the rms amplitude of system noise in a specified bandwidth.

•According to the FCC’s cable rules in §76.609(e), system noise is the “total noise power present over a 4 MHz band centered within the cable television channel.” This definition is applicable only to National Television System Committee (NTSC) TV channel CNR measurements. Phase alternating line (PAL) TV channels use a slightly greater noise power bandwidth.

•The FCC doesn’t actually use the term CNR in the rules. §76.605(a)(7) states “The ratio of RF visual signal level to system noise shall…not be less than 43 decibels.”

•That’s in line with the generic definition of SNR, although most cable operators understand it to mean CNR.

•Good engineering practice targets end-of-line analog TV channel CNR in the 45 to 49 dB range.

 Signal-to-Noise Ratio

•In cable industry vernacular, SNR is generally accepted to be a pre-modulation or post-detection measurement, that is, one made on a baseband signal such as video or audio.

•From the previously mentioned Tektronix Application Note: “In video applications, however, it is the effective power of the noise relative to the nominal luminance level that is the greater concern.”

•The Application Note goes on to define video SNR in dB as 20log(Lnominal/Nrms), where Lnominal has a value of 714 millivolts peak-to-peak (100 IRE) for NTSC or 700 mV p-p for PAL. These luminance values exclude sync.

•Translation: Baseband video SNR is the ratio of the peak-to-peak video signal, excluding sync, to the noise within that video signal. The noise is measured in a specified bandwidth, usually defined by a combination of low pass, high pass and weighting filters. Those filters limit the measured noise to a bandwidth that is roughly the same as the video signal, and may be used to remove certain low frequency noise from the measurement. Weighting filters are used to simulate the eye’s response to noise in the TV picture, and are based on standards such as RS-170A, RS-250B and NTC-7.

 CNR vs. SNR Recap

CNR vs. SNR Recap•CNR is a pre-detection measurement performed on RF signals. Raw carrier power to raw noise power in the RF transport path only—say, a coaxial cable distribution network or a standalone device such as a converter or headend hetrodyne processor. Ideal for characterizing network impairments.

•SNR is a pre-modulation or post-detection measurement performed on baseband signals. Includes noise in original signal, transmitter or modulator, transport path, and receiver & demodulator
Ideal for characterizing end-to-end performance—the overall signal quality seen by the end user.

Digitally Modulated Carrier CNR

•DOCSIS specifies a minimum 35 dB CNR for downstream digitally modulated carriers, and 25 dB for upstream digitally modulated carriers.

•Carrier amplitude is the digitally modulated carrier’s average power level.

•Noise power bandwidth is not 4 MHz!

•The correct noise power bandwidth is one that is equivalent to the digitally modulated carrier’s symbol rate. Symbol rate bandwidth is numerically equal to the symbol rate—e.g., 1280 ksym/sec = 1.28 MHz.

 CNR Noise Power Bandwidth

Downstream

 Channel Bandwidth Symbol Rate Modulation Format Noise Power Bandwidth
 6 MHz 5.056941 Msym/sec 64-QAM 5.06 MHz
 6 MHz 5.360537 Msym/sec 256-QAM 5.36 MHz
 8 MHz 6.952 Msym/sec 64-or 256-QAM 6.95 MHz

Upstream

 Channel Bandwidth Symbol Rate Modulation Format Noise Power Bandwidth
 200 kHz 160 ksym/sec QPSK or 16-QAM 
0.16 MHz
 400 kHz 
320 ksym/sec
 QPSK or 16-QAM 
0.32 MHz
 800 kHz 
640 ksym/sec
 QPSK or 16-QAM 
0.64 MHz
 1.6 MHz 1280 ksym/sec QPSK or 16-QAM 
1.28 MHz
 
3.2 MHz
 
2560 ksym/sec
 

QPSK or 16-QAM
 
2.56 MHz
 
6.4 MHz
 
5.12 Msym/sec
 
64-QAM or 128-QAM TCM
 5.12 MHz

Digitally Modulated Carrier CNR vs. BER

 Modulation format 1.0E-04 BER 1.0E-06 BER 1.0E-08 BER 1.0E-10 BER 1.0E-12 BER
 ASK & FSK 7 dB 9 dB 10 dB 11 dB 12 dB
 BPSK 9 dB 11 dB 12 dB 13 dB 14 dB
 QPSK 12 dB 14 dB 15 dB 16 dB 17 dB
 16-QAM 19 dB 21 dB 22 dB 23 dB 24 dB
 32-QAM 21 dB 23 dB 24 dB 25 dB 26 dB
 64-QAM 25 dB 27 dB 28 dB 29 dB 30 dB
 256-QAM 32 dB 34 dB 35 dB 36 dB 37 dB

•If one assumes that SNR is a baseband measurement, there really is no easy way to measure baseband data SNR

•A better parameter is modulation error ratio (MER) or error vector magnitude (EVM)
MER and EVM take into account the combined effects of CNR; transmitter, upconverter or CMTS phase noise; impairments such as second and third order distortions; group delay; in-channel frequency response problems (amplitude tilt or ripple) and microreflections.

Why Measure MER?

•Somewhat analogous to in-channel signal-to-noise ratio (“MER” and “SNR” are often used interchangeably)

•Direct measure of modulation quality

•Direct linkage to BER

•Can be used in conjunction with adjacent channel power to estimate linear distortions

•Good metric for end-to-end health of a network, but provides little insight about the type of impairment

Modulation Error Ratio: Modulation Quality
Modulation error = Transmitted symbol –Target symbol
Source: Hewlett-Packard

Modulation Error Ratio
MER = 10log(average symbol power/average error power)

•Minimum recommended downstream MER (includes 3 to 4 dB of headroom for reliable operation)
64-QAM: 27 dB     256-QAM: 31 dB

•Typical headend and node downstream MER should be 34 to 36 dB or greater

•QPSK typically requires a minimum MER of 10~13 dB, depending on CMTS make/model

•Noise appears random

•CW produces “donut” shape

•16-QAM typically requires a minimum MER of 17~20 dB, depending on CMTS make/model

•Noise appears random

•CW produces “donut” shape

Why Measure EVM?
•Many engineers are familiar with EVM

•More comfort with a linear measurement

•Links directly with the constellation display

•Linear relationship between EVM and constellation “cloud” size

•No direct link between EVM and BER

Error Vector Magnitude: Modulation Quality  Error Vector Concept
EVM: The magnitude of the vector drawn between the ideal (reference) state position and the measured state position.

Error Vector Magnitude
EVM = (RMS error magnitude/maximum symbol magnitude) x 100%

By convention, EVM is reported as a percentage of peak signal level, usually defined by the constellation’s corner states

Error Vector Magnitude
•Some QAM analyzers measure downstream EVM

•Upstream EVM usually must be measured with a vector signal analyzer

 What About CMTS Upstream SNR?

•Broadcom 3137 burst demodulator chip used in most CMTSs can provide upstream SNR estimate

•CMTS SNR estimate is NOT the same thing as CNR that one would measure with a spectrum analyzer or similar test equipment

•SNR estimate is more like MER or EVM
Group delay, microreflections, poor in-channel frequency response and similar impairments will degrade SNR estimate, even though CNR appears fine

•Good tool for tracking average long-term trends

•Because of differences in the algorithms for CMTS-reported SNR versus a spectrum analyzer CNR measurement, there can be differences between the two values—especially if the CNR is extremely low or high. However, for the range of 15~25 dB CNR, where AWGN is the primary noise impairment, the two measurements should agree to within less than 2 dB.

•Impulse noise and certain other fast transient impairments generally will not show up in CMTS SNR estimate

•CMTS-reported SNR is a post-detection measurement. Anything that impairs proper demodulation (collisions, high burst/impulse noise, improper modem operation) or deficiencies in the data base (changes during the measurement process, blending of combined node performance) can skew reported SNR values.

•Other factors that may degrade CMTS-reported SNR, even when CNR is good, include improper modulation profiles, bad timing errors, and poor headend combiner/splitter isolation

•CMTS-reported SNR will always be less than—or at best equal to—CNR, but should never be better than CNR

Summary
•CNR is a measure of pre-detection signal quality
•CNR is ideal for characterizing performance of the cable network
•SNR, MER and EVM are measures of post-detection signal quality
•SNR is a useful metric with which to quantify a baseband signal—say, the video quality seen on a TV set
•SNR, MER and EVM—along with bit error rate—are ideal for characterizing the health of a digital signal