I have always disliked the term "Minimum Discernable Signal (MDS), because it really is a misnomer. Obviously different people can discern a signal at different levels, and what does it mean to "discern" a signal anyway?

The technical definition of MDS is simply the effective noise level in the receiver. It is typically measured by injecting a low-level CW carrier. When the signal plus noise is 3 dB (twice the power) compared to the noise alone, then the signal level equals the noise level, which is the MDS.

A receiver's MDS is a strong function of bandwidth. If you increase the bandwidth by 10x, the MDS goes up (gets worse) by 10 dB. The standard bandwidth for receiver testing is 500 Hz.

Since an SSB receiver is nothing more than a frequency translator from radio to audio frequencies, you can measure the signal to noise level at the audio output. However, theoretically you need a true RMS voltmeter to do an accurate measurement. A typical multimeter reads the average voltage of the rectified AC waveform, which is a little different from the RMS value. The error is small, however. You can correct for it by adjusting the signal generator for a 3.2 dB increase (1.445 voltage ratio) instead of 3.0 dB.

A bigger source of possible error is the automatic gain control in the receiver. Disable it if possible to make sure it doesn't affect the measurement.

Recent editions of the ARRL Handbook (2012 or later) have a thorough explanation of how to measure MDS, in Section 25.5 "Receiver Measurements."

Alan N1AL (author of Chapter 25)


On 05/04/2014 05:34 AM, Brian Alsop wrote:
It was time to drag out the test gear and give my older K3 a going over.

One thing of interest was Minimum Detectible Signal.

There is a WIKI definition for it but the equation doesn't help me a bit.

I just thought I'd attach a calibrated signal generator and keep
reducing the level (for a 500 Hz bandwidth) until I couldn't hear it.
That doesn't seems subjective since the signal generator is a constant
tone and not information to decode.

Then I tried looking at Spectrogram and defining a minimum S/N ratio
which would define MDS.  The MDS values derived this way were much lower
for a 6 dB S/N ratio.  In fact, I couldn't hear the signal at this S/N
ratio!  The integration time constant used for averaging clearly was
helping with detection.  So what time constant would be appropriate for
normal CW?

How indeed is MDS measured quantitively?

73 de Brian/K3KO


-----
No virus found in this message.
Checked by AVG - www.avg.com
Version: 2012.0.2247 / Virus Database: 3722/6937 - Release Date: 05/03/14

______________________________________________________________
Elecraft mailing list
Home: http://mailman.qth.net/mailman/listinfo/elecraft
Help: http://mailman.qth.net/mmfaq.htm
Post: mailto:[email protected]

This list hosted by: http://www.qsl.net
Please help support this email list: http://www.qsl.net/donate.html
Message delivered to [email protected]


______________________________________________________________
Elecraft mailing list
Home: http://mailman.qth.net/mailman/listinfo/elecraft
Help: http://mailman.qth.net/mmfaq.htm
Post: mailto:[email protected]

This list hosted by: http://www.qsl.net
Please help support this email list: http://www.qsl.net/donate.html
Message delivered to [email protected]

Reply via email to