> Regardless of the fact that in 1934 (as was indicated to me in off > reflector email) we used to not use the meter for the S report, at some > time (1970's when proper calibration and standardizations' came about) we > were able to shift that OLD antiquated 1934's definition over to a > STANDARDIZED S meter reading as part of the RST.
There may be a "standard" some people believe in, but it is a "paper standard" that never took hold. Drake used 5 db per S unit as a goal, ICOM about the same. Collins was down around 3 dB per S unit as a target. Most receivers are around 1 dB or so per S unit down around S 1, and very few prior to digital processing were ever remotely linear over the S range. My FT1000MP MKV, sitting in front of me now, is 2 S units per 6 dB at S8 and the very same 6 db pad drops it from S5 to S0 (it has that scale point, even though there is no such thing). I've never measured the K3 for many reasons. S meters historically have been very poor, absolute signal level at a receiver is not an indication of field strength in volts-per-meter, volts-per-meter is not a constant indication of S/N ratio or even how "loud" a signal is, and so on. This whole thing is an exercise similar to arguing how to measure plate milliamps using #47 light bulb. How would Elecraft or anyone else measure the meaningless S units of an S3 signal when RF gain is set so the DSP only sees an S5 signal at the lowest signal sensitivity? Why work to know what isn't even important, and what is never useful? 73 Tom ______________________________________________________________ Elecraft mailing list Home: http://mailman.qth.net/mailman/listinfo/elecraft Help: http://mailman.qth.net/mmfaq.htm Post: mailto:[email protected] This list hosted by: http://www.qsl.net Please help support this email list: http://www.qsl.net/donate.html

