The distortion introduced by ADC's is intrinsically very different from 
that introduced by typical analog circuitry. ADC's have a transfer 
function which is highly non-linear at the small-scale (partly because 
of the quantisation steps intrinsic to the ADC process, but mainly 
because of the dynamic non-linearity and the structure in the integral 
non-linearity). On the other hand normal analog circuitry has a very 
smooth transfer function on the small scale and a smooth transition into 
saturation (an exception here would be cross-over distortion class AB or 
B amplifiers). Put another way, if we tried to describe the transfer 
function by a polynomial then an ADC would require a polynomial with an 
order up to about the number of distinct levels in the ADC (i.e. for a 
16bit ADC we would need a polynomial of order 65000), whereas a good 
analog circuit used below saturation would only need a polynomial of 
order less than ten.

This fundamental difference is the reason why distortion products in an 
ADC are roughly constant in absolute power level (i.e. dB relative to 
full scale or dBFS) as the signal level is increased as you noted in 
your 2-tone tests. This can often be seen in the data sheets, for 
example p. 20 of the TI AFE8406 dual digital receiver,  or p. 20 of the 
Analog Devices AD9446 ADC. So agreed, TOIP has little use for ADC's 
where the transfer function is dominated by higher order distortion.

In contrast, for an analog circuit the absolute level of the major 
distortion products increases *faster* than the signal level as the 
level is increased, and hence the concept of TOIP does make sense for 
this type of circuit.

One way of effectively "smoothing out" the transfer function and to 
reduce the level of the distortion products is to add a dither signal. 
The effect here is to randomise the errors that occur for a given input 
signal level. Without dither and for a certain input voltage the ADC 
output will be in error from the ideal infinite resolution case and this 
error will always be the same. However, if we add a changing noise-like 
signal then successive suamples of the ADC output will have different 
uncorrelated errors. The fact that they are uncorrelated implies that 
they will not appear in a spectrum as a distint spur anymore but as 
noise smeared out over the whole bandwidth, i.e. it will add a small 
amount to the noise floor (if the dither signal is itself a wideband 
noise signal then it to will raise the noise floor by an amount 
dependent on its relative level). I think the 16 MHz weak tone you added 
in your investigation is acting as a dither signal. In this case, since 
the dither is not a random noise signal, the power from the distortion 
products will not be spread evenly over the whole bandwidth but will be 
in a number of weak spurs. Still the effect will be to reduce the third 
order IMD level.

There is a good article by Walt Kester on the Analog Devices website 
which explains why dither can improve the SFDR of an ADC:
http://www.analog.com/library/analogdialogue/archives/40-02/adc_noise.html

If the application we are interested in is SDR here for shortwave 
reception using an ADC followed by digital down converter (i.e. not the 
IQ mixer plus audio ADC approach), then my guess it that dither is 
completely unnecessary since the input signal will intrinsically contain 
a lot of noise and uncorrelated weak signals spread all over the input 
frequency range and these will effectively provide the desirable dither 
function. In fact, it even may be desirable from this point of view not 
to restrict the ADC input bandwidth by filtering too much to ensure 
there is enough noise to dither out the ADC small-scale nonlinearities! 
For the direct conversion approach the performance of a (24bit) audio 
ADC may be good enough to go without dither at all.

Nick, G4JNX.

Andy Talbot wrote:
> I've placed the test description and results of some SDR-IQ linearity
> tests (in .PDF format) on my website given below.  Look in  the SDR
> projects section.
>
> www.scrbg.org/g4jnt/SDRProjects.htm
>
> If one thing comes out of this test - it shows that the concept of
> using Third Order Intercept Point (TOIP) as a measure of performance
> for wideband digital receivers  would appear to be completely flawed.
>
> For example, adding in a third tone F3, at a lower level while
> monitoring the 2.F1 - F2 Intermod product, causes that product to drop
> in amplitude dramatically.   And why does the IMP stay reasonably
> constant (in level, not dBc)  as the input level of the two tones is
> changed.
>
> Explanations please !
> We're well into new territory here.
>
> Andy  G4JNT
> www.scrbg.org/g4jnt
>   

Reply via email to