Hi Rafe.
rafeb wrote:
> >The only reason I can see that a greater number of bits would help is that
> >when you are at the extremities of the CCD's range, more bits should help
> >resolve meaningful data from noise, or by reducing the size of the steps,
> >reduce the loss of image information which lies between the steps at a lower
> >bit depth.
>
> Not quite. There's no point going for extra bits,
> without a corresponding decrease in overall system
> noise. If the noise is equal to one LSB at 8 bits,
> then it's 2 LSBs at 9 bits, 4 LSBs at 10 bits, etc.
Ditto, not quite.
You're assuming that all the noise is generated in the CCD and analogue stages.
In fact, bit dither in the A/D conversion can make a significant contribution to
the overall noise figure.
Techniques like correlated double sampling can significantly reduce the
contribution of analogue noise, whereas the digital noise, or uncertainty, is
always going to be in the region of plus or minus 1 LSB. Each extra bit takes the
digital noise down by -6dB.
A smaller voltage step between bit levels also means that the analogue noise
isn't excessively amplified by causing spurious digital bit changes.
For instance, noise that causes an oscillation between bit levels in an 8 bit
converter will always look as if it has an ampitude of 1/256th of the maximum
voltage input. The same noise in a 14 bit converter *might* look 64 times
smaller.
It will certainly look closer to its real value.
> If you do the math, you'll find that using a 14-bit
> A/D on most CCD scanners is kind of silly; in such
> cases, one LSB generally equates to about 10-50
> microvolts of signal.
How do you work out this figure?
I make it more like 170 microvolts, since most CCDs have a saturation voltage in
the region of 2.8 volts.
Anyway, microvolt signal levels are no big deal as long as the source impedance
is kept low.
I'm not saying that 14 bit A/Ds can be used to their full advantage by any means,
but their use isn't entirely wasted. The range of the signal from a CCD amounts
to about 12 bits, so the last useable 12 dB (0.6D) causes a change of 16 levels,
not just 4, as would be the case with a 12 bit A/D.
The extra bits also give room for the scanner hardware to take advantage of any
improvement in sensor technology that may come along, without a major re-build. A
bit of 'future proofing' by the circuit designers.
I think I might detect the signs of such a change in the air.
Sony have recently cut their range of linear CCD sensors quite drastically,
perhaps to make room for something better?
And nearly all the new scanners are coming out with 14 bit A/D converters.
Maybe I'm just putting 2 and 2 together and making 5.
Regards, Pete.