At 11:19 PM 1/12/01 +0000, Pete wrote:
>I'm not saying that 14 bit A/Ds can be used to their full advantage by any
means,
>but their use isn't entirely wasted. The range of the signal from a CCD
amounts
>to about 12 bits, so the last useable 12 dB (0.6D) causes a change of 16
levels,
>not just 4, as would be the case with a 12 bit A/D.
>The extra bits also give room for the scanner hardware to take advantage
of any
>improvement in sensor technology that may come along, without a major
re-build. A
>bit of 'future proofing' by the circuit designers.
My primary peeve is that the number-of-bits in the A/D
converter is used as a selling point by the manufacturers,
and folks feel smug about the fact that they're sending
42-bit images to Photoshop, without really thinking things
through. (Without considering, for example, that the
four or five LS bits may be pure noise.)
The extra bits generally don't hurt anything, aside
from consuming bandwidth, power, memory, CPU cycles,
etc. But they don't necessarily help much either, if the
noise is well in excess of 1 LSB. Of course, it may be
possible to trade bandwidth for lower noise, and the
extra bits may help with that.
I suspect the mfgrs are using 14-bit converters these
days mostly because they've gotten very inexpensive.
And "14-bits" looks good on the side of the box and
in the specs.
I hope you're right about pending improvements in sensor
technology. That would be nice. Though I wonder if
the bulk of research now isn't in area sensors (for
digicams) rather than the linear sensors used in film
scanners. The digicam market is much larger, I think.
PS: My figure of 10-50 uV for a 14-bit step was based
on a recent measurement from a cheap flatbed scanner
at work.
rafe b.