"Raimo Korhonen" <[EMAIL PROTECTED]> wrote:

>Maybe this could be avoided by the use of some kind of optics in front of the chip?

The legendary Bill Peifer addressed this many moons ago:

"Peifer, William [OCDUS]" <[EMAIL PROTECTED]> wrote:

>All this talk about "analog" vs. "digital" lenses has got me wondering a
>bit.  I'm curious where this whole idea of CCD sensors requiring (or
>preferring) perpendicular rays originated.  I'm pretty convinced that it
>must have originated because somewhere along the line, something got taken
>out of context, and a fundamentally incorrect idea grew from there.  From
>the standpoint of the underlying physics, Tom is absolutely right -- the
>purpose of a lens is to bring an image to critical focus at the focal plane,
>and the nature of the sensor (film, CCD, CMOS, or other) isn't particularly
>relevant.  After all, if all the light rays strike the sensor
>perpendicularly, then they are necessarily parallel and thus cannot form an
>image at the focal plane!
>
>I suspect that this perpendicular-ray story -- dare I say "legend"? -- may
>have originated from a misinterpretation of the characteristic behavior of
>CCD sensors.  We all know that in single-chip color CCD sensors, some of the
>pixels are sensitive to red, others to green, and still others to blue.  For
>the case of color cameras with single CCD sensors, color sensitivity is
>imparted to a particular pixel by incorporating a microscopic optic -- a
>lenslet and filter -- in front of that pixel, which I believe is
>accomplished as part of the manufacturing process for the sensor chip.  I
>can imagine that the numerical aperture of this microscopic optic may not be
>terribly large, and it might very well constrain the field of view of its
>corresponding pixel.  Maybe someone that knows more about chip fab can
>comment on this.  Anyway, although each individual pixel may very well be
>"looking" through an optic with small numerical aperture, it's only
>"looking" a very short distance (microns?  tenths of microns?) to the
>illuminated spot on the focal plane directly in front of it.  In fact, this
>is precisely what you want.  If each pixel had a more "wide-angle" view, it
>would not only register the intensity of light directly in front of it, but
>it would also register the intensity of light from a immediately adjacent
>pixels (perhaps pixels intended to sense a different color), resulting in a
>spatially and chromatically degraded image.  The characteristics of the
>macroscopic, "analog" lens mounted onto the front of the camera -- focal
>length, f-number, etc. -- isn't particularly relevant, except that a faster
>"analog" lens will make each pixel-size spot of light at the focal plane
>correspondingly brighter.
>
>Jaume's original question about spectral characteristics of particular
>lenses and lens coatings is interesting as well.  The general strategy in
>designing the ~lens~ is, among other things, to reduce chromatic aberration;
>that is, to get red, green, and blue rays from a single object point to
>focus at a single point on the same focal plane.  I think lens ~coatings~
>are generally optimized to match the response of the human eye, rather than
>the film emulsion.  (Likewise, most film emulsions -- excluding infrared, of
>course -- are designed to match the human eye.)  I believe that the general
>strategy in designing antireflection coatings (like SMC) is to minimize the
>reflective loss of green light, since green is the color our eyes are most
>sensitive to.  This doesn't mean that the coated lens passes primarily green
>light; rather, it means that for the 1% or 2% of light that would otherwise
>be lost at each air-glass interface of an uncoated lens element, the lens
>designers try to "rescue" the green component by applying a green-optimized
>antireflection coating.  CCDs are more sensitive to the red end of the
>spectrum than the human eye.  You might imagine that in order to maximize
>the signal level at the focal plane of the CCD, a lens designer might
>consider using antireflection coatings optimized for passing red light.
>However, this would yield an image with what we would perceive as a highly
>perturbed color balance.  In fact, for consumer imaging applications,
>designers use filters that ~decrease~ the intensity of far red and near
>infrared light impinging on the sensor.  Thus, I can't imagine that consumer
>digital camera designers would go to the expense of new lens designs, or
>bodies specific for old vs. new lenses.  (Although that would certainly be
>an interesting marketing gimmick....)
>
>Just as a final aside, I'll mention a pet peeve of mine.  It seems that in
>many discussions, we refer to film-based and CCD-based imaging as "analog"
>and "digital".  This is really an artificial distinction.  CCDs, after all,
>~are~ analog sensors, and the readout electronics for CCDs are analog
>circuits.  The only thing that makes "digital" cameras digital is the way
>the analog signal array is stored after being read off the CCD sensor.  A
>minor point, but a pet peeve nonetheless.
>
>Bill Peifer
>Rochester, NY
>-

-- 
Mark Roberts
www.robertstech.com
-
This message is from the Pentax-Discuss Mail List.  To unsubscribe,
go to http://www.pdml.net and follow the directions. Don't forget to
visit the Pentax Users' Gallery at http://pug.komkon.org .

Reply via email to