So where do RGB sensors fit into this?  Chips read each color independently.

If the sensor is R and the next is G and the next B, doesn't that mean that the 
distance between red sensor is at least twice their diameter?  So at most, you sample 
the red light on only 1/3 of the chip... unless you are one of those Foveon (?) chips. 
 This means there are dead spots for sure.

I'd bet that there is no grain in digital pictures because the grain is a pixel.  
Either it is the same of 64,000 colors as the next pixel (based on the RGB sensors 
interpolated) or it is not because it is an edge of another color/shape.

The part that gets me most here is experiences I have had with digital scans of 
slides.  I remember trying to send a picture of a sport fishing boat leaving the foggy 
harbor at sunrise to the PUG.  I had a devil of a time getting the banding out of the 
sky.  The light and color changed so slowly that you could see the changes from one of 
the 12 steps of red to the next.  It made for 3 or 4 diagonal bands across the sky in 
the distance.  What I needed here was a scan (or digital camera) with more bits per 
color, or shades per color.  The digital approximation was not so hot.  

I've got to think that this will always be a problem to some extent.

Regards,  Bob S.

In a message dated 1/27/2003 3:06:40 PM Eastern Standard Time, [EMAIL PROTECTED] writes:

> No, not really.  The huge majority of area on a ccd is sensitive to
> incoming light, even moreso with modern cmos. Just because the diode
> junction is sometimes small doesn't mean it doesn't  "see" 
> everything
> thru the microlens.

Reply via email to