On Wed, Dec 17, 2003 at 10:11:19AM -0800, Bruce Dayton wrote:
> One thing you are not factoring in to this issue is the output side.
> When the output is digital, you have the same basic problem. Each
> "pixel" is only one color.
Just because, at the output side, printers need to dither dots on paper
to create colors, does not mean that any penalties you paid at the input
side are irrelevant. It's like saying "printing enlargements magnifies
grain anyway, so there's not so much benefit in using finer-grained film
over coarse-grained film as you'd think".
Dithering means the printer has to make compromises to try and reach the
colour you want at a point. If you add uncertainty about *which* colour it
should try to reach into the mix (like with an interpolated Bayer-matrix
CCD), things get worse.
> I believe digital mini-labs do this. So in fact, the color doesn't
> have to be faked as much as it has to be patterned. The downside to
> this is that certain "patterns" (especially man-made) could come out
> looking wrong. The natural random nature of film grain tends to hide
> this rather than accentuate it.
Film, with its "pixels" all a mix of random sizes and shapes, distributed
randomly, even over and behind each other, has a huge advantage over
digital due to this.
I don't think the naively religious digital crowd realizes this. Sometimes
I hear of someones who scans a negative, sees (they think) the first film
grain, and then concludes that any higher resolution scan of the film is
therefore fruitless. They don't seem to realize there's smaller grains yet
to be resolved, after the first big one's they thought you saw.
--
,_
/_) /| /
/ i e t e r / |/ a g e l