The sensor has a Bayer grid in front of it, RGGB in most cases.  The
luminance at any pixels has ALREADY been selectively assigned a designation
as either red, green or blue, by physical means and not in software.  The
pixel cannot be the bearer of information in any colour channel but the one
of the filter above it.  It only subsequently becomes a three channel RGB
site because two channels are filled in from surrounding pixel data.

Any field of any colour and any luminance will show the same concept as my
white field example, except in those cases the data will look like;
XXX,0,0 - 0,XXX,0 - 0,0,XXX - 0,XXX,0 - and so on. (XXX = any number you
like between 0 and 255 for 8 bit scales)

You don't need no stinkin' "properly done algorithms" to view no "spatial
resolution intensity map".  IMO any algorithm applied to the RAW data will
alter it to something that isn't RAW data any longer.  What you need is a
RAW converter that will provide the unmosaiced greyscale image.  Until one
night ago I had one of those, called SharpRaw (uninstalled in favour of
s7RAW which better suits my camera), and as I said in an earlier post, this
type of image looks nothing like a greyscale image, but has a strong
latticed appearance due to the very different luminance levels of RGB
information when displayed as separate but enmeshed images of a single
colour each, rather than as a blended array of pixels bearing the full RGB
data.

Godfrey writes:
> Translating the monochromatic
> value recorded by the sensor into a luminance and chrominance value
> through the bayer matrix, transforming to LAB color space and then
> discarding the chrominance value will provide the spatial resolution
> intensity map (luminance) that the chip's pixel sites suggest, if the
> algorithms are done properly. Properly done algorithms include
> calculation for the effect of the filters on the intensity map.

Huh?  That's interpolation followed by conversion to greyscale!  Besides
which the image data after conversion from RAW is an RGB file.  It has no
separate luminance and chrominance channels, this data is integral to the
RGB data which measures all of those values at each pixel simultaneously.
To get to an RGB file the RAW data was interpolated.  When Photoshop or any
other editor provides data channels like HSL or CMYK or LAB or God knows
what else, it only gets that by deriving it from the RGB, which was already
derived from interpolated data.

IOW none of the data in a bayer array sourced image file is pure
information, it is a construct synthesised from four enmeshed grids, two of
which are three quarters empty while the other two are half empty.  They
only fill up with data when they are demosaiced then interpolated during RAW
conversion.

regards,
Anthony Farr 

> -----Original Message-----
> From: Godfrey DiGiorgi [mailto:[EMAIL PROTECTED]
> 
> >> A white field from the bayer converted with no interpolation would
> >> look
> >> like: 255,0,0 - 0,255,0 - 0,0,255 - 0,255,0 - and so on.
> >> After interpolation it looks like:
> >> 255,255,255 - 255,255,255 -255,255,255 - and so on.
> 
> A white field will look like that, and a black field will look like
> the converse. Those are the only two points in the spectrum where you
> don't need the rest of the surrounding pixels to demonstrate how the
> individual pixels' value is rendered. Translating the monochromatic
> value recorded by the sensor into a luminance and chrominance value
> through the bayer matrix, transforming to LAB color space and then
> discarding the chrominance value will provide the spatial resolution
> intensity map (luminance) that the chip's pixel sites suggest, if the
> algorithms are done properly. Properly done algorithms include
> calculation for the effect of the filters on the intensity map.
> 
> Of course this is not simple to visualize.
> 
> But to get back to the actual thing being discussed, all you have to
> do is look at the pictures taken with 35mm film and an *ist D/DS,
> using the same lens or with lenses to provide the same field of view,
> and you can see the difference in DoF. And it will follow the
> calculations roughly posed in this thread. So to say that it is
> "impossible to compare" them, implying to calculate them, is hogwash.
> 
> Godfrey


Reply via email to