> Gamma 1.0 means almost all values of 0..ffff are used to encode light 
> zone. Dark part gets squeezed into relatively few codes. And what happens 
> if these critical codes are holding noise? This is another reason because
> I  always recommend to avoid gamma 1.0 and use 2.2...2-4 if possible.

Marti,

typical consumer scanners usually have (nearly) linear CCD sensors and also
a linear ADC, so they internally capture the data with gamma 1.0 anyway, with
typically 12..16 bit ADC resolution. Thereafter either the scanner applies
gamma internally and returns gamma corrected data, or the scanner directly
returns the linear gamma 1.0 data (with 16bpp, which usually exceeds the ADC
resolution anyway) and the input table of the profile's LUT performs the gamma
encoding. So in principle, I really don't see any difference between these two
cases (except if 8-bit encoding would be used for the linear data).

Best Regards,
Gerhard

-- 
NEU F�R ALLE - GMX MediaCenter - f�r Fotos, Musik, Dateien...
Fotoalbum, File Sharing, MMS, Multimedia-Gru�, GMX FotoService

Jetzt kostenlos anmelden unter http://www.gmx.net

+++ GMX - die erste Adresse f�r Mail, Message, More! +++



-------------------------------------------------------
This sf.net email is sponsored by:ThinkGeek
Welcome to geek heaven.
http://thinkgeek.com/sf
_______________________________________________
Lcms-user mailing list
[EMAIL PROTECTED]
https://lists.sourceforge.net/lists/listinfo/lcms-user

Reply via email to