> Date: Tue, 20 Dec 2011 07:04:20 +1100
> From: [email protected]
> To: [email protected]
> Subject: Re: [Gimp-developer] Luminosity in LAB does not agree with Wikipedia 
> or Matlab
>
> How about just implementing the conversion correctly ? To convert from RGB to 
> L*a*b*,
> one has to convert from RGB to XYZ, then XYZ to L*a*b*. A typical working RGB 
> space (like
> sRGB, AdobeRGB etc.) has a gamma encoding. So RGB to XYZ removes the gamma
> encoding (since XYZ is linear light), and then L*a*b* uses a 1/3 power which 
> models
> perceptual linearity. Of course to display an L*a*b* value you have to 
> convert it
> to XYZ and then to the display's RGB space. [And yes there are many
> possible avenues for optimising the performance of such conversions.]
> 
> Graeme Gill.

> _______________________________________________

> gimp-developer-list mailing list

> [email protected]

> http://mail.gnome.org/mailman/listinfo/gimp-developer-list


If an image's raw RGB pixel values represent perceptual linearity (when viewed 
through your standard RGB monitor), and L*a*b*
values also model perceptual linearity, it should logically follow that 
converting a linear greyscale gradient from one to the other should yield a 
linear gradient on the Y* value (like you get from performing an RGB -> YCC 
conversion), despite that the decomp must account for gamma encoding during the 
intermediate steps.

And this is not what you get when running a LAB decomp in GIMP.

So . . . yeah, I'm totally in agreement here.

-- Stratadrake
[email protected]
--------------------
                                          
_______________________________________________
gimp-developer-list mailing list
[email protected]
http://mail.gnome.org/mailman/listinfo/gimp-developer-list

Reply via email to