Ok I am calibrating a Camera and light meter. shooting a grey card ( a good one that I know is 18%) I take this file red it into anything that reads R3d. and in every case but nuke grey is .18xxxxxxx in float and 127.xxxxxxx or 128.xxxxxx in 8 bit. huh. in nuke its .21% ok weird. Make file in photoshop in sRGB 2.1 make middle grey 50% (cause its gamma corrected sRGB) Take that tiff file into Nuke .21 AGAIN. WHAT? ok set read node to RAW. MORE WHAT. It reads at .5 as I would expect. ok I studied color a little. This sounds like color space conversion matrix problem. Yup if you use LAB 50% then all is great. But None of my files are LAB and in RGB they are fine. So it seems nuke conversion going through LAB remapping RGB 50% to LAB 50% and then not accounting for that. But also if I use a Gamma node set to .402 I can get in a gradient .18 to be exactly the middle of the gradient as well. This in tern doesn't distribute the values linearly threw the gradient either at 10% translation I have only 7% illumination change from dmax and from dmin 20% translation results in 10% change. As expected when using a gamma node on a raw sRGB since srgb is a power curve. Just nukes Power curve seems a bit off. Is there a reason for this?
Randy S. Little http://www.rslittle.com _______________________________________________ Nuke-users mailing list [email protected], http://forums.thefoundry.co.uk/ http://support.thefoundry.co.uk/cgi-bin/mailman/listinfo/nuke-users
