I prefix this by stating this might be a really stupid question.

I understand the curves this produces, like the REC709 curve. What i struggle 
understanding is how incoming footage is mapped along this curve.

For simplicity sake say i have 8 bit 0-255 (i know it's usually not) REC709

Two cameras: one has a 8 stop range and one has a 12 stop range, same scene 
shot side by side, assuming the black shadow were the same, the 8 stop camera 
would clip first.

But in both cases the camera would produce a full 0-255 range of data.

Therefore bringing it into nuke REC709-Linear would result in these two files 
being stretched 0-1 in float, yet the whites of both files (1.0) would be 
different in terms of real world light, and any comping or exposure changes 
would produce inaccurate results.

So while i understand that the purpose of a LUT is to reverse the curve that a 
capture device has created, what i struggle to understand is how these LUTS 
have any real world linear light values relative to each other, because surely 
that is the whole point of a linear workflow?

I don't see how to convert REC709 to linear without have some kind of absolute 
value for the source exposure range... 

help!

cheers
paul



_______________________________________________
Nuke-users mailing list
[email protected], http://forums.thefoundry.co.uk/
http://support.thefoundry.co.uk/cgi-bin/mailman/listinfo/nuke-users

Reply via email to