Hi Andy,

One more *shortish* note on converting to a baseline set of primaries on
ingest. We've learned from experience that converting everything ingested
to monitor display primaries (709/sRGB) -- though it allows plates to
appear "in-the-ballpark"  without a 3D LUT -- is generally bad for
preserving data: going from a wider (camera) set of primaries to a narrower
(display) set causes some saturated values to go negative and clamp.



On Wed, Jun 5, 2013 at 11:27 PM, Andy Jones <[email protected]> wrote:

> Thanks for the replies, guys!
>
> On Wed, Jun 5, 2013 at 9:07 PM, Ben Dickson <[email protected]>wrote:
>
>> On 06/06/13 12:16, Elias Ericsson Rydberg wrote:
>> > Could you possibly replace the read/write node with a gizmo?
>>
>> Sadly, wrapping the Read node in a gizmo is not really practical, as the
>> there are knobs dynamically created for different formats
>>
>> Potentially you could make your own subclass of the Read node, but this
>> seemed like more effort than it's worth..
>
>
> Yup, I did in fact do exactly that with the gizmo, but the result is
> pretty unsatisfying.  It's particularly bad with a write node, since the
> knobs are created and destroyed when the format changes.  Also, to really
> complete the picture, you need all the right callbacks and logic, which
> then means putting code somewhere to go along with the gizmo.  I've been
> contemplating subclassing Read/Write, and it seems like a lot of work as
> you say (both in terms of implementation, and deployment/support).  That's
> really what prompted the question.
>
> On 06/06/13 11:31, Andy Jones wrote:
>> > Yes, one thought I had was just to avoid the issue by insisting that
>> > all media coming into the pipeline be pre-transcoded as linear exr.
>> > Not the worst option, but it's more of a workaround than a solution.
>> > It's a sad day when Nuke needs a workaround for dealing with color.
>>
>> This isn't necessarily a workaround.. but could be the basis of a nice,
>> robust pipeline.
>>
>> For example, we process all received scans before working on them.. Part
>> of this processing is linearisation, but may also involves lots of other
>> useful things, such as:
>>
>> * applying "neutral grades" for consistency through a sequence
>> * creating different outputs for specific applications (linear EXR's,
>> halfres undistorted JPEG's for animating in Maya, etc etc)
>>
>> One benefit of this is, compers don't need to worry how to linaerise a
>> random DPX.
>>
>> Also since we default the Read to 'raw', if someone reads in a JPEG or
>> something, they have to consider how to linearise it (instead of expecting
>> the Read's behaviour to be correct, which is rarely is)
>>
>>
> I'd say we're actually on the same page about an all-exr linear pipeline
> being a good thing, and it's actually on my roadmap already.  The reason I
> call it a workaround is that it's still avoiding the need for the
> functionality I'd like to have.  Albeit in a mostly favorable way.  It does
> also force me over the additional hurdle of getting linear exr's converting
> colorspaces reliably inside Flame, which is often the end of our pipeline.
>  Not impossible, but I was hoping to tackle Nuke's color management first
> and keep things flexible.
>
> _______________________________________________
> Nuke-dev mailing list
> [email protected], http://forums.thefoundry.co.uk/
> http://support.thefoundry.co.uk/cgi-bin/mailman/listinfo/nuke-dev
>
>


-- 
*B*l*a*k*e* S*l*o*a*n
*S*o*f*t*w*a*r*e, *C*o*l*o*r*
*D*i*g*i*t*a*l* D*o*m*a*i*n*
_______________________________________________
Nuke-dev mailing list
[email protected], http://forums.thefoundry.co.uk/
http://support.thefoundry.co.uk/cgi-bin/mailman/listinfo/nuke-dev

Reply via email to