Seems like this feature is a perfect candidate to be implemented directly on 
the GPU, since it already has the display buffer, and absolute accuracy isn’t 
paramount.

-Nathan



From: Jonathan Egstad 
Sent: Sunday, October 14, 2012 9:50 AM
To: Nuke user discussion 
Cc: [email protected] 
Subject: Re: [Nuke-users] Normalize Viewer

Hi Frank,

Not to a devil's advocate or anything...but calculating the min/max of an image 
means sampling the entire image before a single pixel can be drawn in the 
Viewer.  Needless to say this will destroy Nuke's update speed.

As long as that's understood as a side-effect of this feature, then soldier on.

-jonathan

Sent from my iPhone

On Oct 13, 2012, at 6:22 PM, Frank Rueter <[email protected]> wrote:


  None of those solutions actually produce what we're after though (some of 
your solutions seem to invert the input).

  We need something that can compresses the input to a 0-1 range by offsetting 
and scaling based on the image's min and max values (so the resulting range is 
0-1). You can totally do this with a Grade or Expression node and a bit of tcl 
or python (or the CurveTool if you want to pre-compute), but that's not 
efficient.

  I reckon this should be a feature built into the viewer for ease-of-use and 
speed.
_______________________________________________
Nuke-users mailing list
[email protected], http://forums.thefoundry.co.uk/
http://support.thefoundry.co.uk/cgi-bin/mailman/listinfo/nuke-users

Reply via email to