This isn't too surprising. Your original deep data has discrete color 
samples for each deep sample in places where objects overlap, but your 
flattened color data has no "samples" to speak of. Thus, when you apply 
your DeepRecolor, Nuke has to basically take each pixel and spread its 
value out across all the deep samples at that pixel's coordinates with 
no knowledge of what the original sample colors would have been.

Also, keep in mind that deep data isn't a silver bullet for depth-based 
defocusing. It's important to remember that even deep images have blind 
spots (unless you use tricks like telling the renderer that every object 
is semi-transparent), so at a certain radius, your blur will start to 
break down around the edges of overlapping objects. I'm guessing your 
artifacts are the result of a combination of these two limitations.

-Nathan

On 2/9/2017 12:36 AM, Gabor L. Toth wrote:
> Hi,
>
> we are testing peregrine's bokeh in our pipeline. I would like to use 
> deep input for perfect defocus. With connecting the original deep read 
> to the node's deep input it works perfectly indeed. But when I do a 
> deep recolor, putting the comped rgba data to deep branch, it produces 
> a little halo around the sharp area, and other subtle imperfections. 
> The edges of the rgba look the same as the original deep input, no 
> blur or reformat happens there. Even if I just deeprecolor with the 
> flat beauty rgba, without any color corrections or other stuff, has 
> these halos. Anyone has idea what could be the difference? Or how to 
> solve it?
>
> Thanks
> Gabor
>
>
>
> _______________________________________________
> Nuke-users mailing list
> Nuke-users@support.thefoundry.co.uk, http://forums.thefoundry.co.uk/
> http://support.thefoundry.co.uk/cgi-bin/mailman/listinfo/nuke-users

_______________________________________________
Nuke-users mailing list
Nuke-users@support.thefoundry.co.uk, http://forums.thefoundry.co.uk/
http://support.thefoundry.co.uk/cgi-bin/mailman/listinfo/nuke-users

Reply via email to