Sounds like you're sorted, but I found this note on the basic method I
used. So you could use a flat bezier mask as you say, that is promoted to
deep then used for the deep holdout.

"The deep image is unpremultiplied by its original deep opacity alpha and
premultiplied by the new mask. Redundant deep samples are killed if they
are a zero alpha value. This is achieved by moving those deep samples to
behind the camera and then deep cropping them."

You would do the same for the inverse, then deep merge and you should get
back to your original image but you can insert a deep colour correct
upstream of that deep merge to get a typical keymix effect.


On 17 June 2014 17:38, Frank Rueter|OHUfx <fr...@ohufx.com> wrote:

>  Thanks Michael!
> I think I got reasonably close with my expression hackiness yesterday and
> with a little help from somebody else we got even closer (basically by
> doing the soft part of the mask in flat image space).
> Hopefully we are good to go now.
>
>
>
> On 18/06/14 3:25 am, Michael Garrett wrote:
>
>  If all the samples are at the "same depth" in both A and B, easy, the
>> output samples are a simple mix between each sample of A and B. This is
>> the case for, say, keymixing in a DeepColorCorrect (where only the
>> existing sample values will be changed)
>
>
>  I managed to get a deep volumetric keymix working using the basic
> scenario Ben is describing with deep holdouts - one with the mask and the
> other with the inverse mask, then deep merging them together. Typically I
> used this when creating a deep Pmatte then using that as a deep keymix for
> a deep colorcorrect/grade.
>
>  And yes it was extremely useful.  I'll have to revisit the specifics
> though because like Ben says I think there was some additional work
> required to get rid of fringing issues which since I have yet to write a
> plug-in, was achieved with deep expressions (hello hackiness).
>
>  This was specific to Mantra. I have yet to extensively use the deep
> output from other renderers but I believe Mantra's deep output has it's
> quirks and what I did may not translate exactly to another renderer. Also
> we were working a fair bit with full deep rgb output which can make things
> a lot cleaner, although we did fall back to deep opacity/recolor in some
> cases as deadlines approached and still got an acceptable result.
>
>  Cheers,
> Michael
>
>
> On 17 June 2014 02:09, Ben Dickson <ben.dick...@rsp.com.au> wrote:
>
>> It's more difficult than it initially seems..
>>
>> The obvious thing is to use the DeepMerge set to holdout to punch a hole
>> in your A input, invert the matte and punch the inverse hole in the B
>> input.. but when you merge these you get the dark fringing where your
>> matte is semi-transparent, which is the same problem solved by adding
>> the two images together, or using the disjoint-over
>>
>> ..but, you inherently cannot do that with deep samples - when the deep
>> image is flattened, all the samples for a pixel are over'd.
>>
>>
>> There is a DeepKeyMix gizmo on Nukepedia, but it is very destructive -
>> it flattens the image with DeepToImage, applies a regular KeyMix and
>> then uses the DeepRecolour.. which is probably okay if you are only
>> rendering deep-opacity, but bad if you are rendering deep-RGB.
>>
>>
>> I had a rough idea of how to write a plugin to mix between two deep
>> images, but haven't got around to implementing it.. so.. there might be
>> some other fundamental flaw in the approach, but..
>>
>> For two inputs "A" and "B":
>>
>> If all the samples are at the "same depth" in both A and B, easy, the
>> output samples are a simple mix between each sample of A and B. This is
>> the case for, say, keymixing in a DeepColorCorrect (where only the
>> existing sample values will be changed)
>>
>> If the samples are not aligned, things are more complicated (e.g mixing
>> two separate renders). For each sample you need to make a corresponding
>> sample at the same depth in the other image, by interpolating between
>> the nearest two samples.
>>
>>
>> In other words, if you have two images like this:
>>
>> A samples: empty empty red  empty black
>> B samples: empty empty blue blue  blue
>>
>> 1) For first two empty samples, nothing is done
>> 2) For the A:red and B:blue sample pair, output sample is a simple mix
>> 3) For the A:empty and B:blue sample pair,
>>   insert a sample in A which is a mix between the "red" and "black"
>>   samples. Then mix between that and B's blue sample
>>
>> I think the case which would cause artefacts is when your samples have
>> large distance-gaps between them: like keymixing between a foreground
>> tree and the sky - the process of creating the new samples will create
>> "tree" coloured samples at the "sky" depth and vice-versa
>>
>> - Ben
>>
>> On 17/06/14 12:40, Frank Rueter|OHUfx wrote:
>> > Hi peeps,
>> >
>> > I'm just trying to figure out how to merge two deep images based on a
>> > deep mask channel, without getting fringing.
>> > Been playing with DeepExpression but don't know if I can reference
>> > samples in there (the documentation is rather sparse to say the least).
>> > Basically I need a true, volumetric DeepKeyMix.
>> >
>> > Any ideas?
>> >
>> > Cheers,
>> > frank
>> >
>> > --
>>  > ohufxLogo 50x50 <http://www.ohufx.com>        *vfx compositing
>> > <http://ohufx.com/index.php/vfx-compositing> | *workflow customisation
>> > and consulting <http://ohufx.com/index.php/vfx-customising>* *
>> >
>> >
>> >
>> > _______________________________________________
>> > Nuke-users mailing list
>> > Nuke-users@support.thefoundry.co.uk, http://forums.thefoundry.co.uk/
>> > http://support.thefoundry.co.uk/cgi-bin/mailman/listinfo/nuke-users
>> >
>>
>> --
>> ben dickson
>> 2D TD | ben.dick...@rsp.com.au
>> rising sun pictures | www.rsp.com.au
>> _______________________________________________
>> Nuke-users mailing list
>> Nuke-users@support.thefoundry.co.uk, http://forums.thefoundry.co.uk/
>> http://support.thefoundry.co.uk/cgi-bin/mailman/listinfo/nuke-users
>>
>
>
>
> _______________________________________________
> Nuke-users mailing listnuke-us...@support.thefoundry.co.uk, 
> http://forums.thefoundry.co.uk/http://support.thefoundry.co.uk/cgi-bin/mailman/listinfo/nuke-users
>
>
> --
>   [image: ohufxLogo 50x50] <http://www.ohufx.com> *vfx compositing
> <http://ohufx.com/index.php/vfx-compositing> | workflow customisation and
> consulting <http://ohufx.com/index.php/vfx-customising> *
>
> _______________________________________________
> Nuke-users mailing list
> Nuke-users@support.thefoundry.co.uk, http://forums.thefoundry.co.uk/
> http://support.thefoundry.co.uk/cgi-bin/mailman/listinfo/nuke-users
>
_______________________________________________
Nuke-users mailing list
Nuke-users@support.thefoundry.co.uk, http://forums.thefoundry.co.uk/
http://support.thefoundry.co.uk/cgi-bin/mailman/listinfo/nuke-users

Reply via email to