I've got a quick example that takes the world position output of the
ScanlineRender node and converts it to Nuke format zdepth, using standard
Nuke nodes. But there are some plugins on Nukepedia such as C44Matrix and
another that is written by Jonathan Egstad that do this more elegantly.


Camera2 {
 translate {2.857524157 6.229105473 9.965373039}
 rotate {-31.000002 15.99999646 0}
 name Camera1
 selected true
 xpos -259
 ypos -120
}
CheckerBoard2 {
 inputs 0
 name CheckerBoard1
 selected true
 xpos 4
 ypos -161
}
set N17e077d0 [stack 0]
Sphere {
 translate {0 1 0}
 name Sphere1
 selected true
 xpos 55
 ypos -67
}
push $N17e077d0
Card2 {
 orientation ZX
 image_aspect false
 uniform_scale 10
 control_points {3 3 3 6

1 {-0.5 0 -0.5} 0 {0.1666666865 0 0} 0 {0 0 0} 0 {0 0 0.1666666865} 0 {0 0
0} 0 {0 0 0}
1 {0 0 -0.5} 0 {0.1666666716 0 0} 0 {-0.1666666716 0 0} 0 {0 0
0.1666666865} 0 {0 0 0} 0 {0.5 0 0}
1 {0.5 0 -0.5} 0 {0 0 0} 0 {-0.1666666865 0 0} 0 {0 0 0.1666666865} 0 {0 0
0} 0 {1 0 0}
1 {-0.5 0 0} 0 {0.1666666865 0 0} 0 {0 0 0} 0 {0 0 0.1666666716} 0 {0 0
-0.1666666716} 0 {0 0.5 0}
1 {0 0 0} 0 {0.1666666716 0 0} 0 {-0.1666666716 0 0} 0 {0 0 0.1666666716} 0
{0 0 -0.1666666716} 0 {0.5 0.5 0}
1 {0.5 0 0} 0 {0 0 0} 0 {-0.1666666865 0 0} 0 {0 0 0.1666666716} 0 {0 0
-0.1666666716} 0 {1 0.5 0}
1 {-0.5 0 0.5} 0 {0.1666666865 0 0} 0 {0 0 0} 0 {0 0 0} 0 {0 0
-0.1666666865} 0 {0 1 0}
1 {0 0 0.5} 0 {0.1666666716 0 0} 0 {-0.1666666716 0 0} 0 {0 0 0} 0 {0 0
-0.1666666865} 0 {0.5 1 0}
1 {0.5 0 0.5} 0 {0 0 0} 0 {-0.1666666865 0 0} 0 {0 0 0} 0 {0 0
-0.1666666865} 0 {1 1 0} }
 name Card1
 selected true
 xpos -72
 ypos -69
}
Scene {
 inputs 2
 name Scene1
 selected true
 xpos -9
 ypos -27
}
push 0
add_layer {P P.red P.green P.blue P.alpha}
ScanlineRender {
 inputs 3
 motion_vectors_type distance
 output_shader_vectors true
 P_channel P
 name ScanlineRender1
 selected true
 xpos -19
 ypos 76
}
set N17e06a10 [stack 0]
Add {
 value {{-Camera1.world_matrix.3} {-Camera1.world_matrix.7}
{-Camera1.world_matrix.11} {curve}}
 name Add1
 selected true
 xpos -19
 ypos 100
}
ColorMatrix {
 channels P
 matrix {
     {{Camera1.world_matrix.0} {Camera1.world_matrix.1}
{Camera1.world_matrix.2}}
     {{Camera1.world_matrix.4} {Camera1.world_matrix.5}
{Camera1.world_matrix.6}}
     {{Camera1.world_matrix.8} {Camera1.world_matrix.9}
{Camera1.world_matrix.10}}
   }
 invert true
 name ColorMatrix1
 selected true
 xpos -19
 ypos 147
}
Multiply {
 channels {-P.red -P.green P.blue -P.alpha}
 value -1
 name Multiply1
 selected true
 xpos -19
 ypos 183
}
Shuffle {
 in P
 red blue
 green blue
 name Shuffle2
 selected true
 xpos -19
 ypos 219
}
Expression {
 channel0 rgb
 expr0 1/r
 name Expression1
 label "Nuke format zdepth"
 selected true
 xpos -19
 ypos 243
}
set N17e09350 [stack 0]
push $N17e06a10
Dot {
 name Dot1
 selected true
 xpos 247
 ypos 79
}
Shuffle {
 in depth
 name Shuffle1
 selected true
 xpos 213
 ypos 241
}
set N17e070f0 [stack 0]
Merge2 {
 inputs 2
 operation difference
 name Difference
 selected true
 xpos 118
 ypos 291
}
push $N17e070f0
push $N17e09350
Viewer {
 inputs 2
 frame 44
 input_process false
 name Viewer1
 selected true
 xpos 118
 ypos 344
}



On 14 May 2013 03:40, Simon Björk <[email protected]> wrote:

> By pixel perfect I meant compared to a rendered Z. I'm having really high
> values in my P pass (>100000), so it might be rounding off errors. I've
> tried with both 16 and 32bit renders though.
>
> *I've found the easiest way is to do a coordinate system transform from
> world space to camera space then just take the Z channel. Note that,
> depending on the shutter angle of the render vs the imported camera
> keyframes that you may get a minor offset compared to rendering "in camera".
> *
> *
> *
> Interesting. Do you have an example setup to share?
>
> Cheers!
>
>
> 2013/5/13 Michael Garrett <[email protected]>
>
>> Yes, it's worth taking note of that and it's caught me out in the past.
>> If you take the vector from the camera position to any position coordinate
>> in the scene you're getting the true distance, whereas a Zdepth image is
>> just the z component of that distance.
>>
>> I've found the easiest way is to do a coordinate system transform from
>> world space to camera space then just take the Z channel. Note that,
>> depending on the shutter angle of the render vs the imported camera
>> keyframes that you may get a minor offset compared to rendering "in camera".
>>
>> Michael
>>
>>
>> On 13 May 2013 09:41, Eetu Martola <[email protected]> wrote:
>>
>>> Most renderers write Z as orthogonal distance to the (extended)
>>> cameraplane, instead of the actual distance between the pixel position and
>>> camera center point. Is your perceived difference greater toward the image
>>> edges?
>>>
>>> eetu.
>>>
>>>
>>>
>>> On 13.5.2013 16:36, Steve Newbold wrote:
>>>
>>>> Pixel perfect compared to what?  A depth pass from the same render as
>>>> the world position pass?
>>>>
>>>> How far off are your values?  Can it be put down to rounding of the
>>>> camera position from your 3D package or rounding on the pass values.  Is
>>>> your depth pass 32bit float?  Other things to check are premultiplcation
>>>> of the position pass from the renderer.
>>>>
>>>> Steve
>>>>
>>>>
>>>> On 13/05/13 14:26, Simon Björk wrote:
>>>>
>>>>> Hi all,
>>>>>
>>>>> I'm looking for a way to accurately converting a point position pass
>>>>> (world space) to a depth pass (camera space). I have a camera
>>>>> available.
>>>>>
>>>>> The following expression gets me close but not pixel perfect:
>>>>>
>>>>> set cut_paste_input [stack 0]
>>>>> version 7.0 v4
>>>>> push $cut_paste_input
>>>>> Expression {
>>>>>  temp_name0 cx
>>>>>  temp_expr0 shot_camera.translate.x-r
>>>>>  temp_name1 cy
>>>>>  temp_expr1 shot_camera.translate.y-g
>>>>>  temp_name2 cz
>>>>>  temp_expr2 shot_camera.translate.z-b
>>>>>  temp_name3 dist
>>>>>  temp_expr3 sqrt(cx*cx+cy*cy+cz*cz)
>>>>>  channel0 rgba
>>>>>  expr0 dist
>>>>>  channel1 none
>>>>>  name Expression7
>>>>>  label "convert p world to z-depth"
>>>>>  selected true
>>>>>  xpos 266
>>>>>  ypos -50
>>>>> }
>>>>>
>>>>>
>>>>>
>>>>> ______________________________**_________________
>>>>> Nuke-users mailing list
>>>>> [email protected].**co.uk<[email protected]>
>>>>> ,http://forums.**thefoundry.co.uk/ <http://forums.thefoundry.co.uk/>
>>>>> http://support.thefoundry.co.**uk/cgi-bin/mailman/listinfo/**
>>>>> nuke-users<http://support.thefoundry.co.uk/cgi-bin/mailman/listinfo/nuke-users>
>>>>>
>>>>
>>>>
>>>>
>>>> ______________________________**_________________
>>>> Nuke-users mailing list
>>>> [email protected].**co.uk<[email protected]>,
>>>> http://forums.thefoundry.co.**uk/ <http://forums.thefoundry.co.uk/>
>>>> http://support.thefoundry.co.**uk/cgi-bin/mailman/listinfo/**nuke-users<http://support.thefoundry.co.uk/cgi-bin/mailman/listinfo/nuke-users>
>>>>
>>>>
>>> ______________________________**_________________
>>> Nuke-users mailing list
>>> [email protected].**co.uk<[email protected]>,
>>> http://forums.thefoundry.co.**uk/ <http://forums.thefoundry.co.uk/>
>>> http://support.thefoundry.co.**uk/cgi-bin/mailman/listinfo/**nuke-users<http://support.thefoundry.co.uk/cgi-bin/mailman/listinfo/nuke-users>
>>>
>>
>>
>> _______________________________________________
>> Nuke-users mailing list
>> [email protected], http://forums.thefoundry.co.uk/
>> http://support.thefoundry.co.uk/cgi-bin/mailman/listinfo/nuke-users
>>
>
>
> _______________________________________________
> Nuke-users mailing list
> [email protected], http://forums.thefoundry.co.uk/
> http://support.thefoundry.co.uk/cgi-bin/mailman/listinfo/nuke-users
>
_______________________________________________
Nuke-users mailing list
[email protected], http://forums.thefoundry.co.uk/
http://support.thefoundry.co.uk/cgi-bin/mailman/listinfo/nuke-users

Reply via email to