If your depth pass is reliably in the same units as your camera, you can
directly calculate the distance from the camera to your axis with an
expression on the focus plane knob, e.g.:
1/sqrt(pow(Axis1.translate.x-Camera1.translate.x,
2)+pow(Axis1.translate.y-Camera1.translate.y,
you could try using exiftool in an afterFrameRender callback:
http://www.sno.phy.queensu.ca/~phil/exiftool/
jrab
On 03/31/2014 09:47 PM, Richard Bobo wrote:
Hi all,
Has anyone had to write IPTC metadata in their PNG output
Hi everyone,
I just wanted to share the news that we’ve published MochaImport+ for NUKE
today.
It simplifies the workflow to get tracking data from Imagineer Systems mocha to
NUKE. You can, for example, directly import the tracking data into NUKE’s nodes
Tracker, Roto, RotoPaint, GridWarp and
What’s your target app - PS or Bridge or non-adobe ?
If it’s adobe then I would try finding some way to kluge it via automation /
droplet.
A quick bingle seemed to imply that IPTC support in the png spec’s was limited
which might explain why PS use’s it’s own category header in an attempt to
I just came back from a shoot where I did just that - insist on 2k
compression. We were meant to shoot 3k raw but since production
failed to secure a T-Link, 2k prores was the next best thing I
could ask for. I told them 422 would be unacceptable for vfx,
We had a problem on a shoot a few years ago. Was 4:4:4 afaik on an Arri but at
the time they were recommending 800 ASA.
As it was a night scene the DOP underlit. Now because of the gamma curve,
especially as it was set to gain detail in blacks, the noise levels were awful.
Worst keying I've
Andrew,
Thanks for the help. After a lot of fruitless research trying to make PNG files
display metadata in Photoshop (target app by client request), I have moved to
TIFF.
With some more work today, I now have a Python afterRender function in Nuke
that adds the metadata fields to the
Yeah, we shot everything on 800 which seems to be every DOPs'
standard flavor.
I also had to fight with the gaffer to give me a little bit of light
on his rather orange "green screen". He kept insisting that, because
it was perfectly on exposure, "it will sing - trust
Well, in Nuke chances are that you may need FrameHolds or other
retime nodes that require multiples frames to be read from the input
clip at the same time. That is usually when quicktime files fall
apart and frame sequences can't be beaten for efficiency and
stability.