I used to reset all viewers to None when a script was prepared for sending to
the farm to work around this bug.
When switching to Ocio that was no problem anymore as the show specific config
is always set through the environment variable.
So what do you define in your menu.py, Howard?
Nathan
The latest version of the convolve node has gpu support.
Am 20.03.2017 um 01:20 schrieb jon parker:
Greetings Nuke users,
I'm just wondering if there are any faster, more robust FFT tools
available for Nuke besides the (hidden) built-in nodes?
The built-ins do the job, but they are pretty
The button is executed in the context of the gizmo so allNodes returns the
nodes found inside it.
You have to switch to the root context first
with nuke.root().begin():
...
Kind regards,
Michael
Michael Hodges wrote:
>I’ve got a callback that affects a certain
Hi,
What do you mean by you didn't find any node? What about the ociocdltransform
and the ociofiletransform? They both work perfectly for me!
Kind regards,
Michael
Howard Jones wrote:
>Hi
>
>We have a CDL which we would like to get into Nuke.
>So far we can’t find a node
Yes, and even if you have the additional bg samples behind your
defocused foreground bokeh is not using them.
Am 09.02.2017 um 19:04 schrieb Nathan Rusch:
This isn't too surprising. Your original deep data has discrete color
samples for each deep sample in places where objects overlap, but
I tried it already, but stays
>the
>same with the halo.
>Any other tips?
>
>Thanks
>Gabor
>
>
>On Thu, Feb 9, 2017 at 2:34 PM, Michael Habenicht <m...@tinitron.de>
>wrote:
>
>> Hi,
>> do you have the 'target input alpha' checkbox on the DeepRecolor node
&
Hi,
do you have the 'target input alpha' checkbox on the DeepRecolor node checked?
Turning this option on might solve your problem.
Kind regards,
Michael
"Gabor L. Toth" wrote:
>Hi,
>
>we are testing peregrine's bokeh in our pipeline. I would like to use
>deep
>input for
For a filmic dissolve you should convert it to log not video/gamma corrected
space.
Cheers!
Adrian Baltowski wrote:
>Hej
>
>Just gamma-correct things you want to dissolve: pump up gamma (with
>"Gamma" node for instance) before Dissolve - on both inputs- and invert
Hallo Daniel,
you are missing the quotes: it has to be setValue("nuke.message('...')")
Otherwise the message is shown right away and the return value (None) of this
call is set. There you get the error because it has to be a string.
Viele Grüße,
Michael
On November 1, 2016 11:02:22 AM CET,
Sure, just blur it before unpremultiplying and over the original afterwards.
That is the simplest and easiest way!
Best regards,
Michael
On September 20, 2016 9:48:58 PM CEST, Gary Jaeger wrote:
>Anybody have a good way of extending the edge pixels of a premulted
>image?
If you have a knob called mix which contains the amount of one of the cameras
the expression should be this:
Camera4.world_matrix*mix+Camera5.world_matrix*(1-mix)
With this you should be able to mix between the two cmaera paths.
Best regards,
Michael
On September 20, 2016 10:21:04 PM CEST,
As far as I know it is unfortunately not possible at all to get it without
shotgun ...
Michael Garrett wrote:
>Sorry for the OT, but has anyone else had problems with actually buying
>RV?
>It seems maybe the issue is something up with tickets in their support
>system not
t; Randy S. Little
>> http://www.rslittle.com/
>> http://www.imdb.com/name/nm2325729/
>>
>>
>>
>> On Fri, May 6, 2016 at 2:51 PM, Michael Habenicht <m...@tinitron.de>
>> wrote:
>>
>>> The reference space in the aces 1.01 OCIO config
The reference space in the aces 1.01 OCIO config is ACEScg so you first have to
convert to that. The compositing logspace is ACEScc which means the the
longconvert is converting from ACEScg to ACEScc. I don't know whether it is
going to change the result. Also be aware that you might get
Hi,
you should put a lin2log conversion node before creating your lut and before
applying it. Usually luts convert from log space to their destination because
in this case there are no values above one. If it is not possible to convert
your source to log before applying the lut you have to live
Hi,
I don't think this is possible because you can only reference metadata which is
bound to your current shot. How would you know which shot object has the same
timing on another track?
One solution could be to add a tag to the shot that holds the information your
after as metadata. But I am
hiero.ui.getTimelineEditor(hiero.ui.activeSequence()).selection()
theodor groeneboom wrote:
>Hiya list!
>
>Is there a nice and easy way to collect the currently selected track
>items in a sequence in NukeStudio ?
>
>like the good ol n = nuke.selectedNodes() ?
>
>I
Well, it is how OCIO works and how the nuke-default is implemented. It is all
defined through 1D lookup tables. And that means it is defined only in a
limited range. In this case it is -0.125 - 1.125. So when you convert back from
linear it gets clamped to this range.
The colorspace node on the
Hi,
I might overlook something here but I would just load in the alembic,
select the vertex at the position you are after and snap a nuke axis
there. If you need it to follow an animation use this script from
nukepedia: http://www.nukepedia.com/python/3d/animatedsnap3d
If you have only the
Hi John,
Do you have J-Ops on nuke 7 but not on nuke 8? Read nodes from recursing
folders is no feature out of the box.
J-Ops is the first tool that implements this which comes to my mind and is not
available for nuke 8. Although the python tools should still work as it's only
the plugins which
Hi,
You have to also specify the nodeclass: Viewer.viewerProcess
That way you overwrite the default that the ocio setup uses which is the
default display and default view.
Best regards,
Michael
Neil Rögnvaldr Scholes n...@uvfilms.co.uk wrote:
Hi
I know how to tell Nuke to default to using
I was also wondering why this would be better?! Just press Ctrl when clicking
on it to get the old one!
On February 26, 2014 8:13:11 PM CET, Feli fe...@earthlink.net wrote:
I just got a look at the new color controls in Nuke8 and have a few
questions:
- I would like to make them bigger and
Did you try to write out an alembic? I guess that should work!
Best regards,
Michael
Ron Ganbar ron...@gmail.com wrote:
Hi there,
is it possible to create a particle system in NukeX and then transfer
it,
as it is, to Nuke? For example, but writing the scene as an FBX or
something?
Ron Ganbar
Set it to stabilize instead of matchmove or the other way round depending on
what you want to achieve.
Best regards,
Michael
--
DI (FH) Michael Habenicht
Digital Film Compositor TD
http://www.tinitron.de
m...@tinitron.de
Hi Sean,
use the depth map of your particle render to displace your various 2d
noise maps in one eye horizontly to make it work in stereo.
Best regards,
Michael
Am 11.12.2012 18:13, schrieb Sean Falcon:
Hi All,
I have a simple particle system of dust floating in the air that I've created
The good thing about stereo in nuke is that you have both eyes in one stream
and therefor every node works on both eyes at the same time.
To use this you have to join your views with the JoinView node.
But I saw on your screenshots that you have the name of your view in the
layername which you
J-ops includes a plugin for this!
Vincent Langer vincent.lan...@filmakademie.de wrote:
hi there nuke-list,
i was wondering if it is possible to merge different exposures of an
image or an image-sequenz into one hdr file like in photoshop or ptgui
or other hdr tools?
cheers,
vincent
Hi,
did you set the metadata knob in the write node to all metadata?
Best regards,
Michael
--
DI (FH) Michael Habenicht
Digital Film Compositor TD
http://www.tinitron.de
m...@tinitron.de
--
- Original
Hi,
just use exit instead of close. I don't know it any other way in any software.
When you close a file or project the software resets itself to default.
Best regards,
Michael
Happyrender nuke-users-re...@thefoundry.co.uk wrote:
Hello!
I believe this is my first post here, so hello
--
DI (FH) Michael Habenicht
Digital Film Compositor TD
http://www.tinitron.de
m...@tinitron.de
--
- Original Message -
From: nuke-users-re...@thefoundry.co.uk
To: nuke-users@support.thefoundry.co.uk
Date: 25.07.2012 12:00:12
--
DI (FH) Michael Habenicht
Digital Film Compositor TD
http://www.tinitron.de
m...@tinitron.de
--
- Original Message -
From: nuke-users-re...@thefoundry.co.uk
To: nuke-users@support.thefoundry.co.uk
Date
%209765 [Israel]
url: http://ronganbar.wordpress.com/
On 10 June 2012 16:11, Michael Habenicht m...@tinitron.de
mailto:m...@tinitron.de wrote:
Hi Ron,
as it is only one transform you can calculate it with the matrix
of the transform node. I wrapped it in a NoOp
().knob('pos').value()\\\[1\\],\\ 0))\\\[1\\]]}}
}
Connect the transform node to the input and set the pos knob to the
position you want to transform through the transform node.
Best regards,
Michael
--
--
DI (FH) Michael Habenicht
Digital Film Compositor
Hi Thomas,
you are right the pworld pass is already the first part. We have the
screen space and the coresponding world position. But to be able to
calculate the disparity you need the screen space position for this
particular point viewed through the second camera. It is possible to
Hi,
that is what expressions are for. You can add one to the tile_color knob which
sets the color according to the mix value.
The callbacks are mostly to catch things the user did.
Best regards,
Michael
--
DI (FH) Michael Habenicht
Digital Film Compositor
I had also problems to mask or stencil all layers of a stream, so I created
this handy gizmo:
http://www.nukepedia.com/gizmos/tnt_maskall/
Best regards,
Michael
--
DI (FH) Michael Habenicht
Digital Film Compositor TD
http://www.tinitron.de
m
write node for each eye. the oneview
is not necessary as long as you do not have any write node in the script that
is set to more than one eye, at least in my experience
Best regards,
Michael
--
DI (FH) Michael Habenicht
compositing - vfx :: motiongraphics
Use the hidden node PositionToPoints
press x, make sure tcl is selected and type PositionToPoints
Best regards,
Michael
- Original Message -
From: jorxs...@gmail.com
To: Nuke-users@support.thefoundry.co.uk
Date: 21.09.2011 04:39:09
Subject: [Nuke-users] renderable point clouds?
hey
In general it's easy, just evaluate the file knob and set it as new value:
filex = node['file'].evaluate()
node['file'].setValue(filex)
The problem is that the frame number also gets evaluated so you have to find a
way to bring back %04d or whatever framepadding you are using.
- Original
it as plugin for shake: Transform Coordinates
http://www.pixelmania.se/index.asp?page=shake/index.asp
Best regards,
Michael
--
DI (FH) Michael Habenicht
compositing - vfx :: motiongraphics :: dvd
http://www.tinitron.de
m...@tinitron.de
**
Digital Compositor TD
.value.left.g
Best regards,
Michael
--
DI (FH) Michael Habenicht
compositing - vfx :: motiongraphics :: dvd
http://www.tinitron.de
m...@tinitron.de
**
Digital Compositor TD TRIXTER Film Munich
http://www.trixter.de
the
expression node?
Or is there a better way?
Best regards,
Michael
--
DI (FH) Michael Habenicht
compositing - vfx :: motiongraphics :: dvd
http://www.tinitron.de
m...@tinitron.de
**
Digital Compositor TD TRIXTER Film Munich
http://www.trixter.de
to use the Expression node maybe?
To get a colour through python you'd have to use the node.sample()
On Jul 26, 2011, at 11:04 PM, Michael Habenicht wrote:
Hello everybody,
how can I pass red, green and blue values to a python function in the
expression node. I tried [python myfunc
I think this gizmo is doing what you are looking for:
http://www.nukepedia.com/gizmos/tnt_line/
Best regards,
Michael
--
DI (FH) Michael Habenicht
compositing - vfx :: motiongraphics :: dvd
http://www.tinitron.de
m...@tinitron.de
value or delete the whole line.
Best regards,
Michael
--
DI (FH) Michael Habenicht
compositing - vfx :: motiongraphics :: dvd
http://www.tinitron.de
m...@tinitron.de
**
Digital Compositor TD TRIXTER Film Munich
http://www.trixter.de
45 matches
Mail list logo