This is 'a' use case.  There are many others.

I didn't use the slow approach you describe below.  I resorted to an 
approximation method which involves a reverse lookup to find the nearest NURBS 
Samples which surround the location on the surface (via a custom binary search 
on the NURBS Samples collection), then do a barycentric-like computation 
between those samples to derive the UV coordinate in uniform parameterized 
space.  The process is done in reverse to apply the mapped location to the 
other surface.   This works relatively fast in a scripted operator, but the 
pitfall is Softimage occasionally returns NaN or undefined when querying the 
NurbsSamples collection - usually when querying a sample which resides on a 
boundary edge of the surface, but it's inconsistent.  I have to implement 
significant error trapping to prevent the operator from crashing Softimage.

Anyway, I cannot use the approximation technique in ICE because ICE does not 
have the capability to query NurbsSamples in that way.  Even if it could, I 
would have to implement my own binary search and interpolation methods too.  
That is where the bloat and performance degradation comes from when applied to 
production use case.  This is also a driving reason to write a custom ICE Node 
as I could cut out the middleman of bloat and cut to the chase.

The main reason for pursuing ICE is to improve durability of our content.  
Scripted and compiled operators have the pitfall of self-deleting when an input 
is missing.  In the test case, if one of the nulls goes missing, the operator 
is deleted automatically and any content relying on that operator is now 
broken.  Since the operator may contain metadata specific to it's application, 
it may not be possible to reconstruct the effect after the fact.  This scenario 
is very common on scene load or model import when the inputs are referenced 
models and they have been modified externally.  If such a problem arises using 
an ICE node, it merely complains, turns red, and waits for the user to resolve 
the situation.  The system is still intact which gives the artist the 
opportunity to put things right - often with a face palm followed by getting 
latest version of the missing asset from source control - which is preferable 
over the artists marching into my office asking me to run diagnostics to figure 
out why his scene is broken only for me to have to dig back into previous 
versions of the scene to recognize the problem and determine what input is 
missing.


Matt




From: [email protected] 
[mailto:[email protected]] On Behalf Of Raffaele Fragapane
Sent: Wednesday, May 15, 2013 4:24 PM
To: [email protected]
Subject: Re: custom ICENode - questions and request for example source code

Matt, is the test case you outlined also your use case?
Reparametrization, even outside of ICE, is non-trivial since if you want 
equidistant you are basically facing a minimization problem, which is where I 
assume you went for forward walking technique (repeat with bouncing or 
decreasing increment until a lowest possible U and V Value is found returning a 
distance withing tolerance of the discrete interval).
I tried that, and it was prohibitively expensive as it involves whiles and 
repeates that degrage the graph's threading and inflate memory use enormously.
What, to my surprise, I found out the first time I tackled the problem at its 
lowest dimensionality is that using a ton of get-closest location and a single 
repeat (and then ridding myself of that in favour of starting from a set of 
samples ran through a fixed number of iterations hard-wired) had practically no 
cost compared to that, and threaded more efficiently across all cores at all 
times.
Get closest location on its own of course will return data you want to filter, 
especially in areas where there is considerable discontinuity (high rate of 
change for the first order derivative), but nothing that filtering by a ruleset 
wouldn't deal with excellently (exclude precedent location > filter in range > 
filter by lowest U or V to avoid skipping the entire discontinuity and then a 
further get closest resized and filtered again).
If you literally are limited to cases with only a few control vertices and you 
can guarantee the discontinuity isn't too brutal (IE: first order derivative 
between subsequent nodes doesn't change by more than 90 degrees minus iota) the 
problem is a great deal simpler than if you have many knots and the domain of 
the surface has practically no boundaries other than those of the function. 
That's why I was asking about the case.
Playing with the arrays for filtering in a safe and fast way was also key, and 
that is counter-intuitive compared to how you would deal with arrays in 
traditional programming, especially performance wise, but possible (again, 
Stephen and Julian's blogs have many gems).
I would also consider using a very dense poly or point cloud conversion of the 
nurbs plane with data samples from the surface, if this is an on-off tool, over 
using the surface itself, but that might or might not be possible.
I still don't know what your performance target is. If it's dozens of frames 
per second, or 60hz across multiple setups, I'd say you're bettter off dropping 
this like a dead rat and instantly explore other venues.
If it's a conforming tool used in a session with clear entry and exit points, 
then the average 15-20hz that is perceived as still smooth when operating a 
tool is more achievable.
Lastly, you always have the option of dealing with the parametrization in your 
own OP and writing a transform per discrete element to use in ICE for the rest 
from there, which is probably the sane thing to do if you have dense surfaces 
and the problem has an unbound domain. ICE just isn't well suited to dealing 
with a lot of fringe case handling to scale performance (it does best when 
dealing with the same operation, no matter how big, run many times as widely as 
possible instead of at variable depth), whereas in an OP that kind of 
optimization always works well.

Reply via email to