Re: OT: Oculus rift + LeapMotion + Softimage?

2015-06-13 Thread Tenshi .
..better! animate with eyes.

https://www.kickstarter.com/projects/fove/fove-the-worlds-first-eye-tracking-virtual-reality?lang=es

On Fri, Jun 12, 2015 at 5:03 PM, Rob Chapman  wrote:

> I have all of them items above on my desk. I'd recommend one of these
> though  https://www.google.com/get/cardboard/get-cardboard/ if you
> dont have an oculus, currently its the same resolution display and no
> wires out of the back of your head either. the plastic colorcross with
> headband strap is best so far
>
> Personally I think your proposition would be a bit clunky based on my
> interactions with the leapmotion apps so far, perhaps some hands open
> and hand close to mimic a select and move and then you could repose
> the position, open your hand it stays..?  but for sure some more
> intelligence behind the gesture you are trying to communicate and the
> action you are trying to call is needed as its pretty hit & miss in
> recognizing hand shapes and orientation after a very short while of
> use and it seems so far, limited to one dimensional kind of state
> toggles at a time.   also its kind of a1/2 metre square over your
> keyboard and your hands get tired flapping around above there after a
> while.
>
> but this should all be doable / hackable with fabric as they have the
> connections all coded up already , no?  I think its definitely some
> kind of future but its very early days in the interaction side of
> things imho
>
> ps. touch..? you would have to be controlling inside SI's 3D space an
> object by the leapmotion, yes is possible, but would you also not have
> a collision detection on top for this kind of interaction to work?
>
> On 12 June 2015 at 22:22, Pierre Schiller
>  wrote:
> > Hi, this is kind of an offtopic thing, but is there anyone in the list
> who
> > has already jumped into the VR wagon?
> > I´m checking out some motion leap demos and honestly if we could hook it
> > into softimage (rigging for example as in -animation-) I see a dream come
> > true.
> >
> > The basic idea? posing your character on real 3d space. Good bye
> blocking by
> > hand. Hello blocking (animation) interactively!!
> >
> > An inspiration article:
> >
> https://www.linkedin.com/pulse/experiential-vr-activation-us-air-force-dan-ferguson
> >
> > Thoughts?
> >
> > ps: Touch (select), Roll hand (rolls a joint). Interface on swipe, save
> > pose. Rinse and wash again. :)
> >
> > --
> > Portfolio 2013
> > Cinema & TV production
> > Video Reel
>
>


Re: Paul Smiths Fuzz for animation

2015-06-13 Thread Ognjen Vukovic
If i recall correctly you have to cache the groom node, where the fur set
up is and unplug it from the ice tree, that solved the jitter for us, but
that was quite a while ago, so i might be wrong about that. I dont have
soft here at the moment so i cant really recall the name of the node but i
think it was called fuzz groom or something simmilar, either way its the
first node tree that opens up when you enter the ice tree for fuzz.

On Fri, Jun 12, 2015 at 12:53 PM, Morten Bartholdy 
wrote:

>   Paul got back to me - caching the start frame and reading it back in
> per frame should solve it. Not exactly sure how to do the reading per frame
> though.
>
>
>
> Morten
>
>
>
>
> Den 12. juni 2015 kl. 12:44 skrev David Barosin :
>
>   I'm not familiar with Paul's setup but he's a very clever and talented
> guy. If the jitter is just a few occasional pops, I'd check if any nodes
> are using a point reference frame (ICE attribute) instead of a poly
> reference frame  (or a point normal instead of a poly normal).  When
> sticking geo/particles to other surfaces I've found that the point
> reference frame is an interpolated result which can make things jitter on a
> deformation.  The poly reference frame (or poly normal) locks to each
> polygon.  I like the look of the point interpolation but when it causes
> problems I switch to poly.
>
>  Beyond that emitting from deforming geo can also cause problems.  I
> would suggest using a static object to do all the emitting/grooming and
> transfer that over to the animated mesh. Not sure how you're doing it now.
>
>
>
>
>
>
>