It's not an uncommon problem in rendering in general.
When we do things like wheels and propellers we normally render wedges of
various settings at different incident angles, speeds and so on to make
sure the blur looks correct.

In these cases the physical accuracy of the motion is absolutely irrelevant
(IE: the wheel is theoretically slipping on the ground all the time or the
propellers is at a fraction of the RPMs required for the plane to move at
that speed), the quality of the blur is everything to sell those things off.

The B52 in Sucker Punch took a full two days of testing and rendering
wedges and an operator that would modulate the RPMs based on aesthetic
choices to produce usable frames.

Another classic issue is emulating the forward + rewind + stable blur seen
in footage of car wheels accelerating past a camera, which we had to
implement controls for so that people could animate parameters based on the
actual aesthetics of the blur and motion they wanted instead of animating
rotations and shooting in the dark.

This was the case with PRMan and Mantra and was run with all kind of data,
from straight point caches with a stupid amount of subframes to straight
transforms to deformation with additional data coming from a monitored
transform.

Not that it's not an issue or that it shouldn't be looked at, just saying
that if you want to do spinning objects and you want that cinematic blur
people are used to you'll have to bite the bullet and NOT go for temporally
or physically accurate.

Even shooting these things for real on a set often requires tests and
wedges for the rigs to be timed and controlled so the DoP gets what he
wants.

On Thu, Jul 19, 2012 at 9:44 AM, Jack Kao <[email protected]> wrote:

> I wonder if it's something to do with Mental Ray specific?
>



-- 
Our users will know fear and cower before our software! Ship it! Ship it
and let them flee like the dogs they are!

Reply via email to