Matt thank you so much.

I'll give this a try soon!

On Tue, Mar 24, 2015 at 2:50 PM, Matt Lind <[email protected]> wrote:
> You just need to pull the right vectors from mental ray and do the
> computation yourself, which isn't hard.
>
> You need 3 pieces of data to perform the computation:
>    - intersection point (state->pnt)
>    - light position
>    - camera position (state->org)
>
> The intersection point is the point on the surface being shaded.  Mental ray
> refers to this as state->pnt (or state->point).  You can access this value
> using a state shader.  It'll have to be used in a rendertree applied as a
> material or texture on a surface as state->point doesn't exist for some
> other types of shaders.
>
> The viewing angle is the vector formed by subtracting the intersection point
> from the camera position.  Mental ray calls this state->dir (viewing
> direction), which you might be able to obtain using one of the state
> shaders, but if you do, you'll need to negate it as state->dir points from
> camera to surface.  You need the opposite direction so the vector is aligned
> from surface to camera to match the orientation of the other vector you'll
> be computing.
>
> The light to surface vector is formed by subtracting the intersection point
> from the light position.  In code this is trivial, but in the rendertree
> it's a bit cumbersome for lack of nodes to isolate information from a single
> light.  Therefore, you'll have to plug your light positions into the tree
> explicitly.  One node to represent the position of each light you want to
> consider for the effect.
>
> When you have both vectors, normalize them to be unit vectors.  Next, do a
> dot product. The result is the cosine of the angle between the two vectors.
> This is ratio you seek to drive the retro-reflective intensity.  However,
> you will likely need to clamp or rescale this value to fit a range you
> choose for artistic reasons.  For example, the cosine will cover the full
> [0...180] degree range between the two vectors meaning you'll get a linear
> increase/decrease of intensity as you step through the entire range.
> However, as Peter pointed out, you don't want that.  You want the intense
> response to be isolated to a very narrow angular range when the two vectors
> are very closely aligned to indicate the viewing angle is very similar to
> the reflection angle.  So take the cosine and pump it into a rescale node
> and redefine your output range to be narrow and you'll get the effect you're
> looking for (eg; from [0...1] to [0...0.05]).  if you don't use a rescale
> node, you can do the simple arithmetic yourself with a node or two.
>
> one speed bump you may possibly run into is getting all the positions in the
> same coordinate space before you do the computations.  Mental ray defines
> values in 'internal space' which changes depending on the context which the
> shader is evaluated.  Usually it means world space, but in the case of
> state->pnt it might mean object space.  If you run into this problem, insert
> a coordinate conversion node between the position vector and it's output
> targets.
>
> Matt
>
>
>
>
>
> Date: Tue, 24 Mar 2015 11:16:59 -0500
> From: Patrick Neese <[email protected]>
> Subject: Retro reflective Materials?
> To: [email protected]
>
>
> I'm looking for  something like cat eyes or road signs.  Where the
> reflection of the light source is affected by angle between the light
> to surface to camera. It should also take into account the light
> color.  If a cat turns it's eyes away the intensity/reflection changes
> and/or if the camera shifts off axis from the surface or the light
> shifts off axis the intensity/reflection also changes.
>
> So, I've experimented with some Incidence nodes with some crazy
> mixing...but it isn't exactly what I would like.
>
> Does anyone know of a way to get a proper retro-reflective material?
> Did I miss something in some of the shaders or a shader itself? One of
> the BRDF settings?
>
> Thank you for your time.
>
> Patrick
>

Reply via email to