As stated, I'm using the same tangent space and tangents vertex color property 
as ultimapper.  The only variable is the internal computations to convert the 
normal from the source's (hi res mesh) tangent space to the target's (low res 
mesh) tangent space.  When both meshes are at the world origin untransformed, I 
get the same result as ultimapper.  When one or both meshes is transformed, a 
discrepancy arises that I cannot completely explain.  What I am noticing is my 
code respects the tangents property as if the tangents are encoded in object 
space of the target mesh.  Ultimapper appears to interpret the same tangent 
information in world space, or in the object space of the source object.  I'm 
not sure which, but that's the difference.  I *think* my results are more 
correct than ultimapper's, but I want to make sure I'm understanding what the 
expected result is.  That's why I ask if anybody has experienced weirdness with 
ultimapper.  If everybody agrees ultimapper is correct, then my result is 
wrong, and vice versa.

As for your issues, you can turn off the smoothing of tangents if that doesn't 
work for you.  Just open the TangentsOp2 operator on the tangents vertex color 
property and set smoothing angle to 0 or turn it off completely using the drop 
down menu.  Ultimapper tends to re-apply the tangents operator every time the 
PPG is opened, so you may want to uncheck the 'keep tangent operator' parameter 
in the advanced tab to prevent that from recurring.


Included below is a script to generate a scene for illustration purposes which 
has a sphere representing the hi-res source mesh with spherical texture 
projection used as the basis for generating the tangents.  This means all the 
tangents are parallel to the global XZ plane and point counterclockwise around 
the sphere when viewed from above.  The bitangents point up in global Y along 
the contour of the sphere.  The normal, of course, points out perpendicular to 
the surface.

The cube is the low res mesh to receive the result.  It has a cubic projection 
with tangents applied in very similar fashion as the sphere with the tangents 
pointing in the +X direction and continuining counter clockwise around the cube 
when viewed from above.  The bitangents all point in global +Y.  Again, the 
normal point out perpendicular to the surface of each side.  The only reason 
the cube is heavily subdivided is so I can use scripted code to prototype the 
tool as scripts cannot write image files.  I had to write to a vertex color 
property instead.

1)  open Ultimapper on the cube, make sure the path points to a valid location, 
then click the 'regenerate maps' button.
2) Open the cube's rendertree and replace no_Icon.pic with the normal map 
generated in step 1.

You should get a normal map in tangent space which shows the same image on all 
six faces of the cube - a rainbow colored circle with the top of the circle 
shaded in green, the bottom right shaded in red and the bottom left shaded in 
darker blue.   The green part of the circle should always be closest to the 
bitangent (up), and the red part of the circle towards the tangent (right).

3) select the cube and rotate it +45 degrees on its local Z axis, then 
regenerate the tangent space normal map.

NOTE: Since the tangentsOp2 operator is frozen, the tangents vertex color 
property will not be updated when the cube is rotated.  This means all tangents 
and bitangents are described in the local space of the cube and will retain the 
relative orientation to each polygon compared to step 1.

4) Regenerate the normal map by clicking the 'regenerate maps' button in the 
ultimapper property.

Notice ultimapper has re-encoded the normal map.  The circle drawn on the front 
face of the cube has been rotated -45 degrees so it's bitangent points in 
global +Y (green part of circle) and the tangent is pointing in global +X (red 
part of circle).  By comparison, my code generates the same result as in step 1 
indicating it respects the local nature of the tangent and bitangent 
information and orients the circles to the faces of the cube.

The question is....who is correct?



-------------------- start of script --------------------

// source (sphere)
CreatePrim("Sphere", "MeshSurface", null, null);
ApplyShader("$XSI_DSPRESETS\\Shaders\\Material\\Lambert.Preset", "", null, "", 
siLetLocalMaterialsOverlap);
SetValue("sphere.polymsh.geom.subdivu", 500, null);
SetValue("sphere.polymsh.geom.subdivv", 250, null);
CreateProjection("sphere", siTxtSpherical, null, null, "Texture_Projection", 
null, null, null);
CreateVertexColorSupport("ColorAtVertices", "Tangents", "sphere", null);
ChangeVertexColorDatatype("sphere.polymsh.cls.Texture_Coordinates_AUTO.Tangents",
 1, null);
ApplyOp("TangentOp2_cpp", 
"sphere.polymsh.cls.Texture_Coordinates_AUTO.Tangents;sphere.polymsh.cls.Texture_Coordinates_AUTO.Texture_Projection",
 siUnspecified, siPersistentOperation, null, 2);
FreezeObj(null, null, null);

// target (cube)
CreatePrim("Cube", "MeshSurface", null, null);
ApplyShader("$XSI_DSPRESETS\\Shaders\\Material\\Lambert.Preset", "", null, "", 
siLetLocalMaterialsOverlap);
SetValue("cube.polymsh.geom.subdivu", 100, null);
SetValue("cube.polymsh.geom.subdivv", 100, null);
SetValue("cube.polymsh.geom.subdivbase", 100, null);
CreateProjection("cube", siTxtCubic, siTxtDefaultCubic, null, 
"Texture_Projection", null, null, null);

SetValue("Texture_Support.Texture Support.mxfacesu", 0.333, null);
SetValue("Texture_Support.Texture Support.mxfacesv", 0.333, null);
SetValue("Texture_Support.Texture Support.pxfacesu", 0.333, null);
SetValue("Texture_Support.Texture Support.pxfacesv", 0.333, null);
SetValue("Texture_Support.Texture Support.myfacesu", 0.333, null);
SetValue("Texture_Support.Texture Support.myfacesv", 0.333, null);
SetValue("Texture_Support.Texture Support.pyfacesu", 0.333, null);
SetValue("Texture_Support.Texture Support.pyfacesv", 0.333, null);
SetValue("Texture_Support.Texture Support.mzfacesu", 0.333, null);
SetValue("Texture_Support.Texture Support.mzfacesv", 0.333, null);
SetValue("Texture_Support.Texture Support.pzfacesu", 0.333, null);
SetValue("Texture_Support.Texture Support.pzfacesv", 0.333, null);

SetValue("Texture_Support.Texture Support.mxfacetv", 0.666, null);
SetValue("Texture_Support.Texture Support.pxfacetv", 0.666, null);
SetValue("Texture_Support.Texture Support.myfacetv", 0.333, null);
SetValue("Texture_Support.Texture Support.pyfacetv", 0.333, null);
SetValue("Texture_Support.Texture Support.pxfacetu", 0.666, null);
SetValue("Texture_Support.Texture Support.pxfacetu", 0.333, null);
SetValue("Texture_Support.Texture Support.pyfacetu", 0.333, null);
SetValue("Texture_Support.Texture Support.pzfacetu", 0.333, null);

CreateVertexColorSupport("ColorAtVertices", "Tangents", "cube", null);
ChangeVertexColorDatatype("cube.polymsh.cls.Texture_Coordinates_AUTO.Tangents", 
1, null);
ApplyOp("TangentOp2_cpp", 
"cube.polymsh.cls.Texture_Coordinates_AUTO.Tangents;cube.polymsh.cls.Texture_Coordinates_AUTO.Texture_Projection",
 siUnspecified, siPersistentOperation, null, 2);

AddProp("Ultimapper", "cube", siDefaultPropagation, null, null);
SetValue("preferences.Interaction.autoinspect", false, null);
SetValue("cube.Ultimapper.Texcoord", "Texture_Projection", null);
SetValue("cube.Ultimapper.Tangent", "Tangents", null);
ApplyOp("TangentOp2_cpp", 
"cube.polymsh.cls.Texture_Coordinates_AUTO.Tangents;cube.polymsh.cls.Texture_Coordinates_AUTO.Texture_Projection",
 siUnspecified, siPersistentOperation, null, 2);
SetValue("preferences.Interaction.autoinspect", true, null);
SetValue("cube.Ultimapper.Group", "sphere", null);
SetValue("cube.Ultimapper.Resolution", 1024, null);
SetValue("cube.Ultimapper.SuperSampling", 5, null);
FindUltimapperDistance("cube.Ultimapper");
SetValue("cube.Ultimapper.KeepTanOp", false, null);
DeleteObj("cube.polymsh.cls.Texture_Coordinates_AUTO.Tangents.TangentOp2");

FreezeObj( "cube", null, null);

-------------------- end of script ------------------------





Matt








From: [email protected] 
[mailto:[email protected]] On Behalf Of Szabolcs Matefy
Sent: Thursday, January 02, 2014 10:52 PM
To: [email protected]
Subject: RE: ultimapper issues - tangent space normal maps

Hey Matt,

Your result might be different because of the tangent space calculation. I 
suppose that the normal map calculation might be done in object space, then 
Ultimapper converts it into tangent space. Ultimapper could be quite good, but 
lacks a very important feature, the cage. So finally we dropped in favor of 
xNormal.

You might check few things (I'm not a programmer, so I may be wrong). Check the 
transforms. In my experience transforms has effect how vertex normals are 
calculated. Certain distance from the origin might result imprecision (is this 
the right word?), and the farther the object is from the origin, the bigger 
this imprecision is.

There are discrepancies, for sure, because these tools have different approach 
to derive tangent space. For example, Softimage uses the vertex color to store 
the tangents, and binormal is calculated from this. But, if your smoothing on 
the geo and on the tangent space property differs, you won't get any usable 
normal map. For example the smoothing on tangents made Ultimapper quite useless 
for us, so I wrote an exporter for xNormal, and since then we have no issue at 
all. As our technical chief explained, a normal is correct only if the normal 
baking and displayer use the same tangent calculation. He wrote a tangent space 
calculator for xNormal, that uses the same algorithm CryEngine uses. So, unless 
your game engine approached tangent space differently than Softimage, you won't 
get good result.

I think the whole game pipeline should be redesigned in Softimage...
From: 
[email protected]<mailto:[email protected]>
 [mailto:[email protected]] On Behalf Of Matt Lind
Sent: Friday, January 03, 2014 5:17 AM
To: [email protected]<mailto:[email protected]>
Subject: ultimapper issues - tangent space normal maps

I am writing a modified ultimapper to convert tangent space normal maps from 
one mesh to another.  The tool is needed because our tangent space normal maps 
are not encoded in the standard way and softimage's tools cannot be modified to 
support our proprietary tangent space.  For prototyping I'm using the softimage 
tangent space and tangents property to do the transfer so I can check my math 
against ultimapper.  Once I get a 1:1 match, I'll modify the logistics to 
support our proprietary stuff.

So far when the hi and low res meshes are untransformed I get a 1:1 match with 
ultimapper, but when I transform one or both meshes a wide discrepancy appears 
between my result and the softimage ultimapper result.  The softimage result 
tends to be significantly brighter on the red and green channels, mostly on the 
green.  In some cases, the colors are not even close to the same.  The odd part 
is when I trace through the process step by step to debug, my numbers look 
correct both visually and mathematically.  I'm in a weird situation in that I 
do not know who's result is more correct, mine or Softimage.

Some of our artists have mentioned there have been some discrepancies compared 
to other commercial normal mapping tools (beyond flipping the Y axis).  Has 
anybody had issues getting correct results from ultimapper when transferring 
tangent space normal maps between meshes?


Matt


Reply via email to