1. I mostly do characters for games. Used to do animation too, but not
lately and I've only done one project that involved this Maya-Softimage
workflow with animation.
We had to port characters animations made in Softimage to Maya and baking
transforms and exporting in FBX was good enough. Now that I remember, the
Maya rig wasn't "compatible" with the Softimage one so I had to write a few
things to constrain the Maya rig to the FBX, bake to the Maya rig, and
delete the original FBX data.
Doing manually for all the characters would have been a nightmare.

2. The Softimage data had RGBA. I'm not sure how Maya handles Vertex Color
because I only found how to get a vertex color through a vertex, not
sample, so it may be giving me the average RGBA values.
I just tried again, with an FBX that has some vertex color rotated.
Selected 1 vertex, get the RGBA values : 0.37, 0.0, 0.0, 1.0.
The Alpha value was already in 1.0, but if I set Alpha to 1.0, it fixes it.

3. I meant UV as in samples inside the UV editor, not Texture Projection.
And with merge I meant something like heal. In Maya there is no "Tearing
mode", but cut and merge.

In Maya you need to have at least 1 UV Set (Texture Projection) called
"map1", and this will be the default UV Set any texture in any material
will use. So if you renamed your Texture Projection in Softimage to "map1"
before exporting, this will be the primary UV as default. If you didn't
then Maya will create a map1 and all your objects will have 1 more UV Set
and it will a little troublesome to fix. So, be sure you have a map1.

If you want to select which UV is going to use a texture, you can do it in
Maya too through the UV linking editor tool. So you can have for example
multiple objects using the same texture (and material) using UVSet "map1",
and only one of them has multiple UVs and is set to use UVSet "map2".
As always in Maya, you don't have this option anywhere near the texture
node or object node, but in a totally separated "Relationship Editor" in
your Windows Menu. Traditional Maya workflow.

4. Maya doesn't have Neutral Pose. Joints have something similar called
Joint Orientation where you can have an angle in your joint while having
different rotation values. You can Freeze your joint to set your rotations
to 0, and by doing that the old rotation values will be merged with the
orientation values. But you can't do the opposite (without coding). I mean,
set your orientation to 0 to get your rotation value back, why do you need
to do that ? No idea, but I've been asked to do that sometimes.

Normal Transform nodes don't have this orientation attribute and they would
need an additional transform node as a parent to have this double
coordinate.

If you export in FBX a Softimage object with Neutral Pose, it will be
exported as 2 nodes. If it was a envelope deformer (bone, null) with
Neutral Pose, you will have 2 joints in Maya.

5. I haven't done much rigging in Maya other than simple characters a few
times, so It's been more like a try and error without too much researching.
I can't really remember exactly but trying with a few nodes in the Node
Editor worked for my needs. I haven't done any complicated though.

6. In one Softimage project, we had Maya designers doing it, so it was
kinda the opposite. The bones (nulls) in Softimage had ICE expressions to
the elbow, shoulder, knees, etc. Export to Maya through FBX and recreate
the ICE expressions to simple Maya expresions. It worked fine and the Maya
designers could see the deformations just like they were in Softimage.
When sending the data back to Softimage, we only used the Maya meshes and
their weights with the Softimage bones.

7. Since we do mainly characters, the workflow is always similar in one
project. So for the first character it can take me a day to modify my
scripted workflow to the new project. And a few more days to tweak it and
make it more efficient. After that, the conversion is a matter of minutes,
unless you have some unexpected error or mistakes.

In the project I was talking in the 6th point, it was just a few minutes to
execute a few scripts. The main script consisted in cleaning Softimage
data, renaming to map1 and those things, create a Maya project folder, copy
textures, export to FBX, open it in maya batch, run a mel script to clean
it up, fix materials, texture paths, and all that can be automatized, and
save as .ma ( I don't know if it can be done in Python instead of mel, but
I couldn't make it work back then ). For Maya to Softimage exactly the same
but in reverse, using xsibatch. Nothing too complicated but saved a lot of
time with that.

After that conversion there isn't really much to do but check if everything
is ok and tweak a little.

And a few scripts to export only weights or point position (for blend
shapes) to be readed in Maya.

Martin



On Wed, Oct 11, 2017 at 12:27 AM, Matt Lind <[email protected]> wrote:

> thanks, Martin.
>
> A few follow up questions:
>
> 1. What kind of animation are you baking?  (e.g. besides transforms, what
> parameters are supported?)
>
> 2.  Vertex colors - I think Maya expects RGBA values.  If you send only
> RGB,
> then that would explain the 'rotation' of values as the Red value of the
> 2nd
> polygon node will map to the alpha channel of the first polygon node (face
> vertex) Maya is expecting.  And then ripples to all the latter nodes as
> well.
>
> 3.  UVs - what do you mean by 'merge' UVs?  You mean like 'heal' as it's
> known in Softimage?  or do you mean put all the UVs into a single texture
> projection?  Also, how do you handle the case of multiple objects sharing
> the same material, but with different UVs on their respective geometries?
> Example: 3 objects share the same material and phong shader, but the image
> node driving the ambient and diffuse ports of the phong shader references a
> different set of UV coordinates (texture projection) on each of the 3
> objects....because that's the only way to do it.  How do you replicate that
> setup in Maya?  Same question for materials applied to clusters which are
> shared across multiple objects.
>
> 4. Neutral Pose - what is the technical equivalent of a neutral pose in
> Maya?  Are the neutral pose issues specific to .fbx?
>
> 5.  Constraints.  What constraints do you need most (besides position,
> scale, orientation, direction, and pose)?  After rebuilding your
> constraints
> in Maya, how do you handle cases where multiple constraints applied to the
> same object affect the same attributes?  Example:  in Softimage, applying 2
> position constraints on the same object results in 2 position constraint
> operators being applied to the object.  In Maya, applying 2 point
> constraints results in one point constraint operator with 2 inputs being
> blended.  When more constraints are applied to the same object, additional
> nodes, such as a 'pairBlend' node, may be inserted to resolve the
> conflicts.
> How do you control/organize the logic of how the constraints are applied
> (e.g. what is Maya's logic for determining whether to insert a pairBlend
> node vs. plugging another object into the input of the constraint
> operator?)
>
> 6. Expressions.  How important are expressions to your needs?  What
> features
> of Softimage expressions do you need most?
>
>
> 7.  How much time does it take you to finish your work after it is imported
> into Maya from Softimage?  Minutes? Hours? Days?
>
> Matt
>
>
>
>
> Date: Mon, 9 Oct 2017 13:30:34 +0900
> From: Martin Yara <[email protected]>
> Subject: Re: Porting to Maya
> To: "Official Softimage Users Mailing List.
>
> I've been using that workflow for a few years. Softimage to Maya, Maya to
> Softimage. Mainly for character modeling and animation, shape animation.
>
> I scripted most of it. Softimage Script -> FBX -> Maya batch with a Maya
> Script. Works pretty fine, but it has to be a little customized depending
> on the project. It can be done in one click, and the part that takes the
> most time is the FBX conversion.
>
> 1. Modeling (including bones, weights, all except rig), Animation (baked
> and using 2 compatibles rigs in Softimage and Maya).
>
> 2 and 3.
> Lots of things, and I'm sure a lot of them you already know, but just in
> case:
>
> - Vertex Color. Usually if you match the FBX version to the Maya version it
> will export fine. The problem is Softimage only has FBX 2015, so exporting
> to 2016 didn't work very well sometimes.
> We were doing a Maya 2016 project and exporting to FBX caused the Vertex
> Color to be "rotated" like the old FBX UV problem. Weird enough, if I clean
> the mesh by export / importing to OBJ, copy weights from my old mesh and
> other things before exporting to FBX, it usually works fine. But even more
> weird, importing this bugged FBX into Maya, and setting the Alpha Channel
> to 1.0 fixed it. Yeah, I don't know why.
>
> - UVs. You have to rename at least your main Tex.Projection to map1 before
> exporting or it will get messy inside Maya. And merge all UVs in Maya once
> imported, because all your UVs will be separated. Selecting All UVs and
> merge them with a very low threshold value works fine.
>
> - Materials. Depending on the Maya version and how complicated your
> Materials are you will have to rebuild them. And obviously fix the texture
> paths. Delete Scene Material.
>
> - Delete Neutral Pose in Softimage before exporting or you will have an
> extra locator or bone.
>
> - Unlock Normals. When you import into Maya, the normals will be locked,
> and if you don't unlock them before doing anything in your mesh, your
> normals will get messed up pretty quickly.
>
> - Remove Namespaces in Maya.
>
> - Just in case, check that the weights are normalized. I don't know if that
> is normal, but I had a few problems with this so I normalize everytime I
> import into Maya.
>
> - Vertices numbers are the same. So if you are using shape animation with
> different objects, then it will be easily exportable with a custom script,
> just write the points positions and load them in Maya without having to use
> FBX everytime. I did it with JSON and OpenMaya.
> The same with weights if necessary. I used Comet, and Alan Fregtman's
> Softimage version. Comet script is an old mel, so it could be faster with
> JSON and OpenMaya, but it works. I haven't look out how to do it with
> vertex colors or UVs.
>
> - Animation is pretty straight forward as long the objects have the same
> name and don't have neutral pose. Obviously Constraints, expressions or any
> deformer other than envelope won't work so you'll have to rebuild them in
> Maya. I guess that could be scripted too.
>
> - Internal triangle edges orientation change, so be sure to triangulate
> before exporting if needed.
>
> - Maya has different default frame rate values, so match frame rate, start
> and end frame before importing if needed.
>
>
> >From Maya to Softimage:
> - Once imported into Softimage, rebuild the Envelope or any weight
> operation won't work as expected. I'm not sure but something get a
> different order when imported, deformers, points, I'm not sure, haven't
> looked up either.
>
> - Symmetry Template will fail if you don't rebuild your envelope. And any
> script that depends on the Symmetry Template may crash Softimage if you
> don't do it.
>
> - Again just in case, I normalize the weights.
>
> - Be sure to export with Smoothing Groups to be able to import hard edges
> in Softimage.
>
> I think that's pretty much all I do.
> Hope it helps.
>
> Martin
>
>
> ------
> Softimage Mailing List.
> To unsubscribe, send a mail to [email protected]
> with "unsubscribe" in the subject, and reply to confirm.
>
------
Softimage Mailing List.
To unsubscribe, send a mail to [email protected] with 
"unsubscribe" in the subject, and reply to confirm.

Reply via email to