Hi all, I am interested in the idea "Improve blender for dealing with motion capture data". I'm a graduate student doing research on motion synthesis based on motion capture data. I think this project is quite related to my research. I once do a small project of motion blending using BVH motion data.
Following is my understanding about the features mentioned in the idea: Curve simplification for mo-cap data: we can use a smaller set of key frames to represent the original mo-cap data. Users can edit the motion with these key frames. So the problem here is how to sample these key frames to fit the original motion data. Turn repetitive motion into loops: a simple example is generate walking motion of any length with one walking cycle. The problem here is to find the repetitive pattern of motion from the mo-cap data. Not sure if I get it correctly or not? Any suggestions or information is welcome. Thanks -- Yumei Wang School of Computing, National University of Singapore. _______________________________________________ Bf-committers mailing list [email protected] http://lists.blender.org/mailman/listinfo/bf-committers
