Thanks Matt,

I found the mocap stuff yesterday but unfortunately the server doesn't
appear to work on MacOS. We may be able to work around this.

Fortunately I also found the
maya.utils.executeInMainThreadWithResult()function, so I should be
able to set up an ad-hoc server to pick up my
camera data. I've got a server/client working in python/C++ now, so I just
need to reverse the roles and run the python code inside of Maya instead of
as a standalone process.

Assuming I can get this working, what would be the steps in creating an
animation path from the timecode data?

We will be receiving data in the form hh:mm:ss:ff <position> <tilt> <pan>
<zoom>

I also managed to get the viewCallbackTest installed, but when I invoke it
Maya crashes silently. I will hopefully get more time to dig into this
today.

Since it appears that Maya is mostly built in mel scripts with a massive
library of DLLs we may be able to pick and choose which components we need.

BTW, I had a look at the pymel docs. Amazing and wonderful work.

Cheers,
Steve

On Fri, Feb 13, 2009 at 9:13 AM, Matthew Chapman <[email protected]>wrote:

> Steve,
>
> I was once working on a project similar to this, but dealing with real time
> point cloud data. When we wrote our tool set we originaly planed on using
> maya but ended up using motion builder because it had more provisions for
> dealing with streaming mocap data. I could see it work with a callback
> mechanisms like that but I think it might get hairy dealing with dg/time
> updates. I might checkout the mocap section in the devkit. They have an
> example of a mocap server that reads system time and updates a rotation
> based on seconds, minutes, and hours. You might be able to reuse that code
> to read from your stream and send it into maya as a device using the
> defineDataServer command. I have never worked with a mocap stream to maya
> implementation so all this info is part speculation.
>
> Matt
>
>
> On Wed, Feb 11, 2009 at 8:48 PM, Steve Wart <[email protected]> wrote:
>
>> Sorry to reply to my own note, but I've had a look at the viewCallbackTest
>> example plugin
>> http://me.autodesk.jp/wam/maya/docs/Maya2009/API/view_callback_test_8cpp-example.htmland
>>  it seems I should be able to do everything I need to do with the Maya
>> API.
>>
>> I built the example but when I install the plugin I get the error "Error:
>> does not match the current architecture" - presumably this is because I'm
>> using the 30-day trial version (at least according to this discussion thread
>>
>> https://collada.org/public_forum/viewtopic.php?f=12&t=1022&sid=51f6f06403a8a8b76f177593e8254145&start=30
>> )
>>
>> Can anyone confirm that this is in fact the case?
>>
>> Steve
>>
>>
>> On Wed, Feb 11, 2009 at 5:49 PM, Steve Wart <[email protected]> wrote:
>>
>>> I'm very new to Maya, but I've been reassured that what we have been
>>> asked to do is possible. Hopefully someone here can help point us in the
>>> right direction.
>>>
>>> We have an external camera which is going to be sending location, pan,
>>> tilt and zoom data over the network. We would like to use this data to
>>> manipulate a camera in Maya in real-time and render the results on the
>>> screen. This will be used to preview a video signal combined with the
>>> rendered image.
>>>
>>> There is no need to save the rendered results to a file, in fact, if that
>>> will slow us down, we would prefer to avoid it altogether. We are thinking
>>> it might be better in fact to send the video signal directly to tape,
>>> although we need to save the animation keyframe data so we can do a final
>>> render for post-production.
>>>
>>> We have some simple test harnesses in python and C++ that can read the
>>> location and timecode data from the network, and I've managed to move the
>>> camera around on the screen using another script. One problem I'm having is
>>> that the script process seems to be running in the same thread as the GUI,
>>> so it hangs until it has finished processing all commands.
>>>
>>> Is it possible to have a script running in the background that
>>> continually updates the camera attributes, progresses an animation according
>>> to the timecode data in my script, and renders the resulting image in
>>> real-time? It doesn't need to be a high-quality render. I've searched
>>> through heaps of tutorials, but there seems to be little about driving
>>> animation in real-time.
>>>
>>> Also, I am hoping it would be possible to texture a video frame from the
>>> camera within each animation frame, I suppose in an orthographic projection
>>> in front of the rendered model. I can get the frame data from the video
>>> signal, but is it possible with the C++ API to interact with the frame
>>> buffer in Maya during the render?
>>>
>>> Hopefully my description isn't too confusing. If you have any suggestions
>>> as to a high-level approach (or warnings about specific technical pitfalls),
>>> I'd love to hear them.
>>>
>>> Thanks,
>>> Steve
>>>
>>
>>
>>
>>
>
> >
>

--~--~---------~--~----~------------~-------~--~----~
Yours,
Maya-Python Club Team.
-~----------~----~----~----~------~----~------~--~---

Reply via email to