Sorry to reply to my own note, but I've had a look at the viewCallbackTest
example plugin
http://me.autodesk.jp/wam/maya/docs/Maya2009/API/view_callback_test_8cpp-example.htmland
it seems I should be able to do everything I need to do with the Maya
API.

I built the example but when I install the plugin I get the error "Error:
does not match the current architecture" - presumably this is because I'm
using the 30-day trial version (at least according to this discussion thread
https://collada.org/public_forum/viewtopic.php?f=12&t=1022&sid=51f6f06403a8a8b76f177593e8254145&start=30
)

Can anyone confirm that this is in fact the case?

Steve

On Wed, Feb 11, 2009 at 5:49 PM, Steve Wart <[email protected]> wrote:

> I'm very new to Maya, but I've been reassured that what we have been asked
> to do is possible. Hopefully someone here can help point us in the right
> direction.
>
> We have an external camera which is going to be sending location, pan, tilt
> and zoom data over the network. We would like to use this data to manipulate
> a camera in Maya in real-time and render the results on the screen. This
> will be used to preview a video signal combined with the rendered image.
>
> There is no need to save the rendered results to a file, in fact, if that
> will slow us down, we would prefer to avoid it altogether. We are thinking
> it might be better in fact to send the video signal directly to tape,
> although we need to save the animation keyframe data so we can do a final
> render for post-production.
>
> We have some simple test harnesses in python and C++ that can read the
> location and timecode data from the network, and I've managed to move the
> camera around on the screen using another script. One problem I'm having is
> that the script process seems to be running in the same thread as the GUI,
> so it hangs until it has finished processing all commands.
>
> Is it possible to have a script running in the background that continually
> updates the camera attributes, progresses an animation according to the
> timecode data in my script, and renders the resulting image in real-time? It
> doesn't need to be a high-quality render. I've searched through heaps of
> tutorials, but there seems to be little about driving animation in
> real-time.
>
> Also, I am hoping it would be possible to texture a video frame from the
> camera within each animation frame, I suppose in an orthographic projection
> in front of the rendered model. I can get the frame data from the video
> signal, but is it possible with the C++ API to interact with the frame
> buffer in Maya during the render?
>
> Hopefully my description isn't too confusing. If you have any suggestions
> as to a high-level approach (or warnings about specific technical pitfalls),
> I'd love to hear them.
>
> Thanks,
> Steve
>

--~--~---------~--~----~------------~-------~--~----~
Yours,
Maya-Python Club Team.
-~----------~----~----~----~------~----~------~--~---

Reply via email to