Hi Karan,

I'm trying to create a simple live video DJ application. Basically, it take 4 inputs from 4 cameras, and allows me to switch between them (nothing special). I have the quartz composer part done, but now I need to put it in an application. Having never really worked with PBuffers or live cameras (I have messed around with QCRender), I need some help.

1) I'm using quartz composer to render the video to be displayed at 1024x768, with each of the 4 inputs showing below it at 256x192. My idea is to render this all in a viewport of 1024x960, and then crop it and put each section into a window, scaling the large video to also put a preview onto the controller screen. Is this the best way to do this, and how do I go about the cropping and embedding?

You can indeed render it all in a PBuffer and then show subregion of the associated textures in NSOpenGLView, after making sure that all contexes are shared. You can also have one composition + QCView per window; this should be as fast or faster, and easier to do.


2) I know that uncompressed video takes up a lot of space over firewire. For 4 cameras that send video at an unknown size (I think they'll be analog, so they'll be through component cables), what do I need to do to make sure that I can get all the various streams without delay and at at least 1024x768? Is there anything I can use (some sort of hardware) so that I don't have to potentially buy an extra firewire card for my computer? Or will this not be a problem??

If you have 4 firewire cameras sending 1024x768 at 30fps that will be indeed be a lot of data to process. That won't fit in a FW400 bus, maybe not even in a single FW800 bus. From memory you can fit 3 640x480 streams at 30 fps on a FW400 bus, they have some graph on pointgreyresearch.com that you might want to check out. You need also FW800 cameras, as one FW400 peripheral on a FW800 will make the bus run at 400Mb/s.

Anyway, I don't see how you'll do without an extra card.

But all that is more a question for Quicktime or Firewire people.


3) I want to have some way of creating a table in the application that you can put data in, that will be sent to the quartz composition for use whilst rendering. I need it to be able to reorder the rows, but other than that I just need to be able to send all the data in multidimensional arrays. I know of the Chart application from the examples, but how do I enable reordering?


The chart is reading xml data. Either you reorder the xml data from your app and pass it to QC, either you do it from within QC by changing the indexes given to the Structure Index Member patch within the iterator (0->1, 1->2 and 2->0 for instance).

4) The video input patch in quartz composer has that wonderful input selection tool, but how do I get that in my application? Once again, I know there's an example of it, but I can't make heads or tails of how it's worth. If someone can give me a prod in the right direction on this, I can probably take it from there.

You can't change patch settings programmatically since you can't publish them, so you must save the inputs in the composition, or feed the frames yourself as you might have seen in some examples and do the input selection in your code.

Hope this helps,
Kevin


---------
Kevin Quennesson
Quartz Composer Team            Apple Inc.

_______________________________________________
Do not post admin requests to the list. They will be ignored.
Quartzcomposer-dev mailing list      ([email protected])
Help/Unsubscribe/Update your Subscription:
http://lists.apple.com/mailman/options/quartzcomposer-dev/archive%40mail-archive.com

This email sent to [EMAIL PROTECTED]

Reply via email to