I've been looking a little more into this project I have in mind. Generally 
speaking, I want to take an input movie, process it (generally by adding 
overlays to the video, but possibly any other processing QC can do), and write 
it to a new movie file. I want to pass audio through unaltered.

It seems more and more evident that I'm going to have to write my own app to 
host the composition. I imagine a simple app that allows me to choose a 
composition, an input movie, and specify an output movie filename. It would 
then process the input movie as described above.

Can someone tell me, in high-level terms, how I would do this? A QCRenderer 
seems obviously necessary. I saw a sample app that exports TIFF files for each 
frame, perhaps I can start with that and create an H.264 movie (would I use 
CoreVideo for that?). I suppose the best thing is to synchronously process the 
input movie, controlling the time base, but I'm not sure how to do that; the 
end result I want is to process each frame of the input movie separately.

I also suppose my little host app will have to separately extract the audio of 
the input movie file and add it to the output movie file.

Any suggestions on how to proceed would be much appreciated. If there's some 
obvious and simple way to do this that I'm unaware of, please let me know.

Thanks!

-- 
Rick

 _______________________________________________
Do not post admin requests to the list. They will be ignored.
Quartzcomposer-dev mailing list      ([email protected])
Help/Unsubscribe/Update your Subscription:
http://lists.apple.com/mailman/options/quartzcomposer-dev/archive%40mail-archive.com

This email sent to [email protected]

Reply via email to