Hello!

I'm looking to merge various audio and video sources into one. The sources
need to start at different time offsets. My guess is to use avfilter lib.
Would that do the trick? Is that the right direction?

If avfilter lib is the correct direction I would greatly appreciate if
someone could provide some pseudo code on how I could be done? I'm having a
hard time finding some documentation or examples about using filters.

I have looked into the filtering_video.c example but I'm not 100% sure how
to go about setting it up and all. 

How do I feed the various audio and video frames to the different input
filter buffers? I'm not sure how the offsets are set or handled or if the
avfilter lib handles that side of things.

Thanks for any help or direction about this.
- Jona



--
View this message in context: 
http://libav-users.943685.n4.nabble.com/Merging-various-audio-and-video-sources-into-one-using-avfilter-lib-tp4657898.html
Sent from the libav-users mailing list archive at Nabble.com.
_______________________________________________
Libav-user mailing list
[email protected]
http://ffmpeg.org/mailman/listinfo/libav-user

Reply via email to