Re: [FFmpeg-user] Question about video compositing
On Mon, Jan 9, 2017, at 05:40 PM, Alex Speller wrote: > Ah, thanks a lot for the suggestion, but I should have been clearer that > I > need to do this in an automated fashion for arbitrary sets of videos so > it > has to be command-line (or a library I guess) so that I can integrate it > into an automated pipeline in my app. melt might be a more appropriate tool for this than ffmpeg. It's used by Kdenlive. ___ ffmpeg-user mailing list ffmpeg-user@ffmpeg.org http://ffmpeg.org/mailman/listinfo/ffmpeg-user To unsubscribe, visit link above, or email ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".
Re: [FFmpeg-user] Question about video compositing
There is probably a way to do it directly with ffmpeg on the commandline, but as I'm not an expert on that, I won't confuse you with guesses. I have figured out how to generate raw video frames with a C++ program and have ffmpeg convert it into a video. My ffmpeg command looks like this: ./shm | ffmpeg -f rawvideo -pixel_format bgra -video_size 1024x768 -framerate 30 -i - -vcodec huffyuv overlay.avi The ./shm program is my C++ program. If you generate pngs from your mpeg videos, you can read those in and generate the composit video. I use Qt a lot, and it would make the commandline program easier (handles reading pngs, compositing them, etc.). Josh Ah, thanks a lot for the suggestion, but I should have been clearer that I need to do this in an automated fashion for arbitrary sets of videos so it has to be command-line (or a library I guess) so that I can integrate it into an automated pipeline in my app. Thanks, Alex On Tue, Jan 10, 2017 at 3:37 AM Steve Boyerwrote: > Any suggestions on if either of these approaches is better, or any > alternatives? Thanks! > Hi! I've done something similar to doing this, but I ended up using a non-linear video editor. Specifically, I used kdenlive. It can do keyframe animation, so combine that with fade-ins/fade from blacks (audio/video filters) as well as fade-outs/fade to blacks, and you can do multiple tracks combined into a single output video with animations/fades when a stream ends. The downsides are that kdenlive, despite being the best video editor on linux (IMHO), is a little buggy, is all CPU-based when it comes to rendering and output file, for stability purposes it is recommended to use a single thread, you will have to manually put things together and time them, and need to use a GUI to do it all. I'd be happy to help with suggestions if you go this route, but understand if you want to go a different way (and I'd be interested if anyone has other suggestions how this can be accomplished via FFmpeg or CLI tools). Steve > ___ > ffmpeg-user mailing list > ffmpeg-user@ffmpeg.org > http://ffmpeg.org/mailman/listinfo/ffmpeg-user > > To unsubscribe, visit link above, or email > ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe". ___ ffmpeg-user mailing list ffmpeg-user@ffmpeg.org http://ffmpeg.org/mailman/listinfo/ffmpeg-user To unsubscribe, visit link above, or email ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe". ___ ffmpeg-user mailing list ffmpeg-user@ffmpeg.org http://ffmpeg.org/mailman/listinfo/ffmpeg-user To unsubscribe, visit link above, or email ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe". ___ ffmpeg-user mailing list ffmpeg-user@ffmpeg.org http://ffmpeg.org/mailman/listinfo/ffmpeg-user To unsubscribe, visit link above, or email ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".
Re: [FFmpeg-user] Question about video compositing
Ah, thanks a lot for the suggestion, but I should have been clearer that I need to do this in an automated fashion for arbitrary sets of videos so it has to be command-line (or a library I guess) so that I can integrate it into an automated pipeline in my app. Thanks, Alex On Tue, Jan 10, 2017 at 3:37 AM Steve Boyerwrote: > > Any suggestions on if either of these approaches is better, or any > > alternatives? Thanks! > > > > Hi! I've done something similar to doing this, but I ended up using a > non-linear video editor. Specifically, I used kdenlive. It can do keyframe > animation, so combine that with fade-ins/fade from blacks (audio/video > filters) as well as fade-outs/fade to blacks, and you can do multiple > tracks combined into a single output video with animations/fades when a > stream ends. The downsides are that kdenlive, despite being the best video > editor on linux (IMHO), is a little buggy, is all CPU-based when it comes > to rendering and output file, for stability purposes it is recommended to > use a single thread, you will have to manually put things together and time > them, and need to use a GUI to do it all. > > I'd be happy to help with suggestions if you go this route, but understand > if you want to go a different way (and I'd be interested if anyone has > other suggestions how this can be accomplished via FFmpeg or CLI tools). > > Steve > > > > ___ > > ffmpeg-user mailing list > > ffmpeg-user@ffmpeg.org > > http://ffmpeg.org/mailman/listinfo/ffmpeg-user > > > > To unsubscribe, visit link above, or email > > ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe". > ___ > ffmpeg-user mailing list > ffmpeg-user@ffmpeg.org > http://ffmpeg.org/mailman/listinfo/ffmpeg-user > > To unsubscribe, visit link above, or email > ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe". ___ ffmpeg-user mailing list ffmpeg-user@ffmpeg.org http://ffmpeg.org/mailman/listinfo/ffmpeg-user To unsubscribe, visit link above, or email ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".
Re: [FFmpeg-user] Question about video compositing
> Any suggestions on if either of these approaches is better, or any > alternatives? Thanks! > Hi! I've done something similar to doing this, but I ended up using a non-linear video editor. Specifically, I used kdenlive. It can do keyframe animation, so combine that with fade-ins/fade from blacks (audio/video filters) as well as fade-outs/fade to blacks, and you can do multiple tracks combined into a single output video with animations/fades when a stream ends. The downsides are that kdenlive, despite being the best video editor on linux (IMHO), is a little buggy, is all CPU-based when it comes to rendering and output file, for stability purposes it is recommended to use a single thread, you will have to manually put things together and time them, and need to use a GUI to do it all. I'd be happy to help with suggestions if you go this route, but understand if you want to go a different way (and I'd be interested if anyone has other suggestions how this can be accomplished via FFmpeg or CLI tools). Steve > ___ > ffmpeg-user mailing list > ffmpeg-user@ffmpeg.org > http://ffmpeg.org/mailman/listinfo/ffmpeg-user > > To unsubscribe, visit link above, or email > ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe". ___ ffmpeg-user mailing list ffmpeg-user@ffmpeg.org http://ffmpeg.org/mailman/listinfo/ffmpeg-user To unsubscribe, visit link above, or email ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".
[FFmpeg-user] Question about video compositing
I have a question about video compositing. I’ve included the text of the question below but I’ve also put it in a gist for easier to read formatting here: https://gist.github.com/alexspeller/aefdd5a6d7100d28d0bbc4838527f797 I have multiple mp4 video files and I want to composite them into a single video. Each stream is an mp4 video. They are of different lengths, and each file also has audio. The tricky thing is, I want the layout to change depending on how many streams are currently visible. As a concrete example, say I have 3 video files: | File | Duration | Start | End | |---|--|---|-| | a.mp4 | 30s | 0s| 30s | | b.mp4 | 10s | 10s | 20s | | c.mp4 | 15s | 15s | 30s | So at t=0 seconds, I want the video to look like this: ``` +-+ | | | | | | | | | a.mp4 | | | | | | | | | | | +-+ ``` At t=10s, I want the video to look like this: ``` +--++ | || | || | | a.mp4 | | || | ++ | b.mp4| | | | | | | | | | | +--+ ``` At t=15s, I want the video to look like this: ``` +--++ | || | || | | a.mp4 | | || | ++ | b.mp4|| | || | | c.mp4 | | || | ++ | | +--+ ``` And at t=20s until the end, I want the video to look like this: ``` +--++ | || | || | | a.mp4 | | || | ++ | c.mp4| | | | | | | | | | | +--+ ``` Ideally there would be some animated transitions between the states, but that's not essential. I have found two possible approaches that might work, but I'm not sure what the best one is. The first is using [filters](https://trac.ffmpeg.org/wiki/Create%20a%20mosaic%20out%20of%20several%20input%20videos) to acheive the result, but I'm not sure if it will cope well with (a) the changing layouts and (b) keeping the audio without any artefacts when the layout changes. The other approach I thought of would be exporting all frames to images, building new frames with imagemagick, and then layering the new frames on top of the audio like in [this blog post](https://broadcasterproject.wordpress.com/2010/05/18/how-to-layerremix-videos-with-free-command-line-tools/). Any suggestions on if either of these approaches is better, or any alternatives? Thanks! ___ ffmpeg-user mailing list ffmpeg-user@ffmpeg.org http://ffmpeg.org/mailman/listinfo/ffmpeg-user To unsubscribe, visit link above, or email ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".