On Wed, Aug 17, 2011 at 3:40 PM, Louis Simons <[email protected]> wrote: > I've been working on an open source video playout and real-time titling > package similar to the features of Newtek's Video Toaster. I've been > testing out pieces of implementation with Gstreamer as the backend and have > been finding that it is very fragile when it comes to combining realtime > (I've been using video4linux) and prerecordered sources (files). I've got a > case of the grass is always greener and was hoping to give MLT a try, since > looking at the documentation, it seems like a much simpler interface, > without the background complexities of first mastering glib. On top of > that, if I understand the build process right, MLT treats higher-level > language bindings like first class citizens, and I would've loved to work
Not exactly first-class since some C features are not fully available in Ruby, and they are generated such that they did not get the full "ruby-way" treatment. You will be ok as long as you do not need to get the binary audio and video into Ruby. Of course, if/when you ever need to, you can improve the binding in that area. At least they are always up-to-date when you enable them in your build, but typically only the python binding is available in distro packages because OpenShot needs it. > with Gstreamer in Ruby if only the bindings were up to date. > From reading the mailing list archives, I found a reference to February 2011 > saying that live sources were pretty badly broken. I was wondering if that > is still the case? Also, from the wiki fundamentals of MLT page, I couldn't That was really in reference to libavdevice live inputs, and that is no longer true. Project sponsors have funded to have that fixed, and now video4linux2, alsa, and network streams work good. Also, there is DeckLink SDI and HDMI input. DV/FireWire works via pipe input, but HDV is flaky. > find much information on using live sources with transitions or mixing live > and file producers. Is mixing live and file producers outside the scope of Some info about live sources is in the FAQ. For avformat-based sources, it follows FFmpeg docs fairly closely. For DeckLink, you use "decklink:". You can mix file with live very well. The watermark filter is simpler to use that multiple tracks and a composite filter: melt noise: -filter watermark:demo/watermark1.png > MLT? I'm going to be digging into MLT to learn more, but would greatly > appreciate if someone could warn me I'm barking up the wrong tree with the > framework. It is not the wrong tree at all, but prepare to do a lot of digging as there are not a lot of people here to help answer questions, and I am already stretched thin. -- +-DRD-+ ------------------------------------------------------------------------------ Get a FREE DOWNLOAD! and learn more about uberSVN rich system, user administration capabilities and model configuration. Take the hassle out of deploying and managing Subversion and the tools developers use with it. http://p.sf.net/sfu/wandisco-d2d-2 _______________________________________________ Mlt-devel mailing list [email protected] https://lists.sourceforge.net/lists/listinfo/mlt-devel
