Ok thanks for the feedback then, I will continue developing my stuff as is in
that case.
Seablade
On Saturday, April 20, 2013, Dan Dennedy wrote:
> Yeah, quite simply, MLT is not well suited towards your needs. It
> converts all inputs to uncompressed (if not already). It does not
> support the notions of "-vcodec copy" or lossless editing of
> compressed inputs ("edit without re-encoding by splicing on a
> keyframe"). It is overkill for something like capturing something to
> disk and relaying it over the network. Sorry, it sounds like you need
> something more custom against libav* and the Decklink API.
>
> On Fri, Apr 19, 2013 at 6:00 PM, Thomas Vecchione
> <seabla...@gmail.com<javascript:;>>
> wrote:
> >
> >
> >
> > On Fri, Apr 19, 2013 at 8:09 PM, Dan Dennedy
> > <d...@dennedy.org<javascript:;>>
> wrote:
> >>
> >>
> >> Does it? MLT is not really intended to be used for capturing an IP
> >> stream and then relaying it.
> >>
> >
> > Maybe I am assuming some functionality that isn't there, but since there
> > was, I believe I read, a wrapper around ffmpeg functionality, that so
> long
> > as the functionality existed in ffmpeg it could conceivably be used like
> > this.
> >
> >>
> >>
> >> It depends; that's hard to answer, but doubtful that a novice can do it.
> >>
> >
> > Ok gives me something to go off of, means I probably couldn't get it
> done to
> > quickly at least, which gives me a near term plan for this software.
> >
> >>
> >> > Are producers following a defined API that might be documented
> somewhere
> >> > outside of code, or are they all hardcoded into the framework and not
> >> > following an API?
> >>
> >> The API is documented under API Reference in the Documentation section
> >> of the web site or just look at the source code of existing producers.
> >>
> >
> > Thank you, I was looking at the code of the Decklink producer, will add
> in
> > the doxygen documentation as well, may clear up some questions I had.
> >
> >>
> >> > I think I would rather switch my software to using mlt-framework as a
> >> > backend since it handles what I want and would be much more flexible,
> >> > but I
> >> > would certainly have to be able to add in support for the ATEM
> hardware
> >> > encoders we are using in order to do this. Part of the issue however
> >> > may be
> >>
> >> Often hardware encoders output an IP stream. Software does not need to
> >> specifically handle an ATEM hardware encoder unless perhaps it is PCI
> >> card-based encoder.
> >
> >
> > Correct, or rather in this case it is an IP stream from a USB device via
> > Blackmagic's proprietary server software, which is what the hold up on
> Linux
> > is I believe.
> >
> >>
> >>
> >> > that there doesn't seem to be any support for them on Linux, only OS
> >> > X(Where
> >> > I am developing for) and Windows.
> >> >
> >> > Alternatively, is it possible to route a MPEG-TS encapsulated stream
> >> > with
> >> > h.264 and aac into the mlt since I already have that code written in
> >> > python
> >>
> >> Yes, figure out how to play your stream with ffplay or avplay and then
> >> trying giving the same URL to melt. Is it simply TS/UDP?
> >>
> >
> > Yes it is. The only thing that has caused me to have a hang up thus far,
> > and it could be my lack of knowledge on socket programming, is that in
> > python for some reason I haven't been able to create an identical socket
> to
> > the same destination, and I need to be able to send command strings
> through
> > the same socket, as well as send a string to actually identify the
> socket as
> > a receiver for data and have the data streamed over the socket.
> Currently I
> > am simply filtering out known strings to the console for feedback, and
> > writing everything else to disk and feeding that to ffmpeg with a -re
> flag.
> >
> > Reading that statement I realize it may not be clear so let me try to
> > explain the process for getting data from the encoder(Obviously
> simplified
> > slightly):
> >
> > 1. Create Socket to Server/Driver software
> > 2. Send command strings to set up the encoder to the appropriate settings
> > and to start the encoder
> > 3. Ideally create a new socket to receive data on, but right now I am
> using
> > the same socket
> > 4. Send a command string on this new socket telling it that we are a
> > receiver for the data, at which point it will begin streaming the data
> over
> > the socket to us
> > 5. Use the first socket to send command strings to stop or otherwise
> modify
> > the encoder as needed.
> >
> > So really I should be using two different sockets. But I haven't been
> able
> > to get two identical sockets created in Python for some reason, I am new
> to
> > socket programming though so I don't count out my inexperience on this
> topic
> > as the reason, in fact it almost certainly is the reason:) But even
> using
> > two sockets, I have to send a command string before the data will be
> sent to
> > the receiving socket, and it has to be sent out via the receiving socket,
> > which has proven difficult to manage so that I could feed the output in
> > realtime to ffplay/vlc/whatever. On the other hand if I use the same
> socket
> > and simply filter the data coming in, it seems to work well enough, and I
> > have the data to do whatever I need to with.
> >
> >>
> >> > and just use mlt for consumers (Network stream and disk in my case)?
> >> > Then a
> >>
> >> Do you realize the avormat consumer that does this needs to do
> >> encoding (unless you are writing uncompressed to disk, which is
> >> technically not encoding)?
> >
> >
> > Was not aware of that. So there is no way to tell the avformat consumer
> to
> > copy the existing encoding? (ie. ffmpeg's -c:v or -vcodec copy filters?)
> >
> >>
> >>
> >> It will help if you explain at a higher level what you want to do
> >> instead of perhaps what you think you want to do. :-) Does your ATEM
> >> have HDMI and SDI outputs?
> >
> >
> > In this case no. There are two pieces of hardware that qualify for what
> I
> > am doing, the Blackmagic Design h.264 Pro Recorder (Which is what I have)
> > and the ATEM TV Studio (Which I do not have but does act as a video
> mixer as
> > well with HDMI and SDI output). The h.264 Pro Recorder that I have is
> > merely an encoder, it does not have any video outputs whatsoever. What
> I am
> > trying to do is to take the network stream that is put out by the server
> > software installed as part of the 'driver' for this, and do two things to
> > it.
> >
> > One is to write a copy to disk, as-is at least while it is being written,
> > but then ideally to be re-encapsulated into a .mp4 container upon
> completion
> > of the writing. This does require some re-ordering of how the AAC data
> is
> > muxed according to what little I know of it, as I do need to pass a
> specific
> > flag to ffmpeg to do the conversion(-absf adtstoasc) but that is the
> extent
> > of it, otherwise I end up with a nice clean .mp4 that can be played
> anywhere
> > I have thus far tested(Though I am primarily concerned with VLC for this
> > purpose).
> >
> > The second step is to take the stream and stream it out to a Wowza server
> > online via rtsp live. I do this for bandwidth reasons, I have two other
> > campuses and we are looking to grow farther looking to connect to this
> > stream, and I don't have the bandwidth at our broadcasting campus to
> > maintain multiple copies of the stream. So they connect to the Wowza
> server
> > to watch the stream, and I can record on the Wowza server for yet another
> > copy of the file and a backup. And in the hopefully not to distant
> future
> > we will be moving to online streaming to the public as well, but getting
> the
> > rest of these goals complete needs to happen first.
> >
> > The third goal is to provide a confidence monitor via dual 8" monitors I
> > have rackmounted above the two mac minis (redundancy of recordings and
> > stream, this is essentially a mission critical thing for us, it has to
> work
> > every time:) this is being used on. For right now I am just outputting
> the
> > entire desktop via HDMI, but the future goal is to utilize a BMD
> > mini-monitor or Decklink to output direct to SDI and use the desktop for
> > control only, thus an interest in the decklink capabilities of the
> > mlt-framework.
> >
> > The fourth goal would be to do a basic cut on a keyframe edit to be able
> to
> > trim a file without reencoding for delivery to these campuses as well in
> as
> > time efficient manner as possible. They usually only want the 'message'
> > portion of the file which is about 30mins of a 1hr or more file. And
> > because of time crunches the smaller I make it the better, and
> re-encoding
> > it is out of the question, thus the using a hardware encoder for realtime
> > encoding in this case.
> >
> > In the distant future the ability to use our Decklink capture cards and
> do
> > realtime encoding on the computer would be nice as well, but it is not
> > highest on my priority list at this time. But this is another reason the
> > decklink producer ability of the mlt-framework was interesting to me.
> >
> > So for the most part what I would need I believe:
> >
> > 1. An ability to utilize avformat to restream the output to a network
> > destination without re-encoding
> > 2. An ability to write the stream to disk without re-encoding(This can be
> > outside of mlt honestly, especially if I don't treat the ATEM as a mlt
> > producer)
> > 3. An ability to take the stream and output to a confidence monitor
> window
> > on OS X, sdl consumer seems like it should work for this, audio is not
> an
> > absolute necessity at this time though would be nice in the future
> >
> > What I would be looking for in the future:
> > 4. The ability to take a file (producer) and re-encapsulate it with
> minimal
> > if any encoding to a .mp4 container
> > 5. The ability to provide a confidence monitor via a BMD device over SDI
> > 6. The ability to do an edit without re-encoding by splicing on a
> keyframe
> > (Perfect edits not needed for this particular task just something close)
> > 7. The possible ability to use a standard decklink as a producer and
> provide
> > realtime encoding on the computer via mlt before sending the output to
> the
> > above destinations.
> >
> > Thank you very much for your time!
> >
> > Seablade
>
>
>
> --
> +-DRD-+
>
------------------------------------------------------------------------------
Precog is a next-generation analytics platform capable of advanced
analytics on semi-structured data. The platform includes APIs for building
apps and a phenomenal toolset for data science. Developers can use
our toolset for easy data analysis & visualization. Get a free account!
http://www2.precog.com/precogplatform/slashdotnewsletter
_______________________________________________
Mlt-devel mailing list
Mlt-devel@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/mlt-devel