Hi JS,

I'm just back online after a trip so just starting to catch up.  Given there
is lots left for us to learn about the ffmpeg API and the issues surrounding
video access I think we'll need to subscribe to the ffmpeg mailing list and
see if that will provide some good advice.

Robert.

On Tue, Apr 14, 2009 at 3:25 PM, Jean-Sébastien Guay <
[email protected]> wrote:

> Hi Robert,
>
>  I have merged your error code changes and the additional formats to
>> ReaderWriterFFmpeg.cpp, but haven't merged the http & rtsp as I'd like to be
>> able to test them directly myself and to would like to invesitgate the
>> appropriate ffmpeg way to handle this type of input source.
>>
>
> Go ahead, you just need to set up your streaming source to serve the stream
> using these protocols. By the way, I did mention that it doesn't work 100%
> even for me right now, I've just hit a roadblock and would like someone else
> to see if they can get further, so I'm not surprised you didn't merge those
> changes.
>
>  What makes me more cautious about the code path is that my own /dev/ code
>> is hardwired to a low resolution for the stereo webcam I've been
>> experimenting and plan to remove this code and replace with more generic
>> code down the line, and you've just bolted extra code on to this
>> experimental code path that.  I'd prefer to see a device code path kept
>> separate from a network streaming path.  I also tested watching movies over
>> http using the original ffmpeg code, it had problems but it worked so I'm
>> curious to as why you needed to use this other route.
>>
>
> It's not a file, it's a stream. You're right, for http those changes were
> not necessary so it was the addition of formats to ReaderWriterFFmpeg.cpp
> that did it. But even then, it still stops after 15-20 seconds for me, as I
> said.
>
> But for rtsp, we need to open the stream using that format (pass "rtsp" in
> av_find_input_format() ) and even then, it doesn't work for me. I'm thinking
> we need to pass in an additional format to specify it's MPEG-4 inside the
> container, but I really don't know enough about ffmpeg to figure out how.
>
> In any case, I think we'll need to have different paths for a *file* served
> through http versus a *stream* served through http, but once again I don't
> know enough about these things. Perhaps I'm wrong and the same code path can
> handle both.
>
>  Could you provide example files for http and rtsp so that we can all test
>> against the same sources and can observe the same issue.
>>
>
> I cannot publicly serve video streams from our cameras, no. These are
> streams, not files, so you really need to have a camera that can serve using
> those protocols... (unless someone can set up a stream that loops over a
> given file and serves that, but once again I'm way over my head here)
>
> J-S
> --
> ______________________________________________________
> Jean-Sebastien Guay    [email protected]
>                               http://www.cm-labs.com/
>                        http://whitestar02.webhop.org/
> _______________________________________________
> osg-submissions mailing list
> [email protected]
>
> http://lists.openscenegraph.org/listinfo.cgi/osg-submissions-openscenegraph.org
>
_______________________________________________
osg-submissions mailing list
[email protected]
http://lists.openscenegraph.org/listinfo.cgi/osg-submissions-openscenegraph.org

Reply via email to