Hello,

I'm working on a project where we want to stream a video from a client and 
display it real-time on a server. Using ffmpeg we have been able to encode a 
file and send it to the server (ffserver.) We then use ffplay to play the file 
stored in the server. But this seems to work only in the following way:


-        File is stored in the server

-        Saved file is played after storage, from the beginning

What we would like to achieve is to play the file as it is being sent, "real 
time". So the file would not need to be stored (perhaps just some buffer), or 
even if stored, should be displayed at the frame that is being sent.

To do that I'd like to configure things to work using RTSP, but have so far 
been unsuccessful. Could someone please send me an example of the command lines 
and perhaps the config file for this?

Thanks.
_______________________________________________
ffserver-user mailing list
ffserver-user@mplayerhq.hu
https://lists.mplayerhq.hu/mailman/listinfo/ffserver-user

Reply via email to