Hello to all. In my application I need do stream some streams of video (only video, no audio), and my idea is use http or rtsp. As many modern browsers support video in pages, I tought use webm+http. But I don't know do this.
I know, ffserver does it, but it's a very complete and complex app with a lot of features and ~5k lines of code in a single file :-) And it still has a lot of "goto" calls :-( I saw ffserver instantiate a http server, which is acessible via browser. S lo it does everything I need, but before write the stream server in my app, I need understand how ffserver does it. So my question is: what's the "principle" of the http streaming in ffserver/ffmpeg/libav*? _______________________________________________ libav-user mailing list [email protected] https://lists.mplayerhq.hu/mailman/listinfo/libav-user
