Hi.

I am new to C++ and FFmpeg and wanted to write application that streams
video from camera (webcam at least) to some server.

I've seen StreamingGuide <https://trac.ffmpeg.org/wiki/StreamingGuide> and
wanted to know how to implement it.
I think that basic flow is like this, please correct me if I'm wrong:

   1. Get input device AVInputFormat from libavdevice
   2. Open that input with avformat_open_input
   3. Find its streams' codecs
   4. Get decoder with avcodec_find_decoder
   5. Decode it somehow
   6. Encode it for stream
   7. Write data with muxer
   8. Send muxed data to server

So I imagine how to implement the first half of this list but not the
second.

2 questions that I have are:

   1. Do I understand the streaming flow right? What are the nuances that I
   must consider? What modules/methods should I look into to implement it?
   2. How can I do a preview of stream in a GUI using, for example, Qt
   Quick? Is an input device blocked by one of the processes (either FFmpeg or
   Qt)? If it is, should I somehow copy frames for GUI to show them for user
   or just reference it?

Thanks in advance!

Kind regards,
Timur Guseynov
_______________________________________________
Libav-user mailing list
Libav-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/libav-user

Reply via email to