On date Friday 2010-02-05 10:28:14 -0300, Ivo Calado encoded:
> Hi fellows,
>         I'm newbie in development of applications using libav. So, a read
> some tutorials such as http://dranger.com/ffmpeg/ffmpeg.html and
> http://www.inb.uni-luebeck.de/~boehme/using_libavcodec.html. However, I
> still have some doubts that I want to ask you.
> I'm going to create a videoconference application using the CcRTP API
> (http://www.gnu.org/software/ccrtp/ this api is a requisite) as RTP
> library and ffmpeg as stream library.

No need to use an external library for RTP, libavformat already
supports that.

> In all examples about
> encode/decode using ffmpeg, the data source is a file. I haven't see
> any example about recording data from a webcam.  I suppose it is
> possible as the command-line application ffmpeg supports this but I
> didn't find any documentation about how to work in that way.

The great thing about the FFmpeg API is that all the input
format/input device are managed unifomerly, you simply need to specify
the name of the resource to open (e.g. "/dev/video0") and the name of
the input format/input device (e.g. "v4l") in av_open_input_file() and
the library will figure out the rest.

> Sorry if my question is simple but I very newbie in ffmpeg.
> 
> Thanks in advance,
> Cheers,

Regards.
_______________________________________________
libav-user mailing list
[email protected]
https://lists.mplayerhq.hu/mailman/listinfo/libav-user

Reply via email to