Hi All, I use ffmpeg + ffserver to support a stream server. the video and audio source is from usb webcam.
When I test the stream by VLC, it can get video data , and audio data respectively. but I try to use ffserver to output a rtp stream for audio and a rtp stream for video. there are not sync. When I study the code of ffmpeg.c and ffserver.c. I can see there are a lot of code that does av_rescale_q() for pkt.pts, pkt.dts and pkt.duration Can somebody explain more detail about that? When I can find the document that have detail description about that?
_______________________________________________ Libav-user mailing list [email protected] http://ffmpeg.org/mailman/listinfo/libav-user
