On Mar 27, 2013, at 9:25 PM, Clément Bœsch <[email protected]> wrote:

> You realize FFmpeg 0.6 is 3 years old right? 

I know it very well -- but with FFmpeg documentation / examples, new doesn't 
necessarily mean more helpful. I've spent more hours than I'd like to count 
scouring the Internet for anything that could shed light on various aspects of 
getting my use case built. That 3 year old example is one of the only video / 
audio encoding examples I've found that even addresses pts / dts. Take a look 
at the current decoding_encoding.c in the samples you refer to. It doesn't 
address this at all. In fact, the example itself is very limited in usefulness, 
as it isn't really addressing what would likely be a real-world use-case. 

For starters, video and audio encoding and output are completely separate, 
stand-alone examples, rather than having both audio and video simultaneously in 
play. The audio data is completely contrived -- it is bogus sound generated 
internally in the app, not drawn from an external capture source, file, or 
stream. Neither video or audio encoding even have to deal with pts, other than 
this one line in the video encoding: 

frame->pts = i;

which is a fabricated scenario of a hard-coded index rather than pulling 
decoding and presentation times and durations from a foreign source and scaling 
them properly to the time_base in question, and which then plays into the audio 
/ video time sync issue mentioned. 

> If the examples and doxy documentation are not enough, you should have a
> look at ffmpeg*.c ffplay*.c files in the source root directory.

I'll take a look. As a point of constructive encouragement, the documentation 
and examples could really use some improvement so that they are analog with 
common use cases out there. Granted video / audio is a complicated domain, but 
the API is way, way too hard and time consuming to use and figure out over what 
it could be. I was actually pretty floored when I didn't find a whole host of 
examples for the very use case I've been struggling with -- I would have 
expected "I'd like to stream my webcam to Flash" to have historically been a 
pretty common need, especially on OS X, given there's virtually nothing in the 
way of outbound network or media protocols in the Cocoa API. That's actually 
one other reason I've posted my code on GitHub, in hopes of saving someone some 
time down the road in getting something built. 

But I digress...back to the task at hand...getting video and audio sync'd up. 
Thanks for the pointer Clement, I'll take a look at those....

Brad

_______________________________________________
Libav-user mailing list
[email protected]
http://ffmpeg.org/mailman/listinfo/libav-user

Reply via email to