On Mar 30, 2013, at 05:55, Kalileo wrote:

> What I think you are still missing is the fact that audio packets have a 
> fixed length. For every audio packet you can calculate how long it is (using 
> audio sample rate, channels count, size of the (decoded) data). So an audio 
> packet contains audio for exactly x ms. 
> 
> Video does not have that info built in that strongly, You can show a the 
> image is correct wether displayed 1 ms or 100 ms. To decide 

What's confusing here is that for your audio claim to hold true, one needs at 
least 3 bits of information in addition to the sample's byte length (which is a 
given, after decoding): sample rate, sample size (short, int, float, ...) and 
channel count. For a video frame, the minimum requirement is to know width and 
height in order to map a buffer to an image, but there is no reason a packet 
could not include additional information. After all, even if duration is a 
necessary info bit for audio, one could argue that this is the case for image 
data too, in a video context.

> Quoth Brad:
> Well, here's the rub -- thanks to QTKit, and the QTSampleBuffer it delivers 
> for both video and audio, I don't have to calculate pts, dts, or duration -- 
> those time values are already delivered with the data buffer, along with its 
> associated time scale, so converting to time_base units is merely a simple 
> math problem.

Converting with simple math isn't calculating? :)
Seriously, why not post the information provided by QTKit, and how you convert 
it? Seems it could be quite easy to confound QT's time scale and FFmpeg's 
time_base units?

R.

_______________________________________________
Libav-user mailing list
[email protected]
http://ffmpeg.org/mailman/listinfo/libav-user

Reply via email to