Hi,

> In the attached program, we can see that the DTS sequence obtained,
> while decoding frame or not, differ. Sometimes there is no DTS
> (NO_OPTS_VALUE) but this doesn't happen if we do not decode the frames.
>
> So, is there a bug in my attached program or does ffmpeg act
> strangely?

After several hours of debug, I found why it act like this.

As I was supposing it, packets fields from `av_read_frame' can depends
of the state of the codec context. The function `compute_frame_duration'
is testing `st->codec->time_base'. This value is probably updated when
decoding frame.

Later PTS and DTS computation can be influenced by the duration.

This show that we can retrieve different DTS whether we are decoding the
frame or not.

Is this normal?

I will not say it is a bug for sure, since I think the WMV[1] I'm
testing is broken, but I'm still wondering if this cannot be the cause
of some other strange behaviors. Codec `time_base' is 1/1000 if I do not
decode the frame, then 1/12 once frames are decoded.

  [1] http://fjolliton.free.fr/1140096044413.wmv

      Seems stream 1 codec frame rate differs from container frame rate: 
1000.00 (1000/1) -> 12.50 (25/2)
      Input #0, asf, from '/home/fred/1140096044413.wmv':
        Duration: 00:01:39.4, start: 15.000000, bitrate: 189 kb/s
          Stream #0.0: Audio: 0x000a, 22050 Hz, mono, 20 kb/s
          Stream #0.1: Video: mpeg4, yuv420p, 320x240, 150 kb/s, 12.50 tb(r)

-- 
Frédéric Jolliton
_______________________________________________
libav-user mailing list
[email protected]
https://lists.mplayerhq.hu/mailman/listinfo/libav-user

Reply via email to