Hi all,

I'm using Live555 to receive an H264 stream from a camera. I am using libavcodec to decode the stream, and later OpenGL to render. All of that is working.

However, as soon as I get a frame, I render it. This causes some image stuttering, as I'm not using timestamp info... Now, I'm trying to retreive the presentation timestamp (PTS) from the decoded AVFrame, but this value is not set (set to AV_NOPTS_VALUE).

From Live555 h264 parsing, I only get NAL unit PTS, which I set to the avpacket for decoding (packet->pts).

I also do not have a proper AVCodecCtx->time_base, neither the camera provides the time_base (time_scale) info in the SPS/PPS (it is optional according to the standard).

How could I, in this scenario, account for presentation timestamps? I'm sure there's something else I'm not using or misusing. But can't really spot it.

--

Saludos / Best regards,

*Sergio Basurco*
Coherent Synchro

_______________________________________________
libav-api mailing list
[email protected]
https://lists.libav.org/mailman/listinfo/libav-api

Reply via email to