Hi,
I'm using ffmpeg libraries to read from a UDP transport stream and decode the
data, but I've been asked to add a feature to handle signal loss and I'm hoping
I can get some help with this.
My initial workflow design would be something along these lines:
1. Add a timeout to the av_read_frame call via url_set_interrupt_cb.
2. On interrupt, check if too much time as passed via gettimeofday timestamps
3. If too much time has passed, return 1, causing av_read_frame to fail.
4. If av_read_frame fails due to interrupt, insert black frames according to
my current fps and amount of time that has passed. Also insert silence audio
samples for the amount of time that has passed.
5. Continue in while loop, calling av_read_frame again, and starting the
process over again.
Sudo code workflow:
int retVal = av_read_frame(m_formatInfo, &packet);
if (retVal < 0 && wasTimeout)
{
//Insert blank frames
//Insert empty audio
continue;
}
However, I noticed a couple of things wrong when I started to implement this:
1. When interrupt function returns 1, av_read_frame DOES NOT return a
negative value.
2. Subsequent calls to av_read_frame (after timeout occurred) return
immediately a negative value.
Does anyone know why this happens? Has anyone here have any experience dealing
with signal loss? Does this design look plausible to handle workflow? Any help
is appreciated.
Thanks
_______________________________________________
Libav-user mailing list
[email protected]
http://ffmpeg.org/mailman/listinfo/libav-user