Hi. I will ask again,as nobody answered. I am performing muxing,based on muxing.c example.But in my case the audio comes from a file,which sometimes is longer (in terms of duration) then the video stream. For some reason,even that I am checking and comparing current times of both streams,at the end,the final video has a length longer than planned by up to 20% (depends on audio type and sample rate) .I was expecting to stop adding new audio samples to the buffer once I stop writing interleaved frames for audio. But it doesn't happen. How can I fix it? Do I need to add somewhere in the output stream or format context total audio/or video duration so that ffmpeg knows when to stop buffering audio frames?
Here you can see the full problem explanation,including my setup,and the hack I currently use to minimize the total video length overflow: https://stackoverflow.com/questions/49342379/ffmpeg-multiplexing-streams-with-different-duration Thanks.
_______________________________________________ Libav-user mailing list Libav-user@ffmpeg.org http://ffmpeg.org/mailman/listinfo/libav-user