Re: [Libav-user] may I use ffmpeg API just for decoding

2018-03-26 Thread YIRAN LI
2018-03-24 21:13 GMT+11:00 YIRAN LI :

> Hi guys,
>
>
> Want to confirm one thing. May I write my own demuxer (for example for
> AVI) and only use ffmpeg API for video/audio decoding?
>
>
> Thanks
>

​
Hi guys,

I've found an example video_decode_example@avcodec.c (mine is an old
version so this might have been moved to another place)

in which the codeccontext is initialized with avcodec_alloc_context3 so no
avformat context is needed.

That way I can call av_decode_frame with my own demuxer.


Thanks​
___
Libav-user mailing list
Libav-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/libav-user


[Libav-user] Audio stream is longer than video when muxing

2018-03-26 Thread Michael IV
Hi. I will ask again,as nobody answered. I am performing muxing,based on
muxing.c
example.But in my case the audio comes from a file,which sometimes is
longer (in terms of duration) then the video stream. For some reason,even
that I am checking and comparing current times of both streams,at the
end,the final video has a length longer than planned by up to 20% (depends
on audio type and sample rate) .I was expecting to stop adding new audio
samples to the buffer once I stop writing interleaved frames for audio. But
it doesn't happen. How can I fix it? Do I need to add somewhere in the
output stream or format context total audio/or video duration so that
ffmpeg knows when to stop buffering audio frames?

Here you can see the full problem explanation,including my setup,and the
hack I currently use to minimize the total video length overflow:

https://stackoverflow.com/questions/49342379/ffmpeg-multiplexing-streams-with-different-duration

Thanks.
___
Libav-user mailing list
Libav-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/libav-user