Hi there,

I'm trying to setup a partly multi threaded video pipeline but my result 
images come out distorted from time to time.
It seems like some lines of the video contain lines from the previous 
frames ... but I can't find the exact cause. Maybe
anyone has experienced a similar effect or knows the answer to this 
problem.

In general, what I do is the following :

Open the Mpeg2 video stream (av_open_input_file, avcodec_find_decoder, 
avcodec_open etc.) then

1. Read a frame from the stream (av_read_frame)
2. Decode frame (avcodec_decode_video)
3. Do color conversion (YUV --> RGB)
4. Display Image

Steps 1 and 2 are done by libav*, step 3 is done by some different 
library. Each step is one part in my video
pipeline where (1) delivers an AVPacket to (2), (2) delivers an AVFrame 
to (3) and (3) delivers a RGB buffer
to (4). All those steps should run in parallel, so as soon as one frame 
is read and delivered, the next frame will be
read while the current one is being decoded and so on. Of course, each 
packet and each frame get their own
AVPacket / AVFrame buffer via "new AVPacket()" and "avcodec_alloc_frame" 
respectively.
Unfortunately this produces partly distorted frames as described above. 
As long as I keep the libav* stuff
in one thread (e.g. Read, decode, copy AVFrame to different buffer, 
deliver result to color conversion),
everything is ok.
Any ideas ? I also tried to do step 1 and 2 in one and deliver the "new 
AVFrame()" to the color conversion
component but this still results in distorted image parts.

Any help is highly appreciated :-)

Thanks in advance,
  ALex
_______________________________________________
libav-user mailing list
[email protected]
https://lists.mplayerhq.hu/mailman/listinfo/libav-user

Reply via email to