It seems before send to gstreamer, you need de-packetize first, this why you demux by libav to AVFrame, it de-packetize the stream to complete data of frame, then it works with GStreamer.
On Mon, 4 Jan 2021 at 17:45 fre deric <[email protected]> wrote: > My goal is to demux video stream in libav and then decode it in GStreamer > pipeline. > > My approach is to take AVPacket from the video stream in the first thread > and send it to GStreamer pipeline in the second thread. Important parts of > code are here: > > // -- THREAD 1 -- > // Take data from AVPacket > img_data = (guchar *)packet.data; > size = packet.size; > // Create GStreamer buffer > buffer = gst_buffer_new_allocate(NULL, size, NULL); > gst_buffer_map(buffer, &map, GST_MAP_WRITE); > memcpy((guchar *)map.data, img_data, gst_buffer_get_size(buffer)); > map.size = size; > gst_buffer_unmap(buffer, &map); > // Send the buffer to appsrc element in the pipeline. > gstret = gst_app_src_push_buffer((GstAppSrc *)app_source, buffer); > > // -- THREAD 2 -- > // A video cap for appsrc element > const gchar *video_caps = "video/x-theora, width=1920, height=1080, > framerate=30/1"; > // GStreamer pipeline > string = g_strdup_printf("appsrc name=testsource caps=\"%s\" ! theoradec ! > videoconvert ! autovideosink", video_caps); > > However, I am getting following error in GStreamer pipeline: > " > ERROR from element theoradec0: Could not decode stream. > Debugging info: gsttheoradec.c(812): theora_handle_data_packet (): > /GstPipeline:pipeline0/GstTheoraDec:theoradec0: no header sent yet > " > > I also tested a version, when the AVPacket was decoded to AVFrame by libav > and then sent to the gstreamer pipeline and it WORKS: > > -- THREAD 1 -- > // Decode. > avcodec_decode_video2(pCodecCtx, pFrame, &frameFinished, &packet); > // Take data from AVFrame > img_data = (guchar *)pFrame->data > size = av_image_get_buffer_size(AV_PIX_FMT_BGR24, 1920, 1080, 1); > // same way as before. > > -- THREAD 2 -- > const gchar *video_caps = "video/x-raw, format=BGR, width=1920, > height=1080, > framerate=30/1"; > string = g_strdup_printf("appsrc name=testsource caps=\"%s\" ! videoconvert > ! autovideosink", video_caps); > > > All this is tested on this video file: > container: ogg > codec: Theora > dim: 1920x1080 > framerate: 30fps > > > Why does sending the AVFrame to GStreamer pipeline work, but not the > AVPacket? > > > > -- > Sent from: http://libav-users.943685.n4.nabble.com/ > _______________________________________________ > Libav-user mailing list > [email protected] > https://ffmpeg.org/mailman/listinfo/libav-user > > To unsubscribe, visit link above, or email > [email protected] with subject "unsubscribe".
_______________________________________________ Libav-user mailing list [email protected] https://ffmpeg.org/mailman/listinfo/libav-user To unsubscribe, visit link above, or email [email protected] with subject "unsubscribe".
