Hi,
In my device, I receive video streams from different sources (local or remote). 
After decoding them, I show them on the display using OpenGL. So far, all 
decoding was done in software. I was receiving RGB frames from each source and 
uploading them to certain textures to render them. The sources are now decoded 
in hardware using gstreamer-vaapi. An example gst line is as follows: 
gst-launch-1.0 filesrc location=/store/1.mp4 ! qtdemux ! vaapidecode ! 
vaapisink display=2This works great. However, as you might imagine, vaapisink 
creates its own wondow and draw the decoded frames onto it. What I would like 
to do is to feed the textures that I created in my application and feed them to 
vaapidecode or vaapisink element so that the rendering can happen in my canvas. 
I have been digging into the vaapidecode and vaapisink elements to see where 
the textures are uploaded, but couldn't spot the exact line to feed my texture 
info into. Could anyone help me? A function name, or a line number or any hint 
would be greatly appreciated.
Thanks,
                                          
_______________________________________________
Libva mailing list
Libva@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/libva

Reply via email to