Hi,
If you want to feed textures, then vaapisink is not the option ;)
You have to use GstVideoGLTextureUploadMeta in the renderer....
Either use upstream "glimagesink" or use clutter-gst....
On 29.09.2015 15:38, Dolevo Jay wrote:
Hi,
In my device, I receive video streams from different sources (local or
remote). After decoding them, I show them on the display using OpenGL.
So far, all decoding was done in software. I was receiving RGB frames
from each source and uploading them to certain textures to render
them. The sources are now decoded in hardware using gstreamer-vaapi.
An example gst line is as follows: gst-launch-1.0 filesrc
location=/store/1.mp4 ! qtdemux ! vaapidecode ! vaapisink display=2
This works great. However, as you might imagine, vaapisink creates its
own wondow and draw the decoded frames onto it. What I would like to
do is to feed the textures that I created in my application and feed
them to vaapidecode or vaapisink element so that the rendering can
happen in my canvas. I have been digging into the vaapidecode and
vaapisink elements to see where the textures are uploaded, but
couldn't spot the exact line to feed my texture info into. Could
anyone help me? A function name, or a line number or any hint would be
greatly appreciated.
Thanks,
_______________________________________________
Libva mailing list
Libva@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/libva
--
Thanks
Sree
---------------------------------------------------------------------
Intel Finland Oy
Registered Address: PL 281, 00181 Helsinki
Business Identity Code: 0357606 - 4
Domiciled in Helsinki
This e-mail and any attachments may contain confidential material for
the sole use of the intended recipient(s). Any review or distribution
by others is strictly prohibited. If you are not the intended
recipient, please contact the sender and delete all copies.
_______________________________________________
Libva mailing list
Libva@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/libva