Hi Cyril, I pushed your patch to libva git. However there is 'copy' from GPU to GPU in vaCreateSurfaceGLX(). You can directly use the exported VA Surface handle to create egl image and use this image as texture. You can refer to yamidecode in https://github.com/01org/libyami.git .
Thanks Haihao Hi Sean, I just created the bug. I'll keep an eye on it. Thanks, Cyril On 3/1/2016 3:30 PM, Sean V Kelley wrote: Hi Cyril, Could you file a bug at freedesktop.org? You can attach your patch there too along with details on the issue. https://bugs.freedesktop.org/enter_bug.cgi?product=libva Thanks, Sean On Tue, 2016-03-01 at 09:47 -0800, Cyril Drouet wrote: Hello, I successfully implemented hardware decoding with VAAPI via FFmpeg by copying the data back to the CPU's memory; however, when I tried to use the data directly from the GPU (instead of copying them back) by using VA/GLX to convert the decoded VASurfaces to OpenGL textures, I ran into some issues. With the latest version of libva, vaCreateSurfaceGLX fails all the time when I set OpenGL to 3.1 or above. If I set it to 3.0, then it doesn't fail, and everything works correctly. I downloaded the sources of libva and it fails when checking for GL extensions because of the use of glGetString(GL_EXTENSIONS) which is deprecated. Is that something you guys can fix so that it is available in the latest version? I have implemented a fix on my end (which I can send if you'd like) so it is not a big issue but I'd rather use the official version. Thanks, Cyril _______________________________________________ Libva mailing list [email protected]<mailto:[email protected]> https://lists.freedesktop.org/mailman/listinfo/libva _______________________________________________ Libva mailing list [email protected]<mailto:[email protected]> https://lists.freedesktop.org/mailman/listinfo/libva _______________________________________________ Libva mailing list [email protected] https://lists.freedesktop.org/mailman/listinfo/libva
