If that's the case, yes, I can confirm the NV driver does not support
rendering with remote X servers using EGL, with or without indirect GLX
support enabled in said server, and yes, EGLDevice will work just fine
in that situation for offscreen rendering if you're trying to use the
local GPU.
Thanks,
-James
On 3/13/23 18:27, Adam Jackson wrote:
12290 is indeed EGL_BAD_ACCESS, and it's pretty much impossible for
Mesa's eglInitialize to return that, so (if I had to guess) you have
nvidia's driver installed as well, and (extra guessing now) nvidia's EGL
only works with connections to the local machine and not over the
network. Mesa shouldn't have that problem because it would select
llvmpipe instead of a native driver in that scenario, I think.
If "render to png" really is what you're trying to accomplish you might
do better to use EGL_EXT_platform_device to get a direct connection to
the GPU without involving a display server.
- ajax
On Tue, Mar 7, 2023 at 8:17 PM Richard Haney <compsci2...@gmail.com
<mailto:compsci2...@gmail.com>> wrote:
Please help,
I have been going around and around with this problem but cannot
seem to make any headway. I hope that one of you OpenGL EGL experts
can help.:slight_smile:
I have created a program that uses OpenGL EGL (version 1.5) with
OpenGL 3 that successfully renders an offscreen triangle and saves
it to an image file (PNG) when Issh/without/X11 forwarding on my
Linux (Ubuntu 22.04) machine.
However when I try the same thing usingssh/with/X11 forwarding
enabled I get the following EGL error when I calleglInitialize(…):
12290 (I/think/isEGL_BAD_ACCESS).
This seems really weird and I hope it is something simple that I am
just not currently seeing.
I really like using OpenGL with EGL but need a way to remedy this
situation if possible. Is there a way for EGL to determine if X11
forwarding is being employed and to ignore it or some other solution?
The snippet of relevant C++ code follows, with area where error
occurs marked:
#include <iostream>#include <cstdlib>#include <EGL/egl.h>#define
EGL_EGLEXT_PROTOTYPES#define GL_GLEXT_PROTOTYPES#include
<EGL/eglext.h>#include <GL/gl.h>... EGLDisplay display =
eglGetDisplay(EGL_DEFAULT_DISPLAY); if(display == EGL_NO_DISPLAY) {
std::cerr << "Failed to get EGL display: "<< eglGetError() <<
std::endl; exit(EXIT_FAILURE); } EGLint major; EGLint minor;
if(eglInitialize(display, &major, &minor) == EGL_FALSE) { // ERROR
12290 is generated herestd::cerr << "Failed to initialize EGL: "<<
eglGetError() << std::endl; exit(EXIT_FAILURE); } ...
Any help would be greatly appreciated.