The __GLX_VENDOR_LIBRARY_NAME variable is enough to force libGLX.so to use a specific driver instead of whatever name the X server sends back. Whether the DRI_PRIME variable would be needed depends on the driver that you give it.

For EGL, when the app calls eglGetPlatformDisplay or eglGetDisplay(EGL_DEFAULT_DISPLAY), then libglvnd will just try each driver in the order they're listed until it finds one that works. You could select between two drivers based on an environment variable like DRI_PRIME just by having one driver or the other succeed.

Setting EGL_PLATFORM only affects a call to eglGetDisplay with a non-NULL native display, where it has to guess which platform to use. In a case where you know you're using GBM, it's better to just use eglGetPlatformDisplay.

-Kyle

On 12/27/2016 08:26 PM, Yu, Qiang wrote:
So use like this per application?
DRI_PRIME=1 __GLX_VENDOR_LIBRARY_NAME=xxx glxgears
DRI_PRIME=1 EGL_PLATFORM=xxx es2gears

Another problem is if two EGL vendor can both be used, how do I
select which to use within one application? For example, in xserver,
load two DDX for two GPU: modesetting DDX use mesa EGL,
amdgpu DDX use amdgpu-pro EGL (it can use mesa too).
The interface is the same (both is initialized from gbm fd).
Which is the default one?

Will this work? In amdgpu DDX code temporarily set EGL_PLATFORM=amdgpu-pro
when init, unset when init is done.

Regards,
Qiang
________________________________________
From: Kyle Brenneman <kbrenne...@nvidia.com>
Sent: Wednesday, December 28, 2016 10:18:13 AM
To: Yu, Qiang; Adam Jackson; Emil Velikov; Michel Dänzer
Cc: ML xorg-devel
Subject: Re: Xorg glx module: GLVND, EGL, or ... ?

GLVND doesn't respond to DRI_PRIME (and probably shouldn't, since that's
very driver-specific), but it has an environment variable that you can
use to override which vendor library it selects.

That's entirely on the client side, so whatever driver to tell it to use
still needs to be able to talk to the server.

-Kyle

On 12/27/2016 07:06 PM, Yu, Qiang wrote:
Yes, mesa can handle DRI_PRIME alone. But my use case is:
1. PRIME GPU (iGPU) use mesa libGL
2. Secondary GPU (dGPU) use close source libGL

If this can be done, we can use dynamic GPU offload in hybrid GPU platforms,
currently we have to switch between GPUs statically (change xorg.conf).

When DRI2, secondary GPU has a GPUScreen on the xserver side which can
be used to obtain vendor info (although not implemented). But DRI3, client
just do offload when DRI_PRIME=1 is set without inform xserver.

The only method I can think of is using a config file for GLVND which records 
the
secondary GPU's vendor to use when DRI_PRIME is set like:
<pci id> <vendor>

What's your opinion?

Regards,
Qiang
________________________________________
From: Kyle Brenneman <kbrenne...@nvidia.com>
Sent: Wednesday, December 28, 2016 1:05:50 AM
To: Yu, Qiang; Adam Jackson; Emil Velikov; Michel Dänzer
Cc: ML xorg-devel
Subject: Re: Xorg glx module: GLVND, EGL, or ... ?

Is DRI_PRIME handled within the Mesa?

If so, then no support from GLVND is needed. The GLVND libraries would
simply dispatch any function calls to Mesa, which in turn would handle
those calls the same way it would in a non-GLVND system.

-Kyle

On 12/23/2016 07:31 PM, Yu, Qiang wrote:
Hi guys,

Does GLVND support DRI_PRIME=1? If the secondary GPU uses different
libGL than primary GPU, how GLVND get the vendor to use?

Regards,
Qiang
________________________________________
From: Adam Jackson <a...@redhat.com>
Sent: Saturday, December 17, 2016 6:02:18 AM
To: Emil Velikov; Michel Dänzer
Cc: Kyle Brenneman; Yu, Qiang; ML xorg-devel
Subject: Re: Xorg glx module: GLVND, EGL, or ... ?

On Thu, 2016-12-15 at 16:08 +0000, Emil Velikov wrote:

Example:
Would happen if we one calls glXMakeCurrent which internally goes down
to eglMakeCurrent ? Are we going to clash since (iirc) one is not
allowed to do both on the same GL ctx ?
No, for the same reason this already isn't a problem. If you
glXMakeCurrent an indirect context, the X server does not itself call
glXMakeCurrent. All it does is record the client's binding. Only when
we go do to actual indirect rendering (or mutate context state) does
libglx actually make that context "current". That context is a tuple of
the protocol state and a DRI driver context; it could just as easily be
an EGL context instead of DRI.

- ajax

_______________________________________________
xorg-devel@lists.x.org: X.Org development
Archives: http://lists.x.org/archives/xorg-devel
Info: https://lists.x.org/mailman/listinfo/xorg-devel

Reply via email to