Re: [VirtualGL-Devel] Detecting when running under VirtualGL

2011-12-23 Thread DRC
On 12/23/11 10:46 AM, Arthur Huillet wrote: > As far as I know they still draw to visible windows. (I wonder why... > is there any technical reason to do so?) > > So I guess the case you describe is still applicable, however I know > for having experimented it that "vglrun pvserver" does work. D

Re: [VirtualGL-Devel] Detecting when running under VirtualGL

2011-12-23 Thread Arthur Huillet
On 23.12.2011 16:59, DRC wrote: > On 12/23/11 9:24 AM, Arthur Huillet wrote: >> Doesn't "vglrun pvserver" work out of the box however? >> Why is this technique needed, then? > > Back when we were working with it, which was years ago, the > subrenderers > were using windows, so we wanted to run eac

Re: [VirtualGL-Devel] Detecting when running under VirtualGL

2011-12-23 Thread Stefan Eilemann
On 23. Dec 2011, at 15:59, DRC wrote: > I still am not understanding why you would need the app to behave > differently if VirtualGL is present. If it's using Pbuffers/FBO's for > the subrenderers, then where/why is the "extra readback" occurring? My server is a 3-GPU system. When running with

Re: [VirtualGL-Devel] Detecting when running under VirtualGL

2011-12-23 Thread DRC
On 12/23/11 9:24 AM, Arthur Huillet wrote: > Doesn't "vglrun pvserver" work out of the box however? > Why is this technique needed, then? Back when we were working with it, which was years ago, the subrenderers were using windows, so we wanted to run each subrenderer in VirtualGL to redirect the r

Re: [VirtualGL-Devel] Detecting when running under VirtualGL

2011-12-23 Thread Arthur Huillet
On 22.12.2011 17:51, DRC wrote: > This is a common technique when using VirtualGL with Chromium and > ParaView, although in those cases, the rendering is split up among > multiple processes, so VGL_READBACK=0 is set in the launch scripts > for > the sub-renderers rather than being set in the main

Re: [VirtualGL-Devel] Detecting when running under VirtualGL

2011-12-23 Thread DRC
On 12/23/11 2:43 AM, Stefan Eilemann wrote: >> Why not just have your application putenv("VGL_READBACK=0")? That >> prevents VirtualGL from reading back and sending frames, if you have >> other means of doing that. You could, for instance, wrap a >> glXSwapBuffers() call around which you don't wa

Re: [VirtualGL-Devel] Detecting when running under VirtualGL

2011-12-23 Thread Stefan Eilemann
On 22. Dec 2011, at 17:51, DRC wrote: >> Setup is a 3-GPU 'VGL-served' machine. With VGL present, I want: >> >> - two offscreen renderers using :0.1 and :0.2 contributing to: >> - one on-screen renderer rendering and assembling :0.1 and :0.2 using the >> forwarded DISPLAY (:10.0, redirected to

Re: [VirtualGL-Devel] Detecting when running under VirtualGL

2011-12-23 Thread Stefan Eilemann
Thanks Guys, for your detailed insight. I'll be going with the GLX_VENDOR route since I have temporary display connections anyways to probe for available GPUs. Cheers, Stefan. -- http://www.eyescale.ch http://www.equalizergraphics.com http://www.linkedin.com/in/eilemann --

Re: [VirtualGL-Devel] Detecting when running under VirtualGL

2011-12-22 Thread Nathan Kidd
On 11-12-22 01:24 PM, DRC wrote: > On 12/22/11 11:26 AM, Nathan Kidd wrote: >> Trivia: I don't know why one would do this, but the way VGL hooks >> glXGetClientString, and _isremote is implemented, you technically could >> get away with passing in a bogus DPY, (though the API docs don't >> explicit

Re: [VirtualGL-Devel] Detecting when running under VirtualGL

2011-12-22 Thread DRC
On 12/22/11 11:26 AM, Nathan Kidd wrote: > Trivia: I don't know why one would do this, but the way VGL hooks > glXGetClientString, and _isremote is implemented, you technically could > get away with passing in a bogus DPY, (though the API docs don't > explicitly say that Xlib won't fall back to

Re: [VirtualGL-Devel] Detecting when running under VirtualGL

2011-12-22 Thread Nathan Kidd
On 11-12-22 11:54 AM, DRC wrote: > On 12/22/11 9:56 AM, Nathan Kidd wrote: >> On 11-12-22 09:36 AM, Stefan Eilemann wrote: >>> I need to detect within a C++ program whether or not I'm running under >>> VirtualGL, to enable certain optimizations. What is the recommended way to >>> do this? >> >> W

Re: [VirtualGL-Devel] Detecting when running under VirtualGL

2011-12-22 Thread DRC
On 12/22/11 10:01 AM, Stefan Eilemann wrote: >> If you need to know of VGL's existence prior to that, you'd have to look >> at the value of the LD_PRELOAD environment variable and see if it >> contains "librrfaker.so". [...] > > That sounds like what I want - thanks! Well, just be aware that it i

Re: [VirtualGL-Devel] Detecting when running under VirtualGL

2011-12-22 Thread DRC
On 12/22/11 9:56 AM, Nathan Kidd wrote: > On 11-12-22 09:36 AM, Stefan Eilemann wrote: >> I need to detect within a C++ program whether or not I'm running under >> VirtualGL, to enable certain optimizations. What is the recommended way to >> do this? > > When VGL is running it overrides the GLX

Re: [VirtualGL-Devel] Detecting when running under VirtualGL

2011-12-22 Thread Stefan Eilemann
Hi, On 22. Dec 2011, at 16:52, DRC wrote: > It's not documented, but you could leverage the autotest mechanism [...] That wouldn't work, because I need this information to build my (multi-GPU) configuration. > If you need to know of VGL's existence prior to that, you'd have to look > at the va

Re: [VirtualGL-Devel] Detecting when running under VirtualGL

2011-12-22 Thread Nathan Kidd
On 11-12-22 09:36 AM, Stefan Eilemann wrote: > I need to detect within a C++ program whether or not I'm running under > VirtualGL, to enable certain optimizations. What is the recommended way to do > this? When VGL is running it overrides the GLX vendor strings with "VirtualGL". Try glXGetClie

Re: [VirtualGL-Devel] Detecting when running under VirtualGL

2011-12-22 Thread DRC
On 12/22/11 9:02 AM, Stefan Eilemann wrote: > I need to get this from within a C++ program, i.e.: > > if( getenv( "VGL_RUNNING" ) != 0 ) > doVglSpecificStuff(); > > Unless I am missing something, having an overlay logo doesn't help me. It's not documented, but you could leverage the auto

Re: [VirtualGL-Devel] Detecting when running under VirtualGL

2011-12-22 Thread Stefan Eilemann
On 22. Dec 2011, at 15:58, Shanon Loughton wrote: > >From one of my discussions with DRC: > > An easy way to verify > whether VirtualGL is engaged is to set VGL_LOGO=1 in the environment. > This will display a "VGL" logo in the bottom right of the OpenGL > rendering area if VGL is active. I nee

Re: [VirtualGL-Devel] Detecting when running under VirtualGL

2011-12-22 Thread Shanon Loughton
>From one of my discussions with DRC: An easy way to verify whether VirtualGL is engaged is to set VGL_LOGO=1 in the environment. This will display a "VGL" logo in the bottom right of the OpenGL rendering area if VGL is active. HTH Shanon On 22 December 2011 22:36, Stefan Eilemann wrote: > He

[VirtualGL-Devel] Detecting when running under VirtualGL

2011-12-22 Thread Stefan Eilemann
Hello, I need to detect within a C++ program whether or not I'm running under VirtualGL, to enable certain optimizations. What is the recommended way to do this? Cheers, Stefan. -- http://www.eyescale.ch http://www.equalizergraphics.com http://www.linkedin.com/in/eilemann --