In all of my recent testing with only a single X Server, the 3D Server == 2D
Server with only VGL inbetween. The DISPLAY variable is set to :0.0 (and VGL
rendering to :0.0) with only unix sockets + loopback in use (for VGL SSH
connection) and the crash still occurs.
Since the environment
On 11-03-02 05:14 PM, DRC wrote:
Bingo.
Apparently nVidia is trying to outsmart us yet again and is somehow
making a determination in the underlying GLX library that it needs to
send glXSwapIntervalSGI() over the wire rather than to the display on
which the GLX context has been established
On 3/3/11 11:21 AM, Nathan Kidd wrote:
Another approach to deal with this could be to remove
GLX_SGI_swap_control from the extension list. At least this way
(well-behaved) apps would *know* they can't control the update speed.
(Although in wine's case it will simply warn and carry on, so the