I agree the 3D X server is always running.  That was why I was supprised
that remotely logging in I don't get the 3D X server functionality.  I only
used the -d option to hijack the 3D X server functionality from another
display.  What I can't figure out is why the 3D X server runs when you log
in directly to the machine, but doesn't run when you log in remotely with
TurboVNC.


On Thu, Sep 4, 2014 at 12:17 PM, DRC <dcomman...@users.sourceforge.net>
wrote:

> On 9/4/14 10:11 AM, Ladislaus Dombrowski wrote:
> > I am running TurboVNC and VirtualGL (v2.3.3) on two linux machines
> > running Ubuntu 10.4.  I start up a TurboVNC server on the remote
> > machine.  I connect to that server with a TurboVNC viewer.  When I run
> > 'vglrun glxgears'  i get:
> >
> > x$ vglrun glxgears
> > Running synchronized to the vertical refresh.  The framerate should be
> > approximately the same as the monitor refresh rate.
> > [VGL] ERROR: OpenGL error 0x0502
> > [VGL] ERROR: in readpixels---
> > [VGL]    439: Could not read pixels
> >
> >
> > if I login to the remote machine and then connect with that display on
> > the client machine with 'vglrun -d  :x.o glxgears' everything runs
> > fine.  As soon as I logout of the remote machine, things don't work
> > again.  I've been looking for a couple of weeks but can't find where
> > xserver stuff is connected to the Turboviewer DISPLAY.  Any ideas?
>
> Let's back up, because it seems like you may be misunderstanding how VGL
> is supposed to work.  The -d option to vglrun specifies the 3D X server.
>   That is the X server on which the 3D rendering will take place.  It
> defaults to :0, because the idea of VirtualGL is to perform 3D rendering
> on the server and send only 2D images to the client or to the X proxy.
> You should never have to use the -d option to vglrun unless your server
> has multiple GPU, each connected to a separate X display, and you are
> trying to load balance VirtualGL across the GPUs.  This is not a common
> situation.  For most people, the server will have one display-- :0.0--
> and you will run VirtualGL with its default options.  (this is all
> explained in the User's Guide, BTW.)
>
> To reiterate-- the 3D X server should always be:
> (a) on the same machine as the 3D application you are running
> (b) hardware-accelerated (nVidia or ATI GPU recommended.  Anything else
> is not going to be well-tested.)
> (c) accessible at all times to members of the vglusers group (this can
> be configured with vglserver_config.)
>
> The 2D X server is the X display where the rendered images will end up.
>   If you are using TurboVNC, then this is the VNC server's display (:1
> or whatever.)  The 2D X server is specified in the DISPLAY environment
> variable, whereas the 3D X server is specified in the VGL_DISPLAY
> environment variable (or using the -d argument to vglrun.)
>
> As Nathan points out, the 3D X server should always be running, because
> that's the only way that VirtualGL can access the server's GPU.  Running
> vglserver_config sets up your display manager (gdm, kdm, xdm, etc.) so
> that VirtualGL can access the 3D X server when it is sitting at the
> login prompt (again, refer to the User's Guide.)
>
>
> ------------------------------------------------------------------------------
> Slashdot TV.
> Video for Nerds.  Stuff that matters.
> http://tv.slashdot.org/
> _______________________________________________
> VirtualGL-Users mailing list
> VirtualGL-Users@lists.sourceforge.net
> https://lists.sourceforge.net/lists/listinfo/virtualgl-users
>
------------------------------------------------------------------------------
Slashdot TV.  
Video for Nerds.  Stuff that matters.
http://tv.slashdot.org/
_______________________________________________
VirtualGL-Users mailing list
VirtualGL-Users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/virtualgl-users

Reply via email to