I tested RHEL 5 with an ATI FirePro V5700 prior to VGL 2.1.3, so it
should work.  See inline comments.


Peter Åstrand wrote:
> Hi, we are installing a VirtualGL system on RHEL with an ATI Radeon 4890
> card. A few issues came when running vglserver_config:
> 
>> ... Modifying /etc/security/console.perms to disable automatic
>> permissions
>>    for DRI devices ...
> The script tries to modify /etc/security/console.perms, but on RHEL5,
> the dri permissions are actually set in
> /etc/security/console.perms.d/50-default.perms.

I don't think this matters, because the ATI device permissions are
really taken from the [DRI] section of the xorg.conf file, and the
nvidia device permissions are really taken from the modprobe file.
AFAIK, fiddling with console.perms is only needed to support older
systems.  I do see, however, how the output of vglserver_config can be a
bit confusing.  It should really avoid printing anything unless the
modification it's trying to do is applicable for your system.


>> ... Creating /etc/modprobe.d/virtualgl to set requested permissions for
>>    /dev/nvidia* ...
> 
> As I understand it, there is no corresponding file for ATI cards? We
> have a /dev/dri/card0.

ATI is a "pure" DRI driver, so the device permissions are set in
xorg.conf.  VirtualGL 2.1.3 and later should properly handle this.


>> ... Disabling XTEST extension in /etc/gdm/custom.conf .
> 
> Doesn't work on this system, but I've now figured out how to do it. This
> is what needs to be added to /etc/gdm/custom.conf:
> 
> [server-Standard]
> command=/usr/bin/Xorg -br -audit 0 -tst

Yeah, I've been aware of this problem for a while, but unfortunately,
there isn't an easy way for vglserver_config to work around it.  The
problem with adding a specific X server command line is that it's
different for every system, and vglserver_config won't know the correct
default args to add to it for a particular system.  RedHat really needs
to package a new custom.conf file with a default X server command line
that can be uncommented and modified.

Note that VirtualGL should be able to properly disable XTEST if you are
using KDM.  This issue is specific to GDM.


>> ... Commenting out DisallowTCP line (if it exists) in 
> /etc/gdm/custom.conf ...
> 
> This DisallowTCP is default nowadays, it's not just sufficient to search
> for this and and comment it out. Instead, we need to add to
> /etc/gdm/custom.conf:
> 
> [security]
> DisallowTCP=false

As far as I know, DisallowTCP=true has always been the default.  Prior
to v2.1.1, vglserver_config would explicitly set DisallowTCP=false to
allow TCP connections to :0.  This was necessary because VGL used "xhost
+localhost" to open up the display permissions if you weren't using the
vglusers group.  I then discovered that I could use "xhost +LOCAL:"
instead, so VGL v2.1.1 and later no longer requires TCP connections to
display :0.  Thus, the newer versions of vglserver_config look for the
DisallowTCP line and comment it out, in case that line was added by an
older version of vglserver_config.

To reiterate, turning off TCP connections (the default) is what you want.


> Btw, another thing that might be good to add is:
> 
> AllowRoot=false
> 
> Normally, I hate it, but for the VirtualGL case it's probably a good idea.

I agree that it's a good idea, but it's too system specific to add to
vglserver_config.  I could mention it in the documentation, however.

In general, vglserver_config isn't very good at adding things to config
files.  It's more designed to modify things that are already there.


> With these changes, the configuration looks good. However, VirtualGL
> doesn't actually seem to work: With software-rendered OpenGL in Xvnc,
> glxspheres gives about 5 frames/sec. When running glxspheres through
> vglrun, the performance increases only to 11 frames/sec. The :0 Xserver
> is idle, except for a few system calls when vglrun i launched. Running:
> 
> DISPLAY=:0 glxspheres
> 
> ...gives a nice performance of ~ 1300 frames/sec. Any ideas?

If VirtualGL were using indirect rendering, it would complain about it
loudly, so that's apparently not the problem.  I'm wondering if VGL is
engaged at all.  Do you, for instance, get profiling output when you do
'vglrun +pr glxspheres'?

If the :0 X server is idle, that means (for whatever reason) the
commands aren't ever making it there.


------------------------------------------------------------------------------
Come build with us! The BlackBerry® Developer Conference in SF, CA
is the only developer event you need to attend this year. Jumpstart your
developing skills, take BlackBerry mobile applications to market and stay 
ahead of the curve. Join us from November 9-12, 2009. Register now!
http://p.sf.net/sfu/devconf
_______________________________________________
VirtualGL-Users mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/virtualgl-users

Reply via email to