This has nothing to do with VirtualGL.  If there are two X servers 
running and you are able to successfully run GLX apps on both 
simultaneously, then VGL will "just work" as long as you set VGL_DISPLAY 
properly.  However, getting two X servers running and being able to 
successfully run GLX apps on both simultaneously is the trick, and 
AFAIK, the current hardware-accelerated 3D drivers from nVidia, etc. 
don't allow it.  There may be other O/S-level things that disallow it as 
well.  But then again, even though I'm the developer of VGL, I am 
actually not the foremost expert on using it in a multi-card 
environment, so others on the list may be able to provide a better answer.

DRC


On 4/26/13 2:28 PM, Kevin Van Workum wrote:
> I know how to setup two graphics cards with a single x-server with two
> screens (one screen for each card) and address them via VGL_DISPLAY. But
> I'm wondering why this is the only way to use 2 cards on a single host.
> For example, VGL doesn't work when there are 2 x-servers with one screen
> each, and each using a different card.
>
> Basically I'm trying to setup some sort of access control to each card
> so that only a single user can use one of the two cards. Using the DRI
> options "group" and "mode" work fine but affect both cards since there
> is a single x-server. Do you know of a way to accomplish this type of
> ACL in VGL? Or even if it is possible?

------------------------------------------------------------------------------
Try New Relic Now & We'll Send You this Cool Shirt
New Relic is the only SaaS-based application performance monitoring service 
that delivers powerful full stack analytics. Optimize and monitor your
browser, app, & servers with just a few lines of code. Try New Relic
and get this awesome Nerd Life shirt! http://p.sf.net/sfu/newrelic_d2d_apr
_______________________________________________
VirtualGL-Users mailing list
VirtualGL-Users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/virtualgl-users

Reply via email to