I know how to setup two graphics cards with a single x-server with two
screens (one screen for each card) and address them via VGL_DISPLAY. But
I'm wondering why this is the only way to use 2 cards on a single host. For
example, VGL doesn't work when there are 2 x-servers with one screen each,
and each using a different card.
Basically I'm trying to setup some sort of access control to each card so
that only a single user can use one of the two cards. Using the DRI options
"group" and "mode" work fine but affect both cards since there is a single
x-server. Do you know of a way to accomplish this type of ACL in VGL? Or
even if it is possible?
--
Kevin Van Workum, PhD
Sabalcore Computing Inc.
"Where Data Becomes Discovery"
http://www.sabalcore.com
877-492-8027 ext. 11
--
------------------------------------------------------------------------------
Try New Relic Now & We'll Send You this Cool Shirt
New Relic is the only SaaS-based application performance monitoring service
that delivers powerful full stack analytics. Optimize and monitor your
browser, app, & servers with just a few lines of code. Try New Relic
and get this awesome Nerd Life shirt! http://p.sf.net/sfu/newrelic_d2d_apr
_______________________________________________
VirtualGL-Users mailing list
VirtualGL-Users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/virtualgl-users