I know how to setup two graphics cards with a single x-server with two
screens (one screen for each card) and address them via VGL_DISPLAY. But
I'm wondering why this is the only way to use 2 cards on a single host. For
example, VGL doesn't work when there are 2 x-servers with one screen each,
and
This has nothing to do with VirtualGL. If there are two X servers
running and you are able to successfully run GLX apps on both
simultaneously, then VGL will just work as long as you set VGL_DISPLAY
properly. However, getting two X servers running and being able to
successfully run GLX apps