On 6/19/12 12:42 AM, Jethro Beekman wrote: > > You can't configure each graphics adapter as a separate screen on the same > X > server? > > This seems to be possible, but for some reason I can't get my gnome-session > running in that setup. I should try if this works with the nouveau drivers.
So the X server starts and displays properly across both screens, but no Gnome? Or does the X server not start at all? If it's the latter, maybe contact the bumblebee people, who might have some experience with configuring Xorg on hybrid laptops. Xinerama should work with completely different graphics adapters and drivers. However, I do remember there being some caveats about using it with the nVidia proprietary drivers, like maybe some feature that had to be disabled. Google it. The problem with nouveau is that you're not going to get full 3D acceleration. Like it or not, the nVidia proprietary drivers are still by far the best thing to use with nVidia hardware. In a perfect world, that wouldn't be the case, but this isn't a perfect world. > > It doesn't matter whether both are hardware-accelerated. Only the nVidia > one > needs to be, then if you want to display OpenGL apps to the Intel screen, you > could just use VGL. > > I don't think that is possible because compiz requires OpenGL everywhere. Why is compiz a requirement? Also, I think compiz can be used without OpenGL. At least, the version on my machine can be configured not to use it. > > OK, but you have to remember that it's not just a matter of displaying > pixels. You also have to handle all of the input events and such and make > sure > those are properly translated between the two X servers. > > The input events are handled on the nvidia X server, so I don't need anything > special for that. It really just is handling pixels I don't think you're correct in asserting that, but I don't have time to argue the point. > > That's not something VirtualGL can inherently do, but VNC can. If you are > truly stuck using two X servers and can't figure out a way to Xinerama the > two > devices together as separate screens, then your best approach is probably to > just run a TurboVNC session big enough to span both screens and then hack > vncviewer so that it will only handle a given region within the virtual > desktop. > So, for instance, you might have one full-screen vncviewer session on :0 > handling 0,0 to 1919,1079 and one on :1 handling 1920,0 to 3839,1079. Of > course > you would always have to use VGL in this environment. Whatever you end up > doing > on your own is going to reinvent a lot of VirtualGL and TurboVNC's wheels. > > Can you run a VNC X server and a regular X server on the same graphics card? > Does the VNC X server support the Xinerama client extension so that > maximizing > windows etc. works? You don't run a VNC X server "on a graphics card." It creates a virtual framebuffer in main memory, which is why you have to use VirtualGL within that environment to get 3D acceleration. In the solution I'm proposing, you wouldn't need Xinerama. You'd create a very large VNC desktop, one that spans both monitors, and you'd use two vncviewer instances to connect to it, one for each "real" X server. It's a heck of a lot easier to do that, using off-the-shelf software, than trying to hack together some single-purpose screen capture system like you're proposing. However, I think your better bet is to figure out why the Xinerama stuff isn't working. > Luckily my real job doesn't have anything to do with graphics hardware on > Linux, > so I don't have to worry about that. I also think you're wrong about the > interest people would have it this would actually work smoothly. In my > opinion, > every minute spent towards a more user-friendly Linux Desktop environment is > a > minute well spent. Also, as Linus Torvalds pointed out last week, nVidia has > been terribly unsupportive in this matter, so if the community doesn't do > something, this may never work. I don't follow. You're talking about a very specific solution for a very specific piece of hardware. I don't envision this being very useful even to the VirtualGL community at large, much less the Linux community at large. Anyway, I've suggested several viable approaches. If you ultimately choose a solution that does not involve VirtualGL/TurboVNC, then it's out of scope for this list. ------------------------------------------------------------------------------ Live Security Virtual Conference Exclusive live event will cover all the ways today's security and threat landscape has changed and how IT managers can respond. Discussions will include endpoint security, mobile security and the latest in malware threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/ _______________________________________________ VirtualGL-Users mailing list VirtualGL-Users@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/virtualgl-users