Roland Scheidegger wrote:
Bernhard Wymann wrote:

If you'd use glutInitDisplayString with depth>=16 you _should_ get a 24bit depth buffer if available though, at least in theory - if
not that would be a bug in glut or the glx visual matching code.

That might be true, but the code runs also on Windows and proprietary linux drivers, and at least my TNT and GeForce 2 defaults to 16 bit depth buffer in Linux (with 16 bit color depth), also the ATI Radeon Mobility and GeForce Go are on 16 bit depth buffers even at 32 bit color depth (in Windows).

Ah yes, I forgot everybody optimizes for frame rates nowadays instead of
quality. IIRC there are driver settings available to change the "default" z-buffer depth, but those might not even be accessible by the driver control panel. Well, the driver behaviour is probably not outright illegal, so I guess you really need to specifically ask for 24bit z-buffer.

I looked into this a bit. GLUT has a "verbose" mode where it will show the tests as it selects a visual. Set a break point at findMatch, and set verbose = 1. Then let it rip. You may also need to rebuild GLUT with the preprocessor macro TEST defined. I'd suggest trying that before changing drivers. It's either a problem in the driver or a problem in GLUT. I'm leaning towards a problem in GLUT because GLUT just gets a list of *all* the visuals and does all the filtering itself. glXChooseVisual could have an infinity of bugs and it wouldn't impact GLUT.




-------------------------------------------------------
SF.Net is sponsored by: Speed Start Your Linux Apps Now.
Build and deploy apps & Web services for Linux with
a free DVD software kit from IBM. Click Now!
http://ads.osdn.com/?ad_id=1356&alloc_id=3438&op=click
--
_______________________________________________
Dri-devel mailing list
[EMAIL PROTECTED]
https://lists.sourceforge.net/lists/listinfo/dri-devel

Reply via email to