On Fri, 7 Mar 2003, Ian Romanick wrote:
I was looking at the source for the radeon driver and noticed that the depth buffer is always set to 32 bits if a 24 bpp color depth is selected. Is this a hardware limitation, or might it be possible to change it to 16 bpp?
There's two ways to do this. One would be to add an option to the X server to select the depth-buffer size. Look in programs/Xserver/hw/xfree86/drivers/ati/radeon_driver.c for how options (like "AGPSize") are handled. Basically, you'd add an entry like
{ OPTION_DEPTH_SIZE, "DepthBits", OPTV_INTEGER, {0}, FALSE },
One of the following would be more consistent with option naming:
{ OPTION_DEPTH_SIZE, "DepthSize", OPTV_INTEGER, {0}, FALSE },
{ OPTION_DEPTH_BUFFER_SIZE, "DepthBufferSize", OPTV_INTEGER, {0}, FALSE },
Just a suggestion.
I had thought about that. The problem is that seems to imply setting the size of the buffer, not the size of each element in the buffer. I got to thinking about it last night, and now I have a different idea. How about something like this?
{ OPTION_DEPTH_SIZE, "ZBufferFormat", OPTV_ANYSTR, {0}, FALSE },
On some cards, like Radeon and MGA, this could be the usual INT16, INT24, or INT32, OR it could be INT16W, FLOAT32, or other hardware specific formats. On the flip side, given that the ability for apps to REALLY select the depth-buffer format they want, on a per-window basis, via fbconfig IS coming, I don't know if we want to open up that much flexability. Dunno.
-------------------------------------------------------
This SF.net email is sponsored by: Etnus, makers of TotalView, The debugger for complex code. Debugging C/C++ programs can leave you feeling lost and disoriented. TotalView can help you find your way. Available on major UNIX and Linux platforms. Try it free. www.etnus.com
_______________________________________________
Dri-devel mailing list
[EMAIL PROTECTED]
https://lists.sourceforge.net/lists/listinfo/dri-devel
