On Saturday 07 May 2005 12:41, Stefan D�singer wrote: > > >I switched to the Xorg radeon driver which has 16 bpp support(the 2nd > > > column shows 16 now), and made sure that hl runs with 16bpp, but the > > > error still occurs. > > > > Yes it don't work, > > because you speak about frame buffer (named Color buffer on traces) when > > you speak about 16bpp. I spoke about depth buffer > > Good, thanks for explaining this to me. I mixed the two buffers. > Well, HL doesn't offer any depth buffer setting. There's only one console > command, "gl_zmax", which is supposed to set the maximum depth buffer size. > The default is 4096, and changing this value has no effect on the error.(HL > still tries to get a 32 bit depth buffer) :(
> I sort of fixed the problem for me by forcing the depth buffer to 24 bit in > dlls/x11drv/opengl.c, but I understand that this is not a real solution. Is > there any chance for a better fix? I have no chance to fix this in the game > nor in the video driver I will see how we can have a better fix but for now can you try attached patch ? > Stefan Regards, Raphael
Index: opengl.c
===================================================================
RCS file: /home/wine/wine/dlls/x11drv/opengl.c,v
retrieving revision 1.5
diff -u -r1.5 opengl.c
--- opengl.c 28 Apr 2005 18:29:12 -0000 1.5
+++ opengl.c 9 May 2005 00:33:11 -0000
@@ -203,7 +203,15 @@
if (ppfd->iPixelType == PFD_TYPE_RGBA) {
ADD2(GLX_RENDER_TYPE, GLX_RGBA_BIT);
ADD2(GLX_BUFFER_SIZE, ppfd->cColorBits);
- TEST_AND_ADD2(ppfd->cDepthBits, GLX_DEPTH_SIZE, ppfd->cDepthBits);
+ if (32 == ppfd->cDepthBits) {
+ /**
+ * for 32 bpp depth buffers force to use 24.
+ * needed as some drivers don't support 32bpp
+ */
+ TEST_AND_ADD2(ppfd->cDepthBits, GLX_DEPTH_SIZE, 24);
+ } else {
+ TEST_AND_ADD2(ppfd->cDepthBits, GLX_DEPTH_SIZE, ppfd->cDepthBits);
+ }
TEST_AND_ADD2(ppfd->cAlphaBits, GLX_ALPHA_SIZE, ppfd->cAlphaBits);
}
TEST_AND_ADD2(ppfd->cStencilBits, GLX_STENCIL_SIZE, ppfd->cStencilBits);
pgpHXVeJrgHgQ.pgp
Description: PGP signature
