On Tue, 29 Jul 2003 16:01:22 -0700 Ian Romanick <[EMAIL PROTECTED]> wrote:
Felix Kühling wrote:
On Tue, 29 Jul 2003 13:58:58 -0700 Ian Romanick <[EMAIL PROTECTED]> wrote:
Felix Kühling wrote:
Do the __driRegisterExtensions functions in the drivers rely on being called during initialisation?
In fact I believe it could be dangerous if __driRegisterExtensions was called later as it may override extensions disabled in e.g. CreateContext due to lacking hardware support. Fortunately __glXRegisterExtensions returns immediately if it is called the second or later time. Maybe it's just a matter of updating a few comments after all.
I'm inclined to believe that the comments in dri_glx.c are just wrong. __glXRegisterExtensions has to be called before a call to glXGetProcAddress. The app can query that string via glXQueryExtensionsString long before calling glXGetProcAddress. In fact, it may never call glXGetProcAddress. I'm sure glxinfo doesn't. :)
So this does influence which extensions are listed in the extension string, contradicting what Keith wrote? In that case I have one more question. How can this work with multi-head configurations where you can have multiple different cards (different screens) on one display. Then each driver will add or readd extensions. But they should never disable any extensions, right? You don't want drivers to disable each others extensions, do you?
It influences the GLX extension string. To test this, I fired up gears under gdb. Basically, run it once from gdb, then set a break-poing at __driRegisterExtensions, then run it again. That's the easiest way. Here's the back-trace from that second run:
#0 __driRegisterExtensions () at r200_screen.c:435
#1 0x400b959e in __glXRegisterExtensions () at dri_glx.c:464
#2 0x400b948f in driCreateDisplay (dpy=0x804b7a0, pdisp=0x804cd14) at dri_glx.c:394
#3 0x400a4f9f in __glXInitialize (dpy=0x804b7a0) at glxext.c:885
#4 0x400a1ae3 in glXChooseVisual (dpy=0x804b7a0, screen=0, attribList=0xbffff360)
at glxcmds.c:1265
#5 0x4003ed20 in getVisualInfoRGB () from /usr/lib/libglut.so.3
#6 0x4003edca in __glutDetermineVisual () from /usr/lib/libglut.so.3
#7 0x4003ef98 in __glutDetermineWindowVisual () from /usr/lib/libglut.so.3
#8 0x4003f037 in __glutCreateWindow () from /usr/lib/libglut.so.3
#9 0x4003f391 in glutCreateWindow () from /usr/lib/libglut.so.3
#10 0x0804a2d8 in main (argc=2, argv=0xbffff5b4) at gears.c:348
#11 0x42017589 in __libc_start_main () from /lib/i686/libc.so.6
You can see that __driRegisterExtensions gets called even if glXQueryExtensionsString and glXGetProcAddress are not called.
The problem is that the GLX extensions should be tracked per-screen, but
I see: C SPECIFICATION
const char * glXQueryExtensionsString( Display *dpy,
int screen )
I don't mean what the GLX specification says to do. I mean what our code actually implements. Internally there is a *single* *global* table & extension string. So it's not even tracking it per-display. It's worse than that. :(
they are instead tracked per-display. This doesn't "matter" right now because we don't support the configuration that you describe (at least not as far as I know!). Each card would be it's own display.
Maybe these configs don't work for one reason or another, but the configuration framework was designed with this in mind and also the code in dri_glx.c handles the case of different drivers for different screens. I see two choices here, either glxextensions.c manages multiple screens itself or the four bitfields server/client_support/only are managed in __GLXscreenConfigsRec. In either case glXGetUsableExtensions would have to be told about the screen. A screen number in the first case or a __GLXscreenConfigsRec pointer in the second case.
Since glXGetUsableExtensions is only called from glXQueryExtensionsString (glxcmds.c, line 1416), that should be an easy change to make.
The bit-fields and next_bit would have to be copied to the __GLXscreenConfigsRec. We'd still want the global copies to track the initial state. We'd also need to add an ext_list to the __GLXscreenConfigsRec to track extensions added by calling __glXAddExtension.
Consequently __glXDisableExtension should never be called (or better not even exist ;-). And the only way to disable an extension is to not enable it. Thus, if you don't want to enable the swap-interval extensions if the hardware can't support them (no IRQs) then you have to know whether IRQs work at the time __driRegisterExtensions is called. Is that possible?
Now there's an interesting point. The bigger problem is that the driver might not have a chance to call __glXDisableExtension until *after* the app has called glXQueryExtensionsString. At that point the extension string cannot be changed. I'm not sure what the right answer is here.
Ok, so we have to know which extensions to enable before a driver is initialized in its createScreenFunc. Sounds tough :-/
Agreed. I think for the most part we can enable the extensions, but just have them fail when used. That's a little better than nothing.
------------------------------------------------------- This SF.Net email sponsored by: Free pre-built ASP.NET sites including Data Reports, E-commerce, Portals, and Forums are available now. Download today and enter to win an XBOX or Visual Studio .NET. http://aspnet.click-url.com/go/psa00100003ave/direct;at.aspnet_072303_01/01 _______________________________________________ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel