On Wed, 27 Mar 2002, Daryll Strauss wrote: > On Wed, Mar 27, 2002 at 04:00:55PM -0500, Leif Delgass wrote: > > On Wed, 27 Mar 2002, Alexander Stohr wrote: > > > > So we'd use > > > > mach64Screen->cpp for the calculation instead of a fixed 4 > > > > bytes/texel? > > > > Then the comparison would be: > > > > > > > > if mach64Screen->texSize[0] >= > > > > 2 * mach64Screen->cpp * 1024 * 1024, then MaxTextureLevels = 11 > > > > else if mach64Screen->texSize[0] >= > > > > 2 * mach64Screen->cpp * 512 * 512 , then MaxTextureLevels = 10 > > > > else MaxTextureLevels = 9 (256x256) > > > > > > > > This should apply to Rage128 and Radeon as well. Am I > > > > missing something here? > > > > > > Yes, if you have a Radeon with i.e. 128 MB then you might want to > > > use even bigger textures or higher max levels, as long as the > > > renderer can handle them. > > > > > > Some sort of iteration or loop design might turn out to be the best. > > > At least you can specify the lower and upper limits much easier then. > > > > > > Regards, AlexS. > > > > Yes, for Radeon you can go with a larger texture (2048x2048 looks like the > > max from the code, I don't have Radeon specs), I was thinking in terms of > > mach64 which has a max. size of 1024x1024. But do you see any problem with > > the basic idea in terms of using the screen bit depth? Also, the first > > '2' should probably be MaxTextureUnits for cards that have more than two > > texture units implemented, right? > > This is all really just a heuristic. It works around a bigger problem > that there's no good way to tell how many textures can fit on the > board. So, these rules favor something like quake wher you want two > textures (in order to multitexture) on the board at the same time and > therefore lies to the application in defining what the maximum texture > size is. > > Unfortunetly, this then breaks apps that require the use of bigger > textures like you saw in the planet screensaver. You can argue that the > planet screensaver should have made smaller textures. Since the app can > query the maximum texture size, but that only works if shrinking texture > map is acceptable for the app. > > If you use the correct hardware maximum texture size, then the problem > is for the app is to determine if it can fit big textures in > memory. There's no good way to query, but an app can test it by trying > to do it at the maxmimum and then step down the texture size until it > reaches one that's fast enough. That's the correct general approach in > any case. > > So, making it 3 for a Radeon because it has 3 texture units isn't more > correct. In fact, I'd argue that breaking your driver so that it is not > capable of doing what the hardware can do is really the wrong solution > overall. One of my apps needs 2k textures, and this heuristic makes the > board non-functional for my app. Luckily it's all open source so I can > change it! > > I have no objection to making changes that make specific apps (like > Quake) run faster as long as they don't impact other programs. This is a > case where setting an environment variable like LIBGL_MAX_TEXTURE_SIZE > might make sense or just having LIBGL_I_AM_RUNNING_QUAKE mode. Then it > could throw correctness out the window and go as fast as possible. As > long as it's only on when I ask for it, that's no problem. If I haven't > asked the driver to be broken, I want correct behavior that allows me to > use all of the hardware. > > - |Daryll
I see what you mean. I noticed that one of the new screensavers (flipscreen3d) wanted to create a 1024x512 mipmap from a screen grab and failed, even though it should work with a single texture. If I set MaxTextureLevels to 11, it works. Perhaps it's better, as you say, to just use the maximum number of levels supported by the card and provide an env var to step it down for apps that try to use the max. size for multitexturing. I'm not really a big fan of having a proliferation of env vars, but I guess it's ok for now. It might be nice to have some sort of configuration file or interface like 3D workstation drivers where you could create application profiles with different settings. I suppose you could accomplish this by creating shell scipt wrappers for your GL apps that export the appropriate env vars. -- Leif Delgass http://www.retinalburn.net _______________________________________________ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel