On Wed, Mar 27, 2002 at 03:17:56PM -0500, Leif Delgass wrote: > In the code to set MaxTextureLevels in the Rage128 and Radeon drivers, > 4 bytes/texel is assumed when calculating for the max texture size. If we > always convert to 2 bytes/texel for a 16bpp screen when choosing texture > formats, shouldn't that be factored into the calculation? So we'd use > mach64Screen->cpp for the calculation instead of a fixed 4 bytes/texel? > Then the comparison would be: > > if mach64Screen->texSize[0] >= > 2 * mach64Screen->cpp * 1024 * 1024, then MaxTextureLevels = 11 > else if mach64Screen->texSize[0] >= > 2 * mach64Screen->cpp * 512 * 512 , then MaxTextureLevels = 10 > else MaxTextureLevels = 9 (256x256) > > This should apply to Rage128 and Radeon as well. Am I missing something > here?
It occurs to me that, for cards that support mipmapping (i.e., not mach64), even this test is wrong. If there is 1 texture unit, 16-bits/texle, and 2MB (i.e., 2*1024*1024 bytes) of texture memory, then a 1024x1024 texture and all of its mipmaps will most certainly not fit into texture memory. It would require (0x55555555 >> (32 - (2 * 11))) = 1398101 available texels. -- Tell that to the Marines! _______________________________________________ Dri-devel mailing list [EMAIL PROTECTED] https://lists.sourceforge.net/lists/listinfo/dri-devel