On Fri, Dec 13, 2002 at 12:17:10PM -0500, Murray J. Root wrote:
> On Fri, Dec 13, 2002 at 05:56:23PM +0100, Bruno Thomsen wrote:
> > > Somehow, I cannot use 1600x1200 with a color depth of 24 as before. It
> > > says that the 32 megs of memory aren't enough. I had to lower the depth
> > > to 16 to be able to use that resolution.
> > > 
> > > It was working fine with the previous version.
> > > 
> > > GeForce2 Integrated GPU
> > 
> > Hey
> > 
> > If you only have 32 megs of memory, you cant run in 1600x1200 24bit
> > color, here is why:
> > memreq = (1600 bit x 1200 bit x 24 bit) / (1024 x 1024) = 44 Megabyte.
> > If you use 16 bit color:
> > memreq = (1600 bit x 1200 bit x 16 bit) / (1024 x 1024) = 30 Megabyte.
> > 
> 
> You have confused bytes and bits. Divide by 8.
> 

To followup myself -
  according to NVidia docs, 24bpp uses 32bpp, just the high order
8 bits are not used.
So the correct calculation is 

1600*1200 pixels * 32bpp/8bits_per_byte / (1024*1024 bytes_per_Meg) = 7.3M

-- 
Murray J. Root
------------------------------------------------
DISCLAIMER: http://www.goldmark.org/jeff/stupid-disclaimers/
------------------------------------------------
Mandrake on irc.freenode.net:
  #mandrake & #mandrake-linux = help for newbies 
  #mdk-cooker = Mandrake Cooker 


Reply via email to