Le lundi 01 août 2005 à 21:53 +0200, Arnt Karlsen a écrit :

> >  Being Nvidia and X installed , i continu to search a good answer :
> > After many experimentations,
> > I did not notice any change between 24bpp and 32 bpp.
> 
> ...glxgears, FlightGear etc f/s?

Ouaf..... glxgears isn't  a representative benchmark, with it we cannot
get a good performance analysis. 
I have played to demonstrate that my old ati 9200 and my other old
nvidia 5200 is better than the NVIDIA 6600GT.

assuming we use the Nvidia 7xxx  driver  (not the 6xxx)  

FG says 6600GT is x2.5 more (32 bpp or 24 seem the same performance )

Celestia says  (depending on the render choice) from x3 to x4 more
(probably 32bpp, my Xserver  is permanently 32bpp)

> 
> > I am not an expert in graphics development, may be the differences
> > depends on the GPU itself  and the capability to handle both
> > definitions, 
> > The main question could be about CPU: 
> > does CPU time used and is it any losses with one or the other ?  
> > 
> > Does somebody can give an answer ?
> 
> ...pass, what I learned from my own research on gpu's before buying an
> ATI 9250 clone, is ATI are "native 24bpp" and "24bpp only", where Nvidia
> is "1x32bpp or 2x16bpp", suggesting "ATI would suck at 16bpp doing less
> than 3x8bpp" and "at 32bpp not being able to see or make any use of
> the top 8 bits."   
> My understanding of Nvidea is "their cards should work better at 32bpp
> and 16bpp than at 24bpp, because 24bpp wastes half a 16bpp engine."
> 
> 
Ok , i will try to analyse it. 

-- 
Gerard


_______________________________________________
Flightgear-devel mailing list
Flightgear-devel@flightgear.org
http://mail.flightgear.org/mailman/listinfo/flightgear-devel
2f585eeea02e2c79d7b1d8c4963bae2d

Reply via email to