As you know, the Nvidia 6800 was released for benchmarking and it blew
everything else away, being many many times faster than ATI's former
performance leader the 9800XT.


Now we're getting in benchmarks from cards based on ATI's next gen chip
,the X800 series. And man! THese babies are fast!
"The Radeon X800 series cards perform best in some of our most intensive
benchmarks based on newer games or requiring lots of pixel shading
power, including Far Cry, Painkiller, UT2004, and 3DMark03's Mother
Nature scene�especially high resolutions with edge and texture
antialiasing enabled. The X800s also have superior edge antialiasing.
Their 6X multisampling mode reduces edge jaggies better than NVIDIA's
8xS mode, and the presence of temporal antialiasing only underscores
ATI's leadership here. With a single-slot cooler, one power connector,
and fairly reasonable power requirements, the Radeon X800 XT Platinum
Edition offers all its capability with less inconvenience than NVIDIA's
GeForce 6800 Ultra. What's more, ATI's X800 series will be in stores
first, with a more mature driver than NVIDIA currently has for the
GeForce 6800 line.


The folks at ATI have improved mightily on the R300 design with the
R420, successfully delivering the massive performance leap necessary to
keep pace with NVIDIA's new GPUs. The achievement of ATI's demo team
with the Ruby demo is a heckuva reminder that ATI knows what it's doing
with DirectX 9-class graphics, and a very strong argument that the
X800's new, longer shader instruction limits don't preclude much higher
quality graphics in real time than anything we've seen from game
developers yet.


However, NVIDIA's GeForce 6800 cards are no pushovers this time around.
The GeForce 6 cards are faster in OpenGL, in many older games, and in
Prince of Persia: The Sands of Time. ShaderMark 2.0 is very close, too,
proving that NVIDIA's new pixel shaders are very capable, even with a
distinct clock speed deficit. The GeForce 6800 GPUs have some natural
advantages, including support for Shader Model 3.0 with longer shader
programs, dynamic flow control, and FP16 framebuffer blending and
texture filtering. Down the road, these capabilities could prove useful
for creating advanced visual effects with the highest possible fidelity.

Right now, though, NVIDIA needs to concentrate on getting some basics
right. The NV40 is a novel chip architecture, and its drivers are very
much in the beta stages. We'd like to see better results in newer titles
like Far Cry, antialiasing blends that account for display gamma, and a
consistent means of banishing "brilinear" filtering optimizations.
Ideally, NVIDIA would make "brilinear" an option but not the default;
the GeForce 6800 series is too good and too fast to need this crutch.
It's possible NVIDIA will have worked out all of these problems by the
time GeForce 6800 cards arrive in stores.


At present, ATI appears to be slightly ahead of NVIDIA, but its
superiority isn't etched indelibly in silicon the way it was in the last
generation of GPUs. The GeForce 6800 is an extremely capable graphics
chip, and we don't know yet how good it may become. Whatever happens,
you can see why I said this generation of GPUs presents us with a choice
between better and best. These cards are all killer performers, and
having seen Far Cry running on them fluidly, I can actually see the
logic in parting with four or five hundred bucks in order to own one. "


HYPERLINK
"http://www.anandtech.com/video/showdoc.html?i=2044"http://www.anandtech
.com/video/showdoc.html?i=2044
HYPERLINK
"http://www.firingsquad.com/hardware/geforce_6800_ultra_extreme/page21.a
sp"http://www.firingsquad.com/hardware/geforce_6800_ultra_extreme/page21
.asp
HYPERLINK
"http://www.tomshardware.com/graphic/20040504/index.html"http://www.toms
hardware.com/graphic/20040504/index.html


These should get those interested started in checking out the
performance.
I agree with the above reviewer that the ATI seems the best card for the
money. I can't understand this sillyness by Nvidia to include oversized
coolers, and having their cards need TWO power connectors from your
power supply to work? Also they're quite loud apparently. ATI is getting
comparable performance and only taking up one slot, with a regular sized
heatsink and the need for only one molex.


Plus, the ATI cards are out now! :)


-Gel

---
Outgoing mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.614 / Virus Database: 393 - Release Date: 3/5/2004
[Todays Threads] [This Message] [Subscription] [Fast Unsubscribe] [User Settings]

Reply via email to