A few of our fellow web journalists tested NVIDIA's claims with their GeForce 2 Ultras, and a few others blatantly stated that they had a GeForce 3, but that numbers were not possible due to driver issues.
Apparently driver issues and NVIDIA PR were to be damned. Digit-Life had not one, but four GeForce 3's in their hands and the new 10.50 Detonator drivers primed to do battle. Their massive review, which weighs in at 1.88MB, is filled with a barrage of tests that stress the functionality and performance of many of the GeForce 3's newly touted features.
The most impressive of which are the patented Quincux AA method and a 32-tap anisotropic filtering method. The most disappointing among them is the apparent continued use of DXT1 in 16-bit mode.
Many of the reviews claimed the GeForce 3 wouldn't seem all that different from the GeForce 2 Ultra in performance. After having read this article, I am not so sure that is a fair statement. NVIDIA's first attempt at reducing overdraw seems to have been effective. Even at a lower clock rate, the GeForce 3 shows its muscle over the GeForce 2 Ultra. Is the performance difference so great you should dump your Ultra for it? I would say only if you can get your money's worth back out of the Ultra in auction or selling to a friend. If you were in the market for a new card, though, the performance differenceand most importantly the image quality differenceis simply impossible to ignore.
I just get this huge warm fuzzy feeling all over when I think about running Quake 3 in 800x600x32, 32-tap anisotropic filtering, and Quincux AA.
I am not sure how many of you have tried out 4X FSAA and 8-tap anisotropic filtering on your GeForce/GTS/MX series cards, but the difference in quality is simply breathtaking. It can even make an old dated engine like Half-Life take on totally new and impressive visual quality.
I see the GeForce 3 as a real winner and step forward, if only for its promise of better image quality now and even greater quality in the future.