Dissecting GeForce FX

These vapors aren't so potent

WHO KNEW NVIDIA WOULD stick a Dustbuster to the side of its next-gen graphics card in order to cool the GPU to where it could reach clock speeds higher than the Radeon 9700? Honestly, I figured all the new technology, from a 0.13-micron fab process to DDR-II-type memory, would take care of things for NVIDIA on the performance front. But here we are after the product announcement, about four months from the product's projected availability date, and the GeForce FX reference design has an appendage a la Black and Decker.

Actually, I don't mind the OTES concept on a premium high-end card, even though it eats a PCI slot. But we found the Ti 4200 incarnation of this beast to be alarmingly loud. Let's hope NVIDIA's ultra-high-priced, high-end card doesn't sound like a Dustbuster.

However, the details of GeForce FX's chip architecture are surprisingly tame. We knew ATI had beaten NVIDIA to the punch, but most of us expected NVIDIA's counterpunch to be a little more potent. Now that the GeForce FX specs have hit the street, it's safe to say that ATI produced the exact same class of graphics technology over six months before NVIDIA. At the time I wrote my comparative preview of the Radeon 9700 and NV30-cum-GeForce FX, NVIDIA was being cagey about the NV30's exact specifications. They were claiming (under NDA, of course) that the NV30 would have 48GB/s memory bandwidth, but we now know the part has 16GB/s of memory bandwidth, plus a color compression engine that's most effective when used with antialiasing (where it might achieve a peak of 4:1 compression, but will probably deliver something less—hence the 48GB/s number). The Radeon 9700 Pro has 19.4GB/s of memory bandwidth, thanks to old-fashioned DDR memory and a double-wide, 256-bit memory bus.

NVIDIA was also fuzzy, back then, about the exact number of texture units per pixel pipe in NV30. We now know the GeForce FX has eight pipes with one texture unit each, just like the Radeon 9700. So don't expect any massive performance advantages for the GeForce FX in current games. Only the higher clock speeds, afforded partly by the Black and Decker appendage, will give the GFFX a nominal fill rate higher than the Radeon 9700.


I could go on. The GeForce FX features DirectX 9 compliance, floating-point color formats, adaptive anisotropic filtering, and an early Z routine to eliminate overdraw.

Just like the Radeon 9700.

The GeForce FX offers an antialiasing routine hitched to yet another marketing term; the new Intellisample replaces the outdated Accuview. Intellisample is gamma-corrected multisampling—just like the Radeon 9700. Only the color compression engine, which promises to conserve memory bandwidth, separates the GFFX from ATI's top chip. The thing is, the 9700's antialiasing is already very fast, and in true next-gen applications, GPU pixel processing power should become the big limitation, not memory bandwidth. (Update: Whoops, I forgot. The Radeon 9700 has color compression that it uses with antialiasing, as well.)

Nymph demo chick: hot

You get the point. Other than slightly higher clock speeds, the GeForce FX doesn't appear to offer any compelling advantages over the Radeon 9700. TSMC's 0.13-micron fab process has proven to be much more of a headache and a liability than anything else, and the GFFX's availability date still hangs out in the air, at least four months away, as a result. The use of DDR-II-type memory instead of conventional DDR and wider memory paths has pedestrian advantages like potentially lower board costs and simpler PCB layouts, but those benefits will be available to ATI as its graphics cards make the transition to DDR-II memory.

Now, none of this is to say the GeForce FX doesn't have its appeal. For instance, the nymph chick in that NVIDIA demo is hot. Plus, any product as good as—or possibly even a little better than—the current Radeon 9700 Pro is one helluva spectacular graphics chip. The GeForce FX promises to be superior to the Radeon 9700 in extreme cases where loads of pixel shaders ops need executed in a single pass for performance reasons (though these are definitely non-gaming scenarios we're talking about here). And NVIDIA's overall assets as a company, from always-solid drivers to good board manufacturing partners to developer relations initiatives like Cg, should propel the GeForce FX to success.

That success, when it comes, will be much needed. Only now are the true effects of NVIDIA's missed product cycle with NV30 coming into focus. NVIDIA is no longer the graphics technology leader, in title or, soon, in sales. NVIDIA has held on to its market share over the past few quarters, even with the Radeon 9700 on the scene, because its mainstream and low-end products were still very competitive. That won't be the case for much longer, as ATI pushes its R300 and R200 technology generations down into the mainstream and value segments, respectively. Already, the Radeon 9000 Pro is the best choice for under $100, and soon, the Radeon 9500 and 9500 Pro will fill store shelves, ready to bring floating-point pixels to all the good little Christmas shoppers. All NVIDIA has to counter with are warmed-over versions of the GeForce and GeForce3 graphics cores mated to AGP 8X interfaces. That is to say, NVIDIA is a full technology generation behind in the value and mainstream market segments. Word has it that the GeForce FX-derived NV31 and NV34 chips are just now entering tape-out at TSMC, and in all likelihood, those chips won't hit the market until a month or more after the first NV30-based cards arrive.

Of course, the GeForce FX may be pretty darned fast when it arrives. The benchmarks will tell that story. Also, of course, ATI may have a faster variant of the Radeon 9700 on store shelves before the GFFX arrives. But now that we've had a real whiff of the GeForce FX vapors, the reality is clear: this concoction isn't potent enough to freeze the market for four to six months. If you want a next-gen graphics card now, you might as well go pick up a Radeon 9500 or 9700 and start enjoying it right away.  TR

Tags: Graphics

Revisiting the Radeon VII and RTX 2080 at 2560x1440Can a lower resolution turn the tables for AMD? 243
AMD's Radeon VII graphics card reviewedPlus ça change 265
Asus' ROG Strix GeForce RTX 2070 graphics card reviewedTU106 takes fully-fledged flight 56
The Tech Report System Guide: January 2019 editionNew year, new gear 128
Radeon Software Adrenalin 2019 Edition untethers Radeon gamingHigh-quality gaming comes to low-power devices 58
Intel talks about its architectural vision for the futureGetting real about 10-nm products and beyond 139
Testing Turing content-adaptive shading with Wolfenstein II: The New ColossusPutting shader power where it's needed most 106
Examining the performance of Nvidia RTX effects in Battlefield VRays, rays everywhere 126