Single page Print

NVIDIA's GeForce 6600 GT graphics processor


The green team uncorks a jug o' whup-ass
— 8:00 AM on September 7, 2004

WE'VE BEEN HEARING constantly, all summer long, about fancy new high-end PC hardware that's bat-out-of-hell fast but worth more than the GDP of Lithuania. New Pentium 4 Extreme Editions, Athlon 64 FXes, and especially 16-pipe graphics cards like the GeForce 6800 Ultra and Radeon X800 XT Platinum Edition have dominated the news and reviews. These expensive toys have tempted many of us, no doubt, but generally, a graphics card that costs 500 bucks is a bit of a reach for most folks. With Doom 3 on the loose and Half-Life 2 just over the horizon, it's enough to make a grown man cry, or to make a grown man's wife disown him for charging a mortgage payment's worth of hardware on the MasterCard.

We knew, though, when NVIDIA and ATI introduced their new 16-pipe high-end graphics chips that eventually the other shoe would have to drop. Eventually, the massive performance increases we saw in high-end graphics cards would cascade down the GPU makers' product lines, into their "mainstream" cards. Fortunately, folks, that time has come. NVIDIA's GeForce 6600 series is ready to debut, with eminently reasonable price tags from about $149 to $199. I've had the $199 card, the GeForce 6600 GT, in Damage Labs for a week now, and I'm nearly beside myself. Read on to find out why.

The GeForce 6600 GT
As you may recall from our preview of the GeForce 6600 line, the graphics chip that powers the 6600 series is code-named NV43. This chip is derived from the NV40 GPU found on GeForce 6800 cards, but to keep things cheap, NVIDIA has essentially chopped the number of functional units on the NV40 in half. NV43 has two of the pixel-pipeline "quads" found on NV40, for a total of eight pixel pipelines. Likewise, NV43 has three vertex shader units to the NV40's six, and NV43 has a 128-bit path to memory instead of the 256-bit interface used in NV40.

So the GPU that powers the GeForce 6600 GT has half the oomph, by many measures, of the one in the GeForce 6800 Ultra. Don't let that put you off, though. The NV43 retains all the goodness of the NV40 architecture, including powerful pixel shaders that have propelled the GeForce 6800 Ultra into contention with the Radeon X800 XT PE, despite a marked clock speed deficit. The NV43 also inherits 32-bit floating point color precision, support for Shader Model 3.0, and a host of other features from NV40. (If all of this sounds like crazy talk to you, you may want to start by reading our review of the GeForce 6800 Ultra, which introduces some of these features.)

The NV43 doesn't share its big brother's penchant for sucking up power and putting out heat, though. The NV43 is fabbed at TSMC on a 110nm process that's more advanced than the 130nm process used for the NV40. As a result, the NV43 runs cool and fast, with only a minimalist cooler, like so:


The GeForce 6600 GT reference card With only this puny little cooler and naked memory chips, the GeForce 6600 GT hits clock speeds that the GeForce FX 5800 Ultra required a Dustbuster to achieve: 500MHz for the GPU core and 1000MHz for the memory. At these speeds, this eight-pipe wonder has more pixel-pushing power than a GeForce FX 5950 Ultra or Radeon 9800 XT. That might be your first hint that the 6600 GT is something special.

You'll want to notice, also, that the card picture above is the PCI Express version of the 6600 GT. The NV43 GPU is NVIDIA's first native PCI Express design. The first GeForce 6600 GTs will be PCI-E versions, but AGP versions are coming. NVIDIA plans to use its "HSI" bridge chip to translate for the NV43 GPU on AGP cards. Before you go complaining that AGP is what you want, you may want to look closer. That little connector on the top edge of the card will allow a pair of GeForce 6600 GT cards to run in an SLI configuration on the right motherboard for double the rendering power, but SLI is a PCI Express-only affair.

Whether AGP or PCI-E, though, this puppy has loads of potential. Fortunately, for the first time in a long, long time, we're able to see exactly how well a new graphics card fulfills its potential, because we have a slew of next-generation games for testing. Let's see how it fares.