NVIDIA’s GeForce 6600 GT AGP graphics card

WHILE WE WERE extremely impressed with the GeForce 6600 GT when it was launched back in September, the card’s PCI Express X16 interface left us longing for an AGP flavor. It’s wasn’t that we didn’t like PCI Express, but the only commercially available PCI Express platform was saddled with the Pentium 4’s comparatively poor gaming performance. Since September, we’ve previewed Athlon 64 chipsets with PCI Express support from ATI, NVIDIA, and VIA, but none are available in production motherboards just yet. You’ll still need an LGA775 Pentium 4 or Celeron processor if you want to get in on PCI Express graphics.

Fortunately, you don’t need PCI Express to get in on the GeForce 6600 GT, at least not anymore. With a little help from its High Speed Interconnect (HSI) bridge chip, NVIDIA has brought the GeForce 6600 GT to AGP. Was anything lost in the translation? Read on as we explore the GeForce 6600 GT AGP’s performance against its PCI Express counterpart and mid-range competition.

The GeForce 6600 GT AGP
Before we get into the GeForce 6600 GT AGP, I suggest you read through our review of the GeForce 6600 GT for PCI Express. The two cards are quite similar, except where they’re not. I’ll be going over the key differences between the two.


The NVIDIA GeForce 6600 GT AGP reference card

To get things started, let me point out some physical features that differentiate the 6600 GT AGP from its PCI Express counterpart. First, note that the GeForce 6600 GT AGP has an auxiliary power connector. This auxiliary power connector isn’t required on the GeForce 6600 GT, which can draw up to 75W from a PCI Express X16 graphics slot. The PCI Express GeForce 6600 GT can also be teamed with a second card using NVIDIA’s SLI technology. However, since SLI requires PCI Express, you won’t find an SLI connector on the GeForce 6600 GT AGP. Motherboards aren’t available with dual AGP slots, anyway.

Removing the GeForce 6600 GT AGP’s heat sinks reveals even more about the card. The fact that there are two heat sinks to remove is also telling.


The GeForce 6600 GT GPU and HSI bridge chip

The GeForce 6600 GT AGP has two heat sinks because there are actually two NVIDIA chips on board. The first is the GeForce 6600 GT GPU, otherwise known as NV43. The second is NVIDIA’s High-Speed Interconnect (HSI) bridge chip, which translates between AGP and PCI Express. NVIDIA currently uses the HSI bridge to make its graphics chips with AGP interfaces work on PCI-E motherboards. The GeForce PCX line of cards use the bridge chip in this way. The NV43, however, already has a built-in PCI Express interface, so for the AGP version of the GeForce 6600 GT, NVIDIA is turning the HSI chip around and using it to bridge between the PCI-E graphics chip and an AGP motherboard.

As far as pixel pipelines, shaders, and core clock speeds are concerned, the GeForce 6600 GT AGP is identical to the GeForce 6600 GT. The two differ when it comes to memory clock speeds, though. Both cards have a 128-bit memory bus and GDDR3 memory, but at 450MHz, the GeForce 6600 GT AGP’s memory is clocked 50MHz lower than the PCI Express version of the card. That 50MHz deficit translates to 1.6GB/sec less memory bandwidth for the GeForce 6600 GT AGP, which could make the card slightly slower than its PCI Express counterpart.

 

Our testing methods
As ever, we did our best to deliver clean benchmark numbers. Tests were run three times, and the results were averaged. All graphics driver image quality settings were left at their defaults, with the exceptions that vertical refresh sync (vsync) was always disabled and geometry instancing was enabled on the Radeon cards.

Our test system was configured like so:

Processor Intel Pentium 4 Extreme Edition 3.4GHz
Front-side bus 800MHz (200MHz quad pumped)
Motherboard Abit AG8 Shuttle SB77
Bios revision 6.00PG S00B
North bridge Intel 915P Intel 875P
South bridge Intel ICH6R Intel ICH5R
Chipset drivers 6.0.1.1002
Memory size 1GB (2 DIMMs)
Memory type OCZ PC3200 EL Platinum Rev 2 DDR SDRAM at 400MHz
CAS latency 2
Cycle time 5
RAS to CAS delay 2
RAS precharge 2
Hard drives Western Digital Raptor WD360GD 37GB
Audio ICH6R/ALC658 ICH5R/ALC658
Graphics ATI Radeon X700 XT 128MB with CATALYST 4.11 drivers
ATI Radeon X700 Pro 256MB with CATALYST 4.11 drivers
NVIDIA GeForce 6600 GT 128MB with ForceWare 66.93 drivers
ATI Radeon 9800 Pro 256MB with CATALYST 4.11 drivers
NVIDIA GeForce 6800 128MB with ForceWare 66.93 drivers
NVIDIA GeForce 6600 GT AGP 128MB with ForceWare 66.93 drivers
OS Microsoft Windows XP Professional with Service Pack 2

We’ll be comparing the GeForce 6600 GT’s performance with that of a number of mid-range AGP and PCI Express graphics cards. The 6600 GTs have 128MB of memory, but a few of the cards we’ll be testing them against come with 256MB of memory. It’s common practice for board vendors to differentiate their products by offering more memory, and while this never seems to improve performance on low-end cards, it just might do so for the latest crop of mid-range graphics chips and today’s more demanding games. All the cards we tested are available in roughly the same price range, give or take about $70, so the comparisons are reasonably fair regardless of memory size.

I should also note that our Radeon 9800 Pro 256MB is an underclocked Radeon 9800 XT. There are rumors swirling that newer Radeon 9800 Pro 256MB boards actually use Radeon 9800 XT chips running at lower clock speeds, although we’ve been unable to confirm this with ATI.

Be aware that there are inevitable platform differences when comparing AGP and PCI Express graphics cards. Differences between the motherboards and core logic chipsets used for each platform could have an impact on performance, even with all other system components being identical.

The test systems’ Windows desktops were set at 1280×1024 in 32-bit color at an 85Hz screen refresh rate.

We used the following versions of our test applications:

The tests and methods we employed are generally publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.

 

Pixel filling power
Higher theoretical fill rate and memory bandwidth numbers don’t always guarantee better performance in real world applications, but they’re a good place to start. Here’s how the GeForce 6600 GT’s theoretical peaks stack up.

  Core clock (MHz) Pixel pipelines  Peak fill rate (Mpixels/s) Texture units per pixel pipeline Peak fill rate (Mtexels/s) Memory clock (MHz) Memory bus width (bits) Peak memory bandwidth (GB/s)
GeForce 6200 300 4 1200 1 1200 TBD 128 TBD
Radeon X300 325 4 1300 1 1300 400 128 6.4
Radeon X600 Pro 400 4 1600 1 1600 600 128 9.6
GeForce FX 5700 Ultra 475 4 1900 1 1900 900 128 14.4
Radeon 9600 XT 500 4 2000 1 2000 600 128 9.6
Radeon X600 XT 500 4 2000 1 2000 740 128 11.8
GeForce 6600 300 8* 2400 1 2400 TBD 128 TBD
Radeon 9800 Pro 380 8 3040 1 3040 680 256 21.8
Radeon 9800 Pro 256MB 380 8 3040 1 3040 700 256 22.4
GeForce FX 5900 XT 400 4 1600 2 3200 700 256 22.4
Radeon X700 400 8 3200 1 3200 600 128 9.6
Radeon 9800 XT 412 8 3296 1 3296 730 256 23.4
Radeon X700 Pro 420 8 3360 1 3360 864 128 13.8
GeForce FX 5900 Ultra 450 4 1800 2 3600 850 256 27.2
GeForce FX 5950 Ultra 475 4 1900 2 3800 950 256 30.4
Radeon X700 XT 475 8 3800 1 3800 1050 128 16.8
GeForce 6800  325 12 3900 1 3900 700 256 22.4
GeForce 6600 GT AGP 500 8* 2000 1 4000 900 128 14.4
GeForce 6600 GT 500 8* 2000 1 4000 1000 128 16.0
GeForce 6800 GT 350 16 5600 1 5600 1000 256 32.0
Radeon X800 Pro 475 12 5700 1 5700 900 256 28.8
GeForce 6800 Ultra 400 16 6400 1 6400 1100 256 35.2

Although the GeForce 6600 GT’s unconventional rendering pipeline yields a comparably low single-texturing fill rate, the card’s multi-texturing fill rate is monstrous. The GeForce 6600 GT AGP has a huge multi-texturing fill rate advantage over the Radeon 9800 Pro, which will be its most direct competitor as far as AGP cards are concerned. ATI has no plans to move the X700 series to AGP.

The GeForce 6600 GT AGP’s impressive multi-texturing fill rate flirts with those of higher-end cards like the GeForce 6800, but its 128-bit memory bus brings it back down to mid-range territory when we look at peak memory bandwidth. Thanks to lower memory clock speeds, the GeForce 6600 GT AGP has 1.6GB/sec less memory bandwidth than its PCI-E counterpart.

Since theoretical peaks don’t always determine real-world performance, let’s see how the 6600 GT AGP’s fill rates look in some synthetic tests.

True to their theoretical peaks, the GeForce 6600 GTs lead the way when it comes to multi-texturing fill rate. 3DMark05’s single-texturing fill rate test looks like it might be memory bandwidth limited, at least with the 6600 GT AGP, which is clearly slower than its PCI-E counterpart.

Shader power
3DMark05 also has a trio of handy shader performance tests. The results of these tests don’t necessarily predict real-world gaming performance, but they may help us to explain any performance differences we see later. Like some games, 3DMark05’s shader tests have optimized code paths for shader models 2.0, 2.0b, and 3.0. I tested each card with its highest supported shader model: 3.0 for the 6800 and 6600 GTs, 2.0b for the X700s, and 2.0 for the 9800 Pro.

The 6600 GT AGP’s performance in the pixel shader test is quite strong, but the vertex shader tests yield less impressive results. The Radeons clearly have vertex shader power superior to the GeForce 6600 GT. Somewhat surprisingly, the Radeon X700 cards beat out the GeForce 6800, as well.

Although the vertex shader units from ATI and NVIDIA aren’t entirely comparable in terms of performance, these results show the architectural differences between the Radeon X700 and GeForce 6600 GPUs. The 6600 has three of NVIDIA’s vertex units, while the Radeon X700 has six of ATI’s. Also, the higher-end GeForce 6800 cards have six vertex shader units, but the vanilla 6800 has only five, which explains its relatively unimpressive performance here, too.

 

Doom 3 – Delta Labs
We’ll kick off our gaming benchmarks with Doom 3. Our first Doom 3 test uses a gameplay demo we recorded inside the Delta Labs complex, and it represents the sorts of graphics loads you’ll find in most of the game’s single-player levels. We’ve tested with Doom 3’s High Quality mode, which turns on 8X anisotropic filtering by default.

The GeForce 6600 GT AGP’s performance in our DOOM 3 Delta Labs demo is fantastic. The card averages over 60 frames per second at 1600×1200 and is nearly as fast as the more expensive GeForce 6800. The 6600 GT AGP loses some of its luster when we turn on 4X antialiasing, though. With 4X AA, the card is considerably slower than the GeForce 6800, but still much faster than the Radeon 9800 Pro.

 

Doom 3 – Heat Haze
This next demo was recorded in order to test a specific effect in Doom 3: that cool-looking “heat haze” effect that you see whenever a demon hurls a fireball at you. We figured this effect would be fairly shader intensive, so we wanted to test it separately from the rest of the game.

In our more demanding heat haze demo, the GeForce 6600 GT AGP does something a little odd. The card produces a higher frame rates than its PCI-E counterpart. There are a number of possible explanations for the performance difference, which is larger at lower resolutions, but I’ll hold off on guessing until we see a few more test results.

 

Counter-Strike Source: Video Stress Test
Counter-Strike: Source includes a video stress test that’s essentially the same test Valve used for benchmarking Half-Life 2 performance last year. The video stress test presents a sort of worst-case scenario for graphics in the Source engine, layering surfaces with multiple pixel shader effects on top of each other. If a card does well in the video stress test, you can probably expect it to run Half-Life 2 quite well.

By the time you read this, Valve should have released Half-Life 2 to the masses. It looks like the GeForce 6600 GT will handle the game quite well—better than our trio of Radeons, if the Video Stress Test is to be believed.

I can’t help but be a little puzzled by some of the performance numbers in the VST with 4X antialiasing and 8X anisotropic filtering. Our 128MB PCI Express cards, the GeForce 6600 GT and Radeon X700 XT, are performing lower than one might expect given the performance of our GeForce 6600 GT AGP and Radeon X700 Pro. The Radeon X700 Pro’s performance advantage could be the result of its 256MB of memory, but the difference in 6600 GT performance makes me wonder if the PCI Express bus may also be playing a role.

 

Counter-Strike Source
Quite possibly the most popular multiplayer first-person shooter around, Counter-Strike has been ported to Valve’s new Source engine. Our Counter-Strike: Source test uses a demo of online gameplay on the cs_italy map.

The GeForce 6600 GT AGP is way ahead of the Radeons in Counter-Strike: Source, at least with antialiasing and anisotropic filtering disabled. With 4X AA and 8X aniso, the 6600 GT AGP’s performance is roughly on par with the GeForce 6800, Radeon 9800 Pro, and Radeon X700 Pro, all three of which are 256MB cards.

Again, the PCI Express GeForce 6600 GT and Radeon X700 XT underperform with antialiasing and aniso enabled. The fact that they’re both 128MB PCI Express cards caused Scott Wasson, our resident graphics guru, to muse that the PCI Express bus may have a greater graphics memory footprint than AGP. If that’s the case, 128MB PCI-E cards may be slower in applications that demand a lot of graphics memory.

 

Far Cry – Pier
We’re using the 1.3 version of Far Cry, which includes support for shader models 2.0, 2.0b, 3.0, and geometry instancing. To maximize each card’s potential, I tested the GeForce 6 series cards with shader model 3.0, the X700s with 2.0b, and the Radeon 9800 Pro with 2.0. Geometry instancing is already a part of the shader model 3.0 code path, but it needs to be enabled using a command line variable with the shader model 2.0 and 2.0b paths.

We used “Very high” for all in-game quality settings, with the exception of water, which was set to “Ultra high.” In other words, everything was maxed out.

The Pier level in Far Cry is an outdoor area with dense vegetation, and it makes good use of geometry instancing to populate the jungle with foliage.

The GeForce 6600 GT AGP looks quite comfortable in Far Cry’s dense jungles.

 

Far Cry – Volcano
Like our Doom 3 “heat haze” demo, the Volcano level in Far Cry includes lots of pixel shader warping and shimmering.

The 6600 GT AGP doesn’t skip a beat when we move inside Far Cry’s volcano. Here it manages to sneak to a small lead over the Radeon 9800 Pro.

For some reason, the X700 XT stutters through the volcano demo at 1024×768, but not at higher resolutions. I double and triple-checked the results, running through the demo a total of nine times, but the X700 XT’s scores were consistently within a few frames per second.

 

Far Cry – High Dynamic Range lighting
Far Cry 1.3 also adds support for OpenEXR high dynamic range (HDR) lighting effects. Here’s what HDR lighting looks like in action using the /r_hdrrendering 7 console variable:


Without HDR lighting


With HDR lighting

Purdy. Unfortunately, Far Cry’s HDR lighting effects only work on GeForce 6-series cards, so I wasn’t able to test performance on the Radeons. Maybe that’s not such a bad thing, because at least on the GeForce cards, the performance hit associated with HDR lighting is huge:

As lush as the HDR lighting is, the 6600 GT barely manages 30 frames per second at 1024×768. It looks great, but on the GT, I’d rather have higher resolutions with antialiasing and anisotropic filtering turned up.

 

Rome: Total War
To get a bit of a break from the first-person shooters, we’re testing this real-time strategy game with some very fancy graphics of its own. We used FRAPS to measure performance in this game during the first 150 seconds of the tutorial battle. This sequence is scripted and repeatable, and it switches between birds-eye views and close-ups of the armies. Of course, we turned up all the game’s eye candy to max before testing.

Rome: Total War doesn’t want to go any higher than about 45 frames per second on any of the cards. Of course, that’s just an average frame rate. What about performance across the length of the demo?

Performance is pretty consistent across all six cards, although the GeForce 6600 GT AGP does manage slightly higher frame rates in a few areas. The card doesn’t dip below 30 frame per second at any point, either.

 

Rome: Total War – con’t

Antialiasing and anisotropic filtering give us some separation in Rome: Total War, and the GeForce 6600 GT AGP looks very good indeed.

The GeForce 6600 GT AGP shows no alarming performance spikes or drops across the length of the tutorial demo. All three GeForce cards clearly have an advantage through the first 50 or so seconds of the demo, which consists mainly of open panning shots.

 

Xpand Rally
The Xpand Rally single-player demo is a new addition to our graphics benchmark suite. To test this game, I used FRAPS to capture frame rates during the replay of a custom-recorded demo.

The GeForce 6600 GT AGP fares well in Xpand Rally and is faster than its PCI Express counterpart yet again. Oddly, it’s also faster than the GeForce 6800. Here’s how the cards perform across the length of the demo:

Apart from the GeForce 6800’s lower-than-expected performance, our second-by-second frame rate results don’t reveal anything alarming.

 

Xpand Rally – con’t

Despite having a lower memory clock, the GeForce 6600 GT AGP continues to best its PCI Express counterpart. With 4X antialiasing and 8X aniso, the card is sitting pretty in Xpand Rally.

Second-by-second, the GeForce 6600 GT AGP’s performance is golden. The card doesn’t drop below 40 frames per second at 1024×768 with 4X AA and 8X AF, which is impressive indeed.

 

3DMark05
None of the drivers we’re using are approved by FutureMark for use with 3DMark05, largely because they are too new. Keep that in mind as you look over the results.

Although its place in the pack varies slightly from test to test, the GeForce 6600 GT AGP turns in a strong performance in 3DMark05’s game tests.

Note that our 128MB PCI-E cards are having problems keeping up again. This time it’s in 3DMark05’s first game test.

 

Texture filtering

From no aniso to 8X, the GeForce 6600 GT AGP scales predictably. There’s almost no performance hit going from no anisotropic filtering to 8X, so it’s definitely worth turning up.

Edge antialiasing

Antialiasing isn’t as free as aniso, but when combined with lower resolutions, it’s certainly smooth on the GeForce 6600 GT AGP. As we scale from no antialiasing to 2X and 4X, the 6600 GT AGP keeps pace with its PCI-E counterpart.

 
Conclusions
Despite using slightly slower memory, the GeForce 6600 GT easily measures up to the original PCI Express version. The GT doesn’t have much of a problem knocking off its main AGP competition, the Radeon 9800 Pro, either. This one’s simple. If you’re looking to spend about $200 on a new AGP graphics card, you want the GeForce 6600 GT AGP. Gaming frame rates are great, DOOM 3 performance borders on intimidating, and all the little extras that NVIDIA packs into its ForceWare graphics drivers are gravy.

Although the GeForce 6600 GT AGP is impressive on its own merits, it’s an even safer bet because ATI seems content to challenge the card with last year’s Radeon 9800 Pro. The 9800 Pro does offer some advantages, including a 256-bit memory bus and a more conventional eight-pipe architecture, but those things don’t translate to better overall performance than the 6600 GT AGP. An AGP version of the Radeon X700 XT might stand a better chance, but ATI seems content to offer the 9800 Pro for AGP systems.

In the end, NVIDIA deserves credit not only for delivering a sweet $200 graphics card in the GeForce 6600 GT AGP, but also for having the foresight to develop the flexible HSI bridge chip that makes this card possible. 

Comments closed

Pin It on Pinterest

Share This

Share this post with your friends!