AFTER FIELDING a number of lackluster mid-range graphics chips based on its GeForce FX architecture, NVIDIA finally got its act together with the GeForce FX 5700 Ultra. When we reviewed the 5700 Ultra back in October, the card’s performance in DirectX 8-class games was at least as good asand often much better thanATI’s Radeon 9600 XT. Unfortunately, the 5700 Ultra stumbled in a number of DirectX 9-class games and synthetic benchmarks, casting a shadow of doubt about how the card might handle next-generation DirectX titles.
Not content to bet the mid-range farm on the 5700 Ultra, NVIDIA has added yet another GeForce FX card to its lineup, the GeForce FX 5900 XT. The 5900 XT will share the same $200 price point as NVIDIA’s existing GeForce FX 5700 Ultra, and both will compete with ATI’s $200 Radeon 9600 XT. This new card’s “XT” moniker suggests NVIDIA wants to knock the Radeon 9600 XT off its pedestal. NVIDIA has even whipped up an answer to ATI’s much-lauded Half-Life 2 bundle to sweeten the 5900 XT.
Can its pseudo-eight-pipe graphics core help elevate the GeForce FX 5900 XT above the competition? There’s only one way to find out.
eVGA’s GeForce FX 5900 SE
Don’t let its name fool you. eVGA’s GeForce FX 5900 SE is very much a 5900 XT. NVIDIA’s partners are free to name cards as they please, and most will be using the GeForce FX 5900 XT name. All GeForce FX 5900 XT cards should share the same core and memory clock speeds, and similar board layouts and memory configurations. Here are the specs on eVGA’s GeForce FX 5900 SE:
|Peak pixel fill rate||1600 Mpixels/s|
|Texture units/pixel pipeline||2|
|Textures per clock||8|
|Peak texel fill rate||3200 Mtexels/s|
|Memory type||BGA DDR2 SDRAM|
|Memory bus width||256-bit|
|Peak memory bandwidth||22.4GB/s|
|Ports||VGA, DVI, composite and S-Video outputs|
|Auxiliary power connector||4-pin Molex|
*The NV35 graphics chip renders four conventional (color + Z) pixels per clock, but is capable of performing 8 operations per clock for Z pixels, textures, and stencil and shader ops.
eVGA’s GeForce FX 5900 SE is a pretty plain looking card, which is just fine by me. As attractive as colored boards with blinking lights can be, how often are you actually looking at your PC’s internals?
In a move that will no doubt delight owners of Shuttle’s small form factor XPC systems, the GeForce FX 5900 SE uses a single-slot cooler that gets along just fine with Shuttle’s nonstandard AGP slot layout. The 5900 SE’s cooling fan is nice and quiet, too.
eVGA uses large memory heat sinks to cool the GeForce FX 5900 SE’s memory chips. Incidentally, those chips are only found on one side of the board.
Like just about every other consumer graphics card on the planet, eVGA’s 5900 SE has VGA, DVI, and S-Video output ports. The card also comes with a DVI-to-VGA adapter and an S-Video cable.
In addition to a couple of cables, eVGA’s 5900 SE comes with a pretty stacked software bundle. For starters, the bundle includes full versions of NVDVD 2.0, America’s Army, and Ghost Recon, but that’s not all. NVIDIA has also announced an exclusive deal to bundle the recently released WWII shooter Call of Duty with its GeForce FX 5900 XT graphics cards. Just about every one of NVIDIA’s partners, including eVGA, will be getting in on the deal.
NVIDIA’s Call of Duty deal represents roughly $50 of value for those who were planning on picking up the game, which is a pretty sweet deal. NVIDIA is providing its partners with full-version Call of Duty CDs rather than coupons or vouchers, so the game should come right in the box.
NVIDIA certainly isn’t the first graphics chip manufacturer to announce an exclusive, top-tier game bundle for its mid-range graphics cards. ATI announced its Radeon XT Half-Life 2 bundle back in September. Call of Duty might not have the name recognition of Half-Life 2, and it’s probably wasn’t nearly as eagerly anticipated as Valve’s upcoming sequel. But Call of Duty is available today, and there’s no telling when Radeon XT owners will be able to redeem their Half-Life 2 coupons. In the end, I’ll probably end up spending more time playing Half-Life 2 than I will Call of Duty, but there’s only so much amusement I can wring from a Half-Life 2 coupon. Making paper airplanes and origami are fun for the first five minutes, but replay value is pretty weak. (In all fairness, ATI and Valve are offering an interim game pack for Radeon buyers that includes a total of six downloadable Valve titles from years past. They are older titles, though.)
Our testing methods
As ever, we did our best to deliver clean benchmark numbers. Tests were run three times, and the results were averaged.
Our test system was configured like so:
|Processor||AMD Athlon 64 3200+ 2.0GHz|
|Front-side bus||HT 16-bit/600MHz downstream
HT 8-bit/600MHz upstream
|Motherboard||Chaintech Zenith ZNF3-150|
|Chipset||NVIDIA nForce3 Pro 150|
|Chipset drivers||nForce 3.13|
|Memory size||512MB (1 DIMM)|
|Memory type||Corsair XMS3500 PC3200 DDR SDRAM (400MHz)|
|Graphics card||GeForce FX 5900SE 128MB
GeForce FX 5700 Ultra 128MB
|Radeon 9600 XT 128MB|
|Graphics driver||Detonator FX 52.16||CATALYST 3.9|
|Storage||Maxtor DiamondMax Plus D740X 7200RPM ATA/100 hard drive|
|OS||Microsoft Windows XP Professional|
|OS updates||Service Pack 1, DirectX 9.0b|
Today we’ll be looking at the GeForce FX 5900 XT’s performance against its $200 competition: the GeForce FX 5700 Ultra and ATI’s Radeon 9600 XT. I’ve included results for the 5900 XT with NVIDIA’s previous 52.16 drivers, and the new 53.03s.
Though ATI’s Catalyst 3.9 drivers support Overdrive automatic overclocking for the Radeon 9600 XT, we ran the card with Overdrive disabled. Since Overdrive is controlled by the GPU core temperature, which is influenced by variable ambient system temperatures and the thermal characteristics of individual graphics cores, it’s hard to come up with reproducible scores.
It’s important to mention that, at press time, the NVIDIA Detonator 53.03 drivers we used for testing weren’t approved by FutureMark for use with 3DMark03. FutureMark has yet to evaluate the drivers in question. Keep in mind that future 3DMark03 patches may combat optimizations in unapproved drivers, which could impact performance in the benchmark.
The test system’s Windows desktop was set at 1024×768 in 32-bit color at a 75Hz screen refresh rate. Vertical refresh sync (vsync) was disabled for all tests.
We used the following versions of our test applications:
- FutureMark 3DMark03 Build 340
- Comanche 4 demo benchmark
- Quake III Arena v1.31 with trdemo1.dm_67
- Serious Sam SE v1.07 with Demo0003
- Unreal Tournament 2003 with trtest1.dem
- Splinter Cell v1.2 with TRKalinatekDemo.bin
- Tomb Raider: Angel of Darkness v49 patch
- Gun Metal benchmark v1.20
- ShaderMark 2.0
- rthdribl 1.2
- Halo 1.02
- Call of Duty with trdemo.dm_1
All the tests and methods we employed are publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.
Theoretical fill rate and memory bandwidth peaks don’t necessarily dictate real-world performance, but they’re a good place to start.
|Core clock (MHz)||Pixel pipelines||Peak fill rate (Mpixels/s)||Texture units per pixel pipeline||Peak fill rate (Mtexels/s)||Memory clock (MHz)||Memory bus width (bits)||Peak memory bandwidth (GB/s)|
|Radeon 9600 Pro||400||4||1600||1||1600||600||128||9.6|
|Radeon 9600 XT||500||4||2000||1||2000||600||128||9.6|
|GeForce FX 5600 Ultra||400||4||1600||1||1600||800||128||12.8|
|GeForce FX 5700 Ultra||475||4||1900||1||1900||906||128||14.4|
|GeForce FX 5900 XT||400||4||1600||2||3200||700||256||22.4|
|GeForce FX 5900||450||4||1800||2||3600||850||256||27.2|
With a core clock speed of 400MHz, the GeForce FX 5900 XT can’t quite match the 5700 Ultra or Radeon 9600 XT’s single texture fill rate. However, the 5900 XT’s multi-texturing fill rate blows away the competition, as does its peak memory bandwidth, which more than doubles what’s available with the Radeon 9600 XT. It’s also worth noting that the 5900 XT’s NV35 graphics chip can, under certain circumstances, look a lot like an eight-pipe chip.
Of course, theoretical peaks don’t always hold up in the real world. How does the 5900 XT deliver on its fill rate potential?
Despite having a lower theoretical single-texturing fill rate peak than both the 5700 Ultra and Radeon 9600 XT, the GeForce FX 5900 XT blows them both away in 3DMark03’s single-texturing fill rate tests. The 5900 XT is way out ahead in our multi-texturing test, too, nearly doubling the performance of the 5700 Ultra.
Given the GeForce FX’s unconventional shader architecture, it’s not easy to come up with theoretical expectations for pixel or vertex shader performance, so let’s cut right to the graphs.
The GeForce FX 5900 XT leads the pack in both pixel and vertex shader performance, though pixel shader performance drops noticeably with the latest drivers. Those same drivers dramatically increase vertex shader performance. Win some, lose some, I guess.
ShaderMark 2.0 is brand new and includes some anti-cheat measures to prevent drivers from applying questionable optimizations. The Radeons run the benchmark with straight pixel shader 2.0 code, but I’ve included results for the GeForce FX cards with partial-precision and extended pixel shaders, as well.
Some of ShaderMark 2.0’s shaders won’t run on the GeForce FX 5900 XT, or indeed any other GeForce FX chip. The problem is apparently related to floating point texture formats, which the GeForce FX architecture supposedly supports in hardware. NVIDIA claims that adding floating point texture format support to its drivers hasn’t been a priority, and even the latest 53.03s don’t appear to support the feature.
It can’t quite catch the Radeon 9600 XT, but the GeForce FX 5900 XT is still pretty fast in ShaderMark 2.0. The 5900 XT is much faster than the 5700 Ultra, no doubt because of the former’s ability to perform up to eight shader ops per clock.
Quake III Arena
Call of Duty
Unreal Tournament 2003
The 5900 XT performs well in our first wave of first-person shooters, but the 5700 Ultra steals a rare win in Call of Duty with antialiasing and anisotropic filtering disabled. Despite a better performance in Quake III with the latest 53.03 Detonators, the 5900 XT looks more comfortable with the older 52.16 drivers.
Gun Metal benchmark
The 5900 XT makes quick work of the competition in Comanche 4 and Gun Metal. The card’s more than twice as fast as a Radeon 9600 XT with 4X antialiasing and 8X aniso.
Serious Sam SE
NVIDIA cards have long performed well in Serious Sam SE, and the 5900 XT is no exception. The 5900 XT is way out ahead of both the 5700 Ultra and Radeon 9600 XT, with and without 4X antialiasing and 8X aniso.
The 9600 XT’s performance with 4X antialiasing and 8X aniso at 1600×1200 is wildly erratic through the first 20 seconds of our benchmark test and essentially unplayable at this resolution. This wasn’t the case with ATI’s previous 3.8 Catalyst drivers, so I suspect ATI has a bug on its hands.
The 5900 XT puts on another show in Splinter Cell, where it’s about 50% faster than the Radeon 9600 XT across all resolutions.
No doubt in part because of NVIDIA’s aggressive driver optimizations, the 5900 XT runs away with 3DMark03’s game tests. The latest version of 3DMark03 was incompatible with a number of NVIDIA’s driver optimizations, but the new 53.03s appear to have a whole new set of tricks up their sleeve. Of course, I couldn’t see any difference in rendering quality between the new 53.03s, the old 52.16s, or even ATI’s latest drivers in 3DMark03’s game tests.
Tomb Raider: Angel of Darkness
Tomb Raider: Angel of Darkness gets its own special little intro here because its publisher, EIDOS Interactive, has released a statement claiming that the V49 patch, which includes a performance benchmark, was never intended for public release. The V49 benchmark apparently fails to correctly load Tomb Raider’s GeForce FX-optimized code path, so it’s more a reflection of how the GeForce FX cards perform with default DirectX 9 code than anything else. Running the normal Tomb Raider game executable, without the benchmark mode enabled, loads the correct GeForce FX code path and promises better performance.
We’ve used these extreme quality settings from Beyond3D to give the GeForce FX 5900SE a thorough workout in this DirectX 9 game.
I had a lot of problems getting the 9600 XT to run the Tomb Raider test at anything other than 1024×768, which is really a shame. ATI’s previous Catalyst 3.8 drivers didn’t have a problem with higher resolution tests, but I couldn’t get the Catalyst 3.9s to work properly above 1024×768.
It’s a pity that the Radeon 9600 XT doesn’t have a full suite of scores here, because it could have been an exciting race. The 5900 XT is quite a bit faster than the 5700 Ultra in Tomb Raider, and could potentially be faster than the 9600 XT. All that with an unoptimized DirectX 9 code path, too.
The 5900 XT is 50% faster than the competition with antialiasing and anisotropic filtering disabled, but its lead with 4X antialiasing and 8X aniso is much smaller.
I used the “-use20” switch with the Halo benchmark to force the game to use version 2.0 pixel shaders.
In Halo’s benchmark timedemo, the 5900 XT is out in front by about 50% again.
Real-Time High-Dynamic Range Image-Based Lighting
To test the GeForce FX 5900SE’s performance with high-dynamic-range lighting, we logged frame rates via FRAPS in this technology demo at its default settings. The demo uses high-precision texture formats and version 2.0 pixel shaders to produce high-dynamic-range lighting, depth of field, motion blur, and glare, among other effects.
None of our GeForce FX cards renders the “rthdribl” demo perfectly, possibly because of their lack of support for floating point texture formats, but the 5900 XT still performs well. The Radeon 9600 XT is faster at lower resolutions, but the 5900 XT pulls even when we hit 1024×768.
ATI’s SMOOTHVISION gamma-corrected antialiasing does a better job of camouflaging jagged edges than the GeForce FX’s antialiasing schemes, but the 5900 XT is faster at 2X and 4X AA than the Radeon 9600 XT.
The 5900 XT is at the head of the class in our anisotropic filtering tests, though like every other GeForce FX, the 5900 XT’s aniso maxes out at 8X.
In testing, I managed to crank eVGA’s GeForce FX 5900 SE up to a core and memory clock speed of 460 and 880MHz, respectively. Considering that the vanilla GeForce FX 5900 is clocked at 450/850, that’s not a bad little overclock for a $200 graphics card.
As always, it’s important to note that just because I was able to get my GeForce FX 5900 XT sample stable and artifact-free at 460/880 doesn’t mean that every 5900 XT will be capable of those speeds. Some cards may clock higher, and some may not overclock at all.
Our overclocked 5900 XT gets a nice performance boost from its higher core and memory clock speeds, especially with antialiasing and aniso enabled.
If you’ve been following along, you’ve seen the GeForce FX 5900 XT simply dominate its $200 competition. Though the card is a little behind the Radeon 9600 XT in ShaderMark 2.0 and the “rthdribl” high dynamic range lighting demo, the 5900 XT makes up for it in nearly every other test. The fact that the 5900 XT is occasionally 50% faster than the 9600 XT is a little shocking, but it’s certainly good news for gamers looking for great performance at an affordable price.
As good as the 5900 XT is, I feel for NVIDIA’s partners who are also trying to sell 5700 Ultra boards. 5700 Ultras are retailing for just under $190 online, which wasn’t a bad deal until the 5900 XT came along at roughly same price with much better performance and a copy of Call of Duty in the box. eVGA tells me that its 5900 XT-based GeForce FX 5900 SE is ready to ship, leaving really no reason to go with a 5700 Ultra unless those cards’ prices fall dramatically.
The GeForce FX 5900 XT is no doubt a very fast, very affordable graphics option for enthusiasts and gamers on a budget, and right now I’d recommend it over the Radeon 9600 XT. However, I have some concerns about NVIDIA’s lack of driver support for floating point texture formats, which could become a more important issue as more DirectX 9 titles come to market. Adding support for floating point texture formats apparently hasn’t been a priority for NVIDIA’s driver team, but it should be. Perhaps NVIDIA could take some time away from optimizing for 3DMark03 and dedicate more software engineers to floating point texture support.
Optimization jabs aside, NVIDIA has put together a sweet deal with its GeForce FX 5900 XT and Call of Duty bundling deal. Don’t let eVGA’s GeForce FX 5900 SE name fool you, either; this card is packing as much goodness as everyone else’s 5900 XT cards.