FOR NEARLY a year, ATI’s mid-range Radeons have owned the performance crown for mid-range graphics. ATI’s dominance started with the Radeon 9500 Pro, continued with the Radeon 9600 Pro, and was most recently refreshed with the 9600 XT. NVIDIA took a stab at the mid-range with its GeForce FX 5600s, but not even a faster respin of the GeForce FX 5600 Ultra had enough punch to take on the Radeon 9600 line, especially in DirectX 9 applications.
Before the Radeon 9500 Pro came along, NVIDIA’s GeForce4 Ti 4200 was the most widely recommended graphics card for budget-conscious enthusiasts. NVIDIA knows what it takes to be a mid-range market leader. With memories of the GeForce4 Ti 4200’s glory no doubt in mind, NVIDIA is ready to put up a fight for the mid-range performance crown with the GeForce FX 5700 Ultra. The 5700 Ultra is powered by a new NV36 graphics chip and promises more shader power, higher clock speeds, and greater memory bandwidth than its predecessor. On paper, NVIDIA’s third shot at this generation’s mid-range graphics crown looks pretty good, but does it have the charm to capture the hearts and minds of budget-conscious gamers and PC enthusiasts? Will this be NVIDIA’s third strike with the budget-conscious crowd? Read on as we unleash the FX 5700 Ultra on ATI’s Radeon 9600 cards to find out.
Introducing NV36
The GeForce FX 5700 line is based on NVIDIA’s new NV36 graphics chip, which is essentially a mid-range version of the high-end NV35 chip found in NVIDIA’s GeForce FX 5900 and 5900 Ultra. The NV36-powered GeForce FX 5700 Ultra will replace the mid-range FX 5600 Ultra at a suggested retail price of $199, and is expected on retail shelves starting this Sunday.
NV36 doesn’t represent a major redesign of NVIDIA’s current GeForce FX architecture, but the chip does have a number interesting characteristics that are worth highlighting.
- 4×1-pipe design – Like NV31, NV36 has four pixel pipelines with a single texture unit per pipe. The chip behaves like a four-pipe design regardless of the kind of rendering being done, which makes it a little bit easier to understand than something like NV30, which can act like eight-pipe chip under certain circumstances.
- 128-bit memory bus – NV36’s memory bus is 128 bits wide, just like NV31’s. However, NV36 supports DDR2 and GDDR3 memory types in addition to standard DDR SDRAM. Initially, GeForce FX 5700 Ultra cards will ship with DDR2 memory chips running at 450MHz, but board manufacturers may eventually exploit the chip’s compatibility with different memory types to produce budget cards with DDR SDRAM or more exotic offerings with GDDR3.
- ‘mo shader power – The GeForce FX 5600 Ultra’s shader performance never really cut it against ATI’s mid-range Radeons, so NVIDIA has beefed up shaders for NV36. The chip boasts three vertex units that conspire to deliver triple the vertex processing power of NV31. NVIDIA also re-architected its programmable pixel shader for NV36, though no specific pixel shader performance claims are being made.
- Chip fabrication by IBM – Unlike the rest of NVIDIA’s NV3x graphics chips, NV36 is being manufactured using IBM’s 0.13-micron fabrication process. NV36 is the first product to emerge from NVIDIA’s recently announced partnership with IBM, and NVIDIA is quite happy with how well things have worked out so far. NV36 isn’t built using low-k dielectrics like NV38 or ATI’s RV360 GPUs, but it’s still clocked at a speedy 475MHz on the GeForce FX 5700 Ultra.
What’s particularly impressive about NV36’s fabrication is the fact that NVIDIA was able to get its very first chip sample from IBM up and running Quake just 50 minutes after the chip entered NVIDIA’s testing lab. A testament to IBM’s mad fabrication skills, the first A01 spin of NV36 silicon is actually being used for retail versions of the chip.
Overall, NV36 doesn’t represent a radical departure from the GeForce FX architecture; the chip should share all the perks that go along with “cinematic computing,” but it will also inherit a number of quirky personality traits that have thus far had a negative impact on performance.
NVIDIA continues to reiterate the fact that its entire GeForce FX line is sensitive to instruction ordering and pixel shader precision. Optimized code paths can help NV36 and the rest of the GeForce FX line realize their full potential, but NVIDIA’s new Detonator 50 driver also has a few tricks up its sleeve to improve performance. You can read all about the Detonator 50 drivers in our GeForce FX 5950 Ultra review.
NV36: NVIDIA’s first GPU fabbed by IBM
The specs
Perhaps to illustrate just how close the GeForce FX 5700 Ultra is to store shelves, NVIDIA sent out retail cards instead of standard reference review samples. The eVGA e-GeForce FX 5700 Ultra that showed up on my doorstep came in a full retail box, shrink-wrapped and everything. Let’s have a quick look at the card’s spec sheet.
GPU | NVIDIA NV36 |
Core clock | 475MHz |
Pixel pipelines | 4 |
Peak pixel fill rate | 1900 Mpixels/s |
Texture units/pixel pipeline | 1 |
Textures per clock | 4 |
Peak texel fill rate | 1900 Mtexels/s |
Memory clock | 906MHz* |
Memory type | BGA DDR2 SDRAM |
Memory bus width | 128-bit |
Peak memory bandwidth | 14.5GB/s |
Ports | VGA, DVI, composite and S-Video outputs |
Auxiliary power connector | 4-pin Molex |
NVIDIA’s reference spec calls for an effective memory clock of 900MHz, but our sample’s memory was running at 906MHz
The GeForce FX 5700 Ultra is all about high clock speeds and fancy memory. Quite honestly, I didn’t expect cards with 450MHz DDR2 memory chips to hit $200 price points this soon, but I’m certainly not going to complain. Profit margins on GeForce FX 5700 Ultras may be slimmer than with other cards, but that’s a good thing for consumers looking for the most bang for their buck.
Here’s a few nudies of the e-GeForce FX 5700 Ultra to drool over before we get started with the benchmarks.
The GeForce FX 5700 Ultra looks imposing, and it is
475MHz with a single-slot cooler that doesn’t sound like a Dustbuster.
Imagine that!
BGA DDR2 memory chips bring in the bandwidth
(Insert incessant whining about the lack of dual DVI here)
Our testing methods
As ever, we did our best to deliver clean benchmark numbers. Tests were run three times, and the results were averaged.
Our test system was configured like so:
System | ||
Processor | Athlon XP ‘Thoroughbred’ 2600+ 2.083GHz | |
Front-side bus | 333MHz (166MHz DDR) | |
Motherboard | DFI LANParty NFII Ultra | |
Chipset | NVIDIA nForce2 Ultra 400 | |
North bridge | nForce2 Ultra 400 SPP | |
South bridge | nForce2 MCP-T | |
Chipset drivers | NVIDIA 2.45 | |
Memory size | 512MB (2 DIMMs) | |
Memory type | Corsair XMS3200 PC2700 DDR SDRAM (333MHz) | |
Graphics card | GeForce FX 5600 Ultra 128MB GeForce FX 5700 Ultra 128MB |
Radeon 9600 Pro 128MB Radeon 9600 XT 128MB |
Graphics driver | Detonator FX 52.16 | CATALYST 3.8 |
Storage | Maxtor DiamondMax Plus D740X 7200RPM ATA/100 hard drive | |
OS | Microsoft Windows XP Professional | |
OS updates | Service Pack 1, DirectX 9.0b |
Today we’ll be comparing the GeForce FX 5700 Ultra to the GeForce FX 5600 Ultra (rev 2), and to a couple of Radeon 9600 flavors from ATI. NVIDIA was able to get us a set of WHQL-certified Detonator FX 52.16 drivers for testing, giving us our first peek at Microsoft-approved Detonator 50 drivers. The Detonator FX 52.16 drivers should be available for public download today.
The test system’s Windows desktop was set at 1024×768 in 32-bit color at a 75Hz screen refresh rate. Vertical refresh sync (vsync) was disabled for all tests.
We used the following versions of our test applications:
- FutureMark 3DMark03 Build 330
- Codecreatures Benchmark Pro
- Comanche 4 demo benchmark
- Quake III Arena v1.31 with trdemo1.dm_67
- Wolfenstein: Enemy Territory with demo0000.dm_82
- Serious Sam SE v1.07 with Demo0003
- Unreal Tournament 2003 with trtest1.dem
- Splinter Cell v1.2 with TRKalinatekDemo.bin
- Tomb Raider: Angel of Darkness v49 patch
- Gun Metal benchmark v1.20
- ShaderMark 2.0
- rthdribl 1.2
- Halo 1.02
- AquaMark3
All the tests and methods we employed are publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.
Fill rate
Theoretical fill rate and memory bandwidth peaks don’t necessarily dictate real-world performance, but they’re a good place to start.
Core clock (MHz) | Pixel pipelines | Peak fill rate (Mpixels/s) | Texture units per pixel pipeline | Peak fill rate (Mtexels/s) | Memory clock (MHz) | Memory bus width (bits) | Peak memory bandwidth (GB/s) | |
Radeon 9600 Pro | 400 | 4 | 1600 | 1 | 1600 | 600 | 128 | 9.6 |
Radeon 9600 XT | 500 | 4 | 2000 | 1 | 2000 | 600 | 128 | 9.6 |
GeForce FX 5600 Ultra | 400 | 4 | 1600 | 1 | 1600 | 800 | 128 | 12.8 |
GeForce FX 5700 Ultra | 475 | 4 | 1900 | 1 | 1900 | 906 | 128 | 14.5 |
When it comes to theoretical peaks, the GeForce FX 5700 Ultra is simply a monster. The card’s 475MHz core clock speed yields pixel and texel fill rates that rival the Radeon 9600 XT, but the real story is the card’s memory bandwidth. With DDR2 memory chips running at an effective 906MHz, the 5700 Ultra offers a whopping 14.5GB/s of memory bandwidth50% more bandwidth than the Radeon 9600 XT. ATI has caught some flack for the Radeon 9600 XT’s relatively unimpressive memory bandwidth, but the card has thus far held its own quite nicely against the GeForce FX 5600 Ultra. Against the bandwidth-rich 5700 Ultra, the Radeon 9600 XT may not be so lucky.
Of course, theoretical peaks are sometimes worth little more than the paper they’re printed on. Let’s see how the FX 5700 Ultra’s fill rate specs pan out in practice.
In 3DMark03’s fill rate tests, the GeForce FX 5700 Ultra delivers the best single-textured fill rate performance, but is stuck behind both Radeons when it comes to multitexturing.
Shaders
When NVIDIA claimed that NV36’s shader power was much improved, it wasn’t kidding. In 3DMark03’s pixel shader 2.0 test, the GeForce FX 5700 Ultra is 50% faster than the 5600 Ultra, but even then it can’t catch the Radeon 9600 Pro. Vertex shaders are another story, though. The 5700 Ultra bursts to the front of the pack in 3DMark03’s vertex shader test. Most striking is the fact that the 5700 Ultra delivers more than double the vertex shader performance of the 5600 Ultra.
ShaderMark 2.0
ShaderMark 2.0 is brand new and includes some anti-cheat measures to prevent drivers from applying questionable optimizations. The Radeons run the benchmark with straight pixel shader 2.0 code, but I’ve included results for the GeForce FX cards with partial-precision and extended pixel shaders, as well.
The GeForce FX 5700 Ultra shows significantly better pixel shader performance in ShaderMark than the 5600 Ultra; in many cases the 5700 Ultra doubles the frame rate of its predecessor. However, as much of an improvement as the 5700 Ultra is over the 5600 Ultra, the card still can’t catch either Radeon 9600 in ShaderMark. Because NVIDIA’s Detonator FX drivers have yet to expose NV36’s hardware support for floating point texture formats, the GeForce FX cards aren’t able to complete a number of ShaderMark’s tests. Unfortunately, NVIDIA hasn’t yet set a timetable for including floating point texture support in its GeForce FX drivers, and it’s unclear when the functionality will be exposed to applications.
Quake III Arena
Wolfenstein: Enemy Territory
Unreal Tournament 2003
The GeForce FX 5700 Ultra offers improved performance in a number of first-person shooters, but we’re not looking at a revolution here. With the GeForce FX 5700 Ultra and Radeon 9600 XT each winning three of six tests, this one’s a wash.
Comanche 4
Codecreatures Benchmark Pro
Gun Metal benchmark
The GeForce FX 5700 Ultra comes out ahead in Comanche, Codecreatures, and Gun Metal. The card’s performance with antialiasing enabled is especially impressive, as is its huge performance advantage over the GeForce FX 5600 Ultra in Gun Metal.
Serious Sam SE
Serious Sam SE has always performed well on NVIDIA hardware, and that doesn’t change with the GeForce FX 5700 Ultra. The 5700 Ultra even comes out ahead with antialiasing and aniso enabled.
Splinter Cell
The GeForce FX 5700 Ultra is the only card to really distance itself from the pack in Splinter Cell and offer consistently better frame rates across multiple resolutions.
3DMark03
The GeForce FX 5700 Ultra and Radeon 9600 XT split 3DMark03’s game tests, but the Radeon comes out ahead by a hair if we look at the overall score. Notice how much the FX 5700 Ultra improves on its predecessor’s performance in the pixel shader-packed Mother Nature test; NVIDIA’s managed to improve performance by almost 50%.
Tomb Raider: Angel of Darkness
Tomb Raider: Angel of Darkness gets its own special little intro here because its publisher, EIDOS Interactive, has released a statement claiming that the V49 patch, which includes a performance benchmark, was never intended for public release. Too late, the patch is already public. We’ve used these extreme quality settings from Beyond3D to give the GeForce FX 5700 Ultra a thorough workout in this DirectX 9 game.
The GeForce FX 5700 Ultra does a lot better in Tomb Raider than the 5600 Ultra, but the Radeons still have a healthy lead. What’s worse, both GeForce FX cards refuse to run with 4X antialiasing and 8X aniso at resolutions above 1024×768.
It’s important to note that Tomb Raider’s benchmark mode uses the default DirectX 9 code path rather than the NVIDIA-optimized path that’s used when the game is actually being played normally. The scores you see above aren’t necessarily a reflection of Tomb Raider gameplay performance, but they do show how poorly the GeForce FX 5700 can perform when it’s forced to deal with an unoptimized DirectX 9 code path. Still, the 5700 Ultra’s performance is much-improved over the 5600 Ultra, and NVIDIA’s latest drivers have significantly improved the performance of both cards.
AquaMark3
In AquaMark3, GeForce FX 5700 Ultra performs well until antialiasing and aniso are enabled. The card doesn’t perform particularly poorly with 4X antialiasing and 8X aniso, but it’s well behind the Radeon 9600 XT and even a hair behind the 9600 Pro.
Halo
I used the “-use20” switch with the Halo benchmark to force the game to use version 2.0 pixel shaders.
The GeForce FX 5700 Ultra is much faster than its predecessor in Halo and even has enough horsepower to best the Radeon 9600 XT. Halo’s benchmark timedemo only runs through cut scenes, so the frame rates you see aren’t necessarily a reflection of gameplay performance. However, Halo’s cut scenes are all done using the game engine, so the benchmark mode is a valid tool for graphics performance comparisons.
Real-Time High-Dynamic Range Image-Based Lighting
To test the GeForce FX 5600 Ultra’s performance with high-dynamic-range lighting, we logged frame rates via FRAPS in this technology demo at its default settings. The demo uses high-precision texture formats and version 2.0 pixel shaders to produce high-dynamic-range lighting, depth of field, motion blur, and glare, among other effects.
The GeForce FX 5700 Ultra manages to double the 5600 Ultra’s performance in the rthdribble demo, but ATI’s Radeons still have a pretty significant lead, especially at lower resolutions. Unfortunately, the images generated by both GeForce FX cards in this demo have a few issues. Either 16-bit floating point precision really isn’t enough for high dynamic range lighting, or NVIDIA’s drivers need to expose the GeForce FX’s support for floating point texture formats in order for rthdribble to work properly.
Edge antialiasing
In Unreal Tournament 2003, the GeForce FX 5700 Ultra is really only playable with 4X antialiasing. There’s some weirdness going on with the card’s 2X antialiasing performance, which is dreadful on both the FX 5700 and 5600 Ultras. Unfortunately, NVIDIA’s antialiasing doesn’t look quite as good as ATI’s gamma-corrected SMOOTHVISION, either.
Texture antialiasing
In Serious Sam SE, the GeForce FX 5700 Ultra suffers roughly the same performance hit with each level of anisotropic filtering as its competition.
Overclocking
In testing, I was able to get my GeForce FX 5700 Ultra stable at core and memory clock speeds of 560 and 950MHz, respectively. Considering that the card’s cooler isn’t noticeably louder than what one may find on a GeForce FX 5600 Ultra or Radeon 9600 XT, I’m quite happy with an 85MHz core overclock.
Of course, just because I was able to get my sample stable and artifact-free at 560/950 doesn’t mean that every GeForce FX 5700 Ultra will be capable of those speeds. Then again, some cards may have the potential to hit even higher clock speeds. Overclocking is never guaranteed and can potentially damage hardware, so be careful.
Overclocking the GeForce FX 5700 Ultra gives the card a nice little performance boost, but remember that the Radeon 9600 XT appears to be quite comfortable with overclocking, too.
Conclusions
GeForce FX 5700 Ultra cards will be available in retail stores this Sunday and are expected to sell for around $199. This isn’t even close to a paper launch, folks. The card’s $200 price point puts it in direct competition with the Radeon 9600 XT, but 9600 XTs seem to be a little scarce at the moment.
Overall, the 5700 Ultra’s performance in DirectX 8-class games is impressive, and the card even handles DirectX 9 titles like Halo with aplomb. However, the 5700 Ultra’s performance with unoptimized code paths and synthetic pixel shader tests is a little discouraging looking forward. With more compiler tuning and driver tweaking, NVIDIA may improve the GeForce FX 5700 Ultra’s performance further, but the potential for future performance gains is hard to quantify and impossible to guarantee. At the very least, NVIDIA’s willingness to help developers optimize code paths should ensure that games and applications milk every last drop of performance from the GeForce FX architecture.
Since the GeForce FX 5700 Ultra beat out the Radeon 9600 XT in nearly every real-world game we tested today, it’s tempting to reinstate NVIDIA as the king of mid-range graphics. However, the Radeon 9600 XT is faster in Unreal Tournament 2003, which is particularly important title considering how many future games will use the Unreal engine. It’s also worth noting that ATI’s mid-range Radeons offer superior gamma-corrected antialiasing and a coupon for Half-Life 2, both important considerations for enthusiasts and gamers alike.
So I’m not quite ready to crown NVIDIA as the new mid-range graphics king, but I still feel pretty good about recommending the GeForce FX 5700 Ultra over the Radeon 9600 XT for gamers who have $200 to spend and need a graphics card by the end of the weekend. The Radeon 9600 XT’s performance in synthetic DirectX 9-class benchmarks makes it feel like a safer bet, but NVIDIA’s new run-time compiler has me a little more excited about the GeForce FX’s performance potential in future titles. The fact that the GeForce FX 5700 Ultra rips through today’s current games and uses some pretty swanky hardware for a $200 graphics card doesn’t hurt, either.
Of course, if you don’t need to pick up a new graphics card this weekend, I’d hold off just a little while longer. We’ll be comparing the rendering output and image quality of ATI and NVIDIA’s newest graphics chips with the latest drivers in an upcoming article; the benchmarks we saw today are just the beginning.