review nvidias geforce fx 5200 ultra gpu
Reviews

NVIDIA’s GeForce FX 5200 Ultra GPU

WHEN NVIDIA first introduced the GeForce FX 5200, its performance was horrid. Sure the chip supported all sorts of DirectX 9 goodies, but feature compatibility with DirectX 9 doesn’t guarantee competitive or even competent performance in DirectX 9-class applications or older games. To cure the GeForce FX line’s performance woes, NVIDIA released the Detonator FX 44.03 drivers. Cheats and optimizations for 3DMark03 in the Detonator FX drivers created some controversy. However, the drivers did dramatically improve performance in many applications without degrading image quality or otherwise breaking any rules. And there was much rejoicing.

Thus, I had a renewed interest in NVIDIA’s GeForce FX 5200 Ultra. The Ultra bolsters the GeForce FX 5200’s core clock speed by 75MHz and memory clock by 150MHz, which makes it look more competitive, on paper, with NVIDIA’s mid-range GeForce FX 5600 than with its sub-$70 sibling.

Today we’re looking at Inno3D’s Tornado GeForce FX 5200 Ultra. Does it deliver on the GeForce FX 5200 Ultra’s potential? Is it fast enough to challenge the GeForce FX 5600 or the competition from ATI? The answers might surprise you.

All you need to know about NV34
The GeForce FX 5200 Ultra uses NVIDIA’s NV34 graphics chip, which I described in my GeForce FX 5200 review back in April. NV34 occupies the low end of NVIDIA’s GeForce FX line, and it offers DirectX 9-class pixel and vertex shader programs. Like NV31, which is used in NVIDIA’s GeForce FX 5600 series, NV34 has four pixel pipelines capable of laying down one texture per rendering pass. However, NV34 differs from the rest of the NV3x line in a couple of key ways:

  • Manufacturing process – NVIDIA’s other NV3X graphics chips are built using new 0.13-micron process technology, but NV34 is built using older and more established 0.15-micron technology. NVIDIA’s use of 0.15-micron technology will set NV34’s core clock speed ceiling lower than it perhaps could be with a 0.13-micron chip, which would also consume less power, but 0.15-micron technology makes sense for relatively budget parts like the GeForce FX 5200 and 5200 Ultra.

  • No color or Z compression – Unlike the rest of the NV3X line, NV34 can’t do color or Z compression. The lack of color compression should hamper the chip’s performance primarily with antialiasing enabled, but the lack of Z compression will hurt across the board. Without advanced lossless compression schemes, NV34 doesn’t make as efficient use of the bandwidth it has available, which reduces the chip’s overall effective fill rate (or pixel-pushing power).

Those are the basics. You can find a more comprehensive analysis of the chip in my GeForce FX 5200 review and preview of NVIDIA’s NV31 and NV34 graphics chips.

Inno3D’s Tornado GeForce FX 5200 Ultra
For gamers on a budget, the GeForce FX 5200 Ultra should be a pretty appealing option. Like most other GeForce FX 5200 Ultra cards, Inno3D’s rendition appears to follow NVIDIA’s reference design, so the Tornado will have to rely on bundled extras and low prices in order to compete.

Memory heat sinks have been used on a number of graphics cards from various manufacturers in the past few years, but quite honestly I’ve yet to see a graphics card whose memory cooling solution made a huge difference in even overclocking performance.

Rear-mounted memory heat sinks can also create clearance problems with some motherboards, so Inno3D isn’t missing out by leaving the Tornado GeForce FX 5200 Ultra’s memory chips bare.

The card’s 128MB of memory is mounted on both sides of the board without any extra cooling. The Tornado GeForce FX 5200 Ultra uses Hynix BGA chips rated up to 350MHz (700MHz DDR).

One factor that could aid in any overclocking endeavors is the Tornado’s auxiliary power connector, which helps feed the board with juice. The card uses a standard 4-pin MOLEX connector. Inno3D doesn’t, however, include a MOLEX splitter cable with the Tornado. Some users, especially owners of small-form-factor systems, may have trouble finding a spare power connector.

Like many other manufacturers, Inno3D is essentially re-badging NVIDIA’s reference cooler. Although the non-Ultra GeForce FX 5200 can get away with only passive cooling, the Ultra’s higher clock speeds necessitate active cooling. Fortunately, the Tornado’s fan is no louder than an average graphics card fan; processor or power supply fan noise easily drowns it out.

Like just about everyone else, Inno3D equips the Tornado with VGA, DVI, and S-Video outputs. A DVI-to-VGA adapter is also included in the box for those who want to take advantage of NVIDIA’s nView software running dual CRT monitors. For those looking to hook up the Tornado to a TV or home theater, Inno3D throws in an S-Video-to-composite video cable, but curiously no standalone S-Video cable. Included in the box but completely useless with the card is a VIVO adapter cable; the Tornado GeForce FX 5200 Ultra doesn’t support video input at all.

Inno3D includes a number of different software titles and games with the Tornado. In addition to the requisite driver CD and a disc full of game demos, there are also copies of 3DMark03 (standard version), Comanche 4, WinDVD 4, and WinDVD Creator. Honestly, I’d far rather see brand new graphics cards bundled with recent games rather than older titles. Graphics card manufacturers seem to have no problem getting the latest versions of software like WinDVD into the box, but I’ve yet to see anyone bundle a game that’s not at least a year old.

Now, don’t get me wrong—I’m a big fan of freebies in game bundles. However, I wonder if leaving out a copy of an older game like Comanche 4 might allow Inno3D to drop the price of the Tornado GeForce FX 5200 Ultra a little, or perhaps even to bundle in something more useful, like an S-Video cable.

Our testing methods
As ever, we did our best to deliver clean benchmark numbers. Tests were run three times, and the results were averaged.

Our test system was configured like so:

System
Processor Athlon XP ‘Thoroughbred’ 2600+ 2.083GHz
Front-side bus 333MHz (166MHz DDR)
Motherboard Asus A7N8X
Chipset NVIDIA nForce2
North bridge nForce2 SPP
South bridge nForce2 MCP
Chipset drivers NVIDIA 2.03
Memory size 512MB (2 DIMMs)
Memory type Corsair XMS3200 PC2700 DDR SDRAM (333MHz)
Sound nForce2 APU
Graphics card GeForce FX 5200 Ultra 128MB
GeForce FX 5200 128MB
GeForce FX 5600 256MB
Radeon 9000 Pro 64MB
Radeon 9600 Pro 128MB
Graphics driver Detonator FX 44.03 CATALYST 3.2
Storage Maxtor DiamondMax Plus D740X 7200RPM ATA/100 hard drive
OS Microsoft Windows XP Professional
OS updates Service Pack 1, DirectX 9.0

Today I’ll be comparing the Tornado GeForce FX 5200 Ultra’s performance with the vanilla GeForce FX 5200, the GeForce FX 5600, and a couple of Radeons from ATI. Since the performance of Inno3D’s Tornado should accurately reflect the performance of all GeForce FX 5200 Ultra cards that run at NVIDIA’s prescribed core and memory clock speeds, I’ll be using Inno3D’s Tornado as a reference point for GeForce FX 5200 Ultras as a whole.

Recently, the issue of driver-based cheating and optimizing for specific tests has called into question the legitimacy many of the test applications we use regularly here at TR. As always, we’ve done our best to ensure that we’re delivering clean, reliable benchmark numbers. We have taken steps to circumvent known driver cheats in our testing today. However, we are working on some new methods for graphics testing intended to sidestep common cheats. Those methods will debut in a future article. Stay tuned.

The test system’s Windows desktop was set at 1024×768 in 32-bit color at an 85Hz screen refresh rate. Vertical refresh sync (vsync) was disabled for all tests.

We used the following versions of our test applications:

All the tests and methods we employed are publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.

Fill rate
Raw fill rate isn’t the be-all and end-all of graphics performance, but it’s a good place to get things started. How do the GeForce FX 5200 Ultra’s paper specs stack up against the competition we’ve assembled today?

Core clock (MHz) Pixel pipelines Peak fill rate (Mpixels/s) Texture units per pixel pipeline Peak fill rate (Mtexels/s) Memory clock (MHz) Memory bus width (bits) Peak memory bandwidth (GB/s)
GeForce FX 5200 250 4 1000 1 1000 400 128 6.4
GeForce FX 5600 325 4 1300 1 1300 500 128 8.0
Radeon 9000 Pro 275 4 1100 1 1100 550 128 8.8
Radeon 9600 Pro 400 4 1600 1 1600 600 128 9.6
GeForce FX 5200 Ultra 325 4 1300 1 1300 650 128 10.4

Because both the GeForce FX 5200 Ultra and GeForce FX 5600 share the same core clock speeds and basic 4-by-1 pipeline configuration, they have identical peak theoretical fill rates. However, NV31 and NV34 are different graphics chips; don’t expect the performance of these two cards to be identical because of these specs alone.

The GeForce FX 5200 Ultra may have the same single- and multi-texturing fill rate potential as NVIDIA’s GeForce FX 5600, but the Ultra’s packing 2.4GB/sec of extra memory bandwidth that should help it deliver on that theoretical fill rate promise. In fact, the GeForce FX 5200 Ultra has even more memory bandwidth available than ATI’s mid-range Radeon 9600 Pro, though the latter’s use of lossless color and Z-compression should enable it to make more efficient use of its bandwidth.

Is the GeForce FX 5200 Ultra’s raw real world fill rate as impressive as its theoretical peaks?

Sort of. It’s certainly delivering more single-texturing fill rate than the GeForce FX 5600, which is odd considering that NV31 and NV34 are both 4×1-pipe designs running at 325MHz. I’m inclined to blame the GeForce FX 5600’s lesser memory bandwidth for its poor performance, but the card’s multi-texturing fill rate is actually superior to the GeForce FX 5200 Ultra.

How do these real-world fill rates compare to those theoretical peaks we just looked at?

Not well. The big story here is how inefficient the GeForce FX line is with its theoretically available resources. None of the graphics cards we’re looking at today realizes anywhere near all of its single-texturing fill rate potential, but at least the Radeon cards are making efficient use of resources when it comes to multi-texturing. Unfortunately, the GeForce FX 5200 Ultra isn’t even realizing 75% of its multi-texturing fill rate potential in this synthetic test; that doesn’t bode well for the card’s performance.

Occlusion detection
NVIDIA hasn’t been terribly forthcoming regarding what (if any) advanced occlusion detection algorithms NV34 uses to reduce overdraw. At the very least, the chip’s lack of lossless color and Z-compression will be bit of a handicap in any fill rate-bound scenario.

The GeForce FX 5200 Ultra essentially ties the GeForce FX 5600 without any anisotropic filtering or antialiasing enabled, but the Ultra falls slightly behind with 8X anisotropic filtering and 4X antialiasing.

Pixel shaders
Since NVIDIA isn’t revealing any concrete details about the number or relative strength of NV34’s pixel shaders, trying to predict the GeForce FX 5200 Ultra’s performance is difficult. Instead of speculating about “levels of parallelism” within the GeForce FX programmable shader, let’s check out some actual test results.

The GeForce FX 5200 Ultra’s clock speed advantage over the vanilla GeForce FX 5200 yields a better performance in 3DMark2001 SE’s pixel shader tests, but all the GeForce FX cards are still well behind the curve in the advanced pixel shader test.

The GeForce FX 5200 Ultra cozies up with the GeForce FX 5600 in NVIDIA’s ChameleonMark benchmark, but both cards are dominated by Radeon 9600 Pro.

In 3DMark03’s pixel shader 2.0 test, the GeForce FX 5200 Ultra is still nipping at the GeForce FX 5600’s heels, but all the GeForce FX cards are embarrassed by ATI’s Radeon 9600 Pro. Since it’s only a DirectX 8 part, the Radeon 9000 Pro can’t complete this test.

Vertex shaders

FutureMark’s vertex shader tests show the GeForce FX 5200 Ultra right behind the GeForce FX 5600 again. In fact, the GeForce FX 5200 Ultra actually pulls out ahead of its mid-range sibling at higher resolutions in 3DMark2001 SE’s vertex shader test. Even then, the GeForce FX 5200 Ultra is way behind the Radeon 9000 Pro.

In 3DMark2001 SE’s transform and lighting tests, which run as vertex shader programs on the cards we’re looking at today, the GeForce FX 5200 Ultra trails the GeForce FX 5600 a little more than it did in more targeted vertex shader tests. At least the GeForce FX 5200 Ultra is able to pull out ahead of the Radeon 9000 Pro this time around, especially in the one-light, high-polygon-count test.

Games
Synthetic feature tests are great and all, but how does the GeForce FX 5200 Ultra perform in real games?

Quake III Arena

In Quake III Arena, the GeForce FX 5200 Ultra is actually faster than the GeForce FX 5600 until we turn on anisotropic filtering and antialiasing. Without color and Z-compression, the GeForce FX 5200 Ultra looks like it’s getting a little bandwidth-deprived when anisotropic filtering and antialiasing are turned up. However, when those extra image quality features are disabled, the card’s better real world single-texturing fill rate may be enabling higher frame rates.

Jedi Knight II

In Jedi Knight II, the trend continues; the GeForce FX 5200 Ultra is faster than the GeForce FX 5600 without anisotropic filtering or antialiasing enabled, but slower with. The GeForce FX 5200 Ultra’s higher clock speeds give the card a nice performance boost over the vanilla GeForce FX 5200 and a clear advantage over the Radeon 9000 Pro.

Comanche 4

The trend continues in Comanche 4; it looks like those uninterested in anisotropic filtering or antialiasing may want to opt for a GeForce FX 5200 Ultra rather than a GeForce FX 5600.

Codecreatures Benchmark Pro

Codecreatures produces consistent results for the GeForce FX 5200 Ultra and GeForce FX 5600, which swap places depending on whether or not aniso and AA are enabled.

Unreal Tournament 2003

The GeForce FX 5200 Ultra isn’t quite able to pull out ahead of the GeForce FX 5600 in Unreal Tournament 2003 with aniso and antialiasing disabled, though the performance difference between the two cards is quite slight when high detail settings are used. To me, that’s almost counter-intuitive; I would have expected the less expensive GeForce FX 5200 Ultra to be a faster option in lower detail scenes.

The GeForce FX cards are nicely wedged between the two Radeons in this test, illustrating the huge performance discrepancy between the different architectures used in ATI’s low-end and mid-range graphics cards.

Serious Sam SE
We used Serious Sam SE’s “Extreme Quality” image quality add-on, which maximizes the anisotropic filtering level of each graphics card, for our testing. In this test, the GeForce FX cards are doing 8X anisotropic filtering while the Radeon 9600 Pro is at 16X. The Radeon 9000 Pro is also doing 16X anisotropic filtering, but it’s falling back to bilinear rather than trilinear filtering.

Again, the GeForce FX 5200 Ultra is faster than the GeForce FX 5600 with extra image quality options enabled. How do things look over the length of the benchmark?

The GeForce FX 5200 Ultra suffers the same stuttering at the start of the benchmark demo as the vanilla GeForce FX 5200. Thanks to our lovely graphs, NVIDIA is aware of the issue and has pledged to fix it in the next Detonator FX driver release. Apparently, the problem occurs only with NV34, NV17, and NV18-based graphics products, which suggests the NV34 might have a GeForce2-class core with a programmable shader tacked on.

With 8X aniso and 4X AA, the GeForce FX 5200 Ultra’s performance drops, and it settles in further behind the GeForce FX 5600. How does the card’s performance across the benchmark demo look?

Consistent. When we turn on anisotropic filtering and antialiasing, the GeForce FX 5200 Ultra’s stuttering problem gets even worse, and the card falls behind the GeForce FX 5600 across the board.

Splinter Cell

In Splinter Cell, the GeForce FX 5200 Ultra is faster than the GeForce FX 5600 yet again.

Looking at it second by seond, the GeForce FX 5200 Ultra manages to stay out ahead of the GeForce FX 5600 enough to achieve a higher average frame rate.

The GeForce FX 5200 Ultra takes a rare win over the GeForce FX 5600 with 8X anisotropic filtering and 4X antialiasing. What’s going on when we look at the length of the benchmark demo?

Even with anisotropic filtering and antialiasing enabled, the GeForce FX 5200 Ultra wins this round overall. Perhaps the NV34 shares more with the Xbox’s graphics chip than it does with NV31? Splinter Cell was, of course, originally developed for the Xbox and may be better tuned for that platform.

3DMark2001 SE

In 3DMark2001 SE, the GeForce FX 5200 Ultra and GeForce FX 5600 virtually tie. Let’s look at the individual game tests.

In all but the “Nature” test, the GeForce FX 5200 Ultra is right up there with the GeForce FX 5600, and the 5200 Ultra is faster than the Radeon 9000 Pro throughout.

3DMark03
Recently, NVIDIA was caught “optimizing” anisotropic filtering levels in 3DMark03. Those optimizations affect even the latest 330 build of 3DMark03, but since they only come into play when anisotropic filtering is used, they won’t affect the results of our testing.

In a freshly patched 3DMark03 build 330, the GeForce FX 5200 Ultra trails the GeForce FX 5600. What about the individual game tests?

The GeForce FX 5200 Ultra and GeForce FX 5600 essentially tie in the first and last game test, but the GeForce FX 5600 has a distinct advantage in the middle two. The relatively poor performance of the entire GeForce FX line in the Mother Nature test is especially disappointing.

3DMark03 image quality
Because frame rates aren’t everything, let’s take a quick peek at the GeForce FX 5200 Ultra’s image quality in 3DMark03. There’s more going on here than you might expect.

Below are cropped JPEGs of frame 1799 of 3DMark03’s “Mother Nature” test. Click on the images for a full-size PNG of each screen shot.


DirectX 9’s reference rasterizer


NVIDIA’s GeForce FX 5600


NVIDIA’s GeForce FX 5200 Ultra

There are subtle differences between the image produced by Microsoft’s DirectX 9 reference rasterizer and those produced by the GeForce FX cards, but the real story here is the difference in the image quality between the GeForce FX 5600 and GeForce FX 5200 Ultra. Check out the differenced between the water in each of the two pictures; something is definitely wrong with how the GeForce FX 5200 is rendering the scene.

SPECviewperf

In SPECviewperf, the GeForce FX 5200 Ultra nearly equals or bests the performance of the GeForce FX 5600 across the board. In all but the ugs test, where the Radeon 9600 Pro dominates, the GeForce FX 5200 Ultra is at or near the head of the class.

Antialiasing
We’ve been looking at the GeForce FX 5200 Ultra’s performance with and without 8X anisotropic filtering and 4X antialiasing throughout this review. Next, let’s break down these antialiasing modes and take a closer look at the GeForce FX 5200 Ultra’s performance across each.

Edge antialiasing

The GeForce FX 5200 Ultra’s antialiasing disadvantage is distinct in Unreal Tournament 2003. Across all the antialiasing modes it supports, the GeForce FX 5200 Ultra is slower than the GeForce FX 5600. The GeForce FX 5200 Ultra is, however, a much better performer with AA than the Radeon 9000 Pro or the vanilla GeForce FX 5200.

For image quality purposes, the different antialiasing modes offered by the GeForce FX 5200 Ultra are identical to those offered by the GeForce FX 5200. You can see how the GeForce FX 5200’s antialiasing image quality shots stack up here.

Texture antialiasing

When we specifically target anisotropic filtering performance, the GeForce FX 5200 Ultra actually pulls out ahead of the GeForce FX 5600, suggesting that the latter’s performance advantage with 8X anisotropic filtering and 4X antialiasing is primarily, if not exclusively, a result of the 5600’s superior antialiasing abilities.

Overclocking
Because some of us like to wring as much performance from our hardware as possible, I did a little overclocking with the Tornado GeForce FX 5200 Ultra. In testing, I was able to get the card at a stable core clock speed of 390MHz with an artifact-free memory clock speed of 703MHz. Remember that overclocking is never guaranteed; just because my sample card was able to achieve a 390/703 core/memory clock speed doesn’t mean that every Tornado GeForce FX 5200 Ultra will hit those speeds. I would, however, suspect that most of the cards will be able to hold a 700MHz memory clock speed since the memory chips are rated for operation at that speed.

Unfortunately, although overclocking the Tornado GeForce FX 5200 Ultra was easy, the higher core and memory clock speeds don’t produce significantly better performance in Unreal Tournament 2003. Even with 8X anisotropic filtering and 4X antialiasing, our overclocking efforts only buy a couple of extra frames per second.

Conclusions
Inno3D’s Tornado GeForce FX 5200 Ultra isn’t yet available online, but GeForce FX 5200 Ultra cards from other manufacturers are going for as low as $132 on Pricewatch—a good $11 cheaper than the least expensive GeForce FX 5600 and $30 cheaper than ATI’s Radeon 9600 Pro. If Inno3D is to be competitive among other GeForce FX 5200 Ultra manufacturers, its card will have to come in at around $130.

Honestly, though, I’m a little confused as to where the GeForce FX 5200 Ultra is going to fit into the market. At $130, it’s really leaning towards the high end of the middle ground between sub-$70 vanilla GeForce FX 5200s and our reigning mid-range graphics champion, the Radeon 9600 Pro, at over $160. For anyone who’s serious about gaming, the Radeon 9600 Pro is definitely worth the extra scratch, especially since it should be better-equipped for next-generation applications.

For bargain hunters less concerned with gaming performance, the GeForce FX 5200 Ultra looks like it could be a pretty good deal, in part because it’s often a better performer, without antialiasing enabled, than the GeForce FX 5600. Antialiasing fans would be far better off with a GeForce FX 5600. And at least the GeForce FX 5600 can render 3DMark03’s “Mother Nature” water correctly; I have a sneaking suspicion that NVIDIA may never enable higher precision floating point pixel shaders in the budget GeForce FX 5200, if it even can.

However, it’s hard to ignore the price/performance ratio of the vanilla GeForce FX 5200, which costs almost half as much as the Ultra. Casual gamers will probably be satiated by the performance of GeForce FX 5200, and I just don’t see enough value behind the Ultra’s $65 higher price tag. The Tornado GeForce FX 5200 Ultra isn’t yet on the market, but if Inno3D expects it to compete with its own $67 (on Pricewatch) Tornado GeForce FX 5200, the Ultra’s price is going to have to drop.

In the end, the GeForce FX 5200 Ultra doesn’t have the image quality, antialiasing prowess, or overall performance to challenge the more expensive Radeon 9600 Pro for serious gamers. Nor does it have a low enough price to challenge the GeForce FX 5200 for casual gamers on a budget. It’s impressive to see the Ultra competitive with the GeForce FX 5600 in many instances, and DirectX 9 capabilities are nice to have. Still, I think consumers are going to be better off spending more on a Radeon 9600 Pro or much less on a vanilla GeForce FX 5200.