NVIDIA’s GeForce FX 5600 Ultra GPU

Manufacturer Chaintech
Model Apogee FX71 GeForce FX 5600 Ultra
Price (estimated) US$215
Availability Now

FOR THE BETTER part of a year, NVIDIA’s GeForce4 Ti 4200 owned the mid-range graphics scene. But that was last year, and NVIDIA’s graphics cards have fallen from grace since. The Ti 4200’s price/performance crown was eventually usurped by ATI’s Radeon 9500 Pro, which yielded to the Radeon 9600 Pro.

NVIDIA took a stab at reclaiming the mid-range graphics title with the NV31 chip and its corresponding GeForce FX 5600 line of graphics cards, but neither the vanilla nor Ultra versions of the 5600 were fast enough to knock ATI off the throne, so NVIDIA went back to the drawing board and came up with a new GeForce FX 5600 Ultra using more exotic chips running at higher clock speeds. The GeForce FX 5600 Ultra’s NV31 roots haven’t changed, but this new Ultra does come with a core clocked at 400MHz and memory running at an effective 800MHz.

And it doesn’t require a Dustbuster.

Can NVIDIA’s revised GeForce FX 5600 Ultra knock off ATI’s Radeon 9600 Pro? Does either fight dirty with questionable driver “optimizations”? We’ve wrangled Chaintech’s GeForce FX 5600 Ultra-powered Apogee FX71 to find out.

NV31 respun
The new GeForce FX 5600 Ultra’s NV31 graphics chip is virtually identical to what was found in the old Ultra, and also in the vanilla GeForce FX 5600. The chip is a four-by-one-pipe design that claims full DirectX 9 support complete with pixel and vertex shaders 2.0. NVIDIA has been cagey about the internal shader structure of its entire GeForce FX line, so I can’t tell you how many vertex or pixel shaders the chip effectively has. NV31 apparently has “more parallelism” in its programmable shaders than NV34, which powers the budget GeForce FX 5200 line, and “less parallelism” than NV30 and NV35, which power the GeForce FX 5800 and 5900 lines, respectively.

I could go on about NV31’s capabilities, but instead, I’ll point you to my preview, and subsequent review of the GeForce FX 5600. I’d rather talk about what makes this new Ultra different from its predecessors.

To meet the new GeForce FX 5600 Ultra’s 400MHz core clock speed demands, NVIDIA fabbed up a batch of NV31 graphics chips using the same flip-chip BGA packaging they use with high-end NV30 and NV35 graphics chips.

NV31 in its sexy, shiny new packaging Apart from its new package and higher 400MHz core clock speed, NV31 hasn’t really changed.

The specs
Before I bust out some pictures of Chaintech’s Apogee FX71 GeForce FX 5600 Ultra, here’s a look at the card’s spec sheet:

GPU NVIDIA NV31
Core clock 400MHz
Pixel pipelines 4
Peak pixel fill rate 1600 Mpixels/s
Texture units/pixel pipeline 1
Textures per clock 4
Peak texel fill rate 1600 Mtexels/s
Memory clock 800MHz
Memory type BGA DDR SDRAM
Memory bus width 128-bit
Peak memory bandwidth 12.8GB/s
Ports VGA, DVI, composite and S-Video outputs
Composite, S-Video inputs
Auxiliary power connector Four-pin Molex

The Apogee FX71’s spec sheet is pretty much what you’d expect from a mid-range graphics card. Like much of its mid-range competition, the card sports 128MB of memory. Chaintech’s does break from the pack and offer video-in/video-out (VIVO) support, though.

Apart from the card’s VIVO support, nothing jumps off the Apogee FX71’s spec sheet that differentiates it from the slew of rebadged reference designs currently on the market. To see what really makes Chaintech’s GeForce FX 5600 Ultra unique, we have to take a look at the card.

This ain’t your momma’s reference card

 

Chaintech’s Apogee FX71 GeForce FX 5600 Ultra
On the surface, Chaintech has definitely outdone itself to produce a unique aesthetic for the Apogee FX71. I don’t know why, but the card’s faux-gold finish has me picturing it in some sort of chromed-out, low-rider case mod. The Apogee FX71 would look pretty sweet with a set of shiny wire wheels, don’t you think?

The Apogee FX71 has a heat spreader on the back of the card to help cool its rear-mounted memory chips. Since the rear heat spreader is only a thin sheet of metal, it shouldn’t create any clearance problems with motherboard DIMM tabs or larger north bridge heat sinks.

Unfortunately, the card does create some PCI slot clearance problems; its cooling enclosure can partially block the first PCI slot on some motherboards. The Apogee FX71’s cooler isn’t as wide as what’s found on the GeForce FX 5800 or 5900 Ultra, but it’s just wide enough to be a nuisance.

Combined with its longish frame, the Apogee FX71’s wider cooler could create problems for small form factor systems, but the card’s cooling system relies on that bulky shroud.

Popping open the Apogee FX71’s plastic-and-metal casing reveals a sea of heat sink fins cooled by a single “gas turbine” fan. The fan is situated such that the shroud is needed to properly direct air flow over the GPU heat sink. A far cry from the GeForce FX 5800 Ultra’s Dustbuster, the Apogee FX71’s cooler isn’t noticeably louder than what can be found cooling other GeForce FX 5600 cards or ATI’s Radeon 9600 Pro.

On the front of the card, memory chips are cooled by much larger memory heat sinks than their rear-mounted counterparts. I’ve seen this done on a number of graphics cards, and I have to wonder if front-mounted chips really do get hotter than those mounted on the opposite side of the card. Of course, the size and shape of memory heat sinks probably has more to do with looking cool than with being cool.

Gently prying off the Apogee FX71’s heat sink reveals just the right amount of thermal compound smudged over the graphics chip. Because it allows the heat sink to be easily removed and replaced without damaging the contact area, I prefer seeing graphics card manufacturers use a thin layer of thermal compound over a one-time TIM pad that’s a pain to scrape off and replace.

The Apogee FX71 uses Hynix HY5DU283222AF-25 memory chips rated for operation at 400MHz. Somewhat surprisingly, the board only features 128MB of memory. Mid-range and even budget graphics cards with 256MB of memory seem to be all the rage these days, despite the fact that the extra memory seems to have little impact on performance at all but the highest resolutions with antialiasing and anisotropic filtering cranked all the way up.

Chaintech uses Philips’ SAA7114H video decoder to power the input side of the card’s VIVO capabilities. The Apogee FX71 doesn’t have a TV tuner chip, though.

Like the rest of the GeForce FX Ultra family, the Apogee FX71 has an auxiliary power connector. With a core clock speed of 400MHz and memory running at an effective 800MHz, it’s no surprise that the Apogee FX71 needs a little extra juice.

Single DVI and VGA outputs grace the Apogee FX71’s port cluster. Sadly, dual DVI hasn’t taken off for high-end graphics cards yet, let alone for mid-range offerings like the Apogee FX71. The card’s video input and output ports are handled by a VIVO dongle, which is pictured below.

 

The extras
The bountiful bundles of Chaintech’s Zenith motherboard line have given the company quite a reputation when it comes to extras, and the Apogee FX71 doesn’t disappoint.

The card comes with a full complement of cables and a DVI-to-VGA adapter, just like it should.

For its next trick, Chaintech includes a fan brush and monitor cleaning ball with the Apogee FX71. At first, I actually thought these extras were a little silly, but I’ve been using them too frequently over the past few weeks to joke about them now. The monitor cleaning ball does a great job of keeping dust off the many screens that line my benchmarking sweatshop, and the fan brush has proven quite effective for dusting heat sinks fins and fans on graphics cards, motherboard north bridge chips, and CPUs. Props to Chaintech for taking a chance on a couple of odd, but ultimately useful, extras.

Technically, the Apogee FX71’s blue LEDs aren’t bundled extras, but this seemed like the right place to talk about them. The card gives off a brilliant glow that will no doubt make case modding enthusiasts drool. Those without case windows won’t notice that the LEDs are there, which is really a shame. As much as I hate to admit to geeking out over a couple of LEDs, the Apogee FX71’s glow has grown on me. I’m not quite ready to rig up all my systems with cold cathodes, glow-in-the-dark accessories, and case windows, but I have to admit the Apogee FX71 has me thinking about it.

Rounding out the Apogee FX71’s suite of extra goodies is a modest software bundle that includes InterVideo’s WinCinema and WinProducer software, and a full version of MDK2. The bundle includes a CD of dated game demos, too, but nothing recent enough to raise any eyebrows. Honestly, I can’t understand why new graphics cards like the Apogee FX71 continue to come with games that are a couple of years old. MSI has started bundling much newer games with its GeForce FX cards, including Battlefield 1942, Command & Conquer Generals, and Unreal II, which makes the Apogee FX71’s dated game bundle look all the more lacking.

 

Our testing methods
As ever, we did our best to deliver clean benchmark numbers. Tests were run three times, and the results were averaged.

Our test system was configured like so:

  System
Processor Athlon XP ‘Thoroughbred’ 2600+ 2.083GHz
Front-side bus 333MHz (166MHz DDR)
Motherboard DFI LANParty NFII Ultra
Chipset NVIDIA nForce2 Ultra 400
North bridge nForce2 Ultra 400 SPP
South bridge nForce2 MCP-T
Chipset drivers NVIDIA 2.45
Memory size 512MB (2 DIMMs)
Memory type Corsair XMS3200 PC2700 DDR SDRAM (333MHz)
Graphics card GeForce FX 5600 Ultra 128MB
GeForce FX 5600 256MB
Radeon 9600 Pro 128MB
Graphics driver Detonator FX 45.23 CATALYST 3.7
Storage Maxtor DiamondMax Plus D740X 7200RPM ATA/100 hard drive
OS Microsoft Windows XP Professional
OS updates Service Pack 1, DirectX 9.0b

I’ve rounded up ATI’s mid-range Radeon 9600 Pro and NVIDIA’s non-Ultra GeForce FX 5600 for comparison today. With 256MB of memory, the GeForce FX 5600 I’m using has a bit of an advantage over the other cards. However, 256MB of memory is really only needed with very high antialiasing levels and at high resolutions; under those conditions, the GeForce FX 5600 GPU will become a bottleneck before the card has a chance to use its extra memory.

Those who follow the graphics performance game closely will notice that I’m not using the “release 50” Detonator 51.75 drivers that have been making the rounds online. Like they do with ever major driver release, NVIDIA claims that the Detonator 50s will offer nothing short of a performance revolution, but we prefer to use graphics drivers in our reviews that are available to the general public. I actually have a press copy of the 51.75s from NVIDIA, but they’re quite clearly marked as beta drivers and may not be representative of what’s eventually available to consumers.

The test system’s Windows desktop was set at 1024×768 in 32-bit color at a 75Hz screen refresh rate. Vertical refresh sync (vsync) was disabled for all tests.

We used the following versions of our test applications:

All the tests and methods we employed are publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.

 

Fill rate
In a rare act of disclosure, NVIDIA has confirmed that the GeForce FX 5600 Ultra’s NV31 graphics chip is a 4×1-pipe design. Since the chip has a 128-bit memory bus, we can easily determine its theoretical peak single- and multi-texturing fill rates, and its available memory bandwidth. Here’s how NVIDIA’s new Ultra compares with the competition.

  Core clock (MHz) Pixel pipelines  Peak fill rate (Mpixels/s) Texture units per pixel pipeline Peak fill rate (Mtexels/s) Memory clock (MHz) Memory bus width (bits) Peak memory bandwidth (GB/s)
GeForce FX 5600 325 4 1300 1 1300 500 128 8.0
Radeon 9600 Pro 400 4 1600 1 1600 600 128 9.6
GeForce FX 5600 Ultra 400 4 1600 1 1600 800 128 12.8

The new GeForce FX 5600 Ultra offers the same peak theoretical single- and multi-texturing fill rates as ATI’s Radeon 9600 pro, and it has 3.2GB/sec of extra memory bandwidth over ATI’s mid-range Radeon. Of course, these are just theoretical peaks. Real-world fill rate is what matters.

The GeForce FX 5600 Ultra offers superior single-texturing fill rates in 3DMark03, but its multi-texturing performance isn’t so hot. How do these real-world fill rates compare to the theoretical peaks we were just looking at?

None of the cards really come close to realizing their potential single-texturing fill rate, but the GeForce FX 5600 Ultra leads the way when it comes to single-texturing. However, the card can’t match the Radeon 9600 Pro’s nearly 97% multi-texturing fill rate realization. Since the GeForce FX 5600 Ultra has all that extra memory bandwidth, the bottleneck probably isn’t the memory interface.

Occlusion detection
The GeForce FX 5600 Ultra uses lossless color and Z-compression to maximize its available resources. The card also has occlusion detection algorithms to help reduce or eliminate overdraw, too.

In the overdraw-laden VillageMark test, the GeForce FX 5600 Ultra doesn’t quite have what it takes to best the Radeon 9600 Pro.

 

Pixel shaders
Rather than trying to untangle the mess of “levels of parallelism” within the NV31’s programmable shader unit to predict pixel shader performance, I’ll let the numbers do the talking.

Ouch. In NVIDIA’s own ChameleonMark pixel shader test, the GeForce FX 5600 Ultra gets spanked by the Radeon 9600 Pro. The fact that the GeForce FX 5600 Ultra is dominated by ATI in one of NVIDIA’s own benchmarks has to be embarrassing.

In 3DMark03’s pixel shader 2.0 test, the GeForce FX 5600 Ultra is again beaten by the Radeon 9600 Pro. The Ultra does offer a performance boost over the vanilla GeForce FX 5600, but neither is within striking distance of ATI’s mid-range Radeon.

Vertex shaders
Like its pixel shaders, the GeForce FX’s vertex shaders are all about “levels of parallelism” within the chip’s programmable shader.

And, like we saw in our pixel shader tests, the GeForce FX 5600 Ultra is smacked around by the Radeon 9600 Pro in 3DMar03’s vertex shader test. Perhaps the GeForce FX 5600 Ultra needs more of that parallelism NVIDIA keeps talking about.

Real-Time High-Dynamic Range Image-Based Lighting
To test the GeForce FX 5600 Ultra’s performance with high-dynamic-range lighting, we used FRAPS to log frame rates in this technology demo with its default settings. The demo uses high-precision texture formats and version 2.0 pixel shaders to produce high-dynamic-range lighting, depth of field, motion blur, and glare, among other effects.

The GeForce FX 5600 Ultra really struggles here; it’s well behind the Radeon 9600 Pro. Valve has already demoed high-dynamic-range lighting in Half-Life 2, making the GeForce FX 5600 Ultra’s poor performance in this test particularly discouraging.

 

Quake III Arena

Jedi Knight II

Wolfenstein: Enemy Territory

Results from our three Quake III engine-based games are mixed, but the GeForce FX 5600 Ultra generally performs well. Overall, the card is faster than the Radeon 9600 Pro in all but Quake III Arena with 4X antialiasing and 8X anisotropic filtering. The 5600 Ultra turns in particularly impressive performances in Jedi Knight II and Wolfenstein: Enemy Territory.

 

Unreal Tournament 2003

Comanche 4

Codecreatures Benchmark Pro

In our next batch of game tests, the GeForce FX 5600 Ultra doesn’t look quite as hot. The card only manages to best the Radeon 9600 Pro in Codecreatures with antialiasing and aniso disabled, but is at least close in most of the tests.

 

Gun Metal benchmark

X2 rolling demo

In the DirectX 9 Gun Metal benchmark, which NVIDIA pimps on its own web site, the GeForce FX 5600 fares well against ATI’s mid-range Radeon. However, X2’s DirectX 9 “rolling demo” shows the Radeon 9600 Pro out ahead again. What’s particularly interesting about the X2 demo is that its developers relied primarily on vertex shaders rather than pixel shaders to achieve the game’s special effects.

Tomb Raider: Angel of Darkness
Tomb Raider: Angel of Darkness gets its own special little intro here because its publisher, EIDOS Interactive, has released a statement claiming that the latest patch, which includes a performance benchmark, was never intended for public release. Too late, the patch is already public. We’ve used these extreme quality settings from Beyond3D to give the GeForce FX 5600 Ultra a thorough workout in this DirectX 9 game, and the results speak for themselves.

Ouch. The only thing more painful than the sting NVIDIA must feel from this kind of performance is the amount of time it takes the GeForce FX cards to run through this game’s “paris3” demo. Seriously, it delayed the completion of this review by a couple of days.

According to EIDOS, the patch we’re using is “not a basis for valid benchmarking comparisons,” so you can consider the above results invalid, if you’d like.

 

Serious Sam SE

In Serious Sam SE, the GeForce FX 5600 Ultra just edges the Radeon 9600 pro with antialiasing and aniso disabled, and pulls a little further ahead with 4X antialiasing and 8X aniso.

 

Splinter Cell

In Splinter Cell, the GeForce FX 5600 Ultra and Radeon 9600 Pro are neck and neck. The GeForce FX 5600 Ultra squeaks ahead at higher resolutions, but it’s a close race.

 

3DMark03

In 3DMark03, which NVIDIA maintains should never be used for graphics card benchmarking, the GeForce FX 5600 Ultra fares surprisingly well against the Radeon 9600 Pro. How does the card do in the individual game tests?

Quite well, until we get to the pixel shader-packed Mother Nature test. In the Mother Nature test, the Radeon 9600 Pro has a healthy lead.

 

3DMark03 image quality
Because frame rates aren’t everything, let’s take a quick peek at the GeForce FX 5600 Ultra’s image quality in 3DMark03.

Below are cropped JPEGs of frame 1799 of 3DMark03’s “Mother Nature” test. Click on the images for a full-size PNG of each screen shot.


DirectX 9’s reference rasterizer


NVIDIA’s GeForce FX 5600 Ultra


ATI’s Radeon 9600 Pro

At least in 3DMark03, the GeForce FX 5600 Ultra’s image quality is on par with the Radeon 9600 Pro. Unfortunately, it’s been suggested that NVIDIA drivers can detect screen capture attempts and render higher quality frames for output, so I have to take these screenshots with a grain of salt. When playing in real-time, Mother Nature looks great on the GeForce FX 5600 Ultra, but given recent events, I don’t feel completely safe trusting NVIDIA’s drivers.

 

SPECviewperf

In SPEC’s viewperf suite of workstation graphics tests, the GeForce FX 5600 Ultra is faster than the Radeon 9600 Pro all but the Unigraphics (ugs) test.

 

Edge antialiasing

The GeForce FX 5600 Ultra’s antialiasing performance trails the Radeon 9600 Pro through most resolutions, but the 5600 Ultra has the 9600 Pro within its reach at 1600×1200. Let’s have a look at how well the GeForce FX 5600 Ultra’s antialiasing levels get rid of pesky jagged edges.

 

Edge antialiasing quality – GeForce FX 5600 Ultra


GeForce FX 5600 Ultra: No antialiasing


GeForce FX 5600 Ultra: 2X antialiasing


GeForce FX 5600 Ultra: 4X antialiasing


GeForce FX 5600 Ultra: 6XS antialiasing


GeForce FX 5600 Ultra: 8X antialiasing

With higher antialiasing levels, the GeForce FX 5600 Ultra looks progressively better, but we’ll have to check out the Radeon 9600 Pro’s antialiasing image quality before passing judgment.

 

Edge antialiasing quality – Radeon 9600 Pro


Radeon 9600 Pro: No antialiasing


Radeon 9600 Pro: 2X antialiasing


Radeon 9600 Pro: 4X antialiasing


Radeon 9600 Pro: 6X antialiasing

It only goes up to 6X, but the Radeon 9600 Pro’s gamma-corrected antialiasing looks better than what’s available on the GeForce FX 5600 Ultra. Not only do the Radeon 9600 Pro’s antialiasing modes look better, level-for-level, than what’s available on the GeForce FX 5600 Ultra, the Radeon 9600 Pro at 6X looks better than the GeForce FX 5600 Ultra at 8X.

 

Texture antialiasing

The GeForce FX 5600 Ultra’s anisotropic filtering performance is much improved over the vanilla GeForce FX 5600. It’s quick enough to best the Radeon 9600 Pro in Serious Sam SE. Let’s take a look at some texture filtering screenshots in Quake III Arena and Unreal Tournament 2003 so see anisotropic filtering in action.

 

Texture antialiasing quality – GeForce FX 5600 Ultra


GeForce FX 5600 Ultra: Standard trilinear + bilinear filtering


GeForce FX 5600 Ultra: Trilinear + 8X anisotropic filtering


GeForce FX 5600 Ultra: Standard trilinear + bilinear filtering


GeForce FX 5600 Ultra: Trilinear + 8X anisotropic filtering

The GeForce FX 5600 Ultra’s 8X anisotropic filtering has a marked impact on image quality in both games. Notice how much sharper the textures look off in the distance.

 

Texture antialiasing quality – Radeon 9600 Pro


Radeon 9600 Pro: Standard trilinear + bilinear filtering


Radeon 9600 Pro: Trilinear + 8X anisotropic filtering


Radeon 9600 Pro: Standard trilinear + bilinear filtering


Radeon 9600 Pro: Trilinear + 8X anisotropic filtering

The GeForce FX 5600 Ultra looks just as good with 8X aniso as the Radeon 9600 Pro.

For kicks, let’s get all psychedelic and break out some multi-colored mip map levels.

 

Mip map levels – GeForce FX 5600 Ultra


GeForce FX 5600 Ultra: Standard trilinear + bilinear filtering


GeForce FX 5600 Ultra: Trilinear + 8X anisotropic filtering


GeForce FX 5600 Ultra: Standard trilinear + bilinear filtering


GeForce FX 5600 Ultra: Trilinear + 8X anisotropic filtering

The GeForce FX 5600 Ultra is correctly applying trilinear and anisotropic filtering in Quake III Arena, but it looks like card is using a less aggressive form of trilinear filtering in Unreal Tournament 2003. The mip map transitions look much harsher in Unreal Tournament than in Quake III Arena, but the “light” version of trilinear filtering that NVIDIA is using actually does a pretty good job of hiding mip-map boundaries during actual gameplay.

As good as NVIDIA’s lighter trilinear filtering routine looks in Unreal Tournament 2003, it’s an application-specific optimization of questionable integrity. The above shots were taken with trilinear filtering enabled in Unreal Tournament 2003’s graphics settings control panel using the Detonator FX driver’s “application” image quality settings. Under the “application” setting, the driver should let Unreal Tournament 2003 decide what kind of trilinear filtering to use. Instead, NVIDIA’s driver is stepping in to force a lower quality form of trilinear filtering.

Since Unreal Tournament 2003’s mip map transitions look fine with even the lighter trilinear filtering, I’m not going to get too worked up. However, I would like to see NVIDIA’s “application” image quality settings provide true trilinear filtering when it’s requested by an application.

 

Mip map levels – Radeon 9600 Pro


Radeon 9600 Pro: Standard trilinear + bilinear filtering


Radeon 9600 Pro: Trilinear + 8X anisotropic filtering


Radeon 9600 Pro: Standard trilinear + bilinear filtering


Radeon 9600 Pro: Trilinear + 8X anisotropic filtering

The Radeon 9600 Pro is doing trilinear and anisotropic filtering properly in Quake III Arena, but it’s not doing any trilinear filtering in Unreal Tournament 2003 with 8X aniso.

Is this a driver bug? An optimization? A cheat? I can’t say for sure. I can, however, confirm that this issue occurs in more than just Unreal Tournament 2003. Serious Sam SE also shows a lack of trilinear filtering with 8X aniso when it’s rendered in Direct3D mode:


Radeon 9600 Pro: Standard trilinear + bilinear filtering


Radeon 9600 Pro: Trilinear + 8X anisotropic filtering

With OpenGL, trilinear and 8X anisotropic filtering work properly on the 9600 Pro in Serious Sam SE, but not with Direct3D. Since Unreal Tournament 2003 also uses Direct3D, this issue appears to be API-specific rather than application-specific, which means it could easily be a driver-related bug. However, it could also be a cleverly disguised attempt to inflate benchmark scores. Given the extra scrutiny reviewers have been giving image quality of late, it would seem an incredibly stupid move for ATI to try to intentionally cheat on image quality.

Keep in mind that we tested both Unreal Tournament 2003 and Serious Sam SE with anisotropic and trilinear filtering enabled. Our scores are indicative of performance with the current Catalyst 3.7 drivers, but may not reflect the Radeon 9600 Pro’s performance if and when this trilinear filtering issue is resolved.

 

Overclocking
In testing, I was able to get Chaintech’s GeForce FX 5600 Ultra card stable and artifact-free with core and memory clock speeds of 450 and 880MHz, respectively. All in all, that’s an impressive overclock, especially since the Apogee FX71’s cooling solution is no louder than a conventional cooler.

Of course, just because I was able to get my sample up to 450/880 doesn’t mean every Apogee FX71 can go that high. Some cards may not overclock at all, and some may be capable of higher clock speeds.

With bumped core and memory clock speeds, the GeForce FX 5600 Ultra fares better in Unreal Tournament 2003, but it still can’t catch the Radeon 9600 Pro.

 

Conclusions
This review really looked at two things: the GeForce FX 5600 Ultra GPU, and Chaintech’s Apogee FX71 graphics card. My conclusions regarding the GeForce FX 5600 Ultra GPU apply to all cards based on the chip, including the Apogee FX71, so I’ll deal with those first.

Overall, the GeForce FX 5600 Ultra exceeded my performance expectations in DirectX 8-class applications, but its pokey when pixel and vertex shaders are active, especially DirectX 9 pixel shaders. The 5600 Ultra’s performance in our high-dynamic-range lighting test isn’t encouraging, nor it it’s performance in Tomb Raider: AOD. EIDOS has stated that Tomb Raider isn’t a valid benchmark, but that’s bunk. The game is out and playable, and the patch for the game was publicly available; it’s not like Tomb Raider is some sort of synthetic test with no real-world roots.

In DirectX 9 applications, the GeForce FX 5600 Ultra has a lot of ground to make up if wants to run with the Radeon 9600 Pro. Too much ground, I think. There’s no doubt in my mind that NVIDIA’s next public driver release will improve performance in applications like Tomb Raider and Valve’s upcoming Half-Life 2, but there’s only so much a driver can do to with what looks like a woefully underpowered shader unit.

So, while the GeForce FX 5600 Ultra turns in a good performance in the older games we’ve been testing for years, the outlook is bleak for next-generation titles. Unless NVIDIA can pull off a driver miracle with the next round of Detonators, the Radeon 9600 Pro is a better mid-range graphics card for tomorrow’s games and more than a capable performer with current titles.

For a moment, let’s ignore my apprehension regarding the GeForce FX 5600 Ultra GPU itself. Doing so makes the Apogee FX71 look like a pretty sweet product. The card’s flashy aesthetic may not jive with everyone’s fashion sense, but its quiet cooling system, VIVO support, and surprisingly useful bundle make it a unique offering in a sea of boring reference cards. At $215 online, the Apogee FX71 is in the same price range as other VIVO-equipped GeForce FX 5600 Ultra-based graphics cards. However, Radeon 9600 Pros are available for around $140 online—a tough reality to ignore for anyone who isn’t an NVIDIA fanboy.

Finding a Radeon to compete with the Apogee FX71’s VIVO support is a little trickier, but it can be done. The All-in-Wonder 9600 Pro can be pre-ordered for about $260 and comes with a TV Tuner, RF Remote Wonder, and a slew of multimedia software in addition to a full suite of VIVO options. To me, that’s worth the extra $45.

In the end, my enthusiasm for Chaintech’s Apogee FX71 is curbed only by the card’s use of NVIDIA’s GeForce FX 5600 Ultra GPU. The card is no doubt a capable and competitive performer in many older games, but Radeon 9600 Pro-based graphics cards are cheaper, faster, and look like more promising performers for next-generation games. 

Comments closed
    • Anonymous
    • 17 years ago

    You think that nVidia would have done some research on ATI’s Radeon 9600 Pro card before “cough” “improving” their Geforce FX 5600 into the Utlra card.
    Guess not, in some of the test results. You think they done their homework. What the hell are they doing during that time? I guess they were busy playing games with the Radeon 9600 pro card.

    • Anonymous
    • 17 years ago

    All of the cards rather suck on the higher end tests. Low double-digit beating high single-digit frames per second scores? Obviously all of the cards are out of their league in such tests/applications.

    • llreye
    • 17 years ago

    Regarding VIVO:
    If the Apogee FX71 is 215$ online, that is a poor price to compete with ATI’s “VIVO”.
    mwave will have (no stock yet) the AIW 9600 with faster than 9600 pro memory for 228.45$.
    So not an extra 45$ but rather ~15$ (including with tax diff).

      • Dissonance
      • 17 years ago

      At press time, Pricewatch listed the lowest AIW 9600 Pre-order at $260.

        • llreye
        • 17 years ago

        Crazy what a few hours can make.

    • atidriverssuck
    • 17 years ago

    how often do you find something as ugly as that Geforce card *and* with no substance inside as well? Not often. Oh dear.

    • Pete
    • 17 years ago

    Good review, Diss. Thanks.

    Edit: I’m curious why the Wolf:ET scores are so disparate. Digit-Life shows the 9600 and 5600 series about even in RtCW; are the two games that different? Perhaps you used a demo that unusually favored nV over ATi? IME w/ET on my 9100, Oasis is the easiest on my card. I’d be curious how the cards fared outdoors on Radar or at the Allied spawn point at the Beach.

    Considering how much I play ET, those 9600P AA+AF scores are might disappointing, especially compared to the 5600U.

    • Anonymous
    • 17 years ago

    bollox bollox

    • lyc
    • 17 years ago

    it seems that nvidia may have swallowed too much of that 3dfx “dying tendancy” when they picked up their engineers…

    seriously, nvidia is becoming just like 3dfx. their cards are overpriced and undercomptetitive, and they’re just going insane with the card dimensions, cooling solution…

    well, i’m really happy with my radeon 9700 pro. btw, i’m interested to know how THAT card performs on the new games, which are only being tested with the 9800 pro!

    anyway, good job tr!

    • Anonymous
    • 17 years ago

    From page 23 of the review:
    “Of course, just because I was able to get my sample up to 450/880 doesn’t mean every Apogee FX71 can go that high. Some cards may not overclock at all, and some may be capable of higher clock speeds.”

    On Chaintech’s website they say the card can go up to 450 core/850 memory under their Apogee series Exclusive Guarantees. Not bad considering that they’d have to have full confidence ALL FX71’s would have that capability, where most manufacturers wouldn’t guarantee any specific overclock at all.

    §[<http://www.chaintech.com.tw/tw/eng/news_show0728.asp<]§

    • Anonymous
    • 17 years ago

    Nvidia is simply going to continue to be a 2nd class citizen until their next gen chip comes out next spring, which will be their u[

    • Anonymous
    • 17 years ago

    When you set aniso filtering in the driver control panel for ati cards, only the first texture stage is trilinearly filtered, the rest are bilinearly filtered. This applies to all the games you force aniso filtering on. However, for games where the game sets aniso filtering levels, ati does aniso as requested by the game, provided you have switched off aniso in the driver control panel, iirc. However this should apply to all games, not just direct3d games. I think ut2k3 and serious sam both provide a way to turn aniso filtering on in the game.

      • droopy1592
      • 17 years ago

      uh, oh. You sure about that?

        • Anonymous
        • 17 years ago

        I’m pretty sure.

        From a B3D article:
        §[<http://www.beyond3d.com/reviews/ati/9800_256/index.php?p=13<]§ y[<"...the "Performance" mode is limited to Bilinear Anisotropic Filtering, while the "Quality" mode has Trilinear Filtering. However, the Quality mode only applies Trilinear filtering to the base layer textures (those at texture stage 0), the other texture layers (stage 1 and above) only have Bilinear Filtering enabled, so that with multi-texturing and forced Anisotropic Filtering enabled there will be a Trilinear and Bilinear blends."<]y and whoops, in ut2k3 you have to edit an .ini to get aniso: y[<"Although UT2003 doesn't have any in-game control options for Anisotropic filtering, it can be enabled for the application by changing the the ut2003.ini file in the [UT2003 Install Directory]\system\ directory, as we have done here to select 16x AF."<]y and this is what ati says about their driver control panel settings: y[<"ATI states that the control panel filtering options are only there for legacy applications, and the Trilinear/Bilinear blend was selected as a good compromise between image quality and performance - in most cases its would not be evident that there was any layers of Bilinear filtering, although there are the few odd cases, such as UT2003, that this can be noticed. ATI advocate that new applications should have Anisotropic Filtering controls and that any filtering requested by the application will always have whatever filtering that the application requests on all layers. ATI are investigating the possibility of adding more IQ control options to the control panel though."<]y

      • Dissonance
      • 17 years ago

      So why does Quake III Arena show proper trilinear filtering?

        • droopy1592
        • 17 years ago

        So what about post 24? Can you investigate this Diss?

        • Anonymous
        • 17 years ago

        I dunno.

        Maybe the scene doesn’t use much multitexturing, so you don’t see bilinear filtering?

        I guess a comparison in both ut2k3 and serious sam with quality aniso forced and using application setting to enable aniso would answer the question.

      • Dissonance
      • 17 years ago

      Another thing to note…. Serious Sam SE does trilinear properly when rendering in OpenGL mode, but not in Direct3D…. that’s with forced trilinear in the driver and identical settings otherwise.

    • Anonymous
    • 17 years ago

    well done TR and even better for including TR AoD in the review…

    Thank you for keeping an open mind! We want to know what is coming next…

    • Alex
    • 17 years ago

    First off, nice review Geoff, very nice indeed.

    Second, unless you are an all out Nvidia fan boy, you would have to think twice about draining that kind of $$ into a card that performs “ok.”

    • --k
    • 17 years ago

    Geoff good review. Did the FX5600 AA screenshots need to be gamma-corrected?

      • Dissonance
      • 17 years ago

      Nope, they’re straight up shots

    • droopy1592
    • 17 years ago

    whassup with the lack of Trilinear filtering on the Radeon 9600 in 8x ansio mode? Oh lordy, just when we thought we were safe.

    Still, what’s up with Nvidia trying to render pretty frames while detecting screen capture attempts.

    *[

    • Anonymous
    • 17 years ago

    Liked the review Mr. Gasior! I considered this card before, but reneged after looking at the cooling enclosure that looked like it was protruding out a bit too much. On the driver side, what’s stopping you from posting benchmarks using the ‘press copy’ 51.75’s? Inquiring minds would like to know it’s dx9 benchmarks! they’re beta, but still….

      • R2P2
      • 17 years ago

      Yeah. Refusing to use beta drivers, while going right ahead and using a patch whose designer says should not be used seems a little inconsistent.

        • Anonymous
        • 17 years ago

        I guess you missed the fact the Nvidia just turned around told people NOT to use the 51.75 beta drivers because they were incomplete, after people started finding all kinds of image quality anomalies in them (e.g. in AquaMark 3). Apparently, they were only intended for use with the Half-Life 2 benchmarks…

        Nvidia is starting to remind me of Mr. Short-Term Memory.

          • R2P2
          • 17 years ago

          Eidos said NOT to use the Tomb Raider patch. Why listen to one company and not the other?

          • Patrickr
          • 17 years ago

          Is there an actual quotation of this? I’m not saying I don’t believe you. I just wanted to read the quote.

            • WaltC
            • 17 years ago

            Here’s the quote:

            “It has come to the attention of Eidos that an unreleased patch for Tomb Raider: AOD has unfortunately been used as the basis for videocard benchmarking. While Eidos and Core appreciate the need for modern benchmarking software that utilizes the advanced shading capabilities of modern graphics hardware, Tomb Raider: AOD Patch 49 was never intended for public release and is not a basis for valid benchmarking comparisons. Core and Eidos believe that Tomb Raider: AOD performs exceptionally well on NVIDIA hardware.”

            Notice it does not say anything about pulling the patch. I was able to download the patch from Eidos UK without a problem yesterday after this information was published on GD.

            I have only seen this statement published on GD, however, and have been unable to locate it anywhere else on the ‘net or on Eidos’ sites (except as reprinted by various sites from GD.) However, the wording of it does very much sound like a legitimate PR statement to me. Still, the fact that it doesn’t seem to be a public statement which the public can read anywhere, except as reprinted by websites which don’t post links to the statement on an Eidos site, makes me wonder why Eidos would bother with such a statement in the first place–considering it apparently is not viewable by the public in an official form as an official Eidos position on anything.

            Of course, pulling the pathc would be stupid as it contains many bug fixes for all 3d cards aside from the game’s benchmarking mode. I would imagine Core is pretty pissed, IF this is an actual Eidos statement, as they worked hard to include this benchmark in their software.

            If it is an actual Eidos statement, however, I have no doubt you’ll find nVidia’s influence behind it–they’ll oppose anything they think protrays their cards in an unflattering light, whether it’s true or not is of no consequence to them.

    • Rakhmaninov3
    • 17 years ago

    Good thorough review! Looks like the FX got ground into the floor with the heel of a dirty boot.

      • Honor
      • 17 years ago

      Now I wouldn’t say that. The 5600 ultra comes out on top in some tests and stays close in even more.

      I’d call the 5600 ultra the overall loser, but not “ground in the floor”.

      Unless you count the fact that the somewhat better 9600 pro is 70 bucks cheaper (albeit minus VI) or the 128meg 9700 (non-pro) is 3 bucks more.
      **all prices according to lowest price on pricewatch
      Note I only searched for this specific Chaintech card. Other new-ultra cards may be cheaper, but I doubt by much.

    • Anonymous
    • 17 years ago

    I mentioned before how I hadn’t seen a TR report anywhere on the front page in awhile… now there’s three, and couple more just scrolled off. It’s good to see, especially since I realize this isn’t anyone’s “day job.”

      • Anonymous
      • 17 years ago

      ack, there’s a bug in here… first time I clicked “post anonymous,” it appeared nothing happened. Please delete the dupe.

    • Anonymous
    • 17 years ago

    I mentioned before how I hadn’t seen a TR report anywhere on the front page in awhile… now there’s three, and couple more just scrolled off. It’s good to see, especially since I realize this isn’t anyone’s “day job.”

    • indeego
    • 17 years ago

    q[

    • Anonymous
    • 17 years ago

    Wow..you guys must really be backed up. Or we’ve been to demanding on reviews lately…

    IMHO. Nvidia is just overclocking everything to make up ground. And as we see from thing clockspeed not the only thing that matter.

    • Anonymous
    • 17 years ago

    holy sh$t, a review on this card?? LOL omg.. this is like 3 or 4 months late..

    thx for the review though.. going to get a Radeon 9600pro.

      • Dissonance
      • 17 years ago

      NVIDIA announced the new Ultra a while ago, but it took forever for cards based on the new chips to actually come out. The Chaintech card we looked at has only been out for a couple of weeks.

      • Autonomous Gerbil
      • 17 years ago

      Diss, he’s thinking this was the old version of the Ultra, not the updated one.

Pin It on Pinterest

Share This