FOR THE BETTER part of a year, NVIDIA’s GeForce4 Ti 4200 owned the mid-range graphics scene. But that was last year, and NVIDIA’s graphics cards have fallen from grace since. The Ti 4200’s price/performance crown was eventually usurped by ATI’s Radeon 9500 Pro, which yielded to the Radeon 9600 Pro.
NVIDIA took a stab at reclaiming the mid-range graphics title with the NV31 chip and its corresponding GeForce FX 5600 line of graphics cards, but neither the vanilla nor Ultra versions of the 5600 were fast enough to knock ATI off the throne, so NVIDIA went back to the drawing board and came up with a new GeForce FX 5600 Ultra using more exotic chips running at higher clock speeds. The GeForce FX 5600 Ultra’s NV31 roots haven’t changed, but this new Ultra does come with a core clocked at 400MHz and memory running at an effective 800MHz.
And it doesn’t require a Dustbuster.
Can NVIDIA’s revised GeForce FX 5600 Ultra knock off ATI’s Radeon 9600 Pro? Does either fight dirty with questionable driver “optimizations”? We’ve wrangled Chaintech’s GeForce FX 5600 Ultra-powered Apogee FX71 to find out.
The new GeForce FX 5600 Ultra’s NV31 graphics chip is virtually identical to what was found in the old Ultra, and also in the vanilla GeForce FX 5600. The chip is a four-by-one-pipe design that claims full DirectX 9 support complete with pixel and vertex shaders 2.0. NVIDIA has been cagey about the internal shader structure of its entire GeForce FX line, so I can’t tell you how many vertex or pixel shaders the chip effectively has. NV31 apparently has “more parallelism” in its programmable shaders than NV34, which powers the budget GeForce FX 5200 line, and “less parallelism” than NV30 and NV35, which power the GeForce FX 5800 and 5900 lines, respectively.
I could go on about NV31’s capabilities, but instead, I’ll point you to my preview, and subsequent review of the GeForce FX 5600. I’d rather talk about what makes this new Ultra different from its predecessors.
To meet the new GeForce FX 5600 Ultra’s 400MHz core clock speed demands, NVIDIA fabbed up a batch of NV31 graphics chips using the same flip-chip BGA packaging they use with high-end NV30 and NV35 graphics chips.
NV31 in its sexy, shiny new packaging Apart from its new package and higher 400MHz core clock speed, NV31 hasn’t really changed.
Before I bust out some pictures of Chaintech’s Apogee FX71 GeForce FX 5600 Ultra, here’s a look at the card’s spec sheet:
|Peak pixel fill rate||1600 Mpixels/s|
|Texture units/pixel pipeline||1|
|Textures per clock||4|
|Peak texel fill rate||1600 Mtexels/s|
|Memory type||BGA DDR SDRAM|
|Memory bus width||128-bit|
|Peak memory bandwidth||12.8GB/s|
|Ports||VGA, DVI, composite and S-Video outputs
Composite, S-Video inputs
|Auxiliary power connector||Four-pin Molex|
The Apogee FX71’s spec sheet is pretty much what you’d expect from a mid-range graphics card. Like much of its mid-range competition, the card sports 128MB of memory. Chaintech’s does break from the pack and offer video-in/video-out (VIVO) support, though.
Apart from the card’s VIVO support, nothing jumps off the Apogee FX71’s spec sheet that differentiates it from the slew of rebadged reference designs currently on the market. To see what really makes Chaintech’s GeForce FX 5600 Ultra unique, we have to take a look at the card.
This ain’t your momma’s reference card
Chaintech’s Apogee FX71 GeForce FX 5600 Ultra
On the surface, Chaintech has definitely outdone itself to produce a unique aesthetic for the Apogee FX71. I don’t know why, but the card’s faux-gold finish has me picturing it in some sort of chromed-out, low-rider case mod. The Apogee FX71 would look pretty sweet with a set of shiny wire wheels, don’t you think?
The Apogee FX71 has a heat spreader on the back of the card to help cool its rear-mounted memory chips. Since the rear heat spreader is only a thin sheet of metal, it shouldn’t create any clearance problems with motherboard DIMM tabs or larger north bridge heat sinks.
Unfortunately, the card does create some PCI slot clearance problems; its cooling enclosure can partially block the first PCI slot on some motherboards. The Apogee FX71’s cooler isn’t as wide as what’s found on the GeForce FX 5800 or 5900 Ultra, but it’s just wide enough to be a nuisance.
Combined with its longish frame, the Apogee FX71’s wider cooler could create problems for small form factor systems, but the card’s cooling system relies on that bulky shroud.
Popping open the Apogee FX71’s plastic-and-metal casing reveals a sea of heat sink fins cooled by a single “gas turbine” fan. The fan is situated such that the shroud is needed to properly direct air flow over the GPU heat sink. A far cry from the GeForce FX 5800 Ultra’s Dustbuster, the Apogee FX71’s cooler isn’t noticeably louder than what can be found cooling other GeForce FX 5600 cards or ATI’s Radeon 9600 Pro.
On the front of the card, memory chips are cooled by much larger memory heat sinks than their rear-mounted counterparts. I’ve seen this done on a number of graphics cards, and I have to wonder if front-mounted chips really do get hotter than those mounted on the opposite side of the card. Of course, the size and shape of memory heat sinks probably has more to do with looking cool than with being cool.
Gently prying off the Apogee FX71’s heat sink reveals just the right amount of thermal compound smudged over the graphics chip. Because it allows the heat sink to be easily removed and replaced without damaging the contact area, I prefer seeing graphics card manufacturers use a thin layer of thermal compound over a one-time TIM pad that’s a pain to scrape off and replace.
The Apogee FX71 uses Hynix HY5DU283222AF-25 memory chips rated for operation at 400MHz. Somewhat surprisingly, the board only features 128MB of memory. Mid-range and even budget graphics cards with 256MB of memory seem to be all the rage these days, despite the fact that the extra memory seems to have little impact on performance at all but the highest resolutions with antialiasing and anisotropic filtering cranked all the way up.
Chaintech uses Philips’ SAA7114H video decoder to power the input side of the card’s VIVO capabilities. The Apogee FX71 doesn’t have a TV tuner chip, though.
Like the rest of the GeForce FX Ultra family, the Apogee FX71 has an auxiliary power connector. With a core clock speed of 400MHz and memory running at an effective 800MHz, it’s no surprise that the Apogee FX71 needs a little extra juice.
Single DVI and VGA outputs grace the Apogee FX71’s port cluster. Sadly, dual DVI hasn’t taken off for high-end graphics cards yet, let alone for mid-range offerings like the Apogee FX71. The card’s video input and output ports are handled by a VIVO dongle, which is pictured below.
The bountiful bundles of Chaintech’s Zenith motherboard line have given the company quite a reputation when it comes to extras, and the Apogee FX71 doesn’t disappoint.
The card comes with a full complement of cables and a DVI-to-VGA adapter, just like it should.
For its next trick, Chaintech includes a fan brush and monitor cleaning ball with the Apogee FX71. At first, I actually thought these extras were a little silly, but I’ve been using them too frequently over the past few weeks to joke about them now. The monitor cleaning ball does a great job of keeping dust off the many screens that line my benchmarking sweatshop, and the fan brush has proven quite effective for dusting heat sinks fins and fans on graphics cards, motherboard north bridge chips, and CPUs. Props to Chaintech for taking a chance on a couple of odd, but ultimately useful, extras.
Technically, the Apogee FX71’s blue LEDs aren’t bundled extras, but this seemed like the right place to talk about them. The card gives off a brilliant glow that will no doubt make case modding enthusiasts drool. Those without case windows won’t notice that the LEDs are there, which is really a shame. As much as I hate to admit to geeking out over a couple of LEDs, the Apogee FX71’s glow has grown on me. I’m not quite ready to rig up all my systems with cold cathodes, glow-in-the-dark accessories, and case windows, but I have to admit the Apogee FX71 has me thinking about it.
Rounding out the Apogee FX71’s suite of extra goodies is a modest software bundle that includes InterVideo’s WinCinema and WinProducer software, and a full version of MDK2. The bundle includes a CD of dated game demos, too, but nothing recent enough to raise any eyebrows. Honestly, I can’t understand why new graphics cards like the Apogee FX71 continue to come with games that are a couple of years old. MSI has started bundling much newer games with its GeForce FX cards, including Battlefield 1942, Command & Conquer Generals, and Unreal II, which makes the Apogee FX71’s dated game bundle look all the more lacking.
Our testing methods
As ever, we did our best to deliver clean benchmark numbers. Tests were run three times, and the results were averaged.
Our test system was configured like so:
|Processor||Athlon XP ‘Thoroughbred’ 2600+ 2.083GHz|
|Front-side bus||333MHz (166MHz DDR)|
|Motherboard||DFI LANParty NFII Ultra|
|Chipset||NVIDIA nForce2 Ultra 400|
|North bridge||nForce2 Ultra 400 SPP|
|South bridge||nForce2 MCP-T|
|Chipset drivers||NVIDIA 2.45|
|Memory size||512MB (2 DIMMs)|
|Memory type||Corsair XMS3200 PC2700 DDR SDRAM (333MHz)|
|Graphics card||GeForce FX 5600 Ultra 128MB
GeForce FX 5600 256MB
|Radeon 9600 Pro 128MB|
|Graphics driver||Detonator FX 45.23||CATALYST 3.7|
|Storage||Maxtor DiamondMax Plus D740X 7200RPM ATA/100 hard drive|
|OS||Microsoft Windows XP Professional|
|OS updates||Service Pack 1, DirectX 9.0b|
I’ve rounded up ATI’s mid-range Radeon 9600 Pro and NVIDIA’s non-Ultra GeForce FX 5600 for comparison today. With 256MB of memory, the GeForce FX 5600 I’m using has a bit of an advantage over the other cards. However, 256MB of memory is really only needed with very high antialiasing levels and at high resolutions; under those conditions, the GeForce FX 5600 GPU will become a bottleneck before the card has a chance to use its extra memory.
Those who follow the graphics performance game closely will notice that I’m not using the “release 50” Detonator 51.75 drivers that have been making the rounds online. Like they do with ever major driver release, NVIDIA claims that the Detonator 50s will offer nothing short of a performance revolution, but we prefer to use graphics drivers in our reviews that are available to the general public. I actually have a press copy of the 51.75s from NVIDIA, but they’re quite clearly marked as beta drivers and may not be representative of what’s eventually available to consumers.
The test system’s Windows desktop was set at 1024×768 in 32-bit color at a 75Hz screen refresh rate. Vertical refresh sync (vsync) was disabled for all tests.
We used the following versions of our test applications:
- FutureMark 3DMark 2001 SE Build 330
- FutureMark 3DMark03 Build 330
- Codecreatures Benchmark Pro
- VillageMark v1.17
- Comanche 4 demo benchmark
- Quake III Arena v1.31 with trdemo1.dm_67
- Wolfenstein: Enemy Territory with demo0000.dm_82
- Serious Sam SE v1.07 with Demo0003
- Unreal Tournament 2003 with trtest1.dem
- Splinter Cell v1.2 with TRKalinatekDemo.bin
- Jedi Knight II: Jedi Outcast with trtest1.dm_15
- SPECviewperf v7.1
- Tomb Raider: Angel of Darkness v49 patch
- X2 rolling demo
- Gun Metal benchmark v1.20
All the tests and methods we employed are publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.
In a rare act of disclosure, NVIDIA has confirmed that the GeForce FX 5600 Ultra’s NV31 graphics chip is a 4×1-pipe design. Since the chip has a 128-bit memory bus, we can easily determine its theoretical peak single- and multi-texturing fill rates, and its available memory bandwidth. Here’s how NVIDIA’s new Ultra compares with the competition.
|Core clock (MHz)||Pixel pipelines||Peak fill rate (Mpixels/s)||Texture units per pixel pipeline||Peak fill rate (Mtexels/s)||Memory clock (MHz)||Memory bus width (bits)||Peak memory bandwidth (GB/s)|
|GeForce FX 5600||325||4||1300||1||1300||500||128||8.0|
|Radeon 9600 Pro||400||4||1600||1||1600||600||128||9.6|
|GeForce FX 5600 Ultra||400||4||1600||1||1600||800||128||12.8|
The new GeForce FX 5600 Ultra offers the same peak theoretical single- and multi-texturing fill rates as ATI’s Radeon 9600 pro, and it has 3.2GB/sec of extra memory bandwidth over ATI’s mid-range Radeon. Of course, these are just theoretical peaks. Real-world fill rate is what matters.
The GeForce FX 5600 Ultra offers superior single-texturing fill rates in 3DMark03, but its multi-texturing performance isn’t so hot. How do these real-world fill rates compare to the theoretical peaks we were just looking at?
None of the cards really come close to realizing their potential single-texturing fill rate, but the GeForce FX 5600 Ultra leads the way when it comes to single-texturing. However, the card can’t match the Radeon 9600 Pro’s nearly 97% multi-texturing fill rate realization. Since the GeForce FX 5600 Ultra has all that extra memory bandwidth, the bottleneck probably isn’t the memory interface.
The GeForce FX 5600 Ultra uses lossless color and Z-compression to maximize its available resources. The card also has occlusion detection algorithms to help reduce or eliminate overdraw, too.
In the overdraw-laden VillageMark test, the GeForce FX 5600 Ultra doesn’t quite have what it takes to best the Radeon 9600 Pro.
Rather than trying to untangle the mess of “levels of parallelism” within the NV31’s programmable shader unit to predict pixel shader performance, I’ll let the numbers do the talking.
Ouch. In NVIDIA’s own ChameleonMark pixel shader test, the GeForce FX 5600 Ultra gets spanked by the Radeon 9600 Pro. The fact that the GeForce FX 5600 Ultra is dominated by ATI in one of NVIDIA’s own benchmarks has to be embarrassing.
In 3DMark03’s pixel shader 2.0 test, the GeForce FX 5600 Ultra is again beaten by the Radeon 9600 Pro. The Ultra does offer a performance boost over the vanilla GeForce FX 5600, but neither is within striking distance of ATI’s mid-range Radeon.
Like its pixel shaders, the GeForce FX’s vertex shaders are all about “levels of parallelism” within the chip’s programmable shader.
And, like we saw in our pixel shader tests, the GeForce FX 5600 Ultra is smacked around by the Radeon 9600 Pro in 3DMar03’s vertex shader test. Perhaps the GeForce FX 5600 Ultra needs more of that parallelism NVIDIA keeps talking about.
Real-Time High-Dynamic Range Image-Based Lighting
To test the GeForce FX 5600 Ultra’s performance with high-dynamic-range lighting, we used FRAPS to log frame rates in this technology demo with its default settings. The demo uses high-precision texture formats and version 2.0 pixel shaders to produce high-dynamic-range lighting, depth of field, motion blur, and glare, among other effects.
The GeForce FX 5600 Ultra really struggles here; it’s well behind the Radeon 9600 Pro. Valve has already demoed high-dynamic-range lighting in Half-Life 2, making the GeForce FX 5600 Ultra’s poor performance in this test particularly discouraging.
Quake III Arena
Jedi Knight II
Wolfenstein: Enemy Territory
Results from our three Quake III engine-based games are mixed, but the GeForce FX 5600 Ultra generally performs well. Overall, the card is faster than the Radeon 9600 Pro in all but Quake III Arena with 4X antialiasing and 8X anisotropic filtering. The 5600 Ultra turns in particularly impressive performances in Jedi Knight II and Wolfenstein: Enemy Territory.
Unreal Tournament 2003
Codecreatures Benchmark Pro
In our next batch of game tests, the GeForce FX 5600 Ultra doesn’t look quite as hot. The card only manages to best the Radeon 9600 Pro in Codecreatures with antialiasing and aniso disabled, but is at least close in most of the tests.
Gun Metal benchmark
X2 rolling demo
In the DirectX 9 Gun Metal benchmark, which NVIDIA pimps on its own web site, the GeForce FX 5600 fares well against ATI’s mid-range Radeon. However, X2’s DirectX 9 “rolling demo” shows the Radeon 9600 Pro out ahead again. What’s particularly interesting about the X2 demo is that its developers relied primarily on vertex shaders rather than pixel shaders to achieve the game’s special effects.
Tomb Raider: Angel of Darkness
Tomb Raider: Angel of Darkness gets its own special little intro here because its publisher, EIDOS Interactive, has released a statement claiming that the latest patch, which includes a performance benchmark, was never intended for public release. Too late, the patch is already public. We’ve used these extreme quality settings from Beyond3D to give the GeForce FX 5600 Ultra a thorough workout in this DirectX 9 game, and the results speak for themselves.
Ouch. The only thing more painful than the sting NVIDIA must feel from this kind of performance is the amount of time it takes the GeForce FX cards to run through this game’s “paris3” demo. Seriously, it delayed the completion of this review by a couple of days.
According to EIDOS, the patch we’re using is “not a basis for valid benchmarking comparisons,” so you can consider the above results invalid, if you’d like.
Serious Sam SE
In Serious Sam SE, the GeForce FX 5600 Ultra just edges the Radeon 9600 pro with antialiasing and aniso disabled, and pulls a little further ahead with 4X antialiasing and 8X aniso.
In Splinter Cell, the GeForce FX 5600 Ultra and Radeon 9600 Pro are neck and neck. The GeForce FX 5600 Ultra squeaks ahead at higher resolutions, but it’s a close race.
In 3DMark03, which NVIDIA maintains should never be used for graphics card benchmarking, the GeForce FX 5600 Ultra fares surprisingly well against the Radeon 9600 Pro. How does the card do in the individual game tests?
Quite well, until we get to the pixel shader-packed Mother Nature test. In the Mother Nature test, the Radeon 9600 Pro has a healthy lead.
3DMark03 image quality
Because frame rates aren’t everything, let’s take a quick peek at the GeForce FX 5600 Ultra’s image quality in 3DMark03.
Below are cropped JPEGs of frame 1799 of 3DMark03’s “Mother Nature” test. Click on the images for a full-size PNG of each screen shot.
At least in 3DMark03, the GeForce FX 5600 Ultra’s image quality is on par with the Radeon 9600 Pro. Unfortunately, it’s been suggested that NVIDIA drivers can detect screen capture attempts and render higher quality frames for output, so I have to take these screenshots with a grain of salt. When playing in real-time, Mother Nature looks great on the GeForce FX 5600 Ultra, but given recent events, I don’t feel completely safe trusting NVIDIA’s drivers.
In SPEC’s viewperf suite of workstation graphics tests, the GeForce FX 5600 Ultra is faster than the Radeon 9600 Pro all but the Unigraphics (ugs) test.
The GeForce FX 5600 Ultra’s antialiasing performance trails the Radeon 9600 Pro through most resolutions, but the 5600 Ultra has the 9600 Pro within its reach at 1600×1200. Let’s have a look at how well the GeForce FX 5600 Ultra’s antialiasing levels get rid of pesky jagged edges.
Edge antialiasing quality – GeForce FX 5600 Ultra
With higher antialiasing levels, the GeForce FX 5600 Ultra looks progressively better, but we’ll have to check out the Radeon 9600 Pro’s antialiasing image quality before passing judgment.
Edge antialiasing quality – Radeon 9600 Pro
It only goes up to 6X, but the Radeon 9600 Pro’s gamma-corrected antialiasing looks better than what’s available on the GeForce FX 5600 Ultra. Not only do the Radeon 9600 Pro’s antialiasing modes look better, level-for-level, than what’s available on the GeForce FX 5600 Ultra, the Radeon 9600 Pro at 6X looks better than the GeForce FX 5600 Ultra at 8X.
The GeForce FX 5600 Ultra’s anisotropic filtering performance is much improved over the vanilla GeForce FX 5600. It’s quick enough to best the Radeon 9600 Pro in Serious Sam SE. Let’s take a look at some texture filtering screenshots in Quake III Arena and Unreal Tournament 2003 so see anisotropic filtering in action.
Texture antialiasing quality – GeForce FX 5600 Ultra
The GeForce FX 5600 Ultra’s 8X anisotropic filtering has a marked impact on image quality in both games. Notice how much sharper the textures look off in the distance.
Texture antialiasing quality – Radeon 9600 Pro
The GeForce FX 5600 Ultra looks just as good with 8X aniso as the Radeon 9600 Pro.
For kicks, let’s get all psychedelic and break out some multi-colored mip map levels.
Mip map levels – GeForce FX 5600 Ultra
The GeForce FX 5600 Ultra is correctly applying trilinear and anisotropic filtering in Quake III Arena, but it looks like card is using a less aggressive form of trilinear filtering in Unreal Tournament 2003. The mip map transitions look much harsher in Unreal Tournament than in Quake III Arena, but the “light” version of trilinear filtering that NVIDIA is using actually does a pretty good job of hiding mip-map boundaries during actual gameplay.
As good as NVIDIA’s lighter trilinear filtering routine looks in Unreal Tournament 2003, it’s an application-specific optimization of questionable integrity. The above shots were taken with trilinear filtering enabled in Unreal Tournament 2003’s graphics settings control panel using the Detonator FX driver’s “application” image quality settings. Under the “application” setting, the driver should let Unreal Tournament 2003 decide what kind of trilinear filtering to use. Instead, NVIDIA’s driver is stepping in to force a lower quality form of trilinear filtering.
Since Unreal Tournament 2003’s mip map transitions look fine with even the lighter trilinear filtering, I’m not going to get too worked up. However, I would like to see NVIDIA’s “application” image quality settings provide true trilinear filtering when it’s requested by an application.
Mip map levels – Radeon 9600 Pro
The Radeon 9600 Pro is doing trilinear and anisotropic filtering properly in Quake III Arena, but it’s not doing any trilinear filtering in Unreal Tournament 2003 with 8X aniso.
Is this a driver bug? An optimization? A cheat? I can’t say for sure. I can, however, confirm that this issue occurs in more than just Unreal Tournament 2003. Serious Sam SE also shows a lack of trilinear filtering with 8X aniso when it’s rendered in Direct3D mode:
With OpenGL, trilinear and 8X anisotropic filtering work properly on the 9600 Pro in Serious Sam SE, but not with Direct3D. Since Unreal Tournament 2003 also uses Direct3D, this issue appears to be API-specific rather than application-specific, which means it could easily be a driver-related bug. However, it could also be a cleverly disguised attempt to inflate benchmark scores. Given the extra scrutiny reviewers have been giving image quality of late, it would seem an incredibly stupid move for ATI to try to intentionally cheat on image quality.
Keep in mind that we tested both Unreal Tournament 2003 and Serious Sam SE with anisotropic and trilinear filtering enabled. Our scores are indicative of performance with the current Catalyst 3.7 drivers, but may not reflect the Radeon 9600 Pro’s performance if and when this trilinear filtering issue is resolved.
In testing, I was able to get Chaintech’s GeForce FX 5600 Ultra card stable and artifact-free with core and memory clock speeds of 450 and 880MHz, respectively. All in all, that’s an impressive overclock, especially since the Apogee FX71’s cooling solution is no louder than a conventional cooler.
Of course, just because I was able to get my sample up to 450/880 doesn’t mean every Apogee FX71 can go that high. Some cards may not overclock at all, and some may be capable of higher clock speeds.
With bumped core and memory clock speeds, the GeForce FX 5600 Ultra fares better in Unreal Tournament 2003, but it still can’t catch the Radeon 9600 Pro.
This review really looked at two things: the GeForce FX 5600 Ultra GPU, and Chaintech’s Apogee FX71 graphics card. My conclusions regarding the GeForce FX 5600 Ultra GPU apply to all cards based on the chip, including the Apogee FX71, so I’ll deal with those first.
Overall, the GeForce FX 5600 Ultra exceeded my performance expectations in DirectX 8-class applications, but its pokey when pixel and vertex shaders are active, especially DirectX 9 pixel shaders. The 5600 Ultra’s performance in our high-dynamic-range lighting test isn’t encouraging, nor it it’s performance in Tomb Raider: AOD. EIDOS has stated that Tomb Raider isn’t a valid benchmark, but that’s bunk. The game is out and playable, and the patch for the game was publicly available; it’s not like Tomb Raider is some sort of synthetic test with no real-world roots.
In DirectX 9 applications, the GeForce FX 5600 Ultra has a lot of ground to make up if wants to run with the Radeon 9600 Pro. Too much ground, I think. There’s no doubt in my mind that NVIDIA’s next public driver release will improve performance in applications like Tomb Raider and Valve’s upcoming Half-Life 2, but there’s only so much a driver can do to with what looks like a woefully underpowered shader unit.
So, while the GeForce FX 5600 Ultra turns in a good performance in the older games we’ve been testing for years, the outlook is bleak for next-generation titles. Unless NVIDIA can pull off a driver miracle with the next round of Detonators, the Radeon 9600 Pro is a better mid-range graphics card for tomorrow’s games and more than a capable performer with current titles.
For a moment, let’s ignore my apprehension regarding the GeForce FX 5600 Ultra GPU itself. Doing so makes the Apogee FX71 look like a pretty sweet product. The card’s flashy aesthetic may not jive with everyone’s fashion sense, but its quiet cooling system, VIVO support, and surprisingly useful bundle make it a unique offering in a sea of boring reference cards. At $215 online, the Apogee FX71 is in the same price range as other VIVO-equipped GeForce FX 5600 Ultra-based graphics cards. However, Radeon 9600 Pros are available for around $140 onlinea tough reality to ignore for anyone who isn’t an NVIDIA fanboy.
Finding a Radeon to compete with the Apogee FX71’s VIVO support is a little trickier, but it can be done. The All-in-Wonder 9600 Pro can be pre-ordered for about $260 and comes with a TV Tuner, RF Remote Wonder, and a slew of multimedia software in addition to a full suite of VIVO options. To me, that’s worth the extra $45.
In the end, my enthusiasm for Chaintech’s Apogee FX71 is curbed only by the card’s use of NVIDIA’s GeForce FX 5600 Ultra GPU. The card is no doubt a capable and competitive performer in many older games, but Radeon 9600 Pro-based graphics cards are cheaper, faster, and look like more promising performers for next-generation games.