Today we’ll be looking at NV31 as implemented in BFG Technologies’ Asylum GeForce FX 5600 256MB card. As its name implies, the Asylum GeForce FX 5600 256MB packs 256MB of memory, but the card also has a few other tricks up its sleeve.
The dirt on NV31
Before we consider BFG Technologies’ implementation of the GeForce FX 5600, it’s worth taking a moment to go over some of the key capabilities of NVIDIA’s NV31 graphics chip. I’ll just be highlighting NV31’s more important features here, but a more detailed analysis of NV31’s feature set and how it compares with NV30 and NV34 can be found in my preview of the GeForce FX 5600.
- Pixel and vertex shader versions 2.0 The key selling point of the NV31 graphics chip, and indeed of NVIDIA’s entire GeForce FX line, is support for DirectX 9’s pixel and vertex shader versions 2.0. In fact, NVIDIA takes things a few steps further by supporting longer pixel shader program lengths than called for by the pixel shader 2.0 spec. Like NV30 and NV34, NV31 supports 64- and 128-bit floating-point precision in its pixel shaders. Of course, programs using 64-bit datatypes run faster than those with 128-bit datatypes. NVIDIA claims developers can be more efficient by defining their variables for the datatypes they need, mixing 64-bit and 128-bit processing as required. ATI, by comparison, splits the difference and offers only 96-bit floating-point precision with the R300-series chips’ pixel shaders, although the rest of the graphics pipeline offers a range of datatypes, including 64-bit and 128-bit floating-point formats. Both companies’ compromises sacrifice some precision for performance; which is a better choice depends on real-world performance and image quality.
- Clearly defined pipelines NVIDIA has shrouded much of NV31’s internal architecture in mystery, but they have revealed that NV31 has four pixel pipelines, each of which has a single texturing unit. Unlike NV30, which can apparently lay down either four or eight pixels per clock depending on what kind of pixels are being rendered, NV31 lays down four pixels per clock across the board.
- Obfuscated shader structure NVIDIA spells out NV31’s 4×1-pipe architecture quite clearly, but NV31’s shader structure is a closely guarded secret. ATI clearly defines how many pixel and vertex shaders are present in its R3x0 line of graphics chips, but NVIDIA keeps referring to the relative strength of the GeForce FX’s pixel and vertex shaders in terms of the level of “parallelism” in the chip’s programmable shader. NV30 has more parallelism than NV31, which has more parallelism than NV34, but NVIDIA isn’t quantifying anything beyond that.
- 0.13-micron manufacturing process Like NV30, NV31 is manufactured using 0.13-micron process technology by the good folks at TSMC. Since NV31 runs at only 325MHz on the GeForce FX 5600, it doesn’t need the GeForce FX 5800 Ultra’s Dustbuster to keep cool. GeForce FX 5600 cards don’t necessarily need to draw juice from an auxiliary power source, either.
One interesting but slightly obscure feature of NV31 is its support for clock throttling in 2D applications. The same “Coolbits” registry hack that reveals NVIDIA’s overclocking tab in its Detonator drivers also lets users set the “3D” and “2D” core clock frequencies of the GeForce FX 5600. Lowering the core’s clock frequency should make the GeForce FX 5600 run a little cooler, which should help ambient case temperatures. Unfortunately, the GeForce FX 5600’s cooling fan speed doesn’t seem to throttle down when the card’s core clock speed decreases, which means noise levels are consistent regardless of whether a user is running in “2D” or “3D” mode.
Now that we know what’s going on with NV31, let’s check out BFG Technologies’ take on the GeForce FX 5600.
The GeForce FX 5600 should be a popular mid-range graphics product among NVIDIA’s graphics partners, so consumers will likely see a lot of different incarnations of the card on store shelves. Since all GeForce FX 5600 cards will share the same graphics chip and general feature list, it will be up to third-party board manufacturers to come up with attractive features to woo potential buyers to their brands. How does BFG Technologies differentiate its Asylum GeForce FX 5600 256MB? Let’s have a peek.
At first glance, the Asylum GeForce FX 5600 256MB looks like just about every other graphics card on the market. The card is dressed up on a blue board with silver trim, which isn’t terribly daring or wild these days, but should match the myriad of blue components currently on the market.
As its name implies, the Asylum GeForce FX 5600 256MB sports 256MB of memory, but it’s hard to tell by just looking at the card. The Asylum GeForce FX 5600 256MB’s memory chips are spread over both sides of the card, but with only eight memory chips in total, the board looks deceptively pedestrian.
A closer examination of the Asylum GeForce FX 5600 256MB’s TSOP memory chips reveals 32MB chips that are rated for operation at speeds as high as 250MHz (500MHz DDR). Since the card’s default memory clock speed is 500MHz DDR, there’s no “free” overclocking headroom that would keep the memory chips running within their specifications.
The TSOP memory chips used on the Asylum GeForce FX 5600 256MB tend to run hotter than newer and more expensive BGA memory chips, but given the card’s opulent memory spec, it’s easy to see why BFG Technologies opted for cheaper chips this time around.
Unlike the vast majority of graphics cards that get by with only video output support, the Asylum GeForce FX 5600 256MB supports video input via Philips’ SAA7114H PAL/NTSC/SECAM video decoder chip. The Asylum GeForce FX 5600 256MB is essentially a VIVO (video in, video out) product, but it doesn’t go quite as far as NVIDIA’s Personal Cinema or ATI’s All-in-Wonder when it comes to extra multimedia features.
The high-end GeForce FX 5800 Ultra was blasted for its loud Dustbuster cooling apparatus, but the Asylum GeForce FX 5600 256MB uses a much more conventional heat sink/fan combo to keep the GPU cool. In fact, the Asylum GeForce FX 5600 256MB’s cooler is almost identical to the cooler on Sapphire’s Radeon 9500 Pro. That’s a good thing, since the cooler is easy to pop off and about as quiet as they come. BFG Technologies did a good job using just the right amount of thermal compound between the heat sink and GPU, too.
A standard array of output ports graces the Asylum GeForce FX 5600 256MB, and BFG Technologies also throws in a VIVO dongle and DVI-to-VGA adapter. No video cables were included in the box, but the VIVO dongle does have composite and S-Video input and output ports.
Bundle-wise, there’s not much to talk about. A copy of Ulead’s Video Studio 6 SE is included with the card, which complements its VIVO capabilities nicely, and there’s also a copy of NVIDIA’s own NVDVD software in the box. As far as DVD playback software goes, NVDVD is pretty good, but it’s not going to be vastly superior to PowerDVD or WinDVD for most users.
As ever, we did our best to deliver clean benchmark numbers. Tests were run three times, and the results were averaged.
Our test system was configured like so:
|Processor||Athlon XP ‘Thoroughbred’ 2600+ 2.083GHz|
|Front-side bus||333MHz (166MHz DDR)|
|North bridge||nForce2 SPP|
|South bridge||nForce2 MCP|
|Chipset drivers||NVIDIA 2.03|
|Memory size||512MB (2 DIMMs)|
|Memory type||Corsair XMS3200 PC2700 DDR SDRAM (333MHz)|
|Graphics card||GeForce4 Ti 4200 8X 128MB
GeForce FX 5200 128MB
GeForce FX 5600 256MB
|Radeon 9500 Pro 128MB
Radeon 9600 Pro 128MB
|Graphics driver||Detonator 43.45||CATALYST 3.2|
|Storage||Maxtor DiamondMax Plus D740X 7200RPM ATA/100 hard drive|
|OS||Microsoft Windows XP Professional|
|OS updates||Service Pack 1, DirectX 9.0|
Today we’ll be comparing the Asylum GeForce FX 5600 256MB’s performance with a couple of competitors from ATI and NVIDIA. Since there are already a couple of manufacturers making 256MB versions of the GeForce FX 5600, I’ll be discussing the GPU’s performance in more general terms, though BFG Technologies’ card was used throughout testing.
The GeForce FX 5600 should compete primarily with ATI’s Radeon 9600 Pro, so we should pay special attention to those results. It will also be interesting to see how the GeForce FX 5600 stacks up against its budget sibling, the GeForce FX 5200, and NVIDIA’s previous mid-range gaming card, the GeForce4 Ti 4200 8X.
In order to keep a level playing field, image quality-wise, I tested the NVIDIA cards with the “Application” image quality setting. The Radeon cards were tested using ATI’s “Quality” image quality option, which produces visuals roughly equivalent to NVIDIA’s “Application” setting.
The test system’s Windows desktop was set at 1024×768 in 32-bit color at an 85Hz screen refresh rate. Vertical refresh sync (vsync) was disabled for all tests.
We used the following versions of our test applications:
- FutureMark 3DMark 2001 SE Build 330
- FutureMark 3DMark03 Build 320
- Codecreatures Benchmark Pro
- Comanche 4 demo benchmark
- Quake III Arena v1.31
- Serious Sam SE v1.07
- VillageMark v1.17
- Unreal Tournament 2003
- Splinter Cell v1.2
All the tests and methods we employed are publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.
Peak theoretical fill rates and memory bandwidth don’t necessarily dictate a graphics card’s real-world performance, but they do a good job of setting some initial expectations. How does the GeForce FX 5600 look against the competition we’ve rounded up today?
|Core clock (MHz)||Pixel pipelines||Peak fill rate (Mpixels/s)||Texture units per pixel pipeline||Peak fill rate (Mtexels/s)||Memory clock (MHz)||Memory bus width (bits)||Peak memory bandwidth (GB/s)|
|GeForce FX 5200||250||4||1000||1||1000||400||128||6.4|
|GeForce FX 5600||325||4||1300||1||1300||500||128||8.0|
|GeForce4 Ti 4200 8X||250||4||1000||2||2000||512||128||8.2|
|Radeon 9500 Pro||275||8||2200||1||2200||540||128||8.6|
|Radeon 9600 Pro||400||4||1600||1||1600||600||128||9.6|
Competitivesort of. The GeForce FX 5600’s single- and multi-texturing fill rates aren’t too far behind the Radeon 9600 Pro, and neither is the card’s available memory bandwidth. That the GeForce FX 5600’s multi-texturing fill rate is less than that of the GeForce4 Ti 4200 8X is a bit of a concern, though, especially since the GeForce FX 5600 doesn’t offer more memory bandwidth than the card it’s replacing in NVIDIA’s lineup.
The GeForce FX 5600 may not have as much peak memory bandwidth to play with as the GeForce4 Ti 4200 8X, but more advanced color and Z-compression techniques should help the GeForce FX 5600 make more effective use what’s available. Of course, the Radeon 9600 Pro also has all sorts of color and Z-compression routines, so the GeForce FX 5600 will still be an underdog.
But enough with the theoretical peaks; what kind of fill rates can the GeForce FX 5600 muster in the real world?
Mediocre ones, I’m afraid. The GeForce FX 5600 pumps out more pixels and textures per second than the GeForce FX 5200, but that’s about it. The GeForce4 Ti 4200 8X and both Radeons are easily out ahead of the GeForce FX 5600, especially in 3DMark2001 SE’s multi-texturing fill rate test.
If the GeForce FX 5600’s fill rates aren’t all that impressive when compared with the competition, how do they look when compared with the card’s peak theoretical capabilities?
None of the cards we’re looking at today realize their entire peak theoretical single-texturing fill rate, so the GeForce FX 5600 doesn’t look too bad. However, when we look at multi-texturing performance, the GeForce FX 5600 and 5200 are the only two cards to exhibit real-world fill rates that are significantly lower than their theoretical peaks. The GeForce FX 5600 realizes only 84% of its peak theoretical multi-texturing fill rate, which looks particularly bad when the GeForce4 Ti 4200 8X and both Radeons realize nearly all of their multi-texturing fill rate potential.
Of course, these are just synthetic tests. The real proof will come at high resolutions in actual games.
Like ATI’s Radeon 9500 and 9600 Pro, the GeForce FX 5600 uses lossless algorithms to compress color and Z data and make more effective use of available memory bandwidth. ATI’s R3x0-derived graphics cards also use “Early Z” occlusion detection to all but eliminate overdraw, letting the chip dedicate all of its resources to rendering pixels that will actually be seen. We believe NVIDIA is using a similar “Early Z” technology to help eliminate overdraw with the GeForce FX, but we don’t know many specifics about it.
The GeForce FX 5600’s performance in VillageMark isn’t impressive at all. The card only just stays in front of the budget GeForce FX 5200, which puts it well behind the Radeons. Even with anisotropic filtering and antialiasing enabled, the GeForce FX 5600 only manages to pull ahead of the GeForce4 Ti 4200 8X.
The GeForce FX 5600 supposedly has more parallelism within its pixel shaders than the GeForce FX 5200. Let’s see how that theory works out in practice.
The GeForce FX 5600’s performance in 3DMark2001 SE’s pixel shader test is respectable, but it’s really hurting in the advanced pixel shader test, where the card is barely faster than a GeForce FX 5200.
Things don’t get much better in NVIDIA’s own ChameleonMark. The GeForce FX 5600 is faster than the sub-$100 GeForce FX 5200, but it can’t outrun the previous-gen GeForce4 Ti 4200 8X.
In 3DMark03’s pixel shader 2.0 test, which is our only true DirectX 9-class pixel shader test, the GeForce FX 5600 sits between the Radeons and the GeForce FX 5200. Unfortunately, the GeForce FX 5600 still looks like a rather poor performer with next-generation shaders. (The GeForce4 Ti 4200 8X doesn’t support pixel shaders 2.0, so it’s riding the pine for this test.)
Like with its pixel shaders, the GeForce FX 5600’s vertex shaders are all about levels of parallelism. How do they perform?
Not too well. The GeForce FX 5600’s scores are simply too close to the GeForce FX 5200 to justify the former’s price premium. In 3DMark2001 SE and 3DMark03, the GeForce FX 5600 is outclassed by the Radeons, which offer more than twice as many frames per second overall.
3DMark2001 SE’s transform and lighting tests are running as vertex shader programs on the GeForce FX 5600, and it’s not pretty. The GeForce FX 5600 is consistently faster than the GeForce FX 5200, but not by nearly enough, especially with the Radeons so far out ahead.
The GeForce FX 5600 hasn’t looked impressive in synthetic feature tests, but what about games people actually play?
Quake III Arena
The GeForce FX 5600 performs reasonably well in Quake III Arena, but really doesn’t stretch its legs until anisotropic filtering and antialiasing are enabled. Normally, 30 frames per second at 1600×1200 with 8X anisotropic filtering and 4X antialiasing would be impressive, but with the Radeons running at over 50 frames per second with the same settings, the GeForce FX 5600’s performance looks a little underwhelming.
Jedi Knight II
In Jedi Knight II, the GeForce FX 5600 is faster than the GeForce4 Ti 4200 8X with 8X anisotropic filtering and 4X antialiasing, but it never quite catches the Radeons.
The GeForce FX 5600 is way off the pace in Comanche with anisotropic filtering and antialiasing disabled. With 8X anisotropic filtering and 4X antialiasing, it’s just able to pull even with the GeForce4 Ti 4200 8X, but the Radeons are still way out ahead.
Codecreatures Benchmark Pro
The GeForce FX 5600’s performance in Codecreatures isn’t impressive until anisotropic filtering and antialiasing are enabled. Even then, it’s not faster than the Radeons, or even the GeForce4 Ti 4200 8X.
In our low- and high-detail Unreal Tournament 2003 tests, the GeForce FX 5600 performs more like a budget GeForce FX 5200 than it does a replacement for the GeForce4 Ti 4200 8X.
With anisotropic filtering and antialiasing enabled, the GeForce FX 5600 pulls ahead of the GeForce4 Ti 4200 8X, but is still handily beaten by the Radeons.
We used Serious Sam SE’s “Extreme quality” add-on for our tests, which automatically uses the highest level of anisotropic filtering available on each card. For these tests, the Radeons are using 16X anisotropic filtering, while the GeForce cards are using 8X anisotropic filtering.
When we look at average frame rates in Serious Sam SE, the GeForce FX 5600’s performance actually looks decent; it’s just behind the Radeon 9500 Pro. How do frame rates look over the length of the demo?
Pretty good. The GeForce FX 5600 doesn’t suffer the same stuttering at the start of the benchmark as the GeForce FX 5200, and it doesn’t exhibit any unusual peaks or dips in performance that could potentially ruin gameplay.
Let’s crank up the antialiasing options and see what happens.
With 4X antialiasing and 8X anisotropic filtering, the GeForce FX 5600 distances itself from the GeForce FX 5200 and pulls ahead of the GeForce4 Ti 4200 8X, but it’s still behind the Radeons.
Over the length of the benchmark demo, the GeForce FX 5600’s performance is closest to the GeForce4 Ti 4200 8X. Fortunately, the GeForce FX 5200’s issues at the start of the demo don’t plague the GeForce FX 5600.
Splinter Cell is a new addition to our benchmark suite, and I can’t think of a better way to break it in.
With antialiasing and anisotropic filtering disabled, the GeForce FX 5600 is barely faster than the GeForce FX 5200, and well off the pace set by ATI’s mid-range Radeons.
Since Splinter Cell dumps frame rate information for the entire benchmark demo into an Excel file, we can bust out some nifty “frame rate over time” graphs.
Performance is all over the map in Splinter Cell, even for the Radeons. The GeForce FX 5600’s performance isn’t quite as erratic as some of the other cards, but that’s mostly because its peak frame rates are much lower than the competition. Unfortunately, the troughs are quite a bit lower, too.
With anisotropic filtering and antialiasing enabled, the GeForce FX 5600’s performance scales well. For some reason, the Radeons don’t agree with Splinter Cell at 1600×1200 with anisotropic filtering and antialiasing enabled.
The GeForce FX 5600 is more competitive with anisotropic filtering and antialiasing enabled. However, since frame rates are so low overall, I wouldn’t recommend playing the game on any of these cards with both anisotropic filtering and antialiasing enabled.
NVIDIA hasn’t expressed concerns about the viability of 3DMark2001 SE, but maybe it should. Here the GeForce FX 5600 is way off the pace.
The GeForce FX 5600 is a comparatively poor performer in each of 3DMark2001 SE’s game tests, even when compared with the GeForce4 Ti 4200 8X it’s supposed to replace. Even in the shader-laden Nature test, the GeForce FX 5600 doesn’t have what it takes to pump out competitive frame rates.
The GeForce FX 5600’s standing improves in the DirectX 9-fortified 3DMark03, as the card clearly outpaces the older GeForce4 Ti 4200 8X.
The GeForce FX 5600’s performance in 3DMark03’s individual game tests is competitive with the GeForce4 Ti 4200 8X, but the Radeons are consistently out in front. In the DirectX 9 “Mother Nature” test, the GeForce FX 5600 is only half as fast as the Radeon 9500 Pro.
Because there’s more to a graphics card than performance, I’ve yanked a couple of 3DMark03 screen shots to examine the GeForce FX 5600’s image quality with anisotropic filtering and antialiasing disabled. Below are screenshots from the GeForce FX 5600, Radeon 9600 Pro, and DirectX 9’s reference software renderer. You can click on the images to get full-size versions of each screenshot.
The differences are subtle, but it’s clear that the GeForce FX 5600’s screen shot isn’t as close to the reference renderer as the Radeon 9600’s. Looking decidedly less “cinematic,” GeForce FX 5600’s dull sky is the dead giveaway.
SPECviewperf is really a workstation benchmark, but I’ve included it here for variety. After all, not all consumer graphics cards end up playing only games.
Comparatively, the GeForce FX 5600 performs much better in SPECviewperf than it does in our gaming tests.
Next, we’ll isolate the GeForce FX 5600’s performance in different antialiasing and anisotropic filtering modes. We’ve already had a glimpse of the card’s performance with 4X antialiasing and 8X anisotropic filtering in our game tests, but this section will provide a more thorough look at the card’s performance with these image quality features. Since the GeForce FX 5600 doesn’t support 6X antialiasing using the same algorithm it uses at other sample sizes, I’ve included results for its 6XS antialiasing mode.
At some resolutions, the GeForce FX 5600 is capable of 16X antialiasing, which is impressive in itself. What’s even more impressive is how the card’s performance trails off only slightly at higher antialiasing levels. That’s probably the NV31’s color compression engine at work. Color compression, naturally, becomes more effective at higher sampling rates.
Unfortunately, only at 1600×1200 does the GeForce FX 5600 come within striking distance of the nearest Radeon. The Radeons only support antialiasing up to 6X, but their performance up to that point is better than the GeForce FX 5600.
How good does the GeForce FX 5600’s 16X antialiasing actually look? First let’s check out the competition.
Looks good, no? Let’s quickly examine why.
Like the other members of ATI’s R3x0-derived line, the Radeon 9600 Pro gamma corrects its blends of sampled pixel fragments, so the output looks correct on PC monitors. PC monitors imperfectly relate pixel brightness to signaling voltage exponentially, but most graphics cards maintain a truer linear relationship between the two. To more accurately relate a pixel’s brightness on a monitor, the pixel’s color value can be gamma corrected to compensate for the exponential relationship between pixel brightness and signaling voltage. The result: pixel brightness values that appear more linear on a computer screen.
Also, in 2X and 4X AA modes, the Radeon 9600 Pro uses a rotated grid sampling pattern. With 6X antialiasing, the Radeon 9600 Pro uses a more complex, quasi-random sampling pattern to fool the human eye’s pattern recognition prowess and produce impressively clean edges.
Even with 16X antialiasing, the smoothed edges produced by the GeForce FX 5600’s multisampled antialiasing don’t look significantly better than 6X antialiasing on the Radeon 9600 Pro. Why? Partly because the GeForce FX 5600 isn’t gamma correcting any of its fragment blend operations, so the AA pixel values don’t look as true as they could. Even with larger sample sizes, the GeForce FX 5600 can’t overcome the image quality deficit versus the Radeon 9600 Pro.
Also, the NV31 chip is using a more uniform (and thus easier for the eye to detect) ordered grid sampling pattern. The GeForce FX 5600 does have 4XS and 6XS antialiasing modes that combine multisampling with supersampling to provide a little texture antialiasing. These “XS” modes also use a rotated grid sampling pattern. However, as you can see from the 6XS antialiasing screenshot, the results aren’t anything to write home about. Of course, sample patterns will probably have more impact in motion than in a still shot.
To measure texture antialiasing, I used Serious Sam SE with various texture filtering settings.
The GeForce FX 5600’s performance closely mirrors that of the GeForce FX 5200 and is only a few frames per second faster than the budget DirectX 9 card. The Radeons walk away with this one.
All of our testing was done with the GeForce FX 5600’s “Application” image quality setting. Here’s why:
As we’ve seen with the GeForce FX 5200 and 5800 chips, the “Application” quality setting produces sharper textures than NVIDIA’s “Performance” and “Quality” options.
For comparison, here’s the Radeon 9600 Pro with ATI’s “Performance” and “Quality” 8X anisotropic filtering.
ATI’s “Quality” setting yields sharp textures comparable to NVIDIA’s “Application” image quality setting.
Now let’s see exactly what NVIDIA is doing to texture filtering with these various quality slider settings. I’ve used Q3A’s “r_colormiplevels 1” command to expose the various mip-maps in use and the transitions between them.
The mip-map transitions of the GeForce FX 5600’s “Performance” and “Quality” image quality settings aren’t as smooth as the mip-map transitions of the “Application” image quality setting.
As you can see, the mip-map transitions of ATI’s “Quality” image quality setting are just as smooth as NVIDIA’s “Application” setting.
First, let’s look at NVIDIA’s GeForce FX 5600 in general, then we’ll address BFG Tech’s particular implementation of it.
As a mid-range graphics product, the GeForce FX 5600 has a lot going for it. For as low as $169 online (for the 128MB version), the GeForce FX 5600 is capable of bringing a DirectX 9 feature set to mainstream gamers and mid-range markets. On its own, the GeForce FX 5600 is a solid product that’s even capable of running the “Ultra” version of NVIDIA’s sexy Dawn demo at what looks like about 25 frames per second. 25 frames per second is definitely better than the slideshow that NVIDIA’s budget GeForce FX 5200 produces with Dawn Ultra, giving the GeForce FX 5600 more legitimate DirectX 9 functionality than its low-end cousin.
The GeForce FX 5600, however, does not exist in a vacuum. ATI’s Radeon 9500 Pro is still available online for as low as $160 and its replacement, the Radeon 9600 Pro, is just now becoming available for under $195. The Radeon 9500 Pro likely won’t be available for long, but with the impressive Radeon 9600 Pro taking its place, the GeForce FX 5600 faces stiff competition regardless.
Performance-wise, the Radeon 9500 and 9600 Pro lay down a beating on the GeForce FX 5600, even when the latter is equipped with 256MB of memory. As far as features go, there are few practical ways that the Radeons are really lacking. The Radeon 9500 and 9600 Pro only support pixel shader programs up to 64 instructions in length (which is the limit of the pixel shader 2.0 specification), and they only offer 96-bit pixel shader precision rather than the 128-bit precision available to the GeForce FX 5600. However, those “onlys” have very little practical value. After all, it will be some time before games and applications start taking advantage of pixel and vertex shader versions 2.0 with regularly, and the 3DMark03 image quality tests suggest that the GeForce FX 5600 doesn’t have the horsepower to take full advantage of the extra pixel shader precision it has available.
Really, about the only area that the GeForce FX 5600 has a practical advantage over ATI’s mid-range Radeons is in multimonitor software. NVIDIA’s nView software is far superior to ATI’s Hydravision, which could make the GeForce FX 5600 more appealing to multimonitor fans. Otherwise, I can’t recommend the GeForce FX 5600 with a clear conscience; ATI’s mid-range Radeons are faster, priced in the same class, and have at least comparableif not betterimage quality. Those only concerned with multimonitor software can always opt for a much cheaper GeForce FX 5200, which also supports nView.
The fact that BFG Technologies’ Asylum GeForce FX 5600 256MB is based around the GeForce FX 5600 puts the card at a disadvantage right off the bat. Simply adding extra memory and VIVO ports to the card doesn’t make the GeForce FX 5600 GPU competitive with the Radeon 9500 or 9600 Pro. At $249 on Pricewatch, the Asylum GeForce FX 5600 256MB is $50 more than the Radeon 9600 Pro and nearly $80 more than the cheapest 128MB GeForce FX 5600. I’m all for the VIVO port, which is worth a bit of a premium, but the extra memory probably isn’tnot for most of today’s games. Although 256MB of memory appears to be useful for antialiasing at high resolutions with high detail textures, the GeForce FX 5600 doesn’t have the horsepower to produce playable frame rates in those situations.
With competition like the Radeon 9500 and 9600 Pro, there’s little reason for gamers, enthusiasts, or even casual consumers to buy a graphics card based on the GeForce FX 5600 today. Newer, more mature drivers from NVIDIA could improve the GeForce FX 5600’s performance, but there are never any guarantees in that department. Considering that GeForce FX 5600 cards rigged with 128MB of memory are available for around $169 online, the Asylum GeForce FX 5600 256MB isn’t even an especially attractive offering among GeForce FX 5600-based graphics cards (unless you have a particular memory-intensive graphics application in mind for it). If the price on this Asylum card were to drop by a fair amount, the card’s extra RAM and VIVO capabilities could make it more appealing than GeForce FX 5600 cards from other manufacturers. At least for now, though, any GeForce FX 5600 will be comparatively slow versus the mid-rage Radeon cards.