NVIDIA’s GeForce FX 5600 GPU

THUS FAR, the performance of NVIDIA’s GeForce FX line has left something to be desired. The GeForce FX 5800 Ultra is a noisy alternative to ATI’s Radeon 9800 Pro, and the GeForce FX 5200 is a slower performer in real-world games and applications than the Radeon 9000 Pro. For enthusiasts, however, the GeForce FX 5600 may be the most interesting card in the GeForce FX line. Powered by NVIDIA’s NV31 graphics chip, the GeForce FX 5600 is a mid-range graphics card aimed at ATI’s Radeon 9500 and 9600 lines. Like other members of the GeForce FX family, the GeForce FX 5600 is dressed up with enough DirectX 9 goodies to give users a “cinematic” experience, although features alone don’t guarantee performance.

Today we’ll be looking at NV31 as implemented in BFG Technologies’ Asylum GeForce FX 5600 256MB card. As its name implies, the Asylum GeForce FX 5600 256MB packs 256MB of memory, but the card also has a few other tricks up its sleeve.

The dirt on NV31
Before we consider BFG Technologies’ implementation of the GeForce FX 5600, it’s worth taking a moment to go over some of the key capabilities of NVIDIA’s NV31 graphics chip. I’ll just be highlighting NV31’s more important features here, but a more detailed analysis of NV31’s feature set and how it compares with NV30 and NV34 can be found in my preview of the GeForce FX 5600.

  • Pixel and vertex shader versions 2.0 — The key selling point of the NV31 graphics chip, and indeed of NVIDIA’s entire GeForce FX line, is support for DirectX 9’s pixel and vertex shader versions 2.0. In fact, NVIDIA takes things a few steps further by supporting longer pixel shader program lengths than called for by the pixel shader 2.0 spec. Like NV30 and NV34, NV31 supports 64- and 128-bit floating-point precision in its pixel shaders. Of course, programs using 64-bit datatypes run faster than those with 128-bit datatypes. NVIDIA claims developers can be more efficient by defining their variables for the datatypes they need, mixing 64-bit and 128-bit processing as required. ATI, by comparison, splits the difference and offers only 96-bit floating-point precision with the R300-series chips’ pixel shaders, although the rest of the graphics pipeline offers a range of datatypes, including 64-bit and 128-bit floating-point formats. Both companies’ compromises sacrifice some precision for performance; which is a better choice depends on real-world performance and image quality.

  • Clearly defined pipelines — NVIDIA has shrouded much of NV31’s internal architecture in mystery, but they have revealed that NV31 has four pixel pipelines, each of which has a single texturing unit. Unlike NV30, which can apparently lay down either four or eight pixels per clock depending on what kind of pixels are being rendered, NV31 lays down four pixels per clock across the board.

  • Obfuscated shader structure — NVIDIA spells out NV31’s 4×1-pipe architecture quite clearly, but NV31’s shader structure is a closely guarded secret. ATI clearly defines how many pixel and vertex shaders are present in its R3x0 line of graphics chips, but NVIDIA keeps referring to the relative strength of the GeForce FX’s pixel and vertex shaders in terms of the level of “parallelism” in the chip’s programmable shader. NV30 has more parallelism than NV31, which has more parallelism than NV34, but NVIDIA isn’t quantifying anything beyond that.

  • 0.13-micron manufacturing process — Like NV30, NV31 is manufactured using 0.13-micron process technology by the good folks at TSMC. Since NV31 runs at only 325MHz on the GeForce FX 5600, it doesn’t need the GeForce FX 5800 Ultra’s Dustbuster to keep cool. GeForce FX 5600 cards don’t necessarily need to draw juice from an auxiliary power source, either.

One interesting but slightly obscure feature of NV31 is its support for clock throttling in 2D applications. The same “Coolbits” registry hack that reveals NVIDIA’s overclocking tab in its Detonator drivers also lets users set the “3D” and “2D” core clock frequencies of the GeForce FX 5600. Lowering the core’s clock frequency should make the GeForce FX 5600 run a little cooler, which should help ambient case temperatures. Unfortunately, the GeForce FX 5600’s cooling fan speed doesn’t seem to throttle down when the card’s core clock speed decreases, which means noise levels are consistent regardless of whether a user is running in “2D” or “3D” mode.


NV31 in all its glory

Now that we know what’s going on with NV31, let’s check out BFG Technologies’ take on the GeForce FX 5600.

BFG Technologies’ Asylum GeForce FX 5600 256MB
The GeForce FX 5600 should be a popular mid-range graphics product among NVIDIA’s graphics partners, so consumers will likely see a lot of different incarnations of the card on store shelves. Since all GeForce FX 5600 cards will share the same graphics chip and general feature list, it will be up to third-party board manufacturers to come up with attractive features to woo potential buyers to their brands. How does BFG Technologies differentiate its Asylum GeForce FX 5600 256MB? Let’s have a peek.

At first glance, the Asylum GeForce FX 5600 256MB looks like just about every other graphics card on the market. The card is dressed up on a blue board with silver trim, which isn’t terribly daring or wild these days, but should match the myriad of blue components currently on the market.

As its name implies, the Asylum GeForce FX 5600 256MB sports 256MB of memory, but it’s hard to tell by just looking at the card. The Asylum GeForce FX 5600 256MB’s memory chips are spread over both sides of the card, but with only eight memory chips in total, the board looks deceptively pedestrian.

A closer examination of the Asylum GeForce FX 5600 256MB’s TSOP memory chips reveals 32MB chips that are rated for operation at speeds as high as 250MHz (500MHz DDR). Since the card’s default memory clock speed is 500MHz DDR, there’s no “free” overclocking headroom that would keep the memory chips running within their specifications.

The TSOP memory chips used on the Asylum GeForce FX 5600 256MB tend to run hotter than newer and more expensive BGA memory chips, but given the card’s opulent memory spec, it’s easy to see why BFG Technologies opted for cheaper chips this time around.

Unlike the vast majority of graphics cards that get by with only video output support, the Asylum GeForce FX 5600 256MB supports video input via Philips’ SAA7114H PAL/NTSC/SECAM video decoder chip. The Asylum GeForce FX 5600 256MB is essentially a VIVO (video in, video out) product, but it doesn’t go quite as far as NVIDIA’s Personal Cinema or ATI’s All-in-Wonder when it comes to extra multimedia features.

The high-end GeForce FX 5800 Ultra was blasted for its loud Dustbuster cooling apparatus, but the Asylum GeForce FX 5600 256MB uses a much more conventional heat sink/fan combo to keep the GPU cool. In fact, the Asylum GeForce FX 5600 256MB’s cooler is almost identical to the cooler on Sapphire’s Radeon 9500 Pro. That’s a good thing, since the cooler is easy to pop off and about as quiet as they come. BFG Technologies did a good job using just the right amount of thermal compound between the heat sink and GPU, too.

A standard array of output ports graces the Asylum GeForce FX 5600 256MB, and BFG Technologies also throws in a VIVO dongle and DVI-to-VGA adapter. No video cables were included in the box, but the VIVO dongle does have composite and S-Video input and output ports.

Bundle-wise, there’s not much to talk about. A copy of Ulead’s Video Studio 6 SE is included with the card, which complements its VIVO capabilities nicely, and there’s also a copy of NVIDIA’s own NVDVD software in the box. As far as DVD playback software goes, NVDVD is pretty good, but it’s not going to be vastly superior to PowerDVD or WinDVD for most users.

Our testing methods
As ever, we did our best to deliver clean benchmark numbers. Tests were run three times, and the results were averaged.

Our test system was configured like so:

System
Processor Athlon XP ‘Thoroughbred’ 2600+ 2.083GHz
Front-side bus 333MHz (166MHz DDR)
Motherboard Asus A7N8X
Chipset NVIDIA nForce2
North bridge nForce2 SPP
South bridge nForce2 MCP
Chipset drivers NVIDIA 2.03
Memory size 512MB (2 DIMMs)
Memory type Corsair XMS3200 PC2700 DDR SDRAM (333MHz)
Sound nForce2 APU
Graphics card GeForce4 Ti 4200 8X 128MB
GeForce FX 5200 128MB
GeForce FX 5600 256MB
Radeon 9500 Pro 128MB
Radeon 9600 Pro 128MB
Graphics driver Detonator 43.45 CATALYST 3.2
Storage Maxtor DiamondMax Plus D740X 7200RPM ATA/100 hard drive
OS Microsoft Windows XP Professional
OS updates Service Pack 1, DirectX 9.0

Today we’ll be comparing the Asylum GeForce FX 5600 256MB’s performance with a couple of competitors from ATI and NVIDIA. Since there are already a couple of manufacturers making 256MB versions of the GeForce FX 5600, I’ll be discussing the GPU’s performance in more general terms, though BFG Technologies’ card was used throughout testing.

The GeForce FX 5600 should compete primarily with ATI’s Radeon 9600 Pro, so we should pay special attention to those results. It will also be interesting to see how the GeForce FX 5600 stacks up against its budget sibling, the GeForce FX 5200, and NVIDIA’s previous mid-range gaming card, the GeForce4 Ti 4200 8X.

In order to keep a level playing field, image quality-wise, I tested the NVIDIA cards with the “Application” image quality setting. The Radeon cards were tested using ATI’s “Quality” image quality option, which produces visuals roughly equivalent to NVIDIA’s “Application” setting.

The test system’s Windows desktop was set at 1024×768 in 32-bit color at an 85Hz screen refresh rate. Vertical refresh sync (vsync) was disabled for all tests.

We used the following versions of our test applications:

All the tests and methods we employed are publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.

Synthetic tests

Fill rate
Peak theoretical fill rates and memory bandwidth don’t necessarily dictate a graphics card’s real-world performance, but they do a good job of setting some initial expectations. How does the GeForce FX 5600 look against the competition we’ve rounded up today?

Core clock (MHz) Pixel pipelines Peak fill rate (Mpixels/s) Texture units per pixel pipeline Peak fill rate (Mtexels/s) Memory clock (MHz) Memory bus width (bits) Peak memory bandwidth (GB/s)
GeForce FX 5200 250 4 1000 1 1000 400 128 6.4
GeForce FX 5600 325 4 1300 1 1300 500 128 8.0
GeForce4 Ti 4200 8X 250 4 1000 2 2000 512 128 8.2
Radeon 9500 Pro 275 8 2200 1 2200 540 128 8.6
Radeon 9600 Pro 400 4 1600 1 1600 600 128 9.6

Competitive—sort of. The GeForce FX 5600’s single- and multi-texturing fill rates aren’t too far behind the Radeon 9600 Pro, and neither is the card’s available memory bandwidth. That the GeForce FX 5600’s multi-texturing fill rate is less than that of the GeForce4 Ti 4200 8X is a bit of a concern, though, especially since the GeForce FX 5600 doesn’t offer more memory bandwidth than the card it’s replacing in NVIDIA’s lineup.

The GeForce FX 5600 may not have as much peak memory bandwidth to play with as the GeForce4 Ti 4200 8X, but more advanced color and Z-compression techniques should help the GeForce FX 5600 make more effective use what’s available. Of course, the Radeon 9600 Pro also has all sorts of color and Z-compression routines, so the GeForce FX 5600 will still be an underdog.

But enough with the theoretical peaks; what kind of fill rates can the GeForce FX 5600 muster in the real world?

Mediocre ones, I’m afraid. The GeForce FX 5600 pumps out more pixels and textures per second than the GeForce FX 5200, but that’s about it. The GeForce4 Ti 4200 8X and both Radeons are easily out ahead of the GeForce FX 5600, especially in 3DMark2001 SE’s multi-texturing fill rate test.

If the GeForce FX 5600’s fill rates aren’t all that impressive when compared with the competition, how do they look when compared with the card’s peak theoretical capabilities?

None of the cards we’re looking at today realize their entire peak theoretical single-texturing fill rate, so the GeForce FX 5600 doesn’t look too bad. However, when we look at multi-texturing performance, the GeForce FX 5600 and 5200 are the only two cards to exhibit real-world fill rates that are significantly lower than their theoretical peaks. The GeForce FX 5600 realizes only 84% of its peak theoretical multi-texturing fill rate, which looks particularly bad when the GeForce4 Ti 4200 8X and both Radeons realize nearly all of their multi-texturing fill rate potential.

Of course, these are just synthetic tests. The real proof will come at high resolutions in actual games.

Occlusion detection
Like ATI’s Radeon 9500 and 9600 Pro, the GeForce FX 5600 uses lossless algorithms to compress color and Z data and make more effective use of available memory bandwidth. ATI’s R3x0-derived graphics cards also use “Early Z” occlusion detection to all but eliminate overdraw, letting the chip dedicate all of its resources to rendering pixels that will actually be seen. We believe NVIDIA is using a similar “Early Z” technology to help eliminate overdraw with the GeForce FX, but we don’t know many specifics about it.

The GeForce FX 5600’s performance in VillageMark isn’t impressive at all. The card only just stays in front of the budget GeForce FX 5200, which puts it well behind the Radeons. Even with anisotropic filtering and antialiasing enabled, the GeForce FX 5600 only manages to pull ahead of the GeForce4 Ti 4200 8X.

Pixel shaders
The GeForce FX 5600 supposedly has more parallelism within its pixel shaders than the GeForce FX 5200. Let’s see how that theory works out in practice.

The GeForce FX 5600’s performance in 3DMark2001 SE’s pixel shader test is respectable, but it’s really hurting in the advanced pixel shader test, where the card is barely faster than a GeForce FX 5200.

Things don’t get much better in NVIDIA’s own ChameleonMark. The GeForce FX 5600 is faster than the sub-$100 GeForce FX 5200, but it can’t outrun the previous-gen GeForce4 Ti 4200 8X.

In 3DMark03’s pixel shader 2.0 test, which is our only true DirectX 9-class pixel shader test, the GeForce FX 5600 sits between the Radeons and the GeForce FX 5200. Unfortunately, the GeForce FX 5600 still looks like a rather poor performer with next-generation shaders. (The GeForce4 Ti 4200 8X doesn’t support pixel shaders 2.0, so it’s riding the pine for this test.)

Vertex shaders
Like with its pixel shaders, the GeForce FX 5600’s vertex shaders are all about levels of parallelism. How do they perform?

Not too well. The GeForce FX 5600’s scores are simply too close to the GeForce FX 5200 to justify the former’s price premium. In 3DMark2001 SE and 3DMark03, the GeForce FX 5600 is outclassed by the Radeons, which offer more than twice as many frames per second overall.

3DMark2001 SE’s transform and lighting tests are running as vertex shader programs on the GeForce FX 5600, and it’s not pretty. The GeForce FX 5600 is consistently faster than the GeForce FX 5200, but not by nearly enough, especially with the Radeons so far out ahead.

Games
The GeForce FX 5600 hasn’t looked impressive in synthetic feature tests, but what about games people actually play?

Quake III Arena

The GeForce FX 5600 performs reasonably well in Quake III Arena, but really doesn’t stretch its legs until anisotropic filtering and antialiasing are enabled. Normally, 30 frames per second at 1600×1200 with 8X anisotropic filtering and 4X antialiasing would be impressive, but with the Radeons running at over 50 frames per second with the same settings, the GeForce FX 5600’s performance looks a little underwhelming.

Jedi Knight II

In Jedi Knight II, the GeForce FX 5600 is faster than the GeForce4 Ti 4200 8X with 8X anisotropic filtering and 4X antialiasing, but it never quite catches the Radeons.

Comanche 4

The GeForce FX 5600 is way off the pace in Comanche with anisotropic filtering and antialiasing disabled. With 8X anisotropic filtering and 4X antialiasing, it’s just able to pull even with the GeForce4 Ti 4200 8X, but the Radeons are still way out ahead.

Codecreatures Benchmark Pro

The GeForce FX 5600’s performance in Codecreatures isn’t impressive until anisotropic filtering and antialiasing are enabled. Even then, it’s not faster than the Radeons, or even the GeForce4 Ti 4200 8X.

Unreal Tournament 2003

In our low- and high-detail Unreal Tournament 2003 tests, the GeForce FX 5600 performs more like a budget GeForce FX 5200 than it does a replacement for the GeForce4 Ti 4200 8X.

With anisotropic filtering and antialiasing enabled, the GeForce FX 5600 pulls ahead of the GeForce4 Ti 4200 8X, but is still handily beaten by the Radeons.

Serious Sam SE
We used Serious Sam SE’s “Extreme quality” add-on for our tests, which automatically uses the highest level of anisotropic filtering available on each card. For these tests, the Radeons are using 16X anisotropic filtering, while the GeForce cards are using 8X anisotropic filtering.

When we look at average frame rates in Serious Sam SE, the GeForce FX 5600’s performance actually looks decent; it’s just behind the Radeon 9500 Pro. How do frame rates look over the length of the demo?

Pretty good. The GeForce FX 5600 doesn’t suffer the same stuttering at the start of the benchmark as the GeForce FX 5200, and it doesn’t exhibit any unusual peaks or dips in performance that could potentially ruin gameplay.

Let’s crank up the antialiasing options and see what happens.

With 4X antialiasing and 8X anisotropic filtering, the GeForce FX 5600 distances itself from the GeForce FX 5200 and pulls ahead of the GeForce4 Ti 4200 8X, but it’s still behind the Radeons.

Over the length of the benchmark demo, the GeForce FX 5600’s performance is closest to the GeForce4 Ti 4200 8X. Fortunately, the GeForce FX 5200’s issues at the start of the demo don’t plague the GeForce FX 5600.

Splinter Cell
Splinter Cell is a new addition to our benchmark suite, and I can’t think of a better way to break it in.

With antialiasing and anisotropic filtering disabled, the GeForce FX 5600 is barely faster than the GeForce FX 5200, and well off the pace set by ATI’s mid-range Radeons.

Since Splinter Cell dumps frame rate information for the entire benchmark demo into an Excel file, we can bust out some nifty “frame rate over time” graphs.

Performance is all over the map in Splinter Cell, even for the Radeons. The GeForce FX 5600’s performance isn’t quite as erratic as some of the other cards, but that’s mostly because its peak frame rates are much lower than the competition. Unfortunately, the troughs are quite a bit lower, too.

With anisotropic filtering and antialiasing enabled, the GeForce FX 5600’s performance scales well. For some reason, the Radeons don’t agree with Splinter Cell at 1600×1200 with anisotropic filtering and antialiasing enabled.

The GeForce FX 5600 is more competitive with anisotropic filtering and antialiasing enabled. However, since frame rates are so low overall, I wouldn’t recommend playing the game on any of these cards with both anisotropic filtering and antialiasing enabled.

3DMark2001 SE

NVIDIA hasn’t expressed concerns about the viability of 3DMark2001 SE, but maybe it should. Here the GeForce FX 5600 is way off the pace.

The GeForce FX 5600 is a comparatively poor performer in each of 3DMark2001 SE’s game tests, even when compared with the GeForce4 Ti 4200 8X it’s supposed to replace. Even in the shader-laden Nature test, the GeForce FX 5600 doesn’t have what it takes to pump out competitive frame rates.

3DMark03

The GeForce FX 5600’s standing improves in the DirectX 9-fortified 3DMark03, as the card clearly outpaces the older GeForce4 Ti 4200 8X.

The GeForce FX 5600’s performance in 3DMark03’s individual game tests is competitive with the GeForce4 Ti 4200 8X, but the Radeons are consistently out in front. In the DirectX 9 “Mother Nature” test, the GeForce FX 5600 is only half as fast as the Radeon 9500 Pro.

3DMark03 image quality
Because there’s more to a graphics card than performance, I’ve yanked a couple of 3DMark03 screen shots to examine the GeForce FX 5600’s image quality with anisotropic filtering and antialiasing disabled. Below are screenshots from the GeForce FX 5600, Radeon 9600 Pro, and DirectX 9’s reference software renderer. You can click on the images to get full-size versions of each screenshot.


DirectX 9’s software renderer


ATI’s Radeon 9600 Pro


NVIDIA’s GeForce FX 5600

The differences are subtle, but it’s clear that the GeForce FX 5600’s screen shot isn’t as close to the reference renderer as the Radeon 9600’s. Looking decidedly less “cinematic,” GeForce FX 5600’s dull sky is the dead giveaway.

SPECviewperf
SPECviewperf is really a workstation benchmark, but I’ve included it here for variety. After all, not all consumer graphics cards end up playing only games.

Comparatively, the GeForce FX 5600 performs much better in SPECviewperf than it does in our gaming tests.

Antialiasing
Next, we’ll isolate the GeForce FX 5600’s performance in different antialiasing and anisotropic filtering modes. We’ve already had a glimpse of the card’s performance with 4X antialiasing and 8X anisotropic filtering in our game tests, but this section will provide a more thorough look at the card’s performance with these image quality features. Since the GeForce FX 5600 doesn’t support 6X antialiasing using the same algorithm it uses at other sample sizes, I’ve included results for its 6XS antialiasing mode.

Edge antialiasing

At some resolutions, the GeForce FX 5600 is capable of 16X antialiasing, which is impressive in itself. What’s even more impressive is how the card’s performance trails off only slightly at higher antialiasing levels. That’s probably the NV31’s color compression engine at work. Color compression, naturally, becomes more effective at higher sampling rates.

Unfortunately, only at 1600×1200 does the GeForce FX 5600 come within striking distance of the nearest Radeon. The Radeons only support antialiasing up to 6X, but their performance up to that point is better than the GeForce FX 5600.

Antialiasing quality: Radeon 9600 Pro
How good does the GeForce FX 5600’s 16X antialiasing actually look? First let’s check out the competition.


Radeon 9600 Pro: No antialiasing


Radeon 9600 Pro: 2x antialiasing


Radeon 9600 Pro: 4x antialiasing


Radeon 9600 Pro: 6x antialiasing

Looks good, no? Let’s quickly examine why.

Like the other members of ATI’s R3x0-derived line, the Radeon 9600 Pro gamma corrects its blends of sampled pixel fragments, so the output looks correct on PC monitors. PC monitors imperfectly relate pixel brightness to signaling voltage exponentially, but most graphics cards maintain a truer linear relationship between the two. To more accurately relate a pixel’s brightness on a monitor, the pixel’s color value can be gamma corrected to compensate for the exponential relationship between pixel brightness and signaling voltage. The result: pixel brightness values that appear more linear on a computer screen.

Also, in 2X and 4X AA modes, the Radeon 9600 Pro uses a rotated grid sampling pattern. With 6X antialiasing, the Radeon 9600 Pro uses a more complex, quasi-random sampling pattern to fool the human eye’s pattern recognition prowess and produce impressively clean edges.

Antialiasing quality: GeForce FX 5600


GeForce FX 5600: No antialiasing


GeForce FX 5600: 2x antialiasing


GeForce FX 5600: 4x antialiasing


GeForce FX 5600: 6xS antialiasing


GeForce FX 5600: 8x antialiasing


GeForce FX 5600: 16x antialiasing

Even with 16X antialiasing, the smoothed edges produced by the GeForce FX 5600’s multisampled antialiasing don’t look significantly better than 6X antialiasing on the Radeon 9600 Pro. Why? Partly because the GeForce FX 5600 isn’t gamma correcting any of its fragment blend operations, so the AA pixel values don’t look as true as they could. Even with larger sample sizes, the GeForce FX 5600 can’t overcome the image quality deficit versus the Radeon 9600 Pro.

Also, the NV31 chip is using a more uniform (and thus easier for the eye to detect) ordered grid sampling pattern. The GeForce FX 5600 does have 4XS and 6XS antialiasing modes that combine multisampling with supersampling to provide a little texture antialiasing. These “XS” modes also use a rotated grid sampling pattern. However, as you can see from the 6XS antialiasing screenshot, the results aren’t anything to write home about. Of course, sample patterns will probably have more impact in motion than in a still shot.

Texture antialiasing
To measure texture antialiasing, I used Serious Sam SE with various texture filtering settings.

The GeForce FX 5600’s performance closely mirrors that of the GeForce FX 5200 and is only a few frames per second faster than the budget DirectX 9 card. The Radeons walk away with this one.

All of our testing was done with the GeForce FX 5600’s “Application” image quality setting. Here’s why:


GeForce FX 5600: Standard trilinear + bilinear filtering


GeForce FX 5600: “Performance” 8X anisotropic filtering


GeForce FX 5600: “Quality” 8X anisotropic filtering


GeForce FX 5600: “Application” 8X anisotropic filtering

As we’ve seen with the GeForce FX 5200 and 5800 chips, the “Application” quality setting produces sharper textures than NVIDIA’s “Performance” and “Quality” options.

Anisotropic filtering quality: Radeon 9600 Pro
For comparison, here’s the Radeon 9600 Pro with ATI’s “Performance” and “Quality” 8X anisotropic filtering.


Radeon 9600 Pro: Standard trilinear + bilinear filtering


Radeon 9600 Pro: “Performance” 8X anisotropic filtering


Radeon 9600 Pro: “Quality” 8X anisotropic filtering

ATI’s “Quality” setting yields sharp textures comparable to NVIDIA’s “Application” image quality setting.

Texture filtering and mip map levels: GeForce FX 5600
Now let’s see exactly what NVIDIA is doing to texture filtering with these various quality slider settings. I’ve used Q3A’s “r_colormiplevels 1” command to expose the various mip-maps in use and the transitions between them.


GeForce FX 5600: bilinear + trilinear filtering


GeForce FX 5600: “Performance” 8X anisotropic filtering


GeForce FX 5600: “Quality” 8X anisotropic filtering


GeForce FX 5600: “Application” 8X anisotropic filtering

The mip-map transitions of the GeForce FX 5600’s “Performance” and “Quality” image quality settings aren’t as smooth as the mip-map transitions of the “Application” image quality setting.

Texture filtering and mip map levels: Radeon 9600 Pro


Radeon 9600 Pro: bilinear + trilinear filtering


Radeon 9600 Pro: “Performance” 8X anisotropic filtering


Radeon 9600 Pro: “Quality” 8X anisotropic filtering

As you can see, the mip-map transitions of ATI’s “Quality” image quality setting are just as smooth as NVIDIA’s “Application” setting.

Conclusions
First, let’s look at NVIDIA’s GeForce FX 5600 in general, then we’ll address BFG Tech’s particular implementation of it.

As a mid-range graphics product, the GeForce FX 5600 has a lot going for it. For as low as $169 online (for the 128MB version), the GeForce FX 5600 is capable of bringing a DirectX 9 feature set to mainstream gamers and mid-range markets. On its own, the GeForce FX 5600 is a solid product that’s even capable of running the “Ultra” version of NVIDIA’s sexy Dawn demo at what looks like about 25 frames per second. 25 frames per second is definitely better than the slideshow that NVIDIA’s budget GeForce FX 5200 produces with Dawn Ultra, giving the GeForce FX 5600 more legitimate DirectX 9 functionality than its low-end cousin.

The GeForce FX 5600, however, does not exist in a vacuum. ATI’s Radeon 9500 Pro is still available online for as low as $160 and its replacement, the Radeon 9600 Pro, is just now becoming available for under $195. The Radeon 9500 Pro likely won’t be available for long, but with the impressive Radeon 9600 Pro taking its place, the GeForce FX 5600 faces stiff competition regardless.

Performance-wise, the Radeon 9500 and 9600 Pro lay down a beating on the GeForce FX 5600, even when the latter is equipped with 256MB of memory. As far as features go, there are few practical ways that the Radeons are really lacking. The Radeon 9500 and 9600 Pro only support pixel shader programs up to 64 instructions in length (which is the limit of the pixel shader 2.0 specification), and they only offer 96-bit pixel shader precision rather than the 128-bit precision available to the GeForce FX 5600. However, those “onlys” have very little practical value. After all, it will be some time before games and applications start taking advantage of pixel and vertex shader versions 2.0 with regularly, and the 3DMark03 image quality tests suggest that the GeForce FX 5600 doesn’t have the horsepower to take full advantage of the extra pixel shader precision it has available.

Really, about the only area that the GeForce FX 5600 has a practical advantage over ATI’s mid-range Radeons is in multimonitor software. NVIDIA’s nView software is far superior to ATI’s Hydravision, which could make the GeForce FX 5600 more appealing to multimonitor fans. Otherwise, I can’t recommend the GeForce FX 5600 with a clear conscience; ATI’s mid-range Radeons are faster, priced in the same class, and have at least comparable—if not better—image quality. Those only concerned with multimonitor software can always opt for a much cheaper GeForce FX 5200, which also supports nView.

The fact that BFG Technologies’ Asylum GeForce FX 5600 256MB is based around the GeForce FX 5600 puts the card at a disadvantage right off the bat. Simply adding extra memory and VIVO ports to the card doesn’t make the GeForce FX 5600 GPU competitive with the Radeon 9500 or 9600 Pro. At $249 on Pricewatch, the Asylum GeForce FX 5600 256MB is $50 more than the Radeon 9600 Pro and nearly $80 more than the cheapest 128MB GeForce FX 5600. I’m all for the VIVO port, which is worth a bit of a premium, but the extra memory probably isn’t—not for most of today’s games. Although 256MB of memory appears to be useful for antialiasing at high resolutions with high detail textures, the GeForce FX 5600 doesn’t have the horsepower to produce playable frame rates in those situations.

With competition like the Radeon 9500 and 9600 Pro, there’s little reason for gamers, enthusiasts, or even casual consumers to buy a graphics card based on the GeForce FX 5600 today. Newer, more mature drivers from NVIDIA could improve the GeForce FX 5600’s performance, but there are never any guarantees in that department. Considering that GeForce FX 5600 cards rigged with 128MB of memory are available for around $169 online, the Asylum GeForce FX 5600 256MB isn’t even an especially attractive offering among GeForce FX 5600-based graphics cards (unless you have a particular memory-intensive graphics application in mind for it). If the price on this Asylum card were to drop by a fair amount, the card’s extra RAM and VIVO capabilities could make it more appealing than GeForce FX 5600 cards from other manufacturers. At least for now, though, any GeForce FX 5600 will be comparatively slow versus the mid-rage Radeon cards.

Comments closed
    • lufulu
    • 14 years ago

    I have a BFG 256MB Geforce FX 5600 with new drivers. I have had the GPU overclocked to 400MHz and the RAM overclocked to 585MHz for over 2 years (that’s well over what you said it could do). It’s is August 2005 and I have not come across a game that I cannot run on high detail at 1280X1024 (except Doom3, I have to drop it to 1024X768). Even at high resolutions, I get 30 to 50 FPS on the latest games (Doom3, Half Life 2, GTA: San Andreas, etc…). I think the newer drivers made a big difference too. My point is, with a little overclocking and a couple extra fans, this card smoked the other cards in your tests. Considering it is 256MB, and it can play Doom3 at High detail (compared to medium detail on a 128MB card), it is an excellent buy.

    PS: Your test was bias. You should’ve put the 128MB Ultra 5600 (with faster RAM than the 256MB) against the 128MB Pro 9600 featured in your test. This would have definitely evened the playing field, and the benchmarks!!!!!!!!!!!!

    • Anonymous
    • 16 years ago

    I am astonished by your report and by the way that you completely slagged the geforce fx 5600 256mb.

    Surely that 256 mb of ram accounts for somthing.

    I think you guys are completely biast towards ATI graphics cards and you talk about the geforce fx 5600 256mb as if it cant even ply the latest games and I am sure it can because I have even seen a geforce fx 5200(equipped only wid 128mb) smoothly ply halo and the likes of modern day games.

    Despite your report(unfair) I will proceed in purchasing a geforce fx 5600 256 mb.

      • Anonymous
      • 16 years ago

      ur an extremely lanky boy

    • Anonymous
    • 16 years ago

    I have run both the FX 5600 256 mb card from PNY and tested up against a BFG Ti4200 64 mb 4Xagp and i was shocked. the BFG 4200 was kicking the 8x FX 5600 to the curb. I was getting an average of 15% performance increase with the BFG 4200. that is just sad. FX 5600 Lost out in my book…..I hope nVidia gets the FX 5900 right….

    • Anonymous
    • 16 years ago

    Your review is just side splitting it’s so funny, even my old GF4 MX440 beat that FX5200 with 3dmark2001se even using an old athlon xp 1700. Oh hang on even my FX5200 beat your scores using my old athlon xp 1700 with a rubbish SIS745 motherboard and the 43.51 drivers. ๐Ÿ™‚

    1.Are you aware of the well known problems with the 43.45 drivers?

    2.Why didn’t you use the 43.51 WHQL certified drivers which don’t have the problems the 43.45 drivers have for a fair review? Atleast then the ATI and the Nvidia video cards could all be using WHQL drivers.

    3.Do you think it’s fair to compare cheaper none Pro video cards to more expensive Pro video cards and if so why didn’t you compare the Ultra versions of the Geforce FX line to none Pro versions of the ATI products instead of the other way around?

      • Dissonance
      • 16 years ago

      I’m quite aware that the 43.45 drivers are the only official Detonators that NVIDIA supports for the GeForce FX. When NVIDIA releases new, officially supported drivers, we’ll start using them. The ATI and NVIDIA cards are both using the latest publicly available, officially supported drivers in this review, which is fair.

      As for comparing the non-Ultra FX 5200 with a “Pro” Radeon, surely you jest. The 64MB Radeon 9000 Pro is available for as low as $75 online. Considering that the GeForce FX 5200 we used featured 128MB of memory and retails for closer to $100, it’s surely a fair comparison. Just because the marketing droids say that an “Ultra” should go up against a “Pro” doesn’t mean that the price of each card is actually comparable in the real world.

        • Anonymous
        • 16 years ago

        Good answer, the 43.45 situation is just stupid. Nvidia know the 43.45 drivers have major problems which the 43.51 drivers fix but they don’t have them on their website. IMO they are probably waiting for the Detonator FX (which I’ve heard are very fast 100% Geforce FX optimised drivers, about time too!) drivers to be fully tested before releasing any new drivers.

          • Anonymous
          • 16 years ago

          I’ve just finished doing some more testing and I can’t get my xfx geforce fx5200 to perform as low as yours did even though your using a much more powerful computer.

          (all 1024x768x32bit scores)

          my 3dmark2001se score=6552 your score=5040
          my 3dmark03 score=1346 your score=1190

          Let’s compare technical specs:-

          my cpu=athlon xp 1700 yours=athlon xp 2600
          my motherboard=asus sis745 yours=asus nforce2
          my memory=512mb pc2100 yours=512mb pc2700
          my vid card=128mb fx5200 yours=128mb fx5200
          my drivers=v43.51 (quality) yours=v43.45 (application)

          Do you honestly think my system should be beating yours because I don’t?

          The 43.45 drivers when used in application mode is resulting in a huge performance hit because of a bug which is making any fair comparison impossible?

          Atleast try out the 43.51 drivers, you can get them from ยง[< http://download.guru3d.com/<]ยง You will see that the quality setting with the 43.51 drivers has the same image quality as the application setting with the 43.45 drivers but without the huge performance hit which is what is making the geforce fx cards look so bad in your review.

            • Dissonance
            • 16 years ago

            When NVIDIA releases a new set of officially supported Detonator drivers, we’ll start using them for testing. If there’s a bug in NVIDIA’s current drivers, it’s their responsibility to resolve the issue with a fully supported driver, not some leaked, unsupported beta that may or may not break compatibility or performance in other areas.

            • Anonymous
            • 16 years ago

            Nvidia has just released the official v44.03 drivers on their website, are you going to retest the fx5600 and fx5200 in this article with the 44.03 drivers? I personally think you should because the results you got with the 43.45 drivers are bogus and are worthless.

        • Ardrid
        • 16 years ago

        I can understand the comparison based on price, but I still would’ve liked to see the 5200 Ultra go up against the 9000 Pro. I’m also wondering how the 9200 Pro will be priced, since that’s the obvious target of the 5200 Ultra. And I’m pretty sure the 9000 is going to be phased out much like the 9500, although I could be wrong on that.

          • Dissonance
          • 16 years ago

          When I have a GeForce FX 5200 Ultra and Radeon 92xx, you’ll see the results included. I can only work with what I have.

      • YeuEmMaiMai
      • 16 years ago

      funny that my 3dmark 2k3 score is 3980 with my 9500 Pro…. Quake III flies UT 2K3 flies

      using a “crappy” SiS k7S6A SiS 745 chipset with Athlon XP at 143Mhz FSB ehehe

      GOTTA LOVE THE 9500 PRO as it really hauls BUTT

    • muyuubyou
    • 16 years ago

    Guys you’re exaggerating this a lot. ATI was under Nvidia until quite recently, and it looked like ATI was dissapearing. Now it’s the other way around.

    A couple of months in the 2nd place isn’t going to send a company to the morgue. Not even in the 3rd place. Matrox’s shares look OK. The world is big enough for 4 or 5 companies.

    They’ll eventually be competitive again. Globalization has gone nuts, but there’s still hope.

    • Anonymous
    • 16 years ago

    WTF has happened to Nvidia? I have a hard time believing that throughout the entire FX R&D cycle no one realized their new baby was crap. Didn’t they at least compare it to their previous generation?

    I’ve always had a place for Nvidia – even before the Riva’s, but it’s hard as hell to envision purchasing anything in this line. It’s this whole FX stigma… entirely negative (the dying days of 3dfx come to mind.)

    • Captain Ned
    • 16 years ago

    My MSI Ti4200/128 O/C’d to 305/577 is looking like my smartest purchase when I built my box.

    • Anonymous
    • 16 years ago

    q[

      • Buub
      • 16 years ago

      Right, fanboi, ATI is better at everything than nVidia. nVidia can’t get a single thing better than ATI. Thanks for the info…

      I prefer balanced objective arguments, k thx.

      • Anonymous
      • 16 years ago

      I admit to being an Nvidia fan, although I also acknowledge that nv3x is a bit of a colossal failure, at least so far, and that R300 is quite the opposite.

      Ok, so they’re not completely dead angles, just almost dead. When you ask for 16x and get 2x at a certain angle, that’s a weak angle.

      I have a good understanding of what anisotropic filtering is, what it’s for, and how it works.
      Anisotropy can occur whenever the texture coords are stretched nonuniformly relative to the pixels, which is independent of what angle it’s displayed at. 16 maximum is nice, but really it’s worst case that counts most, and nvidia wins here with no dead angles, and no weak angles.

      Point stands about the 5600 ultra/128MB and its 700MHz RAM. I expect it will usually be faster than than the 9600 pro in the high AA modes (although R3x0 has better AA quality).

    • Anonymous
    • 16 years ago

    q[

    • EasyRhino
    • 16 years ago

    1) Looking at some of the synthetic vertex and pixel shader tests, it’s interesting to note that in many (but not all) cases, the 5600 beats the 5200 by more than 30%, which is the clock difference. So there might be something to that “more parallelism” nonsense.

    2) I really think the 5600 isn’t placed well in card heirarchy. It just isn’t fast enough over the 5200. I blame the 4×1 pipes. Maybe 4×2 and a slow clock would work better. Or maybe there’s the slightest hope for a 5600 Ultra. Course, the card is also slow, which doesn’t help. ๐Ÿ™‚

      • EasyRhino
      • 16 years ago

      Oh, and good writeup Diss.

      The Sphincter cell detained framerates were a little hard to read though, mainly due to line overlap. Mayhaps if some of the data points were removed from the graph it would be easier?

    • crazybus
    • 16 years ago

    Guess what? My 8500LE is faster than this card. That is just sad.

    • Anonymous
    • 16 years ago

    #54
    The 9200/9200 Pro is faster or as fast as the 9000/9000 Pro…the cost cutting was already done with going from the 8500->9000. It’s introduction won’t change the performance picture much, and apparently in the opposite direction that you indicate (the 9200 Pro is /[

    • Anonymous
    • 16 years ago

    Well, by the time all GeForce FX versions (5200, 5600, 5800) are widely available, they will compete with the new Radeons (9200, 9600, 9800) – of which, Radeon 9200 and 9600 perform worse than the older model Radeon 9000 and 9500.

    Therefore what I would like to see is a comparison between:
    – Fanless video card (sub $100): GeForce FX 5200 vs. Radeon 9200.
    – Midrange video card (sub $200): GeForce FX 5600 ultra vs. Radeon 9600 pro
    – High end video card: GeForce FX 5800 ultra vs. Radeon 9800 pro

    Although the 5200 and 5600 performance seems lacking now, they will looks just fine once the Radeon 9000 and 9500 are replaced by 9200 and 9600. You see, the name of the game right now is how to cut cost as much as possible while still producing a reasonable video card for the given price range. Reasonable does not mean it has to top the performance of last year model, or earlier model this year for that matter.

    • Anonymous
    • 16 years ago

    You got posted on Warp2Search… It’s not as good as slashdot but it should help you with your hits.

    • absinthexl
    • 16 years ago

    Wow. My 9500 Pro cost about the same as the 128MB version.

    No looking back…

    • Anonymous
    • 16 years ago

    Why are you comparing a 5600 non-ultra with a 9600/9500 pro? It would have been wise to do it with the 9500/9600 non-pro.

    Good article otherwise.

      • Dissonance
      • 16 years ago

      It would be inappropriate to compare graphics cards based only on their name, so it doesn’t really matter if we’re comparing Pros with non-Ultras, or non-Pros with Ultras. If Pro Radeons are available for the same price as non-Ultra GeForce FXs, then the comparison is legitimate. Currently, the Radeon 9500 Pro is cheaper than the least expensive GeForce FX 5600 non-Ultra. The Radeon 9600 Pro is also much cheaper than the BFG card we reviewed.

    • crazybus
    • 16 years ago

    Hey Dissonance, a got a question for you. Do you bench UT2003 with HardOCP’s utility or with a custom ini?

      • Dissonance
      • 16 years ago

      I use the OCP’s benching utility, but it’s hacked up a bit to do things slightly differently in terms of which maps run and what resolutions are used.

    • tinokun
    • 16 years ago

    How come Tech Report always compares ATI’s ‘Pro’ cards to NV’s non-Ultra cards? First 5200 non-Ultra vs 9200 Pro and now 5600 non-Ultra vs 9600 Pro. It would be more useful to see all cards compared; Pro and non-Pro vs Ultra and non-Ultra.

      • crazybus
      • 16 years ago

      9200 where? did I miss something?

      • Dissonance
      • 16 years ago

      It would be inappropriate to compare graphics cards based only on their name, so it doesn’t really matter if we’re comparing Pros with non-Ultras, or non-Pros with Ultras. If Pro Radeons are available for the same price as non-Ultra GeForce FXs, then the comparison is legitimate. Currently, the Radeon 9500 Pro is cheaper than the least expensive GeForce FX 5600 non-Ultra. The Radeon 9600 Pro is also much cheaper than the BFG card we reviewed.

        • absurdity
        • 16 years ago

        I can’t imagine Diss has /[

        • tinokun
        • 16 years ago

        #40, fair enough. However, I would still like to see the performance of a regular 9600, for example, since I can never find any Pro cards where I live ๐Ÿ˜›

    • FroBozz_Inc
    • 16 years ago

    Geez, I feel a little bit better about my Ti4200/256M card after reading that.
    I was planning on upgrading it this summer, but I think I’ll do the motherboard/CPU first, and then the video card later in the year, once the Doom3 ass-kicking card is apparent.

    Seems to me that if a card owns the D3 engine, it’ll probably has the ballz to run anything else pretty damn good for awhile.

    It will be interesting to see how the 128M versions run w/ the faster memory.

      • Coldfirex
      • 16 years ago

      wth made a ti4200 with 256mb of ram?

        • Freon
        • 16 years ago

        I think Gainward did.

          • Pete
          • 16 years ago

          I’ve never heard of a 256MB 4200. (You’d think a card like that would make news, too.) I doubt it exists.

            • FroBozz_Inc
            • 16 years ago

            my bad, it’s a 128Meg card ๐Ÿ˜‰

    • Anonymous
    • 16 years ago

    This article makes the fx5600 look pretty bad.

    Keep in mind, though, that:

    1. This is the 256MB version, and has much slower ram clock than the 128MB version (500MHz instead of 700MHz).

    2. With the fx running in application mode, it gives better aniso quality thant the radeon (no dead angles). Hence the low performance.

    I would prefer if the tests where aniso is enabled were run in both application and quality mode.

      • Dissonance
      • 16 years ago

      The GeForce FX 5600 *[

    • Anonymous
    • 16 years ago

    Plain and simple, Nvidia has dropped the ball with the GeForce FX line. The 5200 series might as well be called a GeForce3 Ti200 with DirectX 9 support, and I don’t see the 5600 as having too much of a lifespan from a gaming perspective. The plain 5800 is probably the best bet in the series since the 5800 Ultra is really loud and in short supply. You also have to take Nvidia’s crappy drivers into consideration. A sad commentary considering their technology used to really rock and their drivers used to be totally solid.

    • Zenith
    • 16 years ago

    q[< As a mid-range graphics product, the GeForce FX 5600 has a lot going for it. <]q I'm a NV fan boy, and i don't even like this thing! Crap, i don't see it as having ANYTHING going for it. I'll give you i like the NV drivers better, but thats no excuse for being FAR too slow for FAR too high a price. I think they should call the 5600 and 5200 MXs anyway and drop $50 off the prince. THEN i might consider it. maybe.

    • meanfriend
    • 16 years ago

    Good review and ignore the gerbils who cant read a lengthy article before wandering off. (hint: TR has an index on every page so you can jump right to the conclusion if that suits you.)

    A few comments

    The article states that the 5600 128MB will be available for $169? I bought a MSI Ti4200 128MB VIVO last September for about $175 and that is every bit a competetive as this replacement. Long gone are the days when a the next product cycle brought undeniable performance gains across the board. The GF4Ti stomped on the GF3 but now the replacement GFFX is about the same?? ATI is also guilty of the same thing where the midrange 9500Pro stomped on the 8500, but now the replacement 9600Pro is comparable…

    Secondly, I find it kind of strange to point out the 5600 beats the 4200 with full AA/AF when the framerates are so damn low in those modes. Really, either of those cards with full AA/AF + any new game just cannot maintain a smooth framerate (just look at the UT2K3 results). Upcoming games are going to be even more demanding so to point out the performance improvement but not mention where it sits on the absolute performance scale is lacking IMHO

    Which brings me to my third point. How on earth does DX9 support lend ANY legitimacy to any product (as stated in the conclusions)? The GF4 is a DX8 part, the 5X00 a DX9 part. I have yet to see any game where DX9 support gives the 5600 any edge at all. To suggest that a 5600 is more ‘future proof’ than a Ti4200 when the performances are currently ~comparable is overstating the matter. By the time a game comes out where the 5600 will have a clear advantage over the 4200 as a result of differing DX support, both those cards will perform like crap and be worth about as much…

      • droopy1592
      • 16 years ago

      It’s still more future proof. If a game requires DX9, there is your answer, regardless of the framerate. One card will run it, the other won’t.

        • muyuubyou
        • 16 years ago

        Yeah DX9 will look very nice with models configured down to 30 polygons ๐Ÿ˜€ LOL

          • droopy1592
          • 16 years ago

          technically speaking, yeah!

        • meanfriend
        • 16 years ago

        Droopy, of course you are technically right, but I was referring to real world usage of this product. The point of consumer level reviews are with regard to practical applications (with some synthetic benchmarks thrown in to illustrate certain points) so you have to take that into consideration when talking about ‘future-proofing’.

        My original point was that considering DX9 support in a low/mid range product is overstating the importance of future feature sets in this current card. Just looking at the modest Splinter Cell performance, by the time a game *requires* DX9 (and therefore wont run on a GF4) the 5600 will have been long crippled by limitations in memory bandwidth, clock speed, etc and DX9 support will have gained you nothing.

        If someone asked my why a card X might be a better purchase over a Ti4200, I definitely would not offer DX9 as any sort of practical justification unless X has a large enough performance delta to perhaps make DX9 relevant (which the 5600 does not).

        And given that on pricewatch, a Ti4200 128 MB goes for $53 less than a 5600, I’d even call DX9 a red herring used by savy marketers to prop up thier products. And we all know how TR regulars *love* savy marketing ๐Ÿ˜‰

          • droopy1592
          • 16 years ago

          Personally, I would get a 9500 pro anyway.

            • Neo69
            • 16 years ago

            you can’t really blame ATI there – it’s not really a new generation and they probably weren’t making any money on those 9500’s (which were based on a full 107million transistor R300 core) – the 9600 should be much cheaper to produce.

            although that still’s doesn’t change the fact that something is wrong with their naming scheme if a card with a higher number is slower than the one with the lower one. they should’ve named it 9400, even if it is a newer card (the 9600 ist newer than the 9700 too)

            • droopy1592
            • 16 years ago

            But still, spending the cash, I would get a 9500 pro over a 5600 or a 4200

      • Yahoolian
      • 16 years ago

      9600pro overclocks really well.

      • Buub
      • 16 years ago

      q[

        • Anonymous
        • 16 years ago

        But *[

    • Dually
    • 16 years ago

    Suck, suck, suck….

    I purchased one of these yesterday, just for kicks-see if the FX is all that…

    I think I’ll pack it up and exchange it for a 9700…

    cripes!

      • muyuubyou
      • 16 years ago

      Check dawn naked and take some screen shots before ๐Ÿ˜€

        • droopy1592
        • 16 years ago

        That’s the only good part about the card. And the fact you don’t need a blower to clean your driveway.

        • Dually
        • 16 years ago

        Nekkid! Where? ๐Ÿ˜‰

        I think this’ll probably be the shortest I’ve ever kept a video card, and I haven’t even installed it. I just wanted something faster than my 4200 that would render SOF2 more correctly than my 9500 Pro did. *sigh* I don’t know what to do!

    • Anonymous
    • 16 years ago

    No insult to the detailed reviews, but 23 pages for a single mid-range graphics card. I couldn’t make it through the whole thing…

    • Anonymous
    • 16 years ago

    Heh of all the cards I saw id prolly pick the 4200;/

    I wasnt shocked at the results of the nv30 and kin. Nvidia themsevles said they werent going for performance this gen so much as features. Like with the tnt the gf1 and the gf3 nvidia tends to do this same thing over and over. They come out with a starved/underwelming core with some extra features.. and then vastly improve it.

    I think part of the reason ati hasnt managed to profit from thier clear victory in performance and price is everyone has become conditioned to wait and see what the improved versions of cores look like and always suspect that the first run of a core will suck.

      • Anonymous
      • 16 years ago

      I don’t know about that. Performance? They touted the 5800 Ultra which is their next gen card as the 9700pro killer. So it WOULD have to have performance behind it…..It failed.

    • Hockster
    • 16 years ago

    This card is just disappointing…

    • liquidsquid
    • 16 years ago

    This just confirms that nVidia must have let the “expensive” talent go, and they went… to ATI. Seriously howevert ATI management must be jumping in joy at how well thier R&D had paid off when thier complete line of cards puts the hurt on nVidia’s line, and seemingly with more class.

    I’m still holding out for BitBoys, so I will stay with my Herculies CGA card for now. I can see 120 caracters across my amber screen at once! jk. I am getting the itch for the ATI 9800 Pro.

    • fr0d
    • 16 years ago

    It’s amazing how much hot air is generated by a pointless product cycle. This range of cards from both companies are too fast for the current crop of games and will not be quick enough to satisfy at Doom III.

    The smart money is in the bank for the winning flagship card post- Doom III’s release…..

      • droopy1592
      • 16 years ago

      No pun intended, right?

      • muyuubyou
      • 16 years ago

      Yeah but some people need cards TODAY ๐Ÿ™‚ . For instance, people upgrading their computer for some other reason (like memory going south, for instance: SDRAM is now more expensive than DDR so buying it is a waste).

      I wouldn’t recommend this to anyone anyway ๐Ÿ˜‰

      • absurdity
      • 16 years ago

      Hmm… can you expand on the too fast for now, and not fast enough for the future? Getting under 20FPS in Splinter Cell with the settings turned up doesn’t seem too fast to me… and a highly unoptimized Doom III alpha was run on a pre-production Radeon 9700 Pro.

    • Khujo
    • 16 years ago

    dual display doesn’t seem like a good reason to purchase this card, i’ve had both nvidia and ati dual cards and i find Windows XP dual monitor support perfectly adequate to run my two monitors and tv at the same time. I’m currently using a 9700 Pro and an All in Wonder VE. I don’t know what nvidia was thinking with the nv3x core but they have dropped the ball big time, slower performance then year and 2 year old cards is not a successful product launch…

      • Yahoolian
      • 16 years ago

      Not all have/use win XP.

    • droopy1592
    • 16 years ago

    It’s funny how ArtX cleans things up for ATi and 3dFX screws things up for nVidia.

    • dannyai
    • 16 years ago

    hey umm
    im pretty sure i’ve seen the screen shots BEFORE????
    did u’s copy or u’s got copied?

    • NeXus 6
    • 16 years ago

    Damn. My GeForce 3 Ti500 even beats it on the 3DMark2001 SE score! I’ll wait for NV35 or NV40, depending on when I decide to upgrade.

    • Anonymous
    • 16 years ago

    Come on, NVidia. How come the 5600 cannot even beat the aging Ti4200? How can you regain leadership when you cannot even top last year midrange model?

      • muyuubyou
      • 16 years ago

      This is quite a Ti4200 + DX9 … and cheaper at launch date. I think this does have a market: medium range ATI driver haters, and wanting better 3D than that offered by Matrox…

        • Anonymous
        • 16 years ago

        Only problem is ATI’s MID-RANGE cards are faster and provide better visual quality with stable drivers at the same price. Only Nvidia fan-boys with a budget will buy these cards based on the fx chipset.

    • Anonymous
    • 16 years ago

    On this page 15 (http://www.tech-report.com/reviews/2003q2/geforcefx-5600/index.x?pg=15), the 5200 leads the 5600 in the drv test. Since I didn’t notice a comment on this apparent anomaly, was it just overlooked, or was a mistake made and the results swapped?

    • andurin
    • 16 years ago

    Wow, that was a little reading, Sativa. Anyhoo, I’ll be upgrading sound and HD before vid card, but I was kinda hoping this class from NV would be a better perfomer. I do have dual monitors, which is why this card was in my eye. Since it will be awhile before I switch, hopefully they can get somethng better out.

    • sativa
    • 16 years ago

    23 pages?? lol it isn’t detailed enough! ๐Ÿ˜‰

    • zurich
    • 16 years ago

    Oh where or where did my NVIDIA go wrong? ๐Ÿ™

    The irony of the ‘FX’ tag, crediting the 3dfx engineers for their work on the design, is such bitter irony ..

    • Anonymous
    • 16 years ago

    Ugh… man NV is really taking a beating… good things around the corner for big N according to inq… lets hope so, cause this is getting ugly…

    • wesley96
    • 16 years ago

    Nice review. It just confirms my belief that the offspring cores of NV30 are just not up to the task, though. Better luck on the next core, I guess.

Pin It on Pinterest

Share This