Single page Print

NVIDIA's GeForce FX 5600 GPU


BFG Tech serves up DX9 down the middle
— 2:07 AM on May 7, 2003

THUS FAR, the performance of NVIDIA's GeForce FX line has left something to be desired. The GeForce FX 5800 Ultra is a noisy alternative to ATI's Radeon 9800 Pro, and the GeForce FX 5200 is a slower performer in real-world games and applications than the Radeon 9000 Pro. For enthusiasts, however, the GeForce FX 5600 may be the most interesting card in the GeForce FX line. Powered by NVIDIA's NV31 graphics chip, the GeForce FX 5600 is a mid-range graphics card aimed at ATI's Radeon 9500 and 9600 lines. Like other members of the GeForce FX family, the GeForce FX 5600 is dressed up with enough DirectX 9 goodies to give users a "cinematic" experience, although features alone don't guarantee performance.

Today we'll be looking at NV31 as implemented in BFG Technologies' Asylum GeForce FX 5600 256MB card. As its name implies, the Asylum GeForce FX 5600 256MB packs 256MB of memory, but the card also has a few other tricks up its sleeve.

The dirt on NV31
Before we consider BFG Technologies' implementation of the GeForce FX 5600, it's worth taking a moment to go over some of the key capabilities of NVIDIA's NV31 graphics chip. I'll just be highlighting NV31's more important features here, but a more detailed analysis of NV31's feature set and how it compares with NV30 and NV34 can be found in my preview of the GeForce FX 5600.

  • Pixel and vertex shader versions 2.0 — The key selling point of the NV31 graphics chip, and indeed of NVIDIA's entire GeForce FX line, is support for DirectX 9's pixel and vertex shader versions 2.0. In fact, NVIDIA takes things a few steps further by supporting longer pixel shader program lengths than called for by the pixel shader 2.0 spec.

    Like NV30 and NV34, NV31 supports 64- and 128-bit floating-point precision in its pixel shaders. Of course, programs using 64-bit datatypes run faster than those with 128-bit datatypes. NVIDIA claims developers can be more efficient by defining their variables for the datatypes they need, mixing 64-bit and 128-bit processing as required. ATI, by comparison, splits the difference and offers only 96-bit floating-point precision with the R300-series chips' pixel shaders, although the rest of the graphics pipeline offers a range of datatypes, including 64-bit and 128-bit floating-point formats. Both companies' compromises sacrifice some precision for performance; which is a better choice depends on real-world performance and image quality.

  • Clearly defined pipelines — NVIDIA has shrouded much of NV31's internal architecture in mystery, but they have revealed that NV31 has four pixel pipelines, each of which has a single texturing unit. Unlike NV30, which can apparently lay down either four or eight pixels per clock depending on what kind of pixels are being rendered, NV31 lays down four pixels per clock across the board.

  • Obfuscated shader structure — NVIDIA spells out NV31's 4x1-pipe architecture quite clearly, but NV31's shader structure is a closely guarded secret. ATI clearly defines how many pixel and vertex shaders are present in its R3x0 line of graphics chips, but NVIDIA keeps referring to the relative strength of the GeForce FX's pixel and vertex shaders in terms of the level of "parallelism" in the chip's programmable shader. NV30 has more parallelism than NV31, which has more parallelism than NV34, but NVIDIA isn't quantifying anything beyond that.

  • 0.13-micron manufacturing process — Like NV30, NV31 is manufactured using 0.13-micron process technology by the good folks at TSMC. Since NV31 runs at only 325MHz on the GeForce FX 5600, it doesn't need the GeForce FX 5800 Ultra's Dustbuster to keep cool. GeForce FX 5600 cards don't necessarily need to draw juice from an auxiliary power source, either.
One interesting but slightly obscure feature of NV31 is its support for clock throttling in 2D applications. The same "Coolbits" registry hack that reveals NVIDIA's overclocking tab in its Detonator drivers also lets users set the "3D" and "2D" core clock frequencies of the GeForce FX 5600. Lowering the core's clock frequency should make the GeForce FX 5600 run a little cooler, which should help ambient case temperatures. Unfortunately, the GeForce FX 5600's cooling fan speed doesn't seem to throttle down when the card's core clock speed decreases, which means noise levels are consistent regardless of whether a user is running in "2D" or "3D" mode.


NV31 in all its glory

Now that we know what's going on with NV31, let's check out BFG Technologies' take on the GeForce FX 5600.