Single page Print

A peek at the GeForce FX 5200 and 5600


Rolling out DirectX 9 for everyone
— 12:02 PM on March 6, 2003

TODAY NVIDIA UNVEILS its long-awaited NV31 and NV34 graphics chips, which will bring the GeForce FX name down to price points that most of us can afford in what will hopefully be enough volume to meet demand. Though the GeForce FX 5800 Ultra has become everyone's favorite whipping boy over the past few months, the vast majority of the market doesn't buy flagship products like the 5800 Ultra. Instead, they wait for new technologies to trickle down to affordable prices in more mainstream products.

For more than a year, the GeForce4 MX and GeForce4 Ti 4200 chips have occupied the mainstream and performance segments of NVIDIA's product catalog, but they're being put to pasture in favor of new additions to the GeForce FX line. In 2002, GeForce4 Ti 4200 received glowing accolades from reviewers and held the PC graphics price/performance crown for the vast majority the year; many of us will be sad to see the product taken out behind the barn. The GeForce4 Ti 4200 will be replaced by NVIDIA's new GeForce FX 5600 Ultra, which will have a tough act to follow.

The GeForce4 MX, on the other hand, wasn't favored by many, and its lack of hardware pixel and vertex shaders isn't likely to be missed by anyone. The GeForce4 MX's replacements, the NV34-powered GeForce FX 5200 and 5200 Ultra, have rather small shoes to fill, so it should be easy for them to impress.

What are the capabilities and limitations of NVIDIA's new NV31 and NV34 graphics chips and the new GeForce FX cards they'll be rolling out on? Do these new products share enough technology with NVIDIA's high-end NV30-based products to be worthy of the GeForce FX name, or is NVIDIA still keeping the mainstream a generation behind? Read on to find out.

Cinematic computing across the line
In a bold move that lays to waste NVIDIA's much-criticized "MX" philosophy of introducing new low-end graphics chips a generation behind the rest of its lineup, NVIDIA's new NV31 and NV34 chips both support Microsoft's latest DirectX 9 spec and even offer a little extra functionality above and beyond DirectX 9's official requirements. Here's a quick rundown of the features shared by NV30, NV31, and NV34.

  • Vertex shader 2.0+ - NV30's support for vertex shader 2.0+ carries over to NV31 and NV34, with all the bells and whistles included. Vertex shader 2.0+ offers some extra functionality over vertex shader 2.0, making the former a little more flexible.

  • Pixel shader 2.0+ - NV31 and NV34 also inherit all the features and functionality of NV30's pixel shaders 2.0+, which supports more complex pixel shader programs than even Microsoft requires for DirectX 9. In total, NV31 and NV34 support pixel shader programs a maximum of 1024 instructions in length. Most of ATI's R300-derived GPUs support pixel shader 2.0, whose maximum program length is only 64 instructions, though ATI's latest Radeon 9800 and 9800 Pro use an "F-buffer" to support shader programs with a theoretically "infinite" number of instructions. At least for now, ATI's "F-buffer" will only be available on high-end graphics cards, which means NVIDIA still has the edge on mainstream cards.

    Will NV3x's support for more complex pixel shader programs than even DirectX 9's requirements go unused? Maybe not. In this .plan update, id Software programmer John Carmack acknowledges that he's already hit the R300's limits:

    For developers doing forward looking work, there is a different tradeoff -- the NV30 runs fragment programs much slower, but it has a huge maximum instruction count. I have bumped into program limits on the R300 already.
    Games that don't venture beyond the DirectX 9 spec won't make use of the NV3x's support for longer pixel shader programs, but some developers will probably take advantage of support for extra-long shader programs where available.

  • Pixel shader precision - Like NV30, NV31 and NV34 support a maximum internal floating point precision of 128 bits within their pixel shaders. NV3x can also scale down its pixel shader precision to 16-bits of floating-point color per channel, or 64-bits overall, to yield better performance in situations where 128 bits of internal precision is just too slow.

    Unfortunately, it's hard to compare the NV3x's pixel shader precision directly with the R300's. The R300 supports only one level of floating point pixel shader precision, which, at 96 bits, falls between NV3x's support for 64- and 128-bit modes. Based on the results of early reviews, it looks like NV30's performance with 128-bit pixel shader precision is a little slow, but the chip can sacrifice precision to improve performance.

The above features are all key components of NVIDIA's CineFX engine, which means that NV31 and NV34 are both prepared for the "Dawn of cinematic computing" that NVIDIA has been pushing since its NV30 launch. This catch phrase, of course, refers to really, really pretty visual effects that should be easy for developers to create using of the additional flexibility offered by NV3x's support for complex shader programs and high color modes. That's the theory, anyway. NVIDIA will need to equip a large chunk of the market with CineFX-capable products before developers start targeting the technology, but bringing CineFX to the masses is what NV31 and NV34 are all about.