Single page Print

NVIDIA's GeForce FX 5800 Ultra GPU


What's under the Dustbuster?
— 12:00 AM on April 7, 2003

ARE YOU ready? The era of cinematic graphics is upon us.

Err, well, it will be soon.

Or, uhm, maybe not really, but we can show you how it might have looked.

You see, we have our hot little hands on NVIDIA's NV30 chip, better known as the GeForce FX 5800 Ultra, that was intended to usher in a new era of movie-quality graphics when it debuted some time last year. Instead, it's months into 2003, and GeForce FX cards still aren't widely available.

Let's just say NVIDIA's had a wee problem with its vaunted execution. That fact hurts even more because NVIDIA's rival, ATI, hasn't. The Radeon 9700 debuted last fall to deservedly positive reviews. The 9700 lived up to the high expectations we'd set for this new generation of DirectX 9-class graphics chips, delivering amazing new shiny objects and lots of rippling glowy things onscreen. Err, delivering true floating-point color formats and amazing amounts of real-time graphics processing power.

But the story of the GeForce FX 5800 Ultra isn't just the story of a late semiconductor. No, it's also the story of an impressively "overclocked" piece of silicon with a massive, Dustbuster-like appendage strapped to its side. Most intriguingly, it's the story of a product that may never reach store shelves in any kind of volume, because all signs point to its premature demise.

That said, the GeForce FX 5800 Ultra is still interesting as heck. I've managed to finagle one for review, so buckle up, and we'll see what this puppy can do.


NVIDIA's tech demo heralds the "Dawn" of cinematic graphics

What's all this talk about cinematic rendering?
Let's start right off with the graphics egghead stuff, so we can fit the GeForce FX into the proper context.

The GeForce FX 5800 Ultra card is powered by the new NVIDIA GPU, widely known by its code name, NV30, during its development. The NV30 is NVIDIA's first attempt at the generation of graphics chips capable of a whole range of features anticipated by the specifications for Microsoft's DirectX 9 software layer.

The single most important advance in this new generation of graphics chips is a rich, new range of datatypes available to represent graphics data—especially pixels and textures. This capability is the crux of NVIDIA's marketing push for "cinematic graphics." By adding the ability to represent graphics data with more precision, chips like NV30 and ATI's R300 series can render images in real time (or something close to it) nearly as compelling as those produced by movie studios. Now, the GeForce FX may not be ready to replace banks and banks of high-powered computers running in professional render farms just yet, but you might be surprised at how close it comes.

I've written more about this new generation of graphics chips and what it means for the world right here. Go read up if you want to understand how such things are possible.

Along with richer datatypes, this new generation of GPUs offers more general programmability, which makes them much more powerful computational devices. Contemporary GPUs have two primary computational units, vertex shaders and pixel shaders. Vertex shaders handle manipulation and lighting (shading) of sets of coordinates in 3D space. Vertex shader programs can govern movements of models and objects, creating realistic bouncing and flexing motions as a CG dinosaur tromps across a scene. Pixel shaders, on the other hand, apply lighting and shading effects algorithms to sets of pixels. Pixel shader programs can produce sophisticated per-pixel lighting effects and generate mind-bending effects like reflections, refractions, and bump mapping. DX9 pixel and vertex shaders incorporate more CPU-like provisions for program execution, including new vector and scalar instructions and basic flow control for subroutines.

Of course, DirectX 9-class pixel shaders include support for floating-point datatypes, as well.

By offering more flexibility and power to manipulate data, vertex and pixel shaders amplify the graphics power of a GPU. Some fairly recent theoretical insights in graphics have shown us that traditional, less-programmable graphics chips could render just about any scene given rich datatypes and enough rendering passes. This realization has led to the development of high-level shading languages like Microsoft's HLSL and NVIDIA's Cg. These shading languages can compile complex graphics operations into multiple rendering passes using Direct X or OpenGL instructions. Programmable hardware shaders can produce equivalent results in fewer rendering passes, bringing the horizon for real-time cinematic rendering closer.

So that is, in a nutshell, where all this fancy-pants talk about cinematic rendering comes from. Built on a 0.13-micron manufacturing process with over 125 million transistors, NVIDIA's GeForce FX is a powerful graphics rendering engine capable of doings things of which previous graphics chips could only dream. The leap from GeForce4 to GeForce FX is as large a leap forward as we've seen since 3D graphics has been driven by custom chips.

The big, green, shiny fly in the ointment, however, is ATI's Radeon R300 series of chips, which brought similar graphics capabilities to the mass market last September. In the following pages, we'll review the capabilities of NVIDIA's GeForce FX with an eye toward its competition. We'll break it down into key areas like pixel-pushing power, vertex and pixel shaders, texture and edge antialiasing, and performance in real-world games to see how the GFFX stacks up.


Cinematic graphics: Lubed up and a bit chilly


ATI counters the nymph with a chimp. Canny.