Single page Print

NVIDIA's GeForce FX 5700 Ultra GPU


Third time's the charm?
— 8:00 AM on October 23, 2003

FOR NEARLY a year, ATI's mid-range Radeons have owned the performance crown for mid-range graphics. ATI's dominance started with the Radeon 9500 Pro, continued with the Radeon 9600 Pro, and was most recently refreshed with the 9600 XT. NVIDIA took a stab at the mid-range with its GeForce FX 5600s, but not even a faster respin of the GeForce FX 5600 Ultra had enough punch to take on the Radeon 9600 line, especially in DirectX 9 applications.

Before the Radeon 9500 Pro came along, NVIDIA's GeForce4 Ti 4200 was the most widely recommended graphics card for budget-conscious enthusiasts. NVIDIA knows what it takes to be a mid-range market leader. With memories of the GeForce4 Ti 4200's glory no doubt in mind, NVIDIA is ready to put up a fight for the mid-range performance crown with the GeForce FX 5700 Ultra. The 5700 Ultra is powered by a new NV36 graphics chip and promises more shader power, higher clock speeds, and greater memory bandwidth than its predecessor. On paper, NVIDIA's third shot at this generation's mid-range graphics crown looks pretty good, but does it have the charm to capture the hearts and minds of budget-conscious gamers and PC enthusiasts? Will this be NVIDIA's third strike with the budget-conscious crowd? Read on as we unleash the FX 5700 Ultra on ATI's Radeon 9600 cards to find out.

Introducing NV36
The GeForce FX 5700 line is based on NVIDIA's new NV36 graphics chip, which is essentially a mid-range version of the high-end NV35 chip found in NVIDIA's GeForce FX 5900 and 5900 Ultra. The NV36-powered GeForce FX 5700 Ultra will replace the mid-range FX 5600 Ultra at a suggested retail price of $199, and is expected on retail shelves starting this Sunday.

NV36 doesn't represent a major redesign of NVIDIA's current GeForce FX architecture, but the chip does have a number interesting characteristics that are worth highlighting.

  • 4x1-pipe design - Like NV31, NV36 has four pixel pipelines with a single texture unit per pipe. The chip behaves like a four-pipe design regardless of the kind of rendering being done, which makes it a little bit easier to understand than something like NV30, which can act like eight-pipe chip under certain circumstances.

  • 128-bit memory bus - NV36's memory bus is 128 bits wide, just like NV31's. However, NV36 supports DDR2 and GDDR3 memory types in addition to standard DDR SDRAM. Initially, GeForce FX 5700 Ultra cards will ship with DDR2 memory chips running at 450MHz, but board manufacturers may eventually exploit the chip's compatibility with different memory types to produce budget cards with DDR SDRAM or more exotic offerings with GDDR3.

  • 'mo shader power - The GeForce FX 5600 Ultra's shader performance never really cut it against ATI's mid-range Radeons, so NVIDIA has beefed up shaders for NV36. The chip boasts three vertex units that conspire to deliver triple the vertex processing power of NV31. NVIDIA also re-architected its programmable pixel shader for NV36, though no specific pixel shader performance claims are being made.

  • Chip fabrication by IBM - Unlike the rest of NVIDIA's NV3x graphics chips, NV36 is being manufactured using IBM's 0.13-micron fabrication process. NV36 is the first product to emerge from NVIDIA's recently announced partnership with IBM, and NVIDIA is quite happy with how well things have worked out so far. NV36 isn't built using low-k dielectrics like NV38 or ATI's RV360 GPUs, but it's still clocked at a speedy 475MHz on the GeForce FX 5700 Ultra.

    What's particularly impressive about NV36's fabrication is the fact that NVIDIA was able to get its very first chip sample from IBM up and running Quake just 50 minutes after the chip entered NVIDIA's testing lab. A testament to IBM's mad fabrication skills, the first A01 spin of NV36 silicon is actually being used for retail versions of the chip.

Overall, NV36 doesn't represent a radical departure from the GeForce FX architecture; the chip should share all the perks that go along with "cinematic computing," but it will also inherit a number of quirky personality traits that have thus far had a negative impact on performance.

NVIDIA continues to reiterate the fact that its entire GeForce FX line is sensitive to instruction ordering and pixel shader precision. Optimized code paths can help NV36 and the rest of the GeForce FX line realize their full potential, but NVIDIA's new Detonator 50 driver also has a few tricks up its sleeve to improve performance. You can read all about the Detonator 50 drivers in our GeForce FX 5950 Ultra review.


NV36: NVIDIA's first GPU fabbed by IBM

The specs
Perhaps to illustrate just how close the GeForce FX 5700 Ultra is to store shelves, NVIDIA sent out retail cards instead of standard reference review samples. The eVGA e-GeForce FX 5700 Ultra that showed up on my doorstep came in a full retail box, shrink-wrapped and everything. Let's have a quick look at the card's spec sheet.

GPUNVIDIA NV36
Core clock475MHz
Pixel pipelines4
Peak pixel fill rate1900 Mpixels/s
Texture units/pixel pipeline1
Textures per clock4
Peak texel fill rate1900 Mtexels/s
Memory clock906MHz*
Memory typeBGA DDR2 SDRAM
Memory bus width128-bit
Peak memory bandwidth14.5GB/s
PortsVGA, DVI, composite and S-Video outputs
Auxiliary power connector4-pin Molex
NVIDIA's reference spec calls for an effective memory clock of 900MHz, but our sample's memory was running at 906MHz

The GeForce FX 5700 Ultra is all about high clock speeds and fancy memory. Quite honestly, I didn't expect cards with 450MHz DDR2 memory chips to hit $200 price points this soon, but I'm certainly not going to complain. Profit margins on GeForce FX 5700 Ultras may be slimmer than with other cards, but that's a good thing for consumers looking for the most bang for their buck.

Here's a few nudies of the e-GeForce FX 5700 Ultra to drool over before we get started with the benchmarks.


The GeForce FX 5700 Ultra looks imposing, and it is


475MHz with a single-slot cooler that doesn't sound like a Dustbuster.
Imagine that!


BGA DDR2 memory chips bring in the bandwidth


(Insert incessant whining about the lack of dual DVI here)