Based on everything we know, we can only conclude that the CS: Source video stress test is essentially a Half-Life 2 benchmark that’s available to the public right now. Naturally, that piques our curiosity, especially since last time around, the ATI cards were absolutely trouncing the NVIDIA cards in HL2 benchmarks. There was only one thing to do: we rounded up thirteen different DirectX 9-class video cards for a Source engine benchmarking bonanza.
Has ATI maintained its monstrous lead in Half-Life 2 performance over NVIDIA, or have the events of the past year allowed NVIDIA to catch up? Read on to find out.
The Source engine video stress test
The Source engine video stress test included with the CS: Source beta isn’t a real in-game benchmark. It doesn’t use a real Half-Life 2 level, doesn’t test game physics, and doesn’t play sounds back. It is, however, a pretty darned good video card benchmark, because it incorporates a whole range of pixel shader effects, sometimes layering them on top of each other, to produce lots of eye candy. The video stress test is alsolo and beholdvery much a stress test; it seems to throw a series of worst-case scenarios at the graphics card to see how it fares. In other words, if a graphics card can make it through the video stress test without choking, I’d expect it to hold up its end of the bargain in Half-Life 2, as well.
To give you some idea what the stress test does, let’s have a look at a few screenshots. The first one is from the opening stage of the stress test, where multiple translucency effects are layered on top of one another. Note, also, the reflective and refractive water below. This scene is packed with DX9 pixel shader effects.
Next up is a room illuminated by a fire effect. On the pedestal in the middle of the room, you can kind of see a translucent player character, though it is tough to pick out in this shot. Also note the walls, which are covered with very detailed bump or normal maps. The low resolution of this screen shot doesn’t do it justice; the textures still look exquisitely detailed at 1600×1200.
Finally, we have a room with a series of virtual TV sets, displaying images from the previous test room on them through the magic of portal rendering (or, uhm, render to texture). Again, the floor is covered with water, and settled above the water is a thick, blue volumetric fog.
All of these scenes rendered perfectly on all of the video cards we tested, with a couple of minor exceptions that I’ll describe shortly. Overall, the Source engine’s visuals are much higher quality than current games and don’t seem to vary widely from card to card, much like we’ve seen in DOOM 3. We have refrained from providing extensive screenshot comparisons between cards because of some limitations in the CS: Source beta, but to the naked eye, there’s little difference between ATI and NVIDIA in terms of image quality. Let’s talk about the differences we were able to spot…
A few quirks in the CS: Source beta
Now, about those rendering problems. First, no matter which card we tried, we’d see pixel shader corruption problems and skewed benchmark results if we didn’t exit the game and restart it after each video mode change. This problem was simple to work around, of course, but it’s something to note.
Second, the GeForce FX cards hit a few bumps in the road in this beta version of CS: Source. The Source engine auto-detects recommended settings for one’s video card, and on any card in the GeForce FX line, it practically insists on turning off water reflectivity. As a result, we’ve benchmarked the GeForce FX line without water reflectivity, and we’ve put an asterisk next to the name of each FX card in our results, to remind you of that fact. There’s no great visual difference between the water’s look on the FX line and on other cards, but if the menu settings mean anything, the FX cards are doing less work.
The GeForce FX line also won’t do 4X antialiasing in this CS: Source beta. Instead, you get this message:
I’m not sure what the problem here is, but obviously Valve has classified it as a known bug with a beta version of the engine. I’m curious to find out whether this bug has anything to do with the centroid sampling problems on GeForce FX hardware. Whatever the case, we weren’t able to benchmark the GeForce FX cards with antialiasing enabled.
DX8 versus DX9 illustrated
We should briefly address the issue of DirectX 8 versus DirectX 9, because Valve originally said last year that it might have to drop back to its DirectX 8 rendering path in order for GeForce FX cards to perform acceptably in Half-Life 2. We have no confirmation from Valve yet about what rendering path the CS: Source beta is using on GeForce FX cards, but I thought I should show you the difference between a DirectX 8-class card, a GeForce FX, and ATI’s very latest DX9 card. The image output differences between DX8 and DX9 cards are pretty subtle. Have a look at the screenshots below, and you’ll see an example where a difference is apparent.
The DX8-class GeForce4 Ti 4200 manges to render the glow effects reasonably well, but it has less internal color precision than the Radeon X800. You can see some harsher color transitions and some greenish banding at the edge of the light halos in the Ti 4200 screenshot. That is pretty much the sum of the difference between DX8 and DX9 rendering in the CS: Source betaminor differences in color precision.
Interestingly enough, the GeForce FX 5950 Ultra renders the scene with enough precision that the banding apparent on the GeForce4 Ti 4200 is banished. The same holds true for the GeForce FX 5700 Ultra. Looks to me like the GeForce FX cards are using a DX9 rendering path of some sort.
Our testing methods
The dialog below shows the settings we used in testing, with the exception of the water detail problem noted above. For the benchmarks done with 4X antialiasing and 8X anisotropic filtering, we used this in-game settings tool to change AA and aniso modes.
Both the ATI and NVIDIA cards were left at their driver default settings for image quality, with the exception that we turned off vertical refresh sync on all cards.
Our test system was configured like so:
|Processor||Athlon 64 3800+ 2.4GHz|
|System bus||HT 16-bit/800MHz downstream
HT 16-bit/800MHz upstream
|North bridge||K8T800 Pro|
|Chipset drivers||4-in-1 v.4.51
|Memory size||1GB (2 DIMMs)|
|Memory type||Kingston HyperX DDR SDRAM at 400MHz|
|RAS to CAS delay||2|
|Hard drive||Seagate Barracuda V ATA/100 120GB|
|Graphics|| Radeon 9600 XT 128MB AGP
Radeon 9800 Pro 128MB AGP
Radeon 9800 XT 256MB AGP
Radeon X800 Pro 256MB AGP
Radeon X800 XT 256MB AGP
GeForce FX 5700 Ultra 128MB AGP
GeForce FX 5800 Ultra 128MB AGP
GeForce FX 5900 128MB AGP
GeForce FX 5950 Ultra 256MB AGP
GeForce 6800 128MB AGP
GeForce 6800 GT 256MB AGP
GeForce 6800 Ultra 256MB AGP
GeForce 6800 Ultra “Overclocked” 256MB AGP
|OS||Microsoft Windows XP Professional|
|OS updates||Service Pack 2 RC2, DirectX 9.0c|
We used NVIDIA’s ForceWare 61.77 drivers for all of the GeForce cards, and we used ATI’s CATALYST 4.8 drivers with all the Radeon cards.
The test systems’ Windows desktop was set at 1152×864 in 32-bit color at an 85Hz screen refresh rate. Vertical refresh sync (vsync) was disabled for all tests.
If you have questions about our methods, hit our forums to talk with us about them.
We have bar graphs and line graphs, because each type is useful in its own way. Note that on the line graphs down there, I’ve had to split the results across two graphs, because we had too many results for a single line graph. To keep thing sane, I’ve put the newer, higher-end cards on one graph and the older and mid-range cards on another.
4X antialiasing plus 8X anisotropic filtering
The video stress test in the CS: Source beta gives us a very different set of results than what we saw in the early Half-Life 2 benchmarks from almost a year ago. The NVIDIA cards are performing much better than they were before, especially relative to the Radeons. We’re not seeing the kind of “class busting” disparities between benchmark results here that we saw recently in DOOM 3, where one company’s $299 card outran the other company’s $399 card. Instead, what we have is rough parity. Of course, the GeForce 6 series of cards is much more potent in DirectX 9 than the GeForce FX line was. Still, the change in the FX cards’ relative performance is something of a surprise. Let’s break it down by class.
Among the $499 “image products,” the Radeon X800 XT Platinum Edition outdoes both flavors of GeForce 6800 Ultra, the “regular” 400MHz version and the 450MHz “overclocked in the box” model. The X800 XT PE’s advantage is most pronounced at with antialiasing and anisotropic filtering enabled. For instance, with 4X AA and 8X aniso at 1280×1024 resolution, the Radeon hits 87 frames per second, while the GeForce 6800 Ultra OC averages 80 FPS and the Ultra 74 FPS. This isn’t exactly dominance, but ATI is clearly on top.
Down at $399, though, it’s a different story. The GeForce 6800 GT slightly but surely outperforms the Radeon X800 Pro without aniso and AA. With 4X AA and 8X aniso, the two cards are virtually tied across all four resolutions we tested.
At $299, we approach the sorts of graphics cards that many folks might actually consider buying. Here, the aging Radeon 9800 XT faces off against the brand-new GeForce 6800, and the NVIDIA card has the edge in the majority of our tests. Only in the most brutal conditions, at 1600×1200 with AA and aniso enabled, does the Radeon prevail.
Jump down near the $199-ish range, and the field gets a little crowded, with various flavors of Radeons and GeForce FXs vying for attention. I’d pick the battle of the Radeon 9600 XT versus the GeForce FX 5700 Ultra as the most interesting comparison here. The FX card isn’t doing reflective water and can’t run with antialiasing in this CS: Source beta version, but otherwise, the two cards pump out frames at nearly the same rate.
NVIDIA has started phasing them out now, but there are still lots GeForce FX 5900-series cards out there on the market, like the FX 5900 and FX 5950 Ultra cards we tested. Amazingly enough, these cards perform nearly as well as their ATI-based counterparts in the CS: Source beta, with the obvious caveats about water reflections and antialiasing. In the vintage sweeps, I had hoped to include a Radeon 9700 Pro in our tests to face off against the GeForce 5800 Ultra, but our ancient Radeon 9700 Pro card (a very early review unit) proved incompatible with our test system. The 5800 Ultra deafened me a little, but it turned in some decent benchmark scores, only six frames per second behind the Radeon 9800 Pro at 1280×1024.
Looks to me like Valve and NVIDIA have been working together to improve performance on GeForce FX GPUs, with impressive results. If these numbers are any indication, GeForce FX owners ought to be able to play Half-Life 2 with few compromises. I’m curious to see whether the optimizations for FX cards in Half-Life 2 are robust enough to survive incremental shader modifications and code updates to the game. They may still be rather fragile, as some of NVIDIA’s other optimizations for FX cards have proven.