Single page Print

Half-Life 2 benchmark scores revealed


A virtual tour of City 17—or is it a slideshow?
— 12:00 AM on September 12, 2003

AT ATI'S SHADER DAYS event today, we were given the opportunity to obtain some benchmark scores using Valve's new Half-Life 2 performance test, and we have scores to share with you now. Before we do so, I should preface things with a few caveats.

The benchmark setups
Normally, when you see benchmarks on TR with fancy graphs and the like, we have conducted the testing ourselves using test systems we've built and installed personally. That's not the case with these numbers, which were produced in a very controlled environment. ATI and Valve collaborated on the system setups, and my role in the process was reduced to specifying verbally which script should be run. A very protective ATI employee kept my hands off the keyboard and kicked off the scripts for me, so not even in that basic way can I claim these are "my" benchmark numbers.

That said, I did have a chance to inspect a few rudimentary aspects of the system setups, and the Valve folks were present alongside the ATI folks to make sure everything was done according to Valve's liking. I believe these numbers will serve as a useful preview of what we will see ourselves when Valve's Half-Life 2 benchmark is released.

Now, a few words about the system setups themselves. First, we did not have a chance to install NVIDIA's 50.x drivers, even though NVIDIA supplied members of the press with these drivers and urged us to use them for any testing. No doubt these drivers contain-application specific optimizations for Half-Life 2; optimizations seem to be necessary on a per-application basis in order for NV3x chips to perform competitively in DirectX 9 applications. However, such optimizations are, as you probably know, quite controversial. We will have to try the 50.xx drivers in Half-Life 2 when the whole kit and caboodle is available freely to the public. As it is, the systems were configured with the latest public release drivers from NVIDIA and ATI.

Next, the four test systems were initially intended to be equipped with four different graphics cards: a GeForce FX 5600 Ultra 128MB, Radeon 9600 Pro 128MB, GeForce FX 5900 Ultra 256MB, and a Radeon 9800 Pro 128MB. Instead, the FX 5600 turned out to be a GeForce FX 5600 (non-Ultra), which has substantially lower core and memory clock speeds than the Ultra version—an important item to note for those of you keeping score at home.

Also, all the systems were Pentium 4 2.8GHz boxes with 1GB of memory, but the system configs were not identical. The two mid-range systems were based on MSI motherboards and SiS chipsets (the 655, I believe) with DDR400 memory and 533MHz front-side busses. These systems did not have Hyper-Threading. The high-end rigs were Dell boxes with 800MHz front-side busses and Hyper-Threading. As a result, the scores for the mid-range FX 5600 and 9600 Pro cards may not compare perfectly to the scores for the high-end cards. I have chosen to present all the scores together, because I believe the tests were largely bound by the graphics cards, not the host systems. Still, keep that configuration difference in mind.

The benchmark itself
The code we were using for the benchmark was, as I understand it, up to date with Valve's latest internal builds. However, the levels used in the test were not. The benchmark was based on older levels, like Valve's E3 demo levels, which lack high-dynamic-range (HDR) lighting and other new DirectX 9-based effects that will find their way into the final game or into updates via Valve's Steam content distribution system not long after the game's release.

In order to understand why HDR lighting is such a big deal, you really should go download the movie Valve has released showing off HDR and other new DX9 effects in action. Light behaves naturally with the DX9-based HDR lighting model, with very high intensities causing lights to "glow" realistically.


High-dynamic-range lighting in action

Obviously, HDR lighting is one of the big benefits of DirectX 9, so I'm a little perplexed about its absence from the benchmark. Then again, I hear tell the Valve guys may be busy finishing up a game or something.

As it was, we were able to run the benchmark using several different codepaths, including full DirectX 9, a DirectX 9 path written specifically for GeForce FX hardware, and a DirectX 8.1 code path. (We elected not to run the DX8.0 codepath, although it was available, because of time limits.) The FX codepath uses partial-precision hints to trigger 16-bit-per-color-channel datatypes and has some other performance-enhancing shortcuts to accommodate the FX hardware's limitations, like using textures to normalize vectors. It's my understanding that the NV3x-specific codepath does not use FX12 integer pixel shaders.

Notably, Valve hasn't yet settled on the default rendering paths for use in the final game, especially for the FX 5900. That card may use the NV3x-specific codepath. The GeForce FX 5200 Ultra seems likely to use Direct 8.0, while the FX 5600 will probably use Direct 8.1.

To my weary, bloodshot eye, there was very little visible difference in the benchmark between the DX8.1, "mixed" FX, and full DX9 code paths. I think some shader effects looked a little nicer with DX9, but then again, I didn't really have a chance to study the visuals too intently as the tests ran. I'd have to play with the game in order to say with confidence what the visual impact of the different codepaths really is. Then again, the final game will apparently have those gorgeous HDR lighting effects and the like, which are easy to spot.

I did not elect to test performance with antialiasing enabled, because I was told by the Valve folks that performance in this benchmark with AA enabled would not be representative of AA performance in the final game.

With all that out of the way, let's move on to the scores themselves.