3DMark03 optimizations benchmarked

Wondering whether FutureMark's performance claims about 3DMark03 optimizations and cheats (as detailed in this paper) are accurate? I decided to test it for myself, so I fired up my Athlon XP 3200+ test rig and gave it a go. I used both the 320 and 330 builds of 3DMark03 with a pair of video cards: a Gainward GeForce FX 5800 Ultra card with Detonator FX 44.03 drivers, and an ATI Radeon 9800 Pro 256MB card with Catalyst 3.4 drivers.

You'll recall that build 320 was the version of 3DMark for which NVIDIA and ATI optimized their drivers. (ATI has owned up to it, but NVIDIA hasn't so far as I'm aware.) Build 330 of 3DMark03 was written expressly to foil benchmark-specific optimizations and cheats.

Here are my results:

As you know, 3DMark03's overall score is derived from its four game tests. You can see my results for Game 1, Game 2, Game 3, and Game 4.

As you can see, ATI's optimizations—which they claim didn't change image output—barely affect the overall score. NVIDIA's, however, make a substantial difference. The most striking difference between builds 320 and 330 is in the Game 4 test, where performance drops dramatically once the cheats and optimizations are disabled. This test, the "Mother Nature" scene, makes the most extensive use of DirectX 9 and pixel shader 2.0.

Those are the numbers. I don't have time here to dig into all of the related issues, but Dave from Beyond3D sent me a note about a couple of things you should check out on his site. First, Dave has captured images from ATI and NVIDIA cards in 3DMark03 to show the image quality differences between builds 320 and 330. The NVIDIA drivers clearly have more impact on image quality. Next, to help you sort out what that fact means, have a look at Unreal guru Tim Sweeney's take on cheating versus optimization. The basic principle he outlines seems like a good guide in this case.

Tip: You can use the A/Z keys to walk threads.
View options

This discussion is now closed.