Single page Print

Intel graphics drivers employ questionable 3DMark Vantage optimizations

3DMurk all over again?
— 11:02 AM on October 12, 2009

In the early days of GPUs, application-specification performance optimizations in graphics drivers were viewed by many as cheating. Accusations were hurled with regularity, and in some cases, there was real cheating going on. Some optimizations surreptitiously degraded image quality in order to boost performance, which obviously isn't kosher. Optimizations that don't affect an application's image quality are harder to condemn, though, especially if you're talking about games. If a driver can offer users smoother gameplay without any ill effects, why shouldn't it be allowed?

The situation gets more complicated when one considers optimizations that specifically target benchmarks. Synthetic tests don't have user experiences to improve, just arbitrary scores to inflate. Yet the higher scores achieved through benchmark-specific optimizations could influence a PC maker's choice of graphics solution or help determine the pricing of a graphics card.

Futuremark's popular 3DMark benchmark has been the target of several questionable optimizations over the years. Given that history, it's not surprising that the company has strict guidelines for the graphics drivers it approves for use with 3DMark Vantage. These guidelines, which can be viewed here (PDF), explicitly forbid optimizations that specifically target the 3DMark Vantage executable. Here's an excerpt:

With the exception of configuring the correct rendering mode on multi-GPU systems, it is prohibited for the driver to detect the launch of 3DMark Vantage executable and to alter, replace or override any quality parameters or parts of the benchmark workload based on the detection. Optimizations in the driver that utilize empirical data of 3DMark Vantage workloads are prohibited.

No ambiguity there, then: Vantage-specific optimizations aren't allowed.

Intel may not be playing fair, though. We recently learned AMD has notified Futuremark that Intel's Graphics Media Accelerator drivers for Windows 7 incorporate performance optimizations that specifically target the benchmark, so we decided to investigate.

We tested 3DMark Vantage 1.0.1 with these drivers on a G41 Express-based Gigabyte GA-G41M-ES2H motherboard running the Windows 7 x64 release-to-manufacturing build, a Core 2 Duo E6300, 4GB of DDR2-800 memory, and a Raptor WD1500ADFD hard drive.

We first ran the benchmark normally. Then, we renamed the 3DMark executable from "3DMarkVantage.exe" to "3DMarkVintage.exe". And—wouldn't you know it?—there was a substantial performance difference between the two.

Our system's overall score climbs by 37% when the graphics driver knows it's running Vantage. That's not all. Check out the CPU and GPU components of the overall score:

The GPU score jumps by a whopping 46% thanks to Intel's apparent Vantage optimization. At the same time, the CPU score falls by nearly 10%. Curious.

Next, we ran a perfmon log of CPU utilization during each of 3DMark's CPU and GPU component tests. Vantage takes its sweet time loading each test, so our start and end times aren't perfectly aligned for each run. However, the pattern is pretty obvious.

In the GPU tests, the system's CPU utilization is much higher with the default executable than with the "3DMarkVintage" executable. There isn't much difference in CPU utilization in the CPU tests, though.