What's really going on
Intel appears to be offloading some of the work associated with the GPU tests onto the CPU in order to improve 3DMark scores. When asked for comment, Intel replied with the following:
We have engineered intelligence into our 4 series graphics driver such that when a workload saturates graphics engine with pixel and vertex processing, the CPU can assist with DX10 geometry processing to enhance overall performance. 3DMarkVantage is one of those workloads, as are Call of Juarez, Crysis, Lost Planet: Extreme Conditions, and Company of Heroes. We have used similar techniques with DX9 in previous products and drivers. The benefit to users is optimized performance based on best use of the hardware available in the system. Our driver is currently in the certification process with Futuremark and we fully expect it will pass their certification as did our previous DX9 drivers.
This CPU-assisted vertex processing doesn't appear to affect Vantage's image quality. However, Intel is definitely detecting 3DMark Vantage and changing the behavior of its drivers in order to improve performance, which would appear to be a direct contravention of Futuremark's guidelines.
At present, Intel's 18.104.22.1682 graphics drivers don't appear on Futuremark's approved driver list for 3DMark Vantage. None of the company's Windows 7 drivers do. The 22.214.171.1244 Windows Vista x64 drivers that are on the approved list don't appear to include the optimization in question, because there's no change in performance if you rename the Vantage executable when using those drivers.
Violating Futuremark's driver optimization guidelines is one thing, but Intel also claims it's offloading vertex processing to enhance performance in games. Indeed, the very same INF file that singles out 3DMarkVantage.exe also names other executables.
HKR,, ~3DMarkVantage.exe, %REG_DWORD%, 2 HKR,, ~3DMarkVantageCmd.exe, %REG_DWORD%, 2 HKR,, ~CoJ_DX10.exe, %REG_DWORD%, 2 HKR,, ~Crysis.exe, %REG_DWORD%, 2 HKR,, ~RelicCoH.exe, %REG_DWORD%, 2 HKR,, ~UAWEA.exe, %REG_DWORD%, 2
One of the games on the list for detection, Crysis Warhead, should have no problem saturating an integrated graphics chipset, to say the least. We tested it with the executable under its original name and then renamed to Crisis.exe, using FRAPS to collect real-world frame rate data with the game running at 800x600 and minimum detail levels.
Intel's software-based vertex processing scheme improves in-game frame rates by nearly 50% when Crysis.exe is detected, at least in the first level of the game we used for testing. However, even 15 FPS is a long way from what we'd consider a playable frame rate. The game doesn't exactly look like Crysis Warhead when running at such low detail levels, either.
Our Warhead results do prove that Intel's optimization can improve performance in actual games, thoughif only in this game and perhaps the handful of others identified in the driver INF file.
How do Intel's driver optimizations affect the competitive landscape? To find out, we assembled an AMD 785G-based system that's pretty comparable to the G41 rig we used for testing: Athlon II X2 250 processor, Gigabyte GA-MA785GPMT-UD2H motherboard, the same Raptor hard drive, and 4GB of DDR3 memory running at 800MHz with looser timings than the Intel system. We even disabled the board's dedicated sideport graphics memory, forcing the GPU to share system RAM like the G41.
With the Futuremark-approved Catalyst 9.9 drivers, the AMD 785G-based system scored 2161 in 3DMark Vantagenearly the same score as the 2132 3DMarks the G41 gets when it's playing by the rules, but well below the 2931 the score the G41 posts with optimizations enabled. (Renaming the Vantage executable on the AMD system had no notable effect on benchmark scores.) The app-specific optimization gives the G41 a definitive lead in 3DMark Vantage.
Here's the tricky part: the very same 785G system managed 30 frames per second in Crysis: Warhead, which is twice the frame rate of the G41 with all its vertex offloading mojo in action. The G41's new-found dominance in 3DMark doesn't translate to superior gaming performance, even in this game targeted by the same optimization.
All of which brings us back to the perils of using 3DMark Vantage as a substitute or proxy for testing real games. Those perils are well established by now. PC makers and others in positions of influence would do well to re-train their focus on real applicationsespecially for testing integrated graphics solutions, which have no need of advanced graphics workloads based on the latest version of DirectX to push their limits. 3DMark's traditionally purported role as a predictor of future game workloads makes little sense in the context of these modest IGPs.
We're curious to see what FutureMark will make of Intel's Windows 7 graphics drivers. As far as we can tell, the latest GMA drivers are in violation its rules. We've asked Futuremark for comment on this issue, but the company has yet to reply.