As you know if you have been following graphics hardware over the past six months or so, there is a rich thicket of issues surrounding benchmarking, driver optimizations, and next-gen graphics hardware. I will not attempt to bring you up to speed on all of them here. I believe Newell's presentation summarized the relevant issues nicely and stated Valve's position with little ambiguity, so I'll present my pictures of those slides with some commentary as necessary.
The context of Newell's remarks was ATI's "Shader Days" event. Select members of the press spent the day here taking in presentations by employees of ATI, Microsoft, and SoftImage about the latest developments in DirectX 9-class shaders, with an emphasis on pixel shader hardware. The briefings were full of useful information about how innovations made possible by DirectX 9-class programmable graphics hardware promise to vastly improve visual fidelity in games and other interactive apps without compromising performance. The capstone of the day's presentations was Newell's talk, which was accompanied by a stunning demonstration of Half-Life 2's source engine in action.
Newell began by establishing Half-Life 2's credentials as a true Direct X 9 app.
Newell then talked about his consternation over techniques IHVs have used to skew benchmark results, a problem he considers serious, because it threatens the gaming experience for Valve's customers.
As you can tell from looking at the list in the slide above, Newell was concerned particularly with some of the techniques NVIDIA has used in recent driver releases, although he didn't exempt other graphics hardware makers from his complaints. He said they had seen cases where fog was completely removed from a level in one of Valve's games, by the graphics driver software, in order to improve performance. I asked him to clarify which game, and he said it was Half-Life 2. Apparently, this activity has gone on while the game is still in development. He also mentioned that he's seen drivers detect screen capture attempts and output higher quality data than what's actually shown in-game.
Newell summed up his problem with bad benchmarks in a nice zinger line: "Our customers will be pissed."
|Biostar's Ryzen motherboards race toward release||57|
|TSUBAME3.0 gears up for AI supercomputing with 2160 Tesla P100s||30|
|Master of Shapes brings Vive tracking to Daydream VR||5|
|Deals of the week: Z270 motherboards, storage, and more||15|
|Phanteks Glacier gear flows into the water-cooling market||11|
|Display your graphics card with Thermaltake's PCIe riser cable||24|
|WWDC 2017 returns to its roots in San Jose||5|
|Unreal Engine 4.15 arrives with HDR and AFR support||62|
|MSI Aero ITX graphics cards put Pascal in petite places||5|