As you know if you have been following graphics hardware over the past six months or so, there is a rich thicket of issues surrounding benchmarking, driver optimizations, and next-gen graphics hardware. I will not attempt to bring you up to speed on all of them here. I believe Newell's presentation summarized the relevant issues nicely and stated Valve's position with little ambiguity, so I'll present my pictures of those slides with some commentary as necessary.
The context of Newell's remarks was ATI's "Shader Days" event. Select members of the press spent the day here taking in presentations by employees of ATI, Microsoft, and SoftImage about the latest developments in DirectX 9-class shaders, with an emphasis on pixel shader hardware. The briefings were full of useful information about how innovations made possible by DirectX 9-class programmable graphics hardware promise to vastly improve visual fidelity in games and other interactive apps without compromising performance. The capstone of the day's presentations was Newell's talk, which was accompanied by a stunning demonstration of Half-Life 2's source engine in action.
Newell began by establishing Half-Life 2's credentials as a true Direct X 9 app.
Newell then talked about his consternation over techniques IHVs have used to skew benchmark results, a problem he considers serious, because it threatens the gaming experience for Valve's customers.
As you can tell from looking at the list in the slide above, Newell was concerned particularly with some of the techniques NVIDIA has used in recent driver releases, although he didn't exempt other graphics hardware makers from his complaints. He said they had seen cases where fog was completely removed from a level in one of Valve's games, by the graphics driver software, in order to improve performance. I asked him to clarify which game, and he said it was Half-Life 2. Apparently, this activity has gone on while the game is still in development. He also mentioned that he's seen drivers detect screen capture attempts and output higher quality data than what's actually shown in-game.
Newell summed up his problem with bad benchmarks in a nice zinger line: "Our customers will be pissed."
|G.Skill's DDR4-4400 kit seizes the four-module memory speed crown||19|
|Rumor: December Radeon drivers will bring a performance OSD||23|
|Intel spins up new assembly-and-test site for Coffee Lake CPUs||9|
|Deal of the day: A laptop with an i5-8250U and Pascal graphics for $680||26|
|EVGA DG-7 cases cover every base||19|
|Radeon 17.11.2 drivers take the fight to the Galactic Empire||41|
|Intel readies a family of 5G modems and talks up a storm on 28 GHz||25|
|National Fast Food Day Shortbread||19|
|OnePlus 5T stretches its screen without straining wallets||40|