As you know if you have been following graphics hardware over the past six months or so, there is a rich thicket of issues surrounding benchmarking, driver optimizations, and next-gen graphics hardware. I will not attempt to bring you up to speed on all of them here. I believe Newell's presentation summarized the relevant issues nicely and stated Valve's position with little ambiguity, so I'll present my pictures of those slides with some commentary as necessary.
The context of Newell's remarks was ATI's "Shader Days" event. Select members of the press spent the day here taking in presentations by employees of ATI, Microsoft, and SoftImage about the latest developments in DirectX 9-class shaders, with an emphasis on pixel shader hardware. The briefings were full of useful information about how innovations made possible by DirectX 9-class programmable graphics hardware promise to vastly improve visual fidelity in games and other interactive apps without compromising performance. The capstone of the day's presentations was Newell's talk, which was accompanied by a stunning demonstration of Half-Life 2's source engine in action.
Newell began by establishing Half-Life 2's credentials as a true Direct X 9 app.
Newell then talked about his consternation over techniques IHVs have used to skew benchmark results, a problem he considers serious, because it threatens the gaming experience for Valve's customers.
As you can tell from looking at the list in the slide above, Newell was concerned particularly with some of the techniques NVIDIA has used in recent driver releases, although he didn't exempt other graphics hardware makers from his complaints. He said they had seen cases where fog was completely removed from a level in one of Valve's games, by the graphics driver software, in order to improve performance. I asked him to clarify which game, and he said it was Half-Life 2. Apparently, this activity has gone on while the game is still in development. He also mentioned that he's seen drivers detect screen capture attempts and output higher quality data than what's actually shown in-game.
Newell summed up his problem with bad benchmarks in a nice zinger line: "Our customers will be pissed."
|Noctua confirms LGA 2066 will host Skylake-X and Kaby Lake-X||2|
|Radeon 17.4.4 drivers rise for Dawn of War III||2|
|AMD ships Ryzen Balanced power plan with latest chipset drivers||3|
|Amazon's Echo Look uses machine learning to dress you up||29|
|EK machines a waterblock for the ROG Maximus IX Apex||2|
|Microsoft describes how it uses telemetry data for smoother updates||22|
|id software talks about Ryzen||84|
|FSP hits the heatsink market with its Windale CPU coolers||16|
|Steelseries Qck Prism is a lit stage for your mouse||26|
|Love the packaging. For the love of god - this minimalism and colour scheme on regular people cards, please.||+54|