As you know if you have been following graphics hardware over the past six months or so, there is a rich thicket of issues surrounding benchmarking, driver optimizations, and next-gen graphics hardware. I will not attempt to bring you up to speed on all of them here. I believe Newell's presentation summarized the relevant issues nicely and stated Valve's position with little ambiguity, so I'll present my pictures of those slides with some commentary as necessary.
The context of Newell's remarks was ATI's "Shader Days" event. Select members of the press spent the day here taking in presentations by employees of ATI, Microsoft, and SoftImage about the latest developments in DirectX 9-class shaders, with an emphasis on pixel shader hardware. The briefings were full of useful information about how innovations made possible by DirectX 9-class programmable graphics hardware promise to vastly improve visual fidelity in games and other interactive apps without compromising performance. The capstone of the day's presentations was Newell's talk, which was accompanied by a stunning demonstration of Half-Life 2's source engine in action.
Newell began by establishing Half-Life 2's credentials as a true Direct X 9 app.
Newell then talked about his consternation over techniques IHVs have used to skew benchmark results, a problem he considers serious, because it threatens the gaming experience for Valve's customers.
As you can tell from looking at the list in the slide above, Newell was concerned particularly with some of the techniques NVIDIA has used in recent driver releases, although he didn't exempt other graphics hardware makers from his complaints. He said they had seen cases where fog was completely removed from a level in one of Valve's games, by the graphics driver software, in order to improve performance. I asked him to clarify which game, and he said it was Half-Life 2. Apparently, this activity has gone on while the game is still in development. He also mentioned that he's seen drivers detect screen capture attempts and output higher quality data than what's actually shown in-game.
Newell summed up his problem with bad benchmarks in a nice zinger line: "Our customers will be pissed."
|Razer Electra V2 offers affordable immersion||2|
|Samsung 360 Round camera captures the world from all angles||8|
|National Seafood Bisque Day Shortbread||5|
|MSI GS63 Stealth laptop flies under the radar with a GTX 1050||5|
|Zotac GTX 1080 Ti ArcticStorm Mini proves that size doesn't matter||28|
|Aorus X9 packs two GTX 1070s in a slim chassis||16|
|ROG Strix X370-I and B350-I are itty-bitty boards for Ryzen builds||15|
|Qualcomm shows progress on 5G mobile broadband||21|
|Samsung foundry train stops at 8-nm LPP before heading to EUV||27|
|Honestly can't see the point of Vega64 for gamers. It's a power-hungry compute monster that barely outperforms Vega56 and no matter how much you overc...||+25|