Hey, guys. I've not written many Etc. posts lately, mostly because I had a long string of back-to-back deadlines over the past couple of months. Sorry about that.
I'm a little more chipper than usual today since this week has been one of the greatest sports weeks in my native Kansas City's history. I still can't believe that game last night. 29 years since the Royals were in the post-season, and we got all of the drama we'd been missing out of a single game.
In other news, I don't know how I missed this until now, but thanks to Robin Bradley for pointing it out. An engineer at Nvidia named Iain Cantlay wrote a blog post about how Nvidia uses frame time percentiles to evaluate game performance and optimizations. From the post:
Although stutter in games has always been a problem, my own approach to measuring and fixing stutter changed completely, back in 2011, when The Tech Report’s Inside the Second article introduced me to percentiles. Thank you, Scott Wasson! Since then, the use of frame-time percentiles has grown and spread within NVIDIA. We now have a tool to help compute percentiles and we measure stuttering as a matter of course when testing new games.
Percentiles are not the only way to measure stutter and debate has raged many times, here at NVIDIA, about the best approach. But I consistently find that percentiles are easy to measure; they have an intuitive interpretation; they can be used to quantitatively compare different results; and – perhaps most importantly – they always match my subjective game-play experience.
That's humbling—and quite the endorsement from inside of one of the two leading GPU firms.
Notably, Cantlay goes on to explain how he uses percentiles, what they can and cannot do for you, and how to avoid some pitfalls when dealing with frame time distributions. He covers a range of possible approaches to analyzing frame time data in order to produce a meaningful interpretation, some of which involve normalization and ratios of frame time differences. Many folks have suggested similar approaches to us since we started on this path. Ultimately, Cantlay comes to an important conclusion: "Absolute values matter more when measuring frame time."
That's something I've been saying for a while now. In a real-time system, time itself is what we want to track, not just the shape of the distribution. I'm happy to see that other folks get it!
For a while, I've thought that I should package up some of our tools, spreadsheets, and a sample data set so the wider world can see how we do things. Finding the time has been difficult, but happily, Cantlay has made the effort to package up his own spreadsheet, a percentile tool for processing Fraps data, and some example data. The download links are provided at the end of his blog post. If you do any GPU benchmarking—or heck, any testing of CPU gaming performance—you owe it to yourself to check out these resources.
|Lenovo ThinkCentre and ThinkPad machines pack AMD PRO APUs||16|
|Seagate 5TB BarraCuda and 2TB FireCuda drives are big and speedy||6|
|Nvidia licenses Rambus' DPA tech for side-channel data leak prevention||5|
|iOS 10.1 update includes portrait mode beta for iPhone 7 Plus||3|
|Biostar belatedly announces GTX 1060 graphics cards||12|
|HyperX Alloy keyboard gets lean and mean for FPS gaming||8|
|AMD drops prices on the Radeon RX 460 and RX 470||50|
|Reports: Radeon RX 470D is a budget Polaris card for China||9|
|Examining reports of slow write speeds on the 32GB iPhone 7||33|
|A real "console monitor" would be 720p @ 30 Hz ;P||+64|