In the wake of AMD's Ryzen launch, I've been reading a lot of comments around the web about the inclusion or usefulness of the "minimum frame rate" in many reviewers' results. Readers seem to think folks who report a minimum frame rate alongside an average are providing a more complete picture of gaming performance from these new CPUs. That may be true in a way, but the picture is still about as shallow as reporting a frames-per-second average on its own.
The problem is that a minimum frame rate—or any instantaneous frame-rate value—doesn't tell us much at all about the overall gaming experience. A minimum frame rate indicates that somewhere in a test run, the graphics card rendered a frame in a certain number of milliseconds. It doesn't tell us how many frames were produced in a similar amount of time, or how closely together they occurred. The way I see it, focusing on what might be one frame in a set of thousands places undue value in an outlier.
That said, a minimum frame rate value (or maximum frame time) and its friends might occur more than once in a test run. Anybody familiar with how The Tech Report presents frame-time data knows that we already have better ways of dealing with this problem. We set a number of crucial thresholds in our data-processing tools—50 ms, 33.3 ms, 16.7 ms, and 8.3 ms—and determine how long the graphics card spent on frames that took longer than those values to render. Those values correspond to instantaneous frame rates of 20 FPS, 30 FPS, 60 FPS, and 120 FPS.
If even a handful of milliseconds start pouring into our 50-ms bucket, we know that the system is struggling to run a game at times, and it's likely that the end user will notice severe roughness in their gameplay experience if lots of frames take longer than that to render. Too much time spent on frames that take more than 33.3 ms to render means that a system running with traditional vsync on will start running into equally ugly hitches and stutters. Ideally, we want to see a system spend as little time as possible past 16.7 ms rendering frames, and too much time spent past 8.3 ms is starting to become an important consideration for gamers with high-refresh-rate monitors and powerful graphics cards.
In tandem with these buckets, our frame-time plots tell us where and how often major slowdowns occurred, and whether they occurred in close successsion. On these charts, low and flat-looking lines are best. Lots of humps, spikes, or too much "furriness" is an indicator of a sub-optimal experience. If we see just one or two spikes over an entire test run, it's likely that the experience was still enjoyable, but flat lines are still best. In our recent Ryzen review, all of the CPUs delivered a solid gaming experience for the most part—some were just faster while doing it.
Enter the histogram with Crysis 3
Even with all of those data-visualization methods at our disposal, there may still be useful ways to plot frame times that we haven't yet considered when thinking about minimum frame rates. While I was musing on the characteristics of these rates on Twitter, a TR reader asked me to consider whether a histogram would be a good visualization of the minimum-frame-rate (or maximum-frame-time, if you will) problem. Histograms are a natural fit for this type of analysis, since my primary complaint about a minimum frame rate value alone is that it doesn't tell us how many times such an event occurred during a test run. I figured it was worth a shot, so I started crunching numbers and inserting graphs in Excel.
Here's a couple histograms of test runs from Crysis 3 in our Ryzen review, from the Ryzen 7 1800X and the Core i7-7700K. These chips were closely matched in Crysis 3. I've converted raw frame times into instantaneous frame rate values (not averages over a second) so the left-to-right ordering makes sense.
In both cases, we can see that only a couple frames out of thousands would qualify as "minimum frame rate" frames, and it seems highly unlikely that a gamer would care about their contribution to the overall picture. Our "time-spent-beyond-X" graphs for Crysis 3 back up that analysis.
We can pick over these histograms quite a bit, but they tell the same basic story as our 99th-percentile and average-FPS charts from Crysis 3. I'm not actually sure how valuable it is to present this histogram information, because it seems easy to draw the wrong idea from these charts without familiarity with frame-time testing.
Still, it's clear that we want more and taller bars toward the middle or right of the chart, as demonstrated by the FX-8370's Crysis 3 performance.
|Silverstone's Strider Titanium PSUs are ready for a high-power future||6|
|VR180 video bridges the gap between YouTube and VR||0|
|Steam 2017 Summer Sale, part deux||13|
|Deals of the week: Z270 mobos, spinning storage, and more||3|
|G.Skill readies up for X299 with quad-channel DDR4 at 4200 MT/s||12|
|Asus' VivoBook S510 is an ultrabook for the budget crowd||13|
|Windows Insider Build 16226 gives users a look at GPU utilization||22|
|Steam's 2017 Summer Sale is downright hot||46|
|Asus XG-C100C NIC breaks the gigabit barrier||34|