In the wake of AMD’s Ryzen launch, I’ve been reading a lot of comments around the web about the inclusion or usefulness of the “minimum frame rate” in many reviewers’ results. Readers seem to think folks who report a minimum frame rate alongside an average are providing a more complete picture of gaming performance from these new CPUs. That may be true in a way, but the picture is still about as shallow as reporting a frames-per-second average on its own.
The problem is that a minimum frame rate—or any instantaneous frame-rate value—doesn’t tell us much at all about the overall gaming experience. A minimum frame rate indicates that somewhere in a test run, the graphics card rendered a frame in a certain number of milliseconds. It doesn’t tell us how many frames were produced in a similar amount of time, or how closely together they occurred. The way I see it, focusing on what might be one frame in a set of thousands places undue value in an outlier.
That said, a minimum frame rate value (or maximum frame time) and its friends might occur more than once in a test run. Anybody familiar with how The Tech Report presents frame-time data knows that we already have better ways of dealing with this problem. We set a number of crucial thresholds in our data-processing tools—50 ms, 33.3 ms, 16.7 ms, and 8.3 ms—and determine how long the graphics card spent on frames that took longer than those values to render. Those values correspond to instantaneous frame rates of 20 FPS, 30 FPS, 60 FPS, and 120 FPS.
If even a handful of milliseconds start pouring into our 50-ms bucket, we know that the system is struggling to run a game at times, and it’s likely that the end user will notice severe roughness in their gameplay experience if lots of frames take longer than that to render. Too much time spent on frames that take more than 33.3 ms to render means that a system running with traditional vsync on will start running into equally ugly hitches and stutters. Ideally, we want to see a system spend as little time as possible past 16.7 ms rendering frames, and too much time spent past 8.3 ms is starting to become an important consideration for gamers with high-refresh-rate monitors and powerful graphics cards.
In tandem with these buckets, our frame-time plots tell us where and how often major slowdowns occurred, and whether they occurred in close successsion. On these charts, low and flat-looking lines are best. Lots of humps, spikes, or too much “furriness” is an indicator of a sub-optimal experience. If we see just one or two spikes over an entire test run, it’s likely that the experience was still enjoyable, but flat lines are still best. In our recent Ryzen review, all of the CPUs delivered a solid gaming experience for the most part—some were just faster while doing it.
Enter the histogram with Crysis 3
Even with all of those data-visualization methods at our disposal, there may still be useful ways to plot frame times that we haven’t yet considered when thinking about minimum frame rates. While I was musing on the characteristics of these rates on Twitter, a TR reader asked me to consider whether a histogram would be a good visualization of the minimum-frame-rate (or maximum-frame-time, if you will) problem. Histograms are a natural fit for this type of analysis, since my primary complaint about a minimum frame rate value alone is that it doesn’t tell us how many times such an event occurred during a test run. I figured it was worth a shot, so I started crunching numbers and inserting graphs in Excel.
Here’s a couple histograms of test runs from Crysis 3 in our Ryzen review, from the Ryzen 7 1800X and the Core i7-7700K. These chips were closely matched in Crysis 3. I’ve converted raw frame times into instantaneous frame rate values (not averages over a second) so the left-to-right ordering makes sense.
In both cases, we can see that only a couple frames out of thousands would qualify as “minimum frame rate” frames, and it seems highly unlikely that a gamer would care about their contribution to the overall picture. Our “time-spent-beyond-X” graphs for Crysis 3 back up that analysis.
We can pick over these histograms quite a bit, but they tell the same basic story as our 99th-percentile and average-FPS charts from Crysis 3. I’m not actually sure how valuable it is to present this histogram information, because it seems easy to draw the wrong idea from these charts without familiarity with frame-time testing.
Still, it’s clear that we want more and taller bars toward the middle or right of the chart, as demonstrated by the FX-8370’s Crysis 3 performance.
Another look at Grand Theft Auto V
Crysis 3 proved good for both the Core i7-7700K and the Ryzen 7 1800X, but Grand Theft Auto V wasn’t so kind. This less-threaded game opened up a considerable performance gap between the two chips, so we wanted to re-examine its performance with histograms to see what that looked like.
That’s interesting, we think. The Core i7-7700K produces a nice, fat dromedarian hump with most of its frames clustered to the right of the chart, while the bactrian Ryzen 7 1800X exhibits a seemingly-more-bimodal distribution. Not only is the Core i7-7700K faster, but its frame delivery is more consistent—and our frame-time data bears that out.
At the end of the day, I don’t think it’s worth putting too much stock in minimum frame rates. Our histogram analysis lets us see that they’re extreme outliers that might not contribute more than a couple frames (if that) to the overall picture. We already have much better tools to make conclusions about component performance in the 99th-percentile frame time, frame-times-by-percentile, and plots of frame-time data that we present. We might start including frame-rate or frame-time histograms in our future reviews, however, because hey, they’re interesting. Let me know what you think.
Oh, and yeah. Average FPS continues to be terrible. You can take that to the bank.