So.. what now?
We have several takeaways after considering our test data and talking with Nalasco and Petersen about these issues. One of the big ones: ensuring frame-rate smoothness is a new frontier in GPU "performance" that's only partially related to raw rendering speeds. Multi-GPU solutions are challenged on this front, but single-GPU graphics cards aren't entirely in the clear, either. New technologies and clever algorithms may be needed in order to conquer this next frontier. GPU makers have some work to do, especially if they wish to continue selling multi-GPU cards and configs as premium products.
Meanwhile, we have more work ahead of us in considering the impact of jitter on how we test and evaluate graphics hardware. Fundamentally, we need to measure what's being presented to the user via the display. We may have options there. Petersen told us Nvidia is considering creating an API that would permit third-party applications like Fraps to read display times from the GPU. We hope they do, and we'll lobby AMD to provide the same sort of hooks in its graphics drivers. Beyond that, high-speed cameras might prove useful in measuring what's happening onscreen with some precision. (Ahem. There's a statement that just cost us thousands of dollars and countless hours of work.)
Ultimately, though, the user experience should be paramount in any assessment of graphics solutions. For example, we still need to get a good read on a basic question: how much of a problem is micro-stuttering, really? (I'm thinking of the visual discontinuities caused by jitter, not the potential for longer frame times, which are easer to pinpoint.) The answer depends very much on user perception, and user perception will depend on the person involved, on his monitor type, and on the degree of the problem.
|Ultimately, though, the user experience should be paramount in any assessment of graphics solutions.|
Presumably, a jitter pattern alternating between five- and 15-millisecond frame times would be less of an annoyance than a 15- and 45-millisecond pattern. The worst example we saw in our testing alternated between roughly six and twenty milliseconds, but it didn't jump out at me as a problem during our original testing. Just now, I fired up Bad Company 2 on a pair of Radeon HD 6870s with the latest Catalyst 11.8 drivers. Fraps measures the same degree of jitter we saw initially, but try as I might, I can't see the problem. We may need to spend more time with (ugh) faster TN panels, rather than our prettier and slower IPS displays, in order to get a better feel for the stuttering issue.
At the same time, we're very interested in getting reader feedback on this matter. If you have multi-GPU setup, have you run into micro-stuttering problems? If so, how often do you see it and how perceptible is it? Please let us know in the comments.
Although they've been a little bit overshadowed by the issues they've helped uncover, we're also cautiously optimistic about our proposed methods for measuring GPU performance in terms of frame times. Yes, Nvidia's frame metering technology complicates our use of Fraps data with SLI setups. But for single-GPU solutions, at least, we think our new methods, with their focus on frame latencies, offer some potentially valuable insights into real-world performance that traditional FPS measurements tend to miss. We'll probably have to change the way we review GPUs in the future as a result. These methods may be helpful in measuring CPU performance, as well. Again, we're curious to get some reader feedback about which measures make sense to use and how they might fit alongside more traditional FPS averages. Our sense is that once you've gone inside the second, it may be difficult to look at things the same way once you zoom back out again.
162 comments — Last by HTWingNut at 11:14 AM on 02/14/12
|Corsair's K95 RGB Platinum gaming keyboard reviewedA lean, mean macro machine||11|
|HyperX's Pulsefire gaming mouse reviewedKeeping it simple the first time out||7|
|AMD's Radeon RX 580 and Radeon RX 570 graphics cards reviewedIteration marches on||161|
|AMD's Ryzen 5 1600X and Ryzen 5 1500X CPUs reviewed, part oneGetting our game on||187|
|EpicGear's Morpha X modular gaming mouse reviewedHave it your way||11|
|A moment of Zen with David Kanter: The TR Podcast 190Digging into the whys of Ryzen||38|
|Intel defends its process-technology leadership at 14nm and 10nmWhat's in a number?||111|
|SteelSeries' Rival 700 gaming mouse reviewedTactile feedback finds its way to the desktop||9|
|G.Skill KM560 MX keyboard drops the numpad||6|
|Rumor: Acer Triton 700 may use an unreleased Pascal GPU||16|
|Silverstone Vital VT02 could hold a Core i7 in under two liters||4|
|Galax and KFA2 induct the GTX 1080 Ti into the Hall of Fame||20|
|Acer's Aspire GX-281 lineup brings Ryzen to the masses||13|
|Deals of the week: discounts on CPUs, mobos, and more||8|
|Asetek gets $600,000 from Cooler Master in AIO cooler patent spat||14|
|Acer Predator Triton and Helios laptops are ready for serious play||13|
|Intel enjoys healthy revenue and profits for Q1 2017||28|
|Unless Intel suddenly becomes very aggressive in its pricing, a Skylake-X will certainly cost a hell of a lot more than Ryzen CPU. And who cares if AM...||+65|