So.. what now?
We have several takeaways after considering our test data and talking with Nalasco and Petersen about these issues. One of the big ones: ensuring frame-rate smoothness is a new frontier in GPU "performance" that's only partially related to raw rendering speeds. Multi-GPU solutions are challenged on this front, but single-GPU graphics cards aren't entirely in the clear, either. New technologies and clever algorithms may be needed in order to conquer this next frontier. GPU makers have some work to do, especially if they wish to continue selling multi-GPU cards and configs as premium products.
Meanwhile, we have more work ahead of us in considering the impact of jitter on how we test and evaluate graphics hardware. Fundamentally, we need to measure what's being presented to the user via the display. We may have options there. Petersen told us Nvidia is considering creating an API that would permit third-party applications like Fraps to read display times from the GPU. We hope they do, and we'll lobby AMD to provide the same sort of hooks in its graphics drivers. Beyond that, high-speed cameras might prove useful in measuring what's happening onscreen with some precision. (Ahem. There's a statement that just cost us thousands of dollars and countless hours of work.)
Ultimately, though, the user experience should be paramount in any assessment of graphics solutions. For example, we still need to get a good read on a basic question: how much of a problem is micro-stuttering, really? (I'm thinking of the visual discontinuities caused by jitter, not the potential for longer frame times, which are easer to pinpoint.) The answer depends very much on user perception, and user perception will depend on the person involved, on his monitor type, and on the degree of the problem.
|Ultimately, though, the user experience should be paramount in any assessment of graphics solutions.|
Presumably, a jitter pattern alternating between five- and 15-millisecond frame times would be less of an annoyance than a 15- and 45-millisecond pattern. The worst example we saw in our testing alternated between roughly six and twenty milliseconds, but it didn't jump out at me as a problem during our original testing. Just now, I fired up Bad Company 2 on a pair of Radeon HD 6870s with the latest Catalyst 11.8 drivers. Fraps measures the same degree of jitter we saw initially, but try as I might, I can't see the problem. We may need to spend more time with (ugh) faster TN panels, rather than our prettier and slower IPS displays, in order to get a better feel for the stuttering issue.
At the same time, we're very interested in getting reader feedback on this matter. If you have multi-GPU setup, have you run into micro-stuttering problems? If so, how often do you see it and how perceptible is it? Please let us know in the comments.
Although they've been a little bit overshadowed by the issues they've helped uncover, we're also cautiously optimistic about our proposed methods for measuring GPU performance in terms of frame times. Yes, Nvidia's frame metering technology complicates our use of Fraps data with SLI setups. But for single-GPU solutions, at least, we think our new methods, with their focus on frame latencies, offer some potentially valuable insights into real-world performance that traditional FPS measurements tend to miss. We'll probably have to change the way we review GPUs in the future as a result. These methods may be helpful in measuring CPU performance, as well. Again, we're curious to get some reader feedback about which measures make sense to use and how they might fit alongside more traditional FPS averages. Our sense is that once you've gone inside the second, it may be difficult to look at things the same way once you zoom back out again.
162 comments — Last by HTWingNut at 11:14 AM on 02/14/12
|AMD's high-bandwidth memory explainedInside the next generation of graphics memory||256|
|The TR Podcast bonus video: AMD, Zen, Fiji, and moreWith special guest David Kanter||53|
|AMD: Zen chips headed to desktops, servers in 2016Details of its new x86 CPU and plans revealed||250|
|Inside ARM's Cortex-A72 microarchitectureThe next-gen CPU core for mobile devices and servers||42|
|Semiconductors from idea to productThe story of how chips are made||54|
|BenQ's XL2730Z 'FreeSync' monitor reviewedFirst of its breed and 144Hz speed||240|
|EVGA's Torq X5 and X10 mice reviewedRodentia evgae||36|
|Nvidia's GeForce GTX Titan X graphics card reviewedYour GTX 980 is puny. I spit on it. Ptoo.||443|
|New Need for Speed looks like a lean, mean machine||76|
|Friday night topic: how dinosaurs probably looked||52|
|Thermaltake's Suppressor F51 mid-tower looks a tad familiar||9|
|Umbra action RPG uses Megascans tech to glorious effect||25|
|Deal of the week: 27'' AHVA monitor for $300, The Witcher 3 for $39||22|
|F1 2015 offers a new formula for racing fans||10|
|The Witcher 3 developer explains controversial graphics downgrade||84|
|Frostbite engine lead teases next-gen Radeon||40|