Single page Print

So.. what now?
We have several takeaways after considering our test data and talking with Nalasco and Petersen about these issues. One of the big ones: ensuring frame-rate smoothness is a new frontier in GPU "performance" that's only partially related to raw rendering speeds. Multi-GPU solutions are challenged on this front, but single-GPU graphics cards aren't entirely in the clear, either. New technologies and clever algorithms may be needed in order to conquer this next frontier. GPU makers have some work to do, especially if they wish to continue selling multi-GPU cards and configs as premium products.

Meanwhile, we have more work ahead of us in considering the impact of jitter on how we test and evaluate graphics hardware. Fundamentally, we need to measure what's being presented to the user via the display. We may have options there. Petersen told us Nvidia is considering creating an API that would permit third-party applications like Fraps to read display times from the GPU. We hope they do, and we'll lobby AMD to provide the same sort of hooks in its graphics drivers. Beyond that, high-speed cameras might prove useful in measuring what's happening onscreen with some precision. (Ahem. There's a statement that just cost us thousands of dollars and countless hours of work.)

Ultimately, though, the user experience should be paramount in any assessment of graphics solutions. For example, we still need to get a good read on a basic question: how much of a problem is micro-stuttering, really? (I'm thinking of the visual discontinuities caused by jitter, not the potential for longer frame times, which are easer to pinpoint.) The answer depends very much on user perception, and user perception will depend on the person involved, on his monitor type, and on the degree of the problem.

Ultimately, though, the user experience should be paramount in any assessment of graphics solutions.

Presumably, a jitter pattern alternating between five- and 15-millisecond frame times would be less of an annoyance than a 15- and 45-millisecond pattern. The worst example we saw in our testing alternated between roughly six and twenty milliseconds, but it didn't jump out at me as a problem during our original testing. Just now, I fired up Bad Company 2 on a pair of Radeon HD 6870s with the latest Catalyst 11.8 drivers. Fraps measures the same degree of jitter we saw initially, but try as I might, I can't see the problem. We may need to spend more time with (ugh) faster TN panels, rather than our prettier and slower IPS displays, in order to get a better feel for the stuttering issue.

At the same time, we're very interested in getting reader feedback on this matter. If you have multi-GPU setup, have you run into micro-stuttering problems? If so, how often do you see it and how perceptible is it? Please let us know in the comments.

Although they've been a little bit overshadowed by the issues they've helped uncover, we're also cautiously optimistic about our proposed methods for measuring GPU performance in terms of frame times. Yes, Nvidia's frame metering technology complicates our use of Fraps data with SLI setups. But for single-GPU solutions, at least, we think our new methods, with their focus on frame latencies, offer some potentially valuable insights into real-world performance that traditional FPS measurements tend to miss. We'll probably have to change the way we review GPUs in the future as a result. These methods may be helpful in measuring CPU performance, as well. Again, we're curious to get some reader feedback about which measures make sense to use and how they might fit alongside more traditional FPS averages. Our sense is that once you've gone inside the second, it may be difficult to look at things the same way once you zoom back out again. TR

Like what we're doing? Pay what you want to support TR and get nifty extra features.
Top contributors
1. Ryszard - $503 2. punkUser - $502 3. the - $306
4. SomeOtherGeek - $300 5. Ryu Connor - $250 6. doubtful500 - $200
7. Anonymous Gerbil - $150 8. danny e. - $125 9. SecretSquirrel - $125
10. Techonomics - $100
EVGA's Torq X5 and X10 mice reviewedRodentia evgae 33
Nvidia's GeForce GTX Titan X graphics card reviewedYour GTX 980 is puny. I spit on it. Ptoo. 436
Intel's Xeon D brings Broadwell to cloud, web servicesA big compute node in a small package 40
AMD previews Carrizo APU, offers insights into power savingsExcavator cores and other innovations to help improve efficiency 115
Five GeForce GTX 960 cards overclockedHow do I compare thee? Dunno, really 189
The TR Podcast 169.5 bonus edition: Q&A intensifiesYou ask, we attempt to answer 5
BenQ's XL2420G G-Sync monitor reviewedTwo scalers, one monitor, zero tearing 54
Samsung's Galaxy Note 4 with the Exynos 5433 processorA Korean import gives us a look at ARM's latest tech 110