So.. what now?
We have several takeaways after considering our test data and talking with Nalasco and Petersen about these issues. One of the big ones: ensuring frame-rate smoothness is a new frontier in GPU "performance" that's only partially related to raw rendering speeds. Multi-GPU solutions are challenged on this front, but single-GPU graphics cards aren't entirely in the clear, either. New technologies and clever algorithms may be needed in order to conquer this next frontier. GPU makers have some work to do, especially if they wish to continue selling multi-GPU cards and configs as premium products.
Meanwhile, we have more work ahead of us in considering the impact of jitter on how we test and evaluate graphics hardware. Fundamentally, we need to measure what's being presented to the user via the display. We may have options there. Petersen told us Nvidia is considering creating an API that would permit third-party applications like Fraps to read display times from the GPU. We hope they do, and we'll lobby AMD to provide the same sort of hooks in its graphics drivers. Beyond that, high-speed cameras might prove useful in measuring what's happening onscreen with some precision. (Ahem. There's a statement that just cost us thousands of dollars and countless hours of work.)
Ultimately, though, the user experience should be paramount in any assessment of graphics solutions. For example, we still need to get a good read on a basic question: how much of a problem is micro-stuttering, really? (I'm thinking of the visual discontinuities caused by jitter, not the potential for longer frame times, which are easer to pinpoint.) The answer depends very much on user perception, and user perception will depend on the person involved, on his monitor type, and on the degree of the problem.
|Ultimately, though, the user experience should be paramount in any assessment of graphics solutions.|
Presumably, a jitter pattern alternating between five- and 15-millisecond frame times would be less of an annoyance than a 15- and 45-millisecond pattern. The worst example we saw in our testing alternated between roughly six and twenty milliseconds, but it didn't jump out at me as a problem during our original testing. Just now, I fired up Bad Company 2 on a pair of Radeon HD 6870s with the latest Catalyst 11.8 drivers. Fraps measures the same degree of jitter we saw initially, but try as I might, I can't see the problem. We may need to spend more time with (ugh) faster TN panels, rather than our prettier and slower IPS displays, in order to get a better feel for the stuttering issue.
At the same time, we're very interested in getting reader feedback on this matter. If you have multi-GPU setup, have you run into micro-stuttering problems? If so, how often do you see it and how perceptible is it? Please let us know in the comments.
Although they've been a little bit overshadowed by the issues they've helped uncover, we're also cautiously optimistic about our proposed methods for measuring GPU performance in terms of frame times. Yes, Nvidia's frame metering technology complicates our use of Fraps data with SLI setups. But for single-GPU solutions, at least, we think our new methods, with their focus on frame latencies, offer some potentially valuable insights into real-world performance that traditional FPS measurements tend to miss. We'll probably have to change the way we review GPUs in the future as a result. These methods may be helpful in measuring CPU performance, as well. Again, we're curious to get some reader feedback about which measures make sense to use and how they might fit alongside more traditional FPS averages. Our sense is that once you've gone inside the second, it may be difficult to look at things the same way once you zoom back out again.
162 comments — Last by HTWingNut at 11:14 AM on 02/14/12
|1. GKey13 - $650||2. JohnC - $600||3. davidbowser - $501|
|4. cmpxchg - $500||5. DeadOfKnight - $400||6. danny e. - $375|
|7. the - $360||8. Ryszard - $351||9. rbattle - $350|
|10. Ryu Connor - $350|
|Nvidia's GeForce GTX 980 and 970 graphics cards reviewedThe bigger Maxwell arrives in style||23|
|Intel's Xeon E5-2687W v3 processor reviewedHaswell-EP brings the hammer down||114|
|AMD's FX-8370E processor reviewedEight threads at 95W||146|
|AMD's Radeon R9 285 graphics card reviewedTonga is quite the surprise||124|
|Intel's Core i7-5960X processor reviewedHaswell Extreme cranks up the core count||198|
|Asus' ROG Swift PG278Q G-Sync monitor reviewedEverything is awesome when you're part of a team||152|
|Nvidia's Shield Tablet reviewedA whole other kind of Android tablet||29|
|AMD spills beans on Seattle's architecture, reference serverCache networks and coprocessors||46|
|Nvidia's GeForce GTX 980 and 970 graphics cards reviewed||23|
|Thursday Night Shortbread||4|
|Xbone controller for Windows is coming; still isn't wireless||12|
|Apple: With iOS 8, we can't give your data to the government||29|
|Stable of new Kindle tablets includes $99 Android model||44|
|Monitor scaler makers commit to FreeSync hardware||46|
|AOC's new backlight tech saves your eyeballs from harmful wavelengths||44|
|Report: Asus may sue mobo makers over patent infringement||60|
|New footage, previews shed light on Gearbox's Battleborn||13|