So.. what now?
We have several takeaways after considering our test data and talking with Nalasco and Petersen about these issues. One of the big ones: ensuring frame-rate smoothness is a new frontier in GPU "performance" that's only partially related to raw rendering speeds. Multi-GPU solutions are challenged on this front, but single-GPU graphics cards aren't entirely in the clear, either. New technologies and clever algorithms may be needed in order to conquer this next frontier. GPU makers have some work to do, especially if they wish to continue selling multi-GPU cards and configs as premium products.
Meanwhile, we have more work ahead of us in considering the impact of jitter on how we test and evaluate graphics hardware. Fundamentally, we need to measure what's being presented to the user via the display. We may have options there. Petersen told us Nvidia is considering creating an API that would permit third-party applications like Fraps to read display times from the GPU. We hope they do, and we'll lobby AMD to provide the same sort of hooks in its graphics drivers. Beyond that, high-speed cameras might prove useful in measuring what's happening onscreen with some precision. (Ahem. There's a statement that just cost us thousands of dollars and countless hours of work.)
Ultimately, though, the user experience should be paramount in any assessment of graphics solutions. For example, we still need to get a good read on a basic question: how much of a problem is micro-stuttering, really? (I'm thinking of the visual discontinuities caused by jitter, not the potential for longer frame times, which are easer to pinpoint.) The answer depends very much on user perception, and user perception will depend on the person involved, on his monitor type, and on the degree of the problem.
|Ultimately, though, the user experience should be paramount in any assessment of graphics solutions.|
Presumably, a jitter pattern alternating between five- and 15-millisecond frame times would be less of an annoyance than a 15- and 45-millisecond pattern. The worst example we saw in our testing alternated between roughly six and twenty milliseconds, but it didn't jump out at me as a problem during our original testing. Just now, I fired up Bad Company 2 on a pair of Radeon HD 6870s with the latest Catalyst 11.8 drivers. Fraps measures the same degree of jitter we saw initially, but try as I might, I can't see the problem. We may need to spend more time with (ugh) faster TN panels, rather than our prettier and slower IPS displays, in order to get a better feel for the stuttering issue.
At the same time, we're very interested in getting reader feedback on this matter. If you have multi-GPU setup, have you run into micro-stuttering problems? If so, how often do you see it and how perceptible is it? Please let us know in the comments.
Although they've been a little bit overshadowed by the issues they've helped uncover, we're also cautiously optimistic about our proposed methods for measuring GPU performance in terms of frame times. Yes, Nvidia's frame metering technology complicates our use of Fraps data with SLI setups. But for single-GPU solutions, at least, we think our new methods, with their focus on frame latencies, offer some potentially valuable insights into real-world performance that traditional FPS measurements tend to miss. We'll probably have to change the way we review GPUs in the future as a result. These methods may be helpful in measuring CPU performance, as well. Again, we're curious to get some reader feedback about which measures make sense to use and how they might fit alongside more traditional FPS averages. Our sense is that once you've gone inside the second, it may be difficult to look at things the same way once you zoom back out again.
162 comments — Last by HTWingNut at 11:14 AM on 02/14/12
|1. Hdfisise - $600||2. Ryszard - $503||3. Andrew Lauritzen - $502|
|4. the - $306||5. SomeOtherGeek - $300||6. Ryu Connor - $250|
|7. doubtful500 - $200||8. Anonymous Gerbil - $150||9. webkido13 - $135|
|10. cygnus1 - $126|
|GeForce GTX 980 Ti cards comparedEVGA, Gigabyte, MSI, and Asus square off||20|
|Asus' Strix Radeon R9 Fury graphics card reviewedFiji goes air-cooled||311|
|AMD's Radeon R9 Fury X graphics card reviewedThe red team vents its Fury||690|
|AMD's Radeon Fury X architecture revealedSome more insights into the Fiji GPU||155|
|Live blog from AMD's 'New era of PC gaming' eventTime for that Fiji reveal||88|
|AMD's Carrizo brings power savings to mainstream laptopsExcavator and GCN combine at 15W||83|
|Intel's Broadwell goes broad with new desktop, mobile, server variants14-nm chips for everyone||166|
|Nvidia's G-Sync goes mobile, adds featuresVariable refresh comes to laptops and windowed games||37|
|GeForce GTX 980 Ti cards compared||20|
|Intel updates IGP drivers for Windows 10||44|
|G.Skill prepares for Skylake with 4GT/s DDR4 memory||24|
|Nvidia releases GeForce 353.62 drivers for Windows 10||15|
|Catalyst 15.7.1 drivers bring Win10 support||29|
|Windows 10 now available for download||51|
|Gigabyte GTX 970 Twin-Turbo cooler moves more air in SLI||13|
|Windows 10 arrives today in 190 countries||169|
|Mionix's Castor mouse shoots for the stars||30|
|TL;DR: Annoying ads annoy users.||+31|