The Elder Scrolls V: Skyrim
Our test run for Skyrim was a lap around the town of Whiterun, starting up high at the castle entrance, descending down the stairs into the main part of town, and then doing a figure-eight around the main drag.
Since these are pretty capable graphics cards, we set the game to its "Ultra" presets, which turns on 4X multisampled antialiasing. We then layered on FXAA post-process anti-aliasing, as well, for the best possible image quality without editing an .ini file.
At this point, you may be wondering what's going on with the funky plots shown above. Those are the raw data for our snazzy new game benchmarking methods, which focus on the time taken to render each frame rather than an frame rate averaged over a second. For more information on why we're testing this way, please read this article, which explains almost everything.
If that's too much work for you, the basic premise is simple enough. The key to creating a smooth animation in a game is to flip from one frame to the next as quickly as possible in continuous fashion. The plots above show the time required to produce each frame of the animation, on each card, in our 90-second Skyrim test run. As you can see, some of the cards struggled here, particularly the GeForce GTX 560 Ti, which was running low on video memory. Those long waits for individual frames, some of them 100 milliseconds (that's a tenth of a second) or more, produce less-than-fluid action in the game.
Notice that, in dealing with render times for individual frames, longer waits are a bad thing—lower is better, when it comes to latencies. For those who prefer to think in terms of FPS, we've provided the handy table at the right, which offers some conversions. See how, in the last plot, frame times are generally lower for the GeForce GTX 680 than for the Radeon HD 7970, and so the GTX 680 produces more total frames? Well, that translates into...
...higher FPS averages for the new GeForce. Quite a bit higher, in this case. Also notice that some of our worst offenders in terms of long frame times, such as the GeForce GTX 560 Ti and the GTX 560 Ti 448, produce seemingly "acceptable" frame rates of 41 and 50 FPS, respectively. We might expect that FPS number to translate into adequate performance, but we know from looking at the plot that's not the case.
To give us a better sense of the frame latency picture, or the general fluidity of gameplay, we can look at the 99th percentile frame latency—that is, 99% of all frames were rendered during this frame time or less. Once we do that, we can see just how poorly the GTX 560 Ti handles itself here compared to everything else.
We're still experimenting with our new methods, and I'm going to drop a couple of new wrinkles on you here today. We think the 99th percentile latency number is a good one, but since it's just one point among many, we have some concerns about using it alone to convey the general latency picture. As a bit of an experiment, we've decided to expand our look at frame times to cover more points, like so.
This illustrates how close the matchup is between several of the cards, especially our headliners, the Radeon HD 7970 and GeForce GTX 680. Although the GeForce generally produces frames in less time than the Radeon, both are very close to that magic 16.7 ms (60 FPS) mark 95% of the time. Adding in those last few percentage points, that last handful of frames that take longer to render, makes the GTX 680's advantage nearly vanish.
Our next goal is to focus more closely on the tough parts, places where the GPU's performance limitations may be contributing to less-than-fluid animation, occasional stuttering, or worse. For that, we add up all of the time each GPU spends working on really long frame times, those above 50 milliseconds or (put another way) below about 20 FPS. We've explained our rationale behind this one in more detail right here, if you're curious or just confused.
Only the two offenders we've already identified really spend any significant time working on really long-to-render frames. The rest of the pack (and I'd include the GTX 580 in this group) handles Skyrim at essentially the highest quality settings quite well.
|Intel enjoys healthy revenue and profits for Q1 2017||12|
|Acer Predator X27 and Predator Z271UV displays report in||8|
|Razer Lancehead wireless mouse is ready to stalk its prey||4|
|Take Our Daughters and Sons to Work Day Shortbread||14|
|Intel document confirms that Xeons will come in Gold and Platinum||31|
|Noctua confirms LGA 2066 will host Skylake-X and Kaby Lake-X||7|
|Radeon 17.4.4 drivers rise for Dawn of War III||14|
|AMD ships Ryzen Balanced power plan with latest chipset drivers||11|
|Amazon's Echo Look uses machine learning to dress you up||32|
|Unless Intel suddenly becomes very aggressive in its pricing, a Skylake-X will certainly cost a hell of a lot more than Ryzen CPU. And who cares if AM...||+59|