The Elder Scrolls V: Skyrim
Our Skyrim test involved running around the town of Whiterun, starting from the city gates, all the way up to Dragonsreach, and then back down again.
We tested at 1366x768 using the "medium" detail preset.
Now, we should preface the results below with a little primer on our testing methodology. Along with measuring average frames per second, we delve inside the second to look at frame rendering times. Studying the time taken to render each frame gives us a better sense of playability, because it highlights issues like stuttering that can occur—and be felt by the player—within the span of one second. Charting frame times shows these issues clear as day, while charting average frames per second obscures them.
For example, imagine one hypothetical second of gameplay. Almost all frames in that second are rendered in 16.7 ms, but the game briefly hangs, taking a disproportionate 100 ms to produce one frame and then catching up by cranking out the next frame in 5 ms—not an uncommon scenario. You're going to feel the game hitch, but the FPS counter will only report a dip from 60 to 56 FPS, which would suggest a negligible, imperceptible change. Looking inside the second helps us detect such skips, as well as other issues that conventional frame rate data measured in FPS tends to obscure.
We're going to start by charting frame times over the totality of a representative run for each system—though we conducted five runs per system to sure our results are solid. These plots should give us an at-a-glance impression of overall playability, warts and all. (Note that, since we're looking at frame latencies, plots sitting lower on the Y axis indicate quicker solutions.)
From this vantage point, it's obvious the A10-4600M and Radeon HD 7760G IGP combo pulls off the lowest, most consistent frame times of the bunch. Ivy Bridge and its HD 4000 IGP suffer from a greater number of latency spikes, and they seems to exhibit more variance in general, as well. Sandy Bridge is the worst of the bunch by far, with embarrassingly high frame latencies and a huge spike over 250 ms at the end of the run.
We can slice and dice our raw frame-time data in other ways to show different facets of the performance picture. Let's start with something we're all familiar with: average frames per second. Though this metric doesn't account for irregularities in frame latencies, it does give us some sense of typical performance.
Next, we can demarcate the threshold below which 99% of frames are rendered. The lower the threshold, the more fluid the game. This metric offers a sense of overall frame latency, but it filters out fringe cases.
Of course, the 99th percentile result only shows a single point along the latency curve. We can show you that whole curve, as well. With integrated graphics or single-GPU configs, the right hand-side of the graph—and especially the last 10% or so—is where you'll want to look. That section tends to be where the best and worst solutions diverge.
These latency curves are nice and neat, with no one solution crossing over to be slower than the other one in the last 5% or so. Sometimes things aren't like that, as we'll likely see shortly.
Finally, we can rank solutions based on how long they spent working on frames that took longer than 50 ms to render. The results should ideally be "0" across the board, because the illusion of motion becomes hard to maintain once frame latencies rise above 50-ms or so. (50 ms frame times are equivalent to a 20 FPS average.) Simply put, this metric is a measure of "badness." It tells us about the scope of delays in frame delivery during the test scenario.
No question about it: Trinity's integrated graphics are fast. They're substantially quicker than even Llano's, and the contest with Intel's solutions is really no contest at all. From a seat-of-the-pants perspective, only the A10-4600M and Radeon HD 7760G are really playable at these settings. Llano is borderline, and the Intel offerings are just too choppy.