IGP performance - Skyrim
We'll start by looking at integrated graphics performance. For these tests, we decided to focus on the higher-end A10 and Core i3 processors, since they have the faster IGPs and are more likely to be compelling offerings. We've also taken a look at the impact of memory speed on IGP performance, since memory bandwidth can be a pretty notable constraint. Our default test config used 1600MHz memory, and we also tested the A10 and Core i3 with 1866MHz memory at pretty tight timings: 9-10-9-27 1T. We'd hoped to test even higher memory frequencies, but neither platform took well to 2133MHz memory, even with relatively conservative timings and extra voltage.
We tested performance while taking a stroll around the town of Whiterun in Skyrim. You can see the image quality settings we used above, which are about as spartan as possible in Skyrim.
Our gaming tests are very different from what you're likely to see elsewhere. We've captured the time required to render every single frame from each of our five test runs, because we believe FPS averages tend to mask the short slowdowns that can break the sense of fluid animation. For more information on how we test and why, please see this article.
You can click the buttons beneath the plots above to see results for the different types of processors. Since we're plotting frame times, lower numbers are better, and big spikes upward are bad—they represent delays in frame delivery. If you're new to the idea latency-focused game testing, the table to the right may help. It shows frame times and how they correspond to FPS rates. Just a look at the raw plots above will tell you much of what you need to know about how these CPUs perform. The Core i3-3225 produces fewer frames at generally higher latencies than the A10, and its frame time spikes tend to be more dramatic.
Although FPS averages can be deceiving, in this case, the relatively high average numbers tend to be backed up by our alternative method, the 99th percentile frame time. (This metric just says that 99% of all frames were rendered in x milliseconds or less.) The overall latency picture for all of the IGPs isn't bad. Except for the last 1% of frames, all of these solutions produce a constant flow of updates at a rate of over 30 FPS. Skyrim doesn't look pretty at these settings, but it will run smoothly enough on any of these IGPs.
The A10 is measurably faster than the Core i3-3225, and you can feel the difference while playing. The difference between the A10 and its Llano predecessor, the A8-3850, is much subtler, only a couple of milliseconds in the 99th percentile metric. Even slighter is the impact of faster memory on the A10.
The 99th percentile frame time is just one point along a curve, and we can have a look at the broader curve to give us a better sense of the overall latency picture. As you can see, the A10 produces much lower frame latencies generally than the Core i3.
The 99th percentile frame time attempts to capture a sense of the overall latency picture while ruling out the outliers. We can also focus on the worst-case frame times, which makes sense, since we want to avoid those hiccups and pauses while playing. Our method of quantifying "badness" is adding up all of the time spent working on frames beyond a given threshold—usually, we set the mark at 50 milliseconds, which equates to 20 FPS. We figure if frame rates drop below about that mark, the illusion of motion is at risk. Also, 50 milliseconds is equal to three vertical refresh intervals on a 60Hz display. If you're waiting longer than that for the next frame, there's likely some pain there.
As you might expect given the other numbers above, most of these solutions don't spend much time beyond our threshold. They really can run Skyrim pretty well at these (kinda lousy) image quality settings. Interestingly enough, the Core i3 benefits quite a bit from the move to 1866MHz memory; its time spent beyond our threshold drops to zero from nearly a third of a second before.
IGP performance - Batman: Arkham City
We tested Arkham City while grappling and gliding across the rooftops of Gotham in a bit of Bat-parkour. We're moving rapidly through a big swath of the city, so the game engine has to stream in more detail periodically. You can see the impact in the frame time plots: every CPU shows occasional spikes throughout the test run.
Again, we've had to reduce image quality settings to their lowest possible level in order to accommodate these relatively pokey integrated graphics processors.
Even with all of the frame time spikes, the numbers above look reasonably good for the most part. The FPS average and 99th percentile frame times pretty much mirror each other, which is usually a sign of health, and the latency curves are all similar in shape, with no big spikes upward until we reach the last few percentage points worth of frames.
All of the numbers point to the same thing, too, which is a clear playability advantage for the A10-5800K over the Core i3-3225.
|OCZ RD400 NVMe SSD heats up the enthusiast storage game||21|
|Samsung's 750 EVO SSD family grows with a 500GB model||8|
|Report: Windows Phone market share drops below 1%||67|
|Cryorig teases a distinctive pair of Mini-ITX cases||31|
|Radeon Software Crimson Edition 16.5.3 gears up for Overwatch||13|
|Rumor: a GP102 GeForce Titan and GTX 1080 Ti are in the works||112|
|We need your input as we plan the "second-10th" TR BBQ||30|
|Revive patch developers fire back by disabling Oculus DRM||32|
|Nvidia 368.22 drivers are tuned for Overwatch||18|