Single page Print

The Elder Scrolls IV: Oblivion
We tested Oblivion by manually playing through a specific point in the game five times while recording frame rates using the FRAPS utility. Each gameplay sequence lasted 60 seconds. This method has the advantage of simulating real gameplay quite closely, but it comes at the expense of precise repeatability. We believe five sample sessions are sufficient to get reasonably consistent results. In addition to average frame rates, we've included the low frame rates, because those tend to reflect the user experience in performance-critical situations. In order to diminish the effect of outliers, we've reported the median of the five low frame rates we encountered.

For this test, we set Oblivion's graphical quality to "Medium" but with HDR lighting enabled and vsync disabled, at 800x600 resolution. We've chosen this relatively low display resolution in order to prevent the graphics card from becoming a bottleneck, so differences between the CPUs can shine through.

Notice the little green plot with four lines above the benchmark results. That's a snapshot of the CPU utilization indicator in Windows Task Manager, which helps illustrate how much the application takes advantage of up to four CPU cores, when they're available. I've included these Task Manager graphics whenever possible throughout our results. In this case, Oblivion really only takes full advantage of a single CPU core, although Nvidia's graphics drivers use multithreading to offload some vertex processing chores.

Obviously, any and all of these CPUs will run Oblivion at acceptable performance levels. If you're keeping score, though, note that Core 2 processors take the top five spots, generally outpacing their like-priced competition and even jumping ahead of their more expensive Athlon 64 rivals.

Interestingly, the dual-socket Athlon 64 FX systems trail their dual-core siblings, probably because there's a slight performance penalty to be paid for the Quad FX platform's non-uniform memory architecture. Even though Windows Vista is supposed to be better at handling NUMA-type architectures, these systems have a small yet measurable disadvantage in this application and, as you'll see, in some of our other tests. Since this game doesn't make use of four cores, the FX processors can't make up for it here. The Quad FX platform's four cores and plentiful memory bandwidth can produce good results in other applications, and we'll see that, too.

Rainbow Six: Vegas
Rainbow Six: Vegas is based on Unreal Engine 3 and is a port from the Xbox 360. For both of these reasons, it's one of the first PC games that's multithreaded, and it ought to provide an illuminating look at CPU gaming performance.

For this test, we set the game to run at 800x600 resolution with high dynamic range lighting disabled. "Hardware skinning" (via the GPU) was disabled, leaving that burden to fall on the CPU. Shadow quality was set to very low, and motion blur was enabled at medium quality. I played through a 90-second sequence of the game's Terrorist Hunt mode on the "Dante's" level five times, capturing frame rates with FRAPS, as we did with Oblivion.

Once again, the Core 2 processors lead their Athlon 64 competition, but this is a much tighter contest. Note to self: Don't expect a game designed for the weakling Xbox 360 CPU to stress any modern PC processor.