Blockbuster games capable of making high-end PCs sweat have become a rarity in recent years. That’s unfortunate but unavoidable in a world where consoles remain saddled with six-year-old graphics hardware—yet are, sadly, far more popular than high-powered gaming PCs. Today’s PC titles typically feature the same graphics engines and pull off the same visual tricks as their predecessors. They don’t look bad… but they could look so much better.
We PC gamers respond with understandable giddiness when a game bucks that trend. From the get-go, EA DICE has boasted that Battlefield 3 was designed for the PC first and foremost, then scaled back to accommodate the Xbox 360 and PlayStation 3. The impact of that design choice was apparent even in the first trailers. With a sophisticated rendering engine bolstered by DirectX 11 effects like terrain tessellation and DirectCompute-accelerated lighting, the war-themed shooter offers visual fidelity above and beyond most of today’s shooters. Large swaths of the game look good enough to put Crysis and its sequel to shame. That’s saying something.
Oh, sure, Battlefield 3 doesn’t have the most engaging single-player campaign, as cinematic and visually awesome as it might look. EA’s Origin distribution scheme and the awkward web-based server browser have also earned their fair share of critics. Nevertheless, this game looks gorgeous and features delightfully engrossing multiplayer. I’ve clocked in 10 hours of Battlefield 3 MP so far, and I’m still itching for my next fix. Few multiplayer games let players sneak around subway tunnels with silenced weapons, snipe enemies across vertiginous valleys, drive tanks around large wooded hills, and engage in dogfights with fighter jets. Battlefield 3 doesn’t just let you do all of those things—it makes them all fun.
Such a game must benefit from a fast graphics card with the latest bells and whistles, of course… but which one? Nvidia has pimped Battlefield 3 since even before its release, and although the title doesn’t feature any Nvidia-only graphical effects that we know of, one has to wonder whether it runs better on Radeons or GeForces. Just as importantly, exactly how much GPU horsepower does one need to enable most of the game’s eye candy? Is this another Crysis, or is it more forgiving of mid-range graphics hardware?
To answer these questions, we enlisted three pairs of competing graphics cards: the GeForce GTX 460 and Radeon HD 6850, which are both priced just under $150; the GeForce GTX 560 and Radeon HD 6870, which reside closer to $180-190; and GeForce GTX 560 Ti and Radeon HD 6950 1GB, which you can usually find priced south of the $250 mark. These six cards represent today’s mid-range landscape. You might have seen them, perhaps with slightly different coolers and clock speeds, in the three cheapest builds from our latest system guide. We fired up Battlefield 3 on each card and got to tweaking graphical settings, measuring frame rates and frame times, and keeping a close eye on our seat-of-the-pants-o-meter.
Our testing methods
Now, we should clarify a couple of choices we made in our testing. First, we benchmarked the game’s single-player missions. While the same graphics engine powers the single- and multiplayer campaigns, testing a multiplayer game makes a number of variables rather difficult to control: network latency to the server, how many players are running across (and flying above) a map, how those players interact with the tester, and so forth. It can be done, but single-player skirmishes can be just as violent and explosive as multiplayer ones—sometimes more so—and they’re much more consistently repeatable.
You’ll also note that we tested at a resolution of 1920×1080 throughout. Battlefield 3‘s lack of a built-in, scripted benchmark forced us to test manually, so covering multiple resolutions would have been a significant time sink. We therefore settled on what’s unarguably the most popular resolution not just for mid-range builds, but also in the market overall. Right now, Newegg lists no fewer than 182 1080p LCD monitors; the next most popular resolution is 1280×1024, with only 61 displays listed. 1080p displays range in price from $109.99 to well over $600, and they come with all manners of panel types and sizes, from 21.5″ TN designs to larger IPS offerings.
As ever, we did our best to deliver clean benchmark numbers, with tests run five times per card. Our test system was configured as follows:
|Processor||Intel Core i5-750|
|North bridge||Intel P55 Express|
|Memory size||4GB (2 DIMMs)|
|Memory type||Kingston HyperX KHX2133C9AD3X2K2/4GX
DDR3 SDRAM at 1333MHz
|Memory timings||9-9-9-24 1T|
|Chipset drivers||INF update 18.104.22.1685
Rapid Storage Technology 10.1.0.1008
|Audio||Integrated Via VT1828S
with 22.214.171.12400 drivers
|Graphics||XFX Radeon HD 6850 1GB (HD-685X-ZNFC)
with Catalyst 11.10 WHQL drivers
|Asus Radeon HD 6870 1GB 915MHz (EAH6870 DC/2DI2S/1GD5)
with Catalyst 11.10 WHQL drivers
|Gigabyte Radeon HD 6950 1GB 870MHz (GV-R695OC-1GD)
with Catalyst 11.10 WHQL drivers
|Zotac GeForce GTX 460 1GB (ZT-40402-10P)
with GeForce 285.62 drivers
|MSI GeForce GTX 560 1GB 870MHz (N560GTX Twin Frozr II/OC)
with GeForce 285.62 drivers
|Asus GeForce GTX 560 Ti 1GB 830MHz (ENGTX560 TI DCII/2DI/1GD5)
with GeForce 285.62 drivers
|Hard drive||Samsung SpinPoint F1 HD103UJ 1TB SATA
Western Digital Caviar Green 1TB
|Power supply||Corsair HX750W|
|OS||Windows 7 Ultimate x64 Edition
Service Pack 1
Thanks to Asus, Intel, Corsair, Kingston, Samsung, and Western Digital for helping to outfit our test rigs with some of the finest hardware available. XFX, Gigabyte, and MSI for supplying the graphics cards for testing, as well.
We conducted testing using the Catalyst 11.10 WHQL driver from AMD and the GeForce 285.62 driver from Nvidia. We left optional AMD optimizations for tessellation and texture filtering disabled. Otherwise, image quality settings for the graphics cards were left at the control panel defaults. Vertical refresh sync (vsync) was disabled for all tests.
We used the following test applications:
We used the Fraps utility to record frame rates while playing a 90-second sequences through each level we tested. Although capturing frame rates while playing isn’t precisely repeatable, we tried to make each run as similar as possible to all of the others. We tested each Fraps sequence five times per video card, in order to counteract any variability.
The tests and methods we employ are generally publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.
Introducing the graphics detail settings
Over the next few pages, you might catch a glimpse of screenshots showing Battlefield 3‘s video options screens. The various individual quality settings and sliders are largely self-explanatory, but what do they actually mean in terms of image quality?
EA DICE Chief Architect Johan Andersson provided some useful background information during his keynote speech at the Nvidia GeForce LAN 6 event. This slide from the presentation perhaps best sums up the different settings:
Andersson explained that the “low” detail preset roughly corresponds to the level of detail in the console versions of Battlefield 3. So, Battlefield 3‘s PC-centric goodness should be apparent even when we bump cards down to the “medium” detail preset—if they require it, that is.
One of the PC-centric goodies Battlefield 3 puts front and center is antialiasing, which is enabled by default in the “medium,” “high,” and “ultra” presets. The first two presets both rely exclusively on FXAA, which smoothes out jagged edges using a post-process shader instead of conventional multisampling. The “ultra” preset enables both 4X multisampling and FXAA, which produces the smoothest images but requires a powerful graphics card with lots of video memory. According to Andersson, running MSAA without FXAA leaves some jagged edges, because additional rendering stages are run after the multisampling is applied. (In our testing, we found that enabling MSAA by itself reduced average frame rates almost as much as combining FXAA and MSAA.)
Andersson says the “ultra” setting also enhances detail in other ways, by sharpening up shadows and increasing the amount of DirectX 11 tessellation applied to the game’s terrain. Based on the slide above, though, it sounds like the “high” setting is the sweet spot. That’s good news, because the game seemed to default to that preset when we first ran it on all six of our cards.
Now that we have a firmer grasp on things, we can try to dissect the implications of the three image quality settings we used. Here are three screenshots from the first mission of the game, Operation Swordbreaker. We tried grabbing screenshots in the two missions we chose to benchmark, but those featured too much movement across the screen—smoke, dust, and swaying tree shadows—to allow for fair image quality comparisons. This parking lot area in Operationg Swordbreaker stayed mostly static as we took our screenshots.
You can click the thumbnails for full-sized screenshots in lossless PNG format. Zooming in on certain parts of the scene gives us a better look at the impact of the different detail levels, though:
The game’s ambient occlusion and shadowing effects look visibly nicer and more realistic as we ramp up detail levels. The “ultra” preset clearly delivers the smoothest antialiasing, as well. That said, the “high” preset isn’t a huge step down in image quality, and FXAA alone does a reasonably good job of keeping jaggies at bay.
How do Battlefield 3‘s image quality settings affect areas with foliage and vegetation? Such areas are commonplace in the multiplayer mode, and they appear in several of the single-player missions. One of those missions is called Kaffarov, and we fired it up to take our next batch of screenshots:
The differences are much less obvious, but they’re there. The yellow grass looks flat in the “medium” preset, but it gains shading in the “high” preset and subtle amounts of extra detail in “ultra” mode. The first screenshot looks a little brighter as a result.
Before we move on to our benchmarks, I’d like to encourage readers to open the full-sized screenshots for both Operation Arrowhead and Kaffarov, and compare image quality across the entire scenes. Slight image quality differences are easily noticeable in cropped closeups, but they can be tougher to identify if you’re not specifically looking for them—especially when you’re busy shooting bad guys and dodging grenades. With things in perspective, you may find yourself questioning the value of a GPU upgrade, or even wondering if you wouldn’t be best served by dropping to a lower detail preset and enjoying higher performance.
Fear no evil
We kicked off our performance testing by playing the beginning of the game’s Fear no evil mission—the part where you get to drive a tank through city streets and, at one point, through an office building. This sequence progresses from narrow alleyways through to a big, open city plaza, and it features explosions, particle effects, and smoke aplenty.
We started testing with the “high” detail preset, which seems to be what all the cards defaulted to. We’ll do some further tinkering with other detail presets on the next page.
Not too shabby. We’ve got two cards near the 60 FPS mark and a couple others just shy of it. The GeForce GTX 460 and Radeon HD 6850 seem to be dragging their feet just a tad, though.
Now, frame rates like those above only tell part of the story. Graphing individual frame rendering times, as we did in our article Inside the second: A new look at game benchmarking, gives us a much better sense of overall smoothness and playability. In a perfect world, we’d want cards to spit out frames in about 16.7 miliseconds each (which would mean 60 frames rendered per second). Just as importantly, we’d want to ensure consistently low frame rendering times. Even momentary spikes in frame rendering times can translate into perceived choppiness—and overall FPS numbers can’t capture that.
For example, imagine one hypothetical second of gameplay. Almost all frames in that second are rendered in 16.7 ms, but the game briefly hangs, taking a disproportionate 100 ms to produce one frame and then catching up by cranking out the next frame in 5 ms—not an uncommon scenario. You’re going to feel the game skip, but the FPS counter will only report a dip from 60 to 56 FPS, which would suggest a negligible, imperceptible change. Looking inside the second helps us detect such skips, as well as other issues that conventional frame rate data measured in FPS tends to obscure.
In the three graphs below, we’ve plotted individual frame rendering times for our cards across the duration of the Fraps run. We’ll be comparing competing pairs of cards in each graph. Note that faster cards produce more frames and thus longer lines in the graphs.
Interesting. The Radeons and GeForces are almost neck-and-neck in the FPS chart, but as our frame time data shows, the GeForces seem to produce frames at a more consistent rate, with fewer spikes.
It’s also worth noting that the GeForce GTX 460 and Radeon HD 6850 produce a fair number of frames in 25 ms or more (which would work out to 40 FPS or more on average). In a game like Battlefield 3 where instant reactions to in-game stimuli are prized, frame times of that length can be a handicap. That tracks with our seat-of-the-pants impression, which is that the GTX 460 and 6850 are a little too slow to provide a truly fluid experience in Battlefield 3 at these settings. We’ll see how these two cards fare at the “medium” detail setting on the next page.
We’re not done with our frame time data yet, though. We can use it to calculate the total number of frames that took longer than 40 ms to render across our five Fraps runs for each card. (40 ms per frame, in case you’re wondering, would yield a sluggish 25 FPS average).
The first chart confirms our earlier observations by telling us that, in this test, the Radeons produce more frames that take over 40 ms to render than the competing GeForces. In other words, in this scene, the GeForces produce a smoother gameplay experience than the Radeons, generally speaking. In this case, we’re seeing a substantial number of spikes above 40 ms from the slowest cards. Even though the GeForce GTX 460 and the Radeon HD 6850 produce comparable FPS rates, the GTX 460 clearly does a better job of avoiding long frame latencies.
Knowing about those spikes is important, especially in a case like the Radeon HD 6850 above. However, the top few cards at least in the example above don’t fare too poorly, and the differences between them are minimal. Rather than focus on the handful of outliers, we may want to think about things a different way. As we’ve said, what we want from our graphics subsystem is consistently low frame latencies. We can consider the general picture of frame latencies for each GPU by looking at the 99th percentile frame times: the threshold below which 99% of frames are rendered. We think this may be our best overall performance metric. Simply put, the percentile calculation doesn’t let unusually short frame times cancel out unusually high ones, unlike average FPS results can. At the same time, this calculation excludes the rarest and most extraordinary latency spikes—the 1%, if you will—so it better reflects overall playability.
These results reflect what we experienced when playing the game. The Radeons just don’t feel much choppier than the GeForces overall. The greater numbers of long frame times could be a point of note for competitive gamers who play Battlefield 3 online, though, since occasional skips can wreak havoc on your kill/death ratio.
Fear no evil—further tinkering
On the previous page, we saw that the GeForce GTX 460 and Radeon HD 6850 struggled to some extent with the “high” detail preset, while the GeForce GTX 560 Ti and Radeon HD 6950 1GB seemed like they might be able to do more work.
Let’s see how the slower pair of graphics cards performs at the “medium” detail preset. Since we’ve established that raw FPS numbers can be misleading, we’re going to lead with our frame latency graph.
Both cards clearly offer a smoother experience at this detail preset, with frame times hovering closer to, and often below, the 20-ms mark. Our FPS graph corroborates that—52 FPS works out to an average frame time of 19 ms. The FPS result don’t tell the whole story, though. Again, the Radeon exhibits more frame latency spikes than the GeForce. When we filter out the 1% of highest frame latencies with our 99th-percentile calculation, the Radeon HD 6850 proves to be a higher-latency solution that the GeForce GTX 460, by a handful of milliseconds. Since we’re talking about frame times in the 25-31 ms range, we expect these differences could be perceptible, though not huge.
What about “ultra” detail on the higher-end cards?
So much for that. The GTX 560 Ti and 6950 1GB perform worse at the “ultra” setting than the GTX 460 and 6850 do at the “high” setting, with all too many frames taking 30 ms or longer to render. That might pass muster in other titles, but it severely degrades the experience in a twitch shooter like Battlefield 3.
Rock and a hard place
Since Battlefield 3 offers such varied forms of gameplay, we benchmarked our cards in another mission: Rock and a hard place. We started recording performance as our jeep ran into an ambush, after which we disembarked and took pot shots at Russian soldiers. This sequence takes place in a small canyon covered by a thick canopy of trees. Shadows from the vegetation sway across the scene with the wind, and light rays shoot between the leaves. Exploding grenades and muzzle flashes punctuate the scene.
Again, we started testing at the “high” preset. This time, too, we’re going to start by looking at frame times before moving on to average FPS:
Whoa. The tables turn completely in this test, where the GeForces are the ones exhibiting wanton frame latency spikes, and the Radeons are the ones keeping sane. One thing still holds true, though: the GTX 460 and Radeon HD 6850 aren’t quite cut out for the “high” detail level. (We’ll see how they perform in the scene at the “medium” setting on the next page.)
Yep. The GeForces crank out so many frames in 40-ms or more, it’s not pretty. In fact, it’s much, much uglier than the show the Radeons put on in Fear no evil. Even when we buff out the very highest peaks with our 99th-percentile calculation, the GeForces come out looking very weak. It’s not just about numbers, either. Playing this section with a GeForce, the latency spikes were very palpable, causing animations seemingly to speed up and slow down wantonly. The game felt smooth, and we didn’t notice any huge skips, but the illusion of motion was compromised—much more so than with the Radeons in Fear no evil.
Note that, looking at the raw FPS averages, you’d think nothing was wrong—fresh evidence of that metric’s pitfalls.
Rock and a hard place—further tinkering
As we did with Fear no evil, let’s see how our slowest and fastest card pairs handle themselves at the “medium” and “ultra” detail levels, respectively.
Turning down the detail doesn’t help the GeForce produce more consistent frame times. While both cards are neck-and-neck in our FPS graph, it’s pretty clear that the Radeon produces considerably smoother gameplay here.
What about the GTX 560 Ti and 6950 1GB in “ultra” mode?
Even when we look past the crazy amounts of frame latency spikes, neither the GeForce GTX 560 Ti nor the Radeon HD 6950 1GB is terribly fluid at the “ultra” setting. You’re going to need a much quicker GPU to enjoy all the eye candy Battlefield 3 has to offer—so long as you want the game to be fluid, that is.
Clearly, our testing shows that both AMD and Nvidia have issues with frame time spikes in Battlefield 3. Our testing also shows that the Radeons’ transgressions in Fear no evil absolutely pale in comparison with what the GeForces pulled off in Rock and a hard place. Take, for example, the Radeon HD 6870 and GeForce GTX 560: the former produced 57 frames that took longer than 40 ms to render in the first test, but the latter spat out a staggering 528 such frames in the second test.
Based on that data, we’d be tempted to recommend Radeons for use with Battlefield 3 on mid-range systems. Then again, the strongest message our data has to offer is that we need more of it. There are evidently stark differences between GeForces and Radeons in this game, and we wonder if, perhaps, other missions and levels would tip the odds the other way.
Now, Nvidia did throw us a bit of a curve ball earlier today, releasing a beta GeForce driver with purported Battlefield 3 performance improvements for DirectX 10 cards. After some spot checking, we’ve concluded that the new driver doesn’t improve either frame times or frame rates in our test scenarios. We still recorded astronomical numbers of high-latency frames in Rock and a hard place. Besides, the results you saw today were obtained with WHQL-certified drivers in a game that’s been around in some form or other for about a month and a half.
Although we can’t offer a clear-cut recommendation between brands without more testing, we’ve at least come away with a clear grasp of which classes of cards can handle BF3’s different detail settings. At a 1080p resolution, $150 offerings like the GeForce GTX 460 and Radeon HD 6850 deliver the best experience using Battlefield 3‘s “medium” preset. Stepping up to the “high” setting requires an upgrade to at least a $180-190 solution like the GeForce GTX 560 or Radeon HD 6870. Finally, the “ultra” mode seems to be off limits to even $250 cards. EA DICE believes it’s suited to multi-GPU configs, and that may well be the case.
This little exercise has also taught us that Battlefield 3 has one heck of a graphics engine. Yes, this game can make $250 cards sweat if we turn up the heat, but we don’t have to. The “high” preset works nicely on mid-range cards, looks absolutely gorgeous, and can be difficult to discern from the “ultra” preset. That’s quite remarkable. Fluid frame rates and dazzling, antialiased graphics don’t often go hand-in-hand on this class of GPU—not with games designed to milk gaming PCs for all they’re worth.