Today’s mid-range GPUs in Battlefield 3

Blockbuster games capable of making high-end PCs sweat have become a rarity in recent years. That’s unfortunate but unavoidable in a world where consoles remain saddled with six-year-old graphics hardware—yet are, sadly, far more popular than high-powered gaming PCs. Today’s PC titles typically feature the same graphics engines and pull off the same visual tricks as their predecessors. They don’t look bad… but they could look so much better.

We PC gamers respond with understandable giddiness when a game bucks that trend. From the get-go, EA DICE has boasted that Battlefield 3 was designed for the PC first and foremost, then scaled back to accommodate the Xbox 360 and PlayStation 3. The impact of that design choice was apparent even in the first trailers. With a sophisticated rendering engine bolstered by DirectX 11 effects like terrain tessellation and DirectCompute-accelerated lighting, the war-themed shooter offers visual fidelity above and beyond most of today’s shooters. Large swaths of the game look good enough to put Crysis and its sequel to shame. That’s saying something.

Oh, sure, Battlefield 3 doesn’t have the most engaging single-player campaign, as cinematic and visually awesome as it might look. EA’s Origin distribution scheme and the awkward web-based server browser have also earned their fair share of critics. Nevertheless, this game looks gorgeous and features delightfully engrossing multiplayer. I’ve clocked in 10 hours of Battlefield 3 MP so far, and I’m still itching for my next fix. Few multiplayer games let players sneak around subway tunnels with silenced weapons, snipe enemies across vertiginous valleys, drive tanks around large wooded hills, and engage in dogfights with fighter jets. Battlefield 3 doesn’t just let you do all of those things—it makes them all fun.

Such a game must benefit from a fast graphics card with the latest bells and whistles, of course… but which one? Nvidia has pimped Battlefield 3 since even before its release, and although the title doesn’t feature any Nvidia-only graphical effects that we know of, one has to wonder whether it runs better on Radeons or GeForces. Just as importantly, exactly how much GPU horsepower does one need to enable most of the game’s eye candy? Is this another Crysis, or is it more forgiving of mid-range graphics hardware?

To answer these questions, we enlisted three pairs of competing graphics cards: the GeForce GTX 460 and Radeon HD 6850, which are both priced just under $150; the GeForce GTX 560 and Radeon HD 6870, which reside closer to $180-190; and GeForce GTX 560 Ti and Radeon HD 6950 1GB, which you can usually find priced south of the $250 mark. These six cards represent today’s mid-range landscape. You might have seen them, perhaps with slightly different coolers and clock speeds, in the three cheapest builds from our latest system guide. We fired up Battlefield 3 on each card and got to tweaking graphical settings, measuring frame rates and frame times, and keeping a close eye on our seat-of-the-pants-o-meter.

Our testing methods

Now, we should clarify a couple of choices we made in our testing. First, we benchmarked the game’s single-player missions. While the same graphics engine powers the single- and multiplayer campaigns, testing a multiplayer game makes a number of variables rather difficult to control: network latency to the server, how many players are running across (and flying above) a map, how those players interact with the tester, and so forth. It can be done, but single-player skirmishes can be just as violent and explosive as multiplayer ones—sometimes more so—and they’re much more consistently repeatable.

You’ll also note that we tested at a resolution of 1920×1080 throughout. Battlefield 3‘s lack of a built-in, scripted benchmark forced us to test manually, so covering multiple resolutions would have been a significant time sink. We therefore settled on what’s unarguably the most popular resolution not just for mid-range builds, but also in the market overall. Right now, Newegg lists no fewer than 182 1080p LCD monitors; the next most popular resolution is 1280×1024, with only 61 displays listed. 1080p displays range in price from $109.99 to well over $600, and they come with all manners of panel types and sizes, from 21.5″ TN designs to larger IPS offerings.

As ever, we did our best to deliver clean benchmark numbers, with tests run five times per card. Our test system was configured as follows:

Processor Intel Core i5-750
Motherboard Asus P7P55D
North bridge Intel P55 Express
South bridge
Memory size 4GB (2 DIMMs)
Memory type Kingston HyperX KHX2133C9AD3X2K2/4GX

DDR3 SDRAM at 1333MHz

Memory timings 9-9-9-24 1T
Chipset drivers INF update 9.2.0.1025

Rapid Storage Technology 10.1.0.1008

Audio Integrated Via VT1828S

with 6.0.1.8700 drivers

Graphics XFX Radeon HD 6850 1GB (HD-685X-ZNFC)

with Catalyst 11.10 WHQL drivers

Asus Radeon HD 6870 1GB 915MHz (EAH6870 DC/2DI2S/1GD5)

with Catalyst 11.10 WHQL drivers

Gigabyte Radeon HD 6950 1GB 870MHz (GV-R695OC-1GD)

with Catalyst 11.10 WHQL drivers

Zotac GeForce GTX 460 1GB (ZT-40402-10P)

with GeForce 285.62 drivers

MSI GeForce GTX 560 1GB 870MHz (N560GTX Twin Frozr II/OC)

with GeForce 285.62 drivers

Asus GeForce GTX 560 Ti 1GB 830MHz (ENGTX560 TI DCII/2DI/1GD5)

with GeForce 285.62 drivers

Hard drive Samsung SpinPoint F1 HD103UJ 1TB SATA

Western Digital Caviar Green 1TB

Power supply Corsair HX750W

OS Windows 7 Ultimate x64 Edition

Service Pack 1

Thanks to Asus, Intel, Corsair, Kingston, Samsung, and Western Digital for helping to outfit our test rigs with some of the finest hardware available. XFX, Gigabyte, and MSI for supplying the graphics cards for testing, as well.

We conducted testing using the Catalyst 11.10 WHQL driver from AMD and the GeForce 285.62 driver from Nvidia. We left optional AMD optimizations for tessellation and texture filtering disabled. Otherwise, image quality settings for the graphics cards were left at the control panel defaults. Vertical refresh sync (vsync) was disabled for all tests.

We used the following test applications:

We used the Fraps utility to record frame rates while playing a 90-second sequences through each level we tested. Although capturing frame rates while playing isn’t precisely repeatable, we tried to make each run as similar as possible to all of the others. We tested each Fraps sequence five times per video card, in order to counteract any variability.

The tests and methods we employ are generally publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.

Introducing the graphics detail settings

Over the next few pages, you might catch a glimpse of screenshots showing Battlefield 3‘s video options screens. The various individual quality settings and sliders are largely self-explanatory, but what do they actually mean in terms of image quality?

EA DICE Chief Architect Johan Andersson provided some useful background information during his keynote speech at the Nvidia GeForce LAN 6 event. This slide from the presentation perhaps best sums up the different settings:

Andersson explained that the “low” detail preset roughly corresponds to the level of detail in the console versions of Battlefield 3. So, Battlefield 3‘s PC-centric goodness should be apparent even when we bump cards down to the “medium” detail preset—if they require it, that is.

One of the PC-centric goodies Battlefield 3 puts front and center is antialiasing, which is enabled by default in the “medium,” “high,” and “ultra” presets. The first two presets both rely exclusively on FXAA, which smoothes out jagged edges using a post-process shader instead of conventional multisampling. The “ultra” preset enables both 4X multisampling and FXAA, which produces the smoothest images but requires a powerful graphics card with lots of video memory. According to Andersson, running MSAA without FXAA leaves some jagged edges, because additional rendering stages are run after the multisampling is applied. (In our testing, we found that enabling MSAA by itself reduced average frame rates almost as much as combining FXAA and MSAA.)

Andersson says the “ultra” setting also enhances detail in other ways, by sharpening up shadows and increasing the amount of DirectX 11 tessellation applied to the game’s terrain. Based on the slide above, though, it sounds like the “high” setting is the sweet spot. That’s good news, because the game seemed to default to that preset when we first ran it on all six of our cards.

Now that we have a firmer grasp on things, we can try to dissect the implications of the three image quality settings we used. Here are three screenshots from the first mission of the game, Operation Swordbreaker. We tried grabbing screenshots in the two missions we chose to benchmark, but those featured too much movement across the screen—smoke, dust, and swaying tree shadows—to allow for fair image quality comparisons. This parking lot area in Operationg Swordbreaker stayed mostly static as we took our screenshots.

Medium High Ultra

You can click the thumbnails for full-sized screenshots in lossless PNG format. Zooming in on certain parts of the scene gives us a better look at the impact of the different detail levels, though:

Medium

High

Ultra

The game’s ambient occlusion and shadowing effects look visibly nicer and more realistic as we ramp up detail levels. The “ultra” preset clearly delivers the smoothest antialiasing, as well. That said, the “high” preset isn’t a huge step down in image quality, and FXAA alone does a reasonably good job of keeping jaggies at bay.

How do Battlefield 3‘s image quality settings affect areas with foliage and vegetation? Such areas are commonplace in the multiplayer mode, and they appear in several of the single-player missions. One of those missions is called Kaffarov, and we fired it up to take our next batch of screenshots:

Medium High Ultra

Medium

High

Ultra

The differences are much less obvious, but they’re there. The yellow grass looks flat in the “medium” preset, but it gains shading in the “high” preset and subtle amounts of extra detail in “ultra” mode. The first screenshot looks a little brighter as a result.

Before we move on to our benchmarks, I’d like to encourage readers to open the full-sized screenshots for both Operation Arrowhead and Kaffarov, and compare image quality across the entire scenes. Slight image quality differences are easily noticeable in cropped closeups, but they can be tougher to identify if you’re not specifically looking for them—especially when you’re busy shooting bad guys and dodging grenades. With things in perspective, you may find yourself questioning the value of a GPU upgrade, or even wondering if you wouldn’t be best served by dropping to a lower detail preset and enjoying higher performance.

Fear no evil

We kicked off our performance testing by playing the beginning of the game’s Fear no evil mission—the part where you get to drive a tank through city streets and, at one point, through an office building. This sequence progresses from narrow alleyways through to a big, open city plaza, and it features explosions, particle effects, and smoke aplenty.

We started testing with the “high” detail preset, which seems to be what all the cards defaulted to. We’ll do some further tinkering with other detail presets on the next page.

Not too shabby. We’ve got two cards near the 60 FPS mark and a couple others just shy of it. The GeForce GTX 460 and Radeon HD 6850 seem to be dragging their feet just a tad, though.

Now, frame rates like those above only tell part of the story. Graphing individual frame rendering times, as we did in our article Inside the second: A new look at game benchmarking, gives us a much better sense of overall smoothness and playability. In a perfect world, we’d want cards to spit out frames in about 16.7 miliseconds each (which would mean 60 frames rendered per second). Just as importantly, we’d want to ensure consistently low frame rendering times. Even momentary spikes in frame rendering times can translate into perceived choppiness—and overall FPS numbers can’t capture that.

For example, imagine one hypothetical second of gameplay. Almost all frames in that second are rendered in 16.7 ms, but the game briefly hangs, taking a disproportionate 100 ms to produce one frame and then catching up by cranking out the next frame in 5 ms—not an uncommon scenario. You’re going to feel the game skip, but the FPS counter will only report a dip from 60 to 56 FPS, which would suggest a negligible, imperceptible change. Looking inside the second helps us detect such skips, as well as other issues that conventional frame rate data measured in FPS tends to obscure.

In the three graphs below, we’ve plotted individual frame rendering times for our cards across the duration of the Fraps run. We’ll be comparing competing pairs of cards in each graph. Note that faster cards produce more frames and thus longer lines in the graphs.

Interesting. The Radeons and GeForces are almost neck-and-neck in the FPS chart, but as our frame time data shows, the GeForces seem to produce frames at a more consistent rate, with fewer spikes.

It’s also worth noting that the GeForce GTX 460 and Radeon HD 6850 produce a fair number of frames in 25 ms or more (which would work out to 40 FPS or more on average). In a game like Battlefield 3 where instant reactions to in-game stimuli are prized, frame times of that length can be a handicap. That tracks with our seat-of-the-pants impression, which is that the GTX 460 and 6850 are a little too slow to provide a truly fluid experience in Battlefield 3 at these settings. We’ll see how these two cards fare at the “medium” detail setting on the next page.

We’re not done with our frame time data yet, though. We can use it to calculate the total number of frames that took longer than 40 ms to render across our five Fraps runs for each card. (40 ms per frame, in case you’re wondering, would yield a sluggish 25 FPS average).

The first chart confirms our earlier observations by telling us that, in this test, the Radeons produce more frames that take over 40 ms to render than the competing GeForces. In other words, in this scene, the GeForces produce a smoother gameplay experience than the Radeons, generally speaking. In this case, we’re seeing a substantial number of spikes above 40 ms from the slowest cards. Even though the GeForce GTX 460 and the Radeon HD 6850 produce comparable FPS rates, the GTX 460 clearly does a better job of avoiding long frame latencies.

Knowing about those spikes is important, especially in a case like the Radeon HD 6850 above. However, the top few cards at least in the example above don’t fare too poorly, and the differences between them are minimal. Rather than focus on the handful of outliers, we may want to think about things a different way. As we’ve said, what we want from our graphics subsystem is consistently low frame latencies. We can consider the general picture of frame latencies for each GPU by looking at the 99th percentile frame times: the threshold below which 99% of frames are rendered. We think this may be our best overall performance metric. Simply put, the percentile calculation doesn’t let unusually short frame times cancel out unusually high ones, unlike average FPS results can. At the same time, this calculation excludes the rarest and most extraordinary latency spikes—the 1%, if you will—so it better reflects overall playability.

These results reflect what we experienced when playing the game. The Radeons just don’t feel much choppier than the GeForces overall. The greater numbers of long frame times could be a point of note for competitive gamers who play Battlefield 3 online, though, since occasional skips can wreak havoc on your kill/death ratio.

Fear no evil—further tinkering

On the previous page, we saw that the GeForce GTX 460 and Radeon HD 6850 struggled to some extent with the “high” detail preset, while the GeForce GTX 560 Ti and Radeon HD 6950 1GB seemed like they might be able to do more work.

Let’s see how the slower pair of graphics cards performs at the “medium” detail preset. Since we’ve established that raw FPS numbers can be misleading, we’re going to lead with our frame latency graph.

Both cards clearly offer a smoother experience at this detail preset, with frame times hovering closer to, and often below, the 20-ms mark. Our FPS graph corroborates that—52 FPS works out to an average frame time of 19 ms. The FPS result don’t tell the whole story, though. Again, the Radeon exhibits more frame latency spikes than the GeForce. When we filter out the 1% of highest frame latencies with our 99th-percentile calculation, the Radeon HD 6850 proves to be a higher-latency solution that the GeForce GTX 460, by a handful of milliseconds. Since we’re talking about frame times in the 25-31 ms range, we expect these differences could be perceptible, though not huge.

What about “ultra” detail on the higher-end cards?

So much for that. The GTX 560 Ti and 6950 1GB perform worse at the “ultra” setting than the GTX 460 and 6850 do at the “high” setting, with all too many frames taking 30 ms or longer to render. That might pass muster in other titles, but it severely degrades the experience in a twitch shooter like Battlefield 3.

Rock and a hard place

Since Battlefield 3 offers such varied forms of gameplay, we benchmarked our cards in another mission: Rock and a hard place. We started recording performance as our jeep ran into an ambush, after which we disembarked and took pot shots at Russian soldiers. This sequence takes place in a small canyon covered by a thick canopy of trees. Shadows from the vegetation sway across the scene with the wind, and light rays shoot between the leaves. Exploding grenades and muzzle flashes punctuate the scene.

Again, we started testing at the “high” preset. This time, too, we’re going to start by looking at frame times before moving on to average FPS:

Whoa. The tables turn completely in this test, where the GeForces are the ones exhibiting wanton frame latency spikes, and the Radeons are the ones keeping sane. One thing still holds true, though: the GTX 460 and Radeon HD 6850 aren’t quite cut out for the “high” detail level. (We’ll see how they perform in the scene at the “medium” setting on the next page.)

Yep. The GeForces crank out so many frames in 40-ms or more, it’s not pretty. In fact, it’s much, much uglier than the show the Radeons put on in Fear no evil. Even when we buff out the very highest peaks with our 99th-percentile calculation, the GeForces come out looking very weak. It’s not just about numbers, either. Playing this section with a GeForce, the latency spikes were very palpable, causing animations seemingly to speed up and slow down wantonly. The game felt smooth, and we didn’t notice any huge skips, but the illusion of motion was compromised—much more so than with the Radeons in Fear no evil.

Note that, looking at the raw FPS averages, you’d think nothing was wrong—fresh evidence of that metric’s pitfalls.

Rock and a hard place—further tinkering

As we did with Fear no evil, let’s see how our slowest and fastest card pairs handle themselves at the “medium” and “ultra” detail levels, respectively.

Turning down the detail doesn’t help the GeForce produce more consistent frame times. While both cards are neck-and-neck in our FPS graph, it’s pretty clear that the Radeon produces considerably smoother gameplay here.

What about the GTX 560 Ti and 6950 1GB in “ultra” mode?

Even when we look past the crazy amounts of frame latency spikes, neither the GeForce GTX 560 Ti nor the Radeon HD 6950 1GB is terribly fluid at the “ultra” setting. You’re going to need a much quicker GPU to enjoy all the eye candy Battlefield 3 has to offer—so long as you want the game to be fluid, that is.

Conclusions

Clearly, our testing shows that both AMD and Nvidia have issues with frame time spikes in Battlefield 3. Our testing also shows that the Radeons’ transgressions in Fear no evil absolutely pale in comparison with what the GeForces pulled off in Rock and a hard place. Take, for example, the Radeon HD 6870 and GeForce GTX 560: the former produced 57 frames that took longer than 40 ms to render in the first test, but the latter spat out a staggering 528 such frames in the second test.

Based on that data, we’d be tempted to recommend Radeons for use with Battlefield 3 on mid-range systems. Then again, the strongest message our data has to offer is that we need more of it. There are evidently stark differences between GeForces and Radeons in this game, and we wonder if, perhaps, other missions and levels would tip the odds the other way.

Now, Nvidia did throw us a bit of a curve ball earlier today, releasing a beta GeForce driver with purported Battlefield 3 performance improvements for DirectX 10 cards. After some spot checking, we’ve concluded that the new driver doesn’t improve either frame times or frame rates in our test scenarios. We still recorded astronomical numbers of high-latency frames in Rock and a hard place. Besides, the results you saw today were obtained with WHQL-certified drivers in a game that’s been around in some form or other for about a month and a half.

Although we can’t offer a clear-cut recommendation between brands without more testing, we’ve at least come away with a clear grasp of which classes of cards can handle BF3’s different detail settings. At a 1080p resolution, $150 offerings like the GeForce GTX 460 and Radeon HD 6850 deliver the best experience using Battlefield 3‘s “medium” preset. Stepping up to the “high” setting requires an upgrade to at least a $180-190 solution like the GeForce GTX 560 or Radeon HD 6870. Finally, the “ultra” mode seems to be off limits to even $250 cards. EA DICE believes it’s suited to multi-GPU configs, and that may well be the case.

This little exercise has also taught us that Battlefield 3 has one heck of a graphics engine. Yes, this game can make $250 cards sweat if we turn up the heat, but we don’t have to. The “high” preset works nicely on mid-range cards, looks absolutely gorgeous, and can be difficult to discern from the “ultra” preset. That’s quite remarkable. Fluid frame rates and dazzling, antialiased graphics don’t often go hand-in-hand on this class of GPU—not with games designed to milk gaming PCs for all they’re worth.

Comments closed
    • kamikaziechameleon
    • 8 years ago

    I think the community would benefit from a broad spectrum roundup of GPUs with this falls games. I don’t buy a GPU for any one game but rather general performance. The last great GPU round up didn’t include micro stutter analysis.

    • indeego
    • 8 years ago

    Why such an old Fraps Version? Changes in newer versions might be significant for testing:
    [sub<] Fraps 3.4.7 - 22nd October, 2011 - Fixed recording rate being locked to a low multiple when Vsync enabled - Fixed unlocked recording speed after temporary slowdowns - Fixed Alt key interference with VMware/remote desktop applications Fraps 3.4.6 - 31st July, 2011 - Fixed overlay color in Fear 3 - Fixed detection of IL2: Cliffs of Dover and Bloodline Champions - Fixed counter appearing in Thunderbird and Pale Moon applications - Other minor bug fixes Fraps 3.4.5 - 28th May, 2011 - Fixed some videos being recorded with scrambled colors Fraps 3.4.4 - 26th May, 2011 - Increased performance capturing Vista/Windows 7 aero desktop (DWM) - Reduced memory footprint of Fraps process - Fixed Fraps crash while idling on desktop - Fixed gem glowing while recording The Dark Mod - Fixed hotkeys not responding when simulated from autohotkey/macro apps - Fixed graphic corruption in some OpenGL titles Fraps 3.4.3 - 12th May, 2011 - Improved DirectDraw capture speed - Fixed recording crash on single CPU systems - Fixed invalid colors/crashing recording from 16-bit color games - Fixed corrupt/oversized AVIs being written at high resolutions - Fixed Dirt not loading with Fraps running - Other minor bug fixes and optimizations Fraps 3.4.2 - 14th April, 2011 - Improved OpenGL capture speed - Fixed freeze when clicking on Minimize button - Fixed GL state not being restored properly - Fixed blank video showing in some AVIs Fraps 3.4.1 - 2nd April, 2011 - Fixed crash when starting Fraps for some users - Fixed hotkeys not being detected when modifier keys held - Fixed keys getting stuck when used by both the game and Fraps Fraps 3.4.0 - 29th March, 2011 - Added configurable buffer size for loop recording mode - Increased video capture performance at high resolutions - Fixed slow recording of Minecraft at default window size - Fixed benchmark crashing when configured to stop automatically after 1 second Fraps 3.3.3 - 15th March, 2011 - Fixed mouse not responding in Ragnarok and some applications[/sub<]

    • kamikaziechameleon
    • 8 years ago

    I’m really happy to see TR shining a light on the frame stutter issues in these games etc. I’m also curious to see if any of the graphics companies are listening. I hope to see something addressing these issues in the next gen cards coming in the next 3-4 months.

    • Biggins
    • 8 years ago

    [quote<]Before we move on to our benchmarks, I'd like to encourage readers to open the full-sized screenshots for both [b<]Operation Arrowhead[/b<] and Kaffarov[/quote<] Did you mean [b<]Operation Swordbreaker[/b<]? It's at the bottom of page 3. I play a lot of Arma II so I realised that name didn't belong in a BF3 article.

    • lopri
    • 8 years ago

    Have you tried testing the frametime data on AMD platform? You will see quite a different picture.

    I know, I know. No one tests video cards on AMD systems for obvious reasons. But if you were going with something this technical, you can’t simply ignore the different platforms. It’s not too far fetched to think that AMD will first test their hardware on their platform, but for NV (and Intel) it is not likely a priority.

    Just throwing it out there so that you have it in mind as you move forward with this metric. You don’t want to declare something without knowing the full picture.

      • indeego
      • 8 years ago

      Do you have an example of a platform where this would make a difference? Are you claiming flaws in the Intel Core i5-750 or the chipsets or some other portion?

    • odizzido
    • 8 years ago

    awesome article, the information provided is so much more useful than just the standard FPS metric.

    I was really surprised at the complete reversal in frame times between the two stages tested.

    • njsutorius
    • 8 years ago

    I need to run fraps. I a pretty steady gamer, and have played 20+ horus of bf3 multiplayer as of now. Everything is on high running a 6850, and i do not notice any frame lag at all.. these charts would indicate otherwise. i am running as 1920x as well.

    edit: and it looks fricking awsome

    • Ruiner
    • 8 years ago

    Nice read. Is a C2Q enough to push these cards in this game?

      • derFunkenstein
      • 8 years ago

      A Q6600 at stock speeds is probably not. OC’d to 3GHz it’s probably closer to worth it.

      • Krogoth
      • 8 years ago

      Q6600 will be alright at stock. A OC unit will handle the game fine, but the GPUs in question will be held back by the Q6600, unless you are gaming at Ultra settings.

      • BestJinjo
      • 8 years ago

      This game is almost entirely GPU limited.

      Bulldozer FX-4100 @ 2.0ghz produces nearly identical framerates with a GTX580 as it does @ 4.0ghz:
      [url<]http://www.techspot.com/review/458-battlefield-3-performance/page7.html[/url<] Therefore, almost certainly, even a stock C2Q won't bottleneck you in this game if you are playing on High or Ultra visual settings. It's 99% all about the GPU in this game.

        • Airmantharp
        • 8 years ago

        Switch to MP and watch the Bulldozer get run flat over.

        Single-player in BF games is notoriously CPU light- MP is notoriously CPU-limited.

    • ModernPrimitive
    • 8 years ago

    Thank you Cyril. Very informative and a refreshing and useful way of benchmarking. Hopefully this will push the industry to smoother hardware / software in the future.

    • Chrispy_
    • 8 years ago

    Man, I love your “frames over 40ms” results – so much more useful than an isolated fps result. With the illusion of motion failing above 40ms, that really is the difference between a slideshow and fluid gaming.

    In reality, it means that the 460 would need to drop down to “low”, despite it’s 60fps average, whilst the 6850 could be left on “medium”. That’s a seriously important distinction that the 59fps average result completely fails to show.

    • Pantsu
    • 8 years ago

    Great article! It would’ve also been nice to see what say 460 sli or 6850 cf would have produced as a result.

    Personally I found out that 1080p ultra worked exceptionally well when it comes to micro stutter using 6950 CF. It wasn’t perfect but the stutter wasn’t noticeable. On the other hand at Eyefinity resolutions the stutter was prominent as ever. It depends a lot on the settings like AA and SSAO/HBAO. Really hard to track down the reason or make much sense of it.

    BF3 does have a nice console command to check frametimes on the fly though, “render.perfoverlayvisible 1”. It actually showed that it was the CPU graph doing the stuttering, but adjusting clockspeeds didn’t seem to solve anything.

    • Kollaps
    • 8 years ago

    Interesting results, I’m just sure how useful they are.

    My HD6870 handles all Ultra settings at 1080p with 2x AA just fine… when playing multiplayer. The nature of Battlefield 3 (and the series as a whole) makes me feel that testing single player portions of the game gives readers the wrong impression of the required hardware to enjoy the games real meat, its multiplayer.

    I’m aware that getting consistent results in multiplayer is virtually impossible, but I’m also aware that a $180 video card is all that’s required to enjoy BF3 to its fullest.

      • ChronoReverse
      • 8 years ago

      You must have very forgiving standards. I have a 6950 and certainly can’t play on all ultra settings much less MSAA (which destroys FPS)

        • Kollaps
        • 8 years ago

        I guess? I haven’t touched the single player of Battlefield 3. I fully believe it’s far more demanding than the multiplayer, as that’s the usual case. But I certainly have no problems running the game on Ultra settings by my standards when playing online.

        But that’s my real point. There seems to be a significant difference in performance demand between the single player and multiplayer. Considering Battlefield 3 is a multiplayer first game and that’s what the audience is looking for, that it’s a bit misleading to suggest such expensive graphics cards are required to play the game at max or near max settings.

        I’ve put about 20 hours into BF3 so far and my personal opinion is that it’s smoother than BF:BC2 on the same hardware. The player movement feels much smoother, in BC2 you’d get that jittery feeling even on much more powerful than should be necessary hardware.

        I haven’t read any articles on BF3’s performance scaling besides this TR article. I’d be curious to see how the game scales with changes in CPU or RAM. Especially RAM, it seemed odd to me that the test system had only 4GB of RAM, given such low prices it would seem logical to upgrade the system in order to make certain this isn’t a potential issue. In addition to a close examination of differences between single player and multiplayer.

          • njsutorius
          • 8 years ago

          i made a seperate post above, but i fully agree with you. I notice no hardware related lag when playing on high settings with the 6850. This article shoudl read “single player”.

            • SHOES
            • 8 years ago

            Im playing at mostly high settings on a 4890 😉 looks great smooth as silk multiplayer.

          • ChronoReverse
          • 8 years ago

          I’m referring to multi too but each to their own I guess.

      • odizzido
      • 8 years ago

      Some people are more sensitive to frame rate issues than others. You are probably on the low end.

      Personally I hate 120hz TVs because the frame rate is horribly inconsistent, but a lot of people really like them.

      I also find the frame rate on console games to be really low. Just another reason I just don’t like console gaming.

        • ChronoReverse
        • 8 years ago

        120Hz TVs is a good thing if used properly.

        If you’re talking about the frame interpolation they use though to “invent” between frames, then it’s absolutely horrible. Not only does it look rather unnatural, it often generates garbage with motion since the interpolator has to guess.

        But the real use for 120Hz is that material like 24FPS film can be displayed properly since it’s a divisor of 120 (but not 60). With 60Hz, there’ll always be a bit of stutter since one frame will have to be display longer than the others. It’s most easily noticeable during a pan.

          • odizzido
          • 8 years ago

          I find movies, TV, and everything else shot at 24fps runs really choppy regardless of how it is displayed.

          It’s actually really distracting for me when I am at a theatre because the picture is so large.

          I hope that the next format we move to after blueray they push the frame rate to at least 50. It shouldn’t be too difficult since we are moving to digital storage anyways.

            • Firestarter
            • 8 years ago

            That will probably never happen. Sorry to bust your bubble odizzido, but the 24fps standard is there because of the ‘cinematic’ feel. Video material meant for TV broadcast has long been filmed at a higher frame-rate, which gave a TV soap a distinctive different look than a movie. This different (smoother) look therefore has an association with the lower quality production of a TV movie or soap, which is something that the traditional movie producer will absolutely want to prevent.

            That our brains can barely puzzle together a somewhat fluid picture from 24fps and that any panning scene looks like absolute crap is apparently beside the point. You can sit there in the cinema, admiring a great sharp shot of a landscape, which then suddenly turns into a blurry mess because of a slight pan towards the protagonist. At a higher frame-rate, it would most definitely have been a more beautiful scene.

            Oh and let’s not forget that everything that’s been digitally added these days would have to be processed for all the extra frames as well, adding to the cost.

            • indeego
            • 8 years ago

            The “standard” is there due to previous film technical limitations, and that has been lifted. You will see fewer and fewer 24 fps movies and shows over time. Action/panning scenes are very much improved with higher FPS, and it makes a huge difference.

            “Cinematic” experience my ass. In case people haven’t noticed you can get 20+ movies in the home for the cost of one movie in the theater for a family of four. Holding on to these “ideals” is a quick way of watching old dinosaurs fall fast… in 60+ fps.

            • BobbinThreadbare
            • 8 years ago

            There is one minor technical hurdle. It’s harder to light scenes properly when the camera shutter is closing faster.

            • Firestarter
            • 8 years ago

            Hey I’m not saying I like 24fps, just that the industry does. If you hadn’t noticed, hollywood is pretty conservative in its methods, they’d rather stick with what they know works, and that includes shooting at 24fps.

            • ChronoReverse
            • 8 years ago

            Oh I agree. I’d prefer 60FPS though just to avoid the hassle with syncing to 1080p60 though.

            However, I was really referring to the “judder” effect when you try to display 24FPS on a 60Hz screen.

            Obviously you have to repeat frames but some frames get repeated more. The judder is actually even worse than 24FPS itself (like you, I can easily see the effect even outside of pans) since not only is motion choppy for larger motions, it’s not uniformly choppy.

            • BobbinThreadbare
            • 8 years ago

            TV is 30fps

            • ChronoReverse
            • 8 years ago

            The standards are 720p60, 1080i60, 1080p30 in general.

            However, 60Hz is the most common refresh rate of the TV set itself. But for an input source of 24FPS (i.e., a movie), you need 120Hz to properly display it.

            • --k
            • 8 years ago

            Ever heard of 1080/24p? There are many tvs that can display it w/o using 3:2 pulldown.

            Btw, James Cameron is pushing for 60fps:

            [url<]http://m.hollywoodreporter.com/news/avatar-james-cameron-eyeing-60-frames-237522[/url<]

      • TardOnPC
      • 8 years ago

      This is the most ignorant post I have seen in a long time and I have been troll hunting on the BF3 forums. The results here are quite useful to the rest of us despite how happy you may be with your mediocre standards.

      A) The settings you use aren’t quite the stock “Ultra” settings now are they? Maybe you might be happy with jaggies, but how about the rest of us? I game at 1920×1200 and while most jaggies are taken care of with 2xFSAA they are still present and distracting in shooters.

      B) Your third sentence is so ignorant I can’t even rant properly.

      C) Your awareness of everyone else’s gaming requirements is non existent.

      Even though I disagree with your recommendation I am not going to be ignorant and say a mid range card does not cut it for everyone. I will say that the results listed here are enough for us to make informed decisions based on OUR REQUIREMENTS.

    • aggies11
    • 8 years ago

    Nice work on the article.

    BF3 seems pretty smooth to me (or rather, the type of gameplay leads to not noticing it as much).

    I’d be very curious to see what your results are if you do an “inside the second” article on Skyrim, as my initial experiences (on a Radeon 6950 at least) is that there is lots of micro-stutter (some preliminary looks at the FRAPS frametimes seems to confirm this).

    One thing about the “inside the second” concept in general: I think it might be worthwhile trying to define some terminology with regards to this subject. “Microstutter” gets tossed around a lot and there really isn’t that strong a consensus on what exactly it means.

    With your frametime spikes, I’d say that would qualify as something along the lines of “hitching”, where smooth gameplay hits a bump, a momentary “hitch”,where it is interrupted, and then resumes on. I think you guys have that one covered with your testing methodology.

    However I think there is another one that you hint at in the descriptions, but don’t really have a bead on in the data plots. The idea of “jitter”. Where smooth movement actually does not appear smooth, but rather jitters back and forth. The idea being that it’s not a spike in frametimes, but rather an oscillation in changes BETWEEN frame times. Eg. If the time between two consecutive frames changes 5ms, back and forth. There is no spike, but since the time each frame is rendered is changing from one to the next, movement appears to happen at different rates, ie. jittery vs smooth movement. (Often when paining the camera, or moving in a straight line). (This is the one I experience a lot of in Skyrim so far, and to be far, Oblivion before it, with different Brand and architecture GPUs).

    It would seem that this jitter doesn’t have to oscillate every single frame, but if there are enough shifts in a 1second sample, you start to notice it. So depending on framerate, even a shift in frametime every 5-10frames can still make a noticable difference (3-6 times per second). The smaller the time between shifts, the more jittery movement appears.

    It’s a tricky business, and I admire your pioneering attempt to really quantify this. You would think that as frames complexity doesn’t really change, then the frametime shouldn’t either, but I guess there are other issues going on, both in the driver and engine level, that can introduce latency into the rendering process.

    It is something that does make a noticable difference (depending on the style of game) so worth bringing some attention to!

      • xeridea
      • 8 years ago

      The original article on the new testing methodology has frame by frame graphs. The jitter you speak of is apparent on some setups, most commonly SLI/CF, though the frame times in mulit-gpu are generally low enough to not be to disruptive, though jitter can often be huge…. alternating 5ms and 30ms frames. It seems to happen on both AMD and Nvidia cards… in different games/scenes. Nice that you bring up jitter though… it is a common metric used for internet latency also as well as average latency.

    • south side sammy
    • 8 years ago

    this may be one of those questions alluded to in your last “etc.” thread…………….. so, the 1gig cards suck, what about the cards with 2 gigs. Any smoother game play ?

    and I don’t mean more fps but smoother game play…………

    • luisnhamue
    • 8 years ago

    With my 6950 I mix settings. I’ve managed to get a cream from ultra

    • dashbarron
    • 8 years ago

    [quote<] EA DICE believes it's suited to multi-GPU configs, and that may well be the case. [/quote<] Well.... 🙂 We finally have a game that can push PC cards. I enjoyed this article and would LOVE to see a broad range test of higher (including multi-GPU) and lower cards: let's see what cards can chew this game up and what ones are really stressed. Go TR reviewers go!

      • Krogoth
      • 8 years ago

      Current high-end GPUs (6970 and 580) laugh at BF3 under 2 Megapixel gaming. It only gives mid-range cards a problem if you set it at “Ultra” settings a.k.a (High + forcing high-levels of AA/AF). They get by with High settings as long as you don’t care for AA/AF (majority of the FPS junkies).

      Multi-card setups are only needed if you want to game at 4 Megapixels at Ultra settings.

      It is bit of an exaggeration to say that BF3 pushes current GPUs to their knees. It simply sets the bar higher, that is all.

        • dashbarron
        • 8 years ago

        It’s been so long since anything has though. Benchmarks are always yummy.

          • paulWTAMU
          • 8 years ago

          i hate to admit it but in the last 2 years or so I’ve realized I’m actually a graphics whore 🙁 I never noticed so long as things kept advancing but i want improvements, and in the last 3-4 years they’ve been slow…

    • Jon1984
    • 8 years ago

    Excellent review as always.

    I have a overclocked 560 Ti (950/1100) and I’ve had a really pleasant experience at 1080p Ultra settings with 2x MSAA. Although multiplayer certainly is another story 😉

    • kamikaziechameleon
    • 8 years ago

    I know its outside the price range of this review but I would have loved to see a 570 in here as well as a 6970

      • d0g_p00p
      • 8 years ago

      I guess you missed the name of the article.

        • kamikaziechameleon
        • 8 years ago

        I’d also like to see What kinda result we get with crossfire vs SLI in this scenario since its apparent the single gpu stutter is becoming an ever present issue.

    • anotherengineer
    • 8 years ago

    Cyril, do you use fraps to also do the milliseconds/frame benchmarks?

      • Cyril
      • 8 years ago

      Yep.

        • anotherengineer
        • 8 years ago

        Cool.

        Is there anything extra needed for data interpolation or does it just spit out the numbers?

        I am curious to see if spikes I get in CS:S are attributed to micro stutter. (yes I’m a geek)

          • Cyril
          • 8 years ago

          Fraps reports cumulative frame times relative to the start of the test (so the numbers start at 0 and go up incrementally). You have to do a little Excel magic to work out absolute frame times, but it’s not hard.

            • anotherengineer
            • 8 years ago

            Excel………magic……………pffffffffffffff

            Sounds like a job for Matlab !!!!!!!!!! 🙂

            O wait I forgot how to use it too 🙁

    • S_D
    • 8 years ago

    Fantastic article, and it puts a whole new perspective on the frames-per-second debate.

    FWIW I’m playing BF3 on our 40″ Bravia with an overclocked 5770, and when leaving settings on ‘auto’ BF3 picks ‘high’ for just about everything’. It may not be buttery smooth, but I’m impressed by the visuals none-the-less, as I’m holding out an upgrade till the 7xxx series launches early next year.

    I fear my enjoyment of this will be curtailed today anyway, as Skyrim is due to arrive at home and the wife has ‘booked’ the bravia for the next 6 months. Time to move the PC back into the study…

    Definitely do more investigation of this type, please, on future gfx card reviews.

      • bcronce
      • 8 years ago

      time to get a 120hz 27″

        • odizzido
        • 8 years ago

        120hz TVs are terrible, so I really hope you are talking about a monitor 🙁

          • ChronoReverse
          • 8 years ago

          In themselves 120Hz TVs are a good thing but I don’t believe any of them actually takes a 120Hz input so it’ll be doing interpolation garbage.

            • paulWTAMU
            • 8 years ago

            wth is the point if they can’t take 120 input?!

            • ChronoReverse
            • 8 years ago

            24FPS playback is the primary reason. 24FPS doesn’t divide into 60Hz properly so you get frames that end up being displayed longer than others resulting a weird “judder” effect.

            I sometimes wish it wasn’t pointed out to me because now I can’t “unsee” it.

    • Bensam123
    • 8 years ago

    Curiously how do you know that the Radeons aren’t dropping frames to process them faster? What is the implication of doing such a thing too?

    That aside, you guys really have to look into Mechwarrior: Living Legends and Hard Reset. They look amazing graphically… Has anyone else played them so they can add a bit of merit to this?

      • bcronce
      • 8 years ago

      Reporting a frame rendered, but not rendering would be considered cheating. nVidia does drop frames to reduce micro-stuttering, but they still render the frame so it doesn’t render faster.

        • Bensam123
        • 8 years ago

        Is it really cheating though? I’m sure you also read about Lucids take on how they’re trying to reduce micro-stuttering (which does drop frames) and Scotts take on it, which actually makes the whole experience more fluid.

        I’m uncertain how you can render a frame and drop it at the same time…

    • swiffer
    • 8 years ago

    I very much enjoy the Tech Report’s new method of ranking cards. Even though I have no interest in BF3, this was a great read.

      • SomeOtherGeek
      • 8 years ago

      That is exactly what I wanted to say. Thank you, TechReport, keep up the good work. At 3 AM it is still a great read!

      • cheapFreeAgent
      • 8 years ago

      Agreed. Hopefully the developer will note this.
      Thanks, TR.

        • Duck
        • 8 years ago

        I see it more as being educated/enlightened at least 1 step over regular tech nerds.

        The job TR are doing is most excellent.

    • DrDillyBar
    • 8 years ago

    A very good read; But I don’t play BF3.
    I’ve been playing the ME’s in prep for 03/12.
    🙂

    • C-A_99
    • 8 years ago

    At BF3 beta, lowest settings still looked about twice as good as on the 360, perhaps due to glitchy settings and how the 360 takes forever to properly load textures into memory. In any case, 1920×1080 resolution and anisotropic filtering (which is relatively inexpensive compared to how much it improves the image quality) make it look much better.

    • Compton
    • 8 years ago

    I’m readying Skyrim as I read this, and I’m hoping that it’s another “envelope pusher” for the PC.

    I have to confess that I’m not as concerned with the technical stagnation of console ports, but more the gameplay and mechanics becoming vestigial appendages on what should be unquestionably the best gaming platform.

    I still dig the “inside the second” methodology too.

      • yokem55
      • 8 years ago

      I’ve put a couple of hours in with my GTX260 and as far as I can tell it handles Skyrim on Ultra settings fine at 1680×1050. Quite the sharp game too – especially in texture detail.

Pin It on Pinterest

Share This