Battlefield 4 system requirements call for more RAM, storage

Battlefield 4 is one of the most anticipated games of the season, and now, we know what sort of system you’ll need to run it. The official Battlefield Twitter has posted the PC system requirements.

 

The biggest changes from BF3‘s requirements are in the memory and storage departments. You need at least of 2GB of RAM and 20GB of storage to play BF3, but 4GB of memory and 30GB of storage are required by its successor. Keep in mind that those are the bare minimums. EA’s recommended spec calls for 8GB of system memory, doubling the memory recommendation attached to BF3.

Interestingly, the minimum CPU and graphics requirements really haven’t changed. BF4 calls for a 2.4GHz Core 2 Duo, 2.8GHz Athlon X2, or better. On the graphics front, you’ll need at least a Radeon HD 3870 or a GeForce 8800 GT with 512MB of onboard memory. The recommended specs have been tweaked, though. EA now suggests six-core AMD processors instead of quad-core ones. (The quad-core Intel rec is unchanged.) It also recommends newer Radeon HD 7870 or GeForce GTX 660 graphics cards, both of which are mid-range offerings priced around the $200 mark. Oddly, the specs go on to recommend 3GB of graphics memory. Only a handful of GeForce GTX 660s are available with 3GB of RAM, and I can’t find any Radeon HD 7870s with more than 2GB.

Battlefield 4 is due on October 29, so there’s plenty of time to upgrade in preparation for its arrival. Based on the specs changes, it seems like the latest chapter in the franchise will use higher-quality art assets than its predecessor, which already has great graphics. I can’t wait to see what BF4 looks like with all the eye candy turned up—and without the compression artifacts that plague YouTube trailers.

Comments closed
    • Bensam123
    • 6 years ago

    I don’t really know why any of the specs increased. Based on what I’ve seen the game looks identical to BF3. Perhaps they increased the amount of physics or something? Maybe the wave tech?

      • Airmantharp
      • 6 years ago

      Remember BF:BC2? Remember when you could literally raze 99% of the map? You can’t do that in BF3. They actually turned down the physics in BF3, as well as some of the detail, and deliberately shipped lower-resolution textures and so on to keep the game size and the memory footprint down.

      BF4 is a chance to exceed everything they’ve done before; and the specs are actually below what most people would actually want to use for BF3 in a real multiplayer scenario, i.e. a huge map with all 64 players, with all vehicles in play, all wreaking having and blowing stuff up at the same time :).

        • Bensam123
        • 6 years ago

        Yeah, I remember the highly scripted buildings, pretty cool stuff… I didn’t really notice that in BF3, but that’s about right. Now that you mention that, hopefully it’s back in BF4.

          • Airmantharp
          • 6 years ago

          In the end it’s all scripted- it’s just a matter of how fine they want to go, which is a matter of how much resources they want to commit. But the fact that you could actually knock everything down to craters and rubble was very, very cool, and really enhanced the gameplay. Need to enter a building on the other side where there’s no door to hit the objective? No problem, just make one! Stuff like that was pretty cool, and mostly doesn’t exist in BF3.

            • Bensam123
            • 6 years ago

            Physics doesn’t need to be scripted.

            • DeadOfKnight
            • 6 years ago

            I sometimes wonder if people even know what physics is.

            Most games don’t even have any physics calculations.

            • Klimax
            • 6 years ago

            I think he wants full simulation, but I’m not sure that number of Titans with 3930k are sufficient…

            • Bensam123
            • 6 years ago

            You can still do impressive approximations, even if they aren’t ‘pure’. PhysX did a pretty good job of that before it was neutered. Cellfactor comes to mind.

            • Airmantharp
            • 6 years ago

            Well, my perspective (and I’m being naturally anal here) is that any physics that’s based on code, without any analog inputs, is scripted. The possibilities may well be statistically infinite, but it’s still a question of whether or not it’s scripted, but to what degree.

      • JohnC
      • 6 years ago

      Commander mode, dude. Lots of RAM needed for those huge “satellite view” textures 😉

        • Airmantharp
        • 6 years ago

        LOL. And I’m actually looking forward to ‘Commander Mode’.

          • JohnC
          • 6 years ago

          Me too! Can’t wait for EA’s app with “click to buy another Tomahawk for 30 BFcoins” built-in microtransactions!

    • Chrispy_
    • 6 years ago

    Recommended specs? (copy and paste from XBone/PS4 hardware list) Done.

    I smell a super-lazy, console-port cash-in coming….

    Oh wait, it’s EA:

    I [i<]know[/i<] a super-lazy, console-port cash-in is coming [i<]for sure[/i<].

      • Airmantharp
      • 6 years ago

      Hey now, the only reason it’d be a super-lazy, console port cash-in is because these consoles actually exceed the average gaming machine by quite a bit, and exceed all but the highest-end gaming machines with respect to video memory available to games (whether or not they actually use it).

      And that doesn’t offend me in the least, because it means that the game will look great and play great on current hardware while benefiting from future hardware down the road. Remember that DICE/EA saved the more graphically challenging maps for the expansions in BF3? Expect those ‘recommended’ requirements to become ‘minimum’ requirements as the game matures :).

    • SternHammer
    • 6 years ago

    Another PR BS.

    I’ve played the Alpha trial release, and the game plays and looks identical to Bf3.
    It uses the same primitive outdated physics engine since Bf2 and there’s no cover mechanics of any sort. I really don’t see the reason behind needing a new CPU..

    On the graphics side, there’s no tessellation, textures look exactly the same as they did in Bf3 with the addition of the grey tint to cover the ugly low res pop-ups in many areas.

    There’s absolutely no reason in getting anything new for this game.
    Just a waste of $$$ XD

      • JohnC
      • 6 years ago

      Don’t get it, then. Noone is forcing you.

        • Airmantharp
        • 6 years ago

        That, and he just complained that the Alpha actually looked like an Alpha- he must be really pissed!

        • SternHammer
        • 6 years ago

        Gonna get it for 8 hours, complete the campaign and then refund it XD

          • Airmantharp
          • 6 years ago

          So, I’m 99% certain you’re just trolling- but very poorly- and 1% certain you’re just an idiot. Please don’t come back.

            • SternHammer
            • 6 years ago

            You sounded like an angry cheated on girlfriend lol.

            • Airmantharp
            • 6 years ago

            Honestly? I’m just being honest :).

            But I’ll give you the benefit of the doubt:

            Battlefield is all about multiplayer. The campaigns have improved significantly, and I did enjoy the campaigns in BF:BC2 and BF3 as intentionally corny as the first one was and as overtly serious as the second one was, regardless of their B-movie action-flair content. They were at least fun, though quite linear and scripted.

            And I get your ‘8 hours’ comment- play the game, return it on Origin, call it done. I see where you’re coming from. But I also think that, if you’re being serious (the chances of which I’ve outlined above), you’re doing the series a grievous disservice. And that’s the only reason I responded to your post :).

            • JustAnEngineer
            • 6 years ago

            Those are not mutually exclusive possibilities.

            • Airmantharp
            • 6 years ago

            And that’s the point!

    • squeeb
    • 6 years ago

    6 cores for AMD eh…is it really time for me to finally ditch my phenom II x4 965 @ 3.6 ? What are the opinions on current FX chips?

      • JohnC
      • 6 years ago

      They make a nice space heaters. Especially FX-9590.

      • Airmantharp
      • 6 years ago

      Along with JohnC’s jab at AMD’s super-TDP high-end CPUs with guaranteed high clocks, I’ll point out that it’ll take a lot of work on your part to really exceed the performance of your current CPU in any meaningful way.

      If you’re running decent DDR3, your best bet is to get an Intel quad-core and motherboard combo on sale, and overclock the piss out of that, before investing in a new AMD CPU.

      • JustAnEngineer
      • 6 years ago

      It’s probably not safe to ask that question in TR’s processors forum.
      [url<]https://techreport.com/forums/viewforum.php?f=2[/url<]

        • squeeb
        • 6 years ago

        🙂

      • maxxcool
      • 6 years ago

      @3.6hz you will see -0- imrpovment for games. No worse though… and you will see other tasks such as ripping and encoding get better but bd/pd still struggles to beat a mildly overclocked k10/k10.5 core.

      • sschaem
      • 6 years ago

      if you trust this

      [url<]http://gamegpu.ru/action-/-fps-/-tps/battlefield-4-alpha-test-gpu.html[/url<] the 955 @ 3.2ghz get 37/43 fps (min avg) fx-8350 48/57 About 33% faster stock... on a Titan at 1080p @ 3.6 , even if you overclock the FX-8350 10%, its not worth the $179 ... Keep that $ and use it to buy a next gen AMD GPU. (I assume you dont have a GTF780 or Titan already)

    • kamikaziechameleon
    • 6 years ago

    I’m a little off put from getting truly excited about this game because much of the footage they’ve let out is underwhelming. From what I gleamed about this the game is going to be mainly a cash in.

      • JohnC
      • 6 years ago

      …but, but “Commander mode”! Tanks falling through pre-scripted “holes”! Pre-scripted building destruction! Very innovative! Not as much as fish AI in next CoD but still impressive!
      😉

        • Airmantharp
        • 6 years ago

        lol… I’m actually optimistic with everything they’re doing- but I know not to take anything from EA without a reasonably large grain of salt 😀

          • kamikaziechameleon
          • 6 years ago

          To both of you gentlemen. I perceive the changes they show but I’m honestly dubious about weather this needed to be a numerical release. Looks like a map pack honestly. The features are flipping cool but the core game is so similar beyond commander tablet there is nothing else that needs a sequel to justify itself.

            • Airmantharp
            • 6 years ago

            Much the same could have been said about BF:BC2 -> BF3; and yet, so much had been improved that the new game really was a new game, even though they’d actually turned down the detail levels for a number of things!

            I’m expecting many of the changes to be procedural, as I believe that you’re imagining, but I’m also expecting enough new ‘stuff’ to keep the game exciting.

            BF3 is really showing it’s age, by the way. Cool as it was/is, the graphics are highly dated with low-resolution textures, unoptimized levels (from a performance standpoint), and highly lacking in detail for everything that’s not immediately in your face.

            • JohnC
            • 6 years ago

            My dear friend, “dated graphics” is a complete non-issue for multiplayer FPS games. I know many, many players who simply ALWAYS set the multiplayer game on lowest graphical settings (to get best performance and less visual distraction from pretty shadows, lens flares and flying bits of “destructible environment”) and still enjoy playing it, day after day. The “graphics” thingie is ONLY relevant for dedicated single-player FPS/RPG or MMORPG games with fairly slow pacing and large explorable vegetation-filled non-urban areas/levels. Even there it’s still less relevant than the storyline or good dialog writing or good AI scripting.

            The ONLY useful aspect of “improved graphics” in game like BF4 is to impress the simple-minded idiots with game trailers and for use as a benchmark for “armchair hardware experts” (who spend more time throwing bar graphs at each other than actually playing any games) and hardware review sites.

            • Airmantharp
            • 6 years ago

            I’m among those that sets MP games to lower settings to make sure the other bastard dies instead of me (as much as I can help it…).

            But I do appreciate good graphics; I just don’t have the horsepower on either of my gaming systems to reliably play with everything reasonably turned up :).

      • travbrad
      • 6 years ago

      [quote<]From what I gleamed about this the game is going to be mainly a cash in.[/quote<] Why not? It works for Call of Duty. They release what is mostly the same game every single year and have record sales. Cash-ins are apparently what gamers want.

    • odizzido
    • 6 years ago

    I bet the game won’t use anything near 8gigs of ram on max. These recommended and minimum specs are always so off.

      • Airmantharp
      • 6 years ago

      They’re usually off, but they’re usually far too low :).

    • Airmantharp
    • 6 years ago

    –WARNING- Wall of text cross-post from the same thread at the [H] Forums follows, primarily concerning VRAM usage–

    Guys, remember frame buffers- the amount of memory needed for the final rendered ‘frame’ that is output to a monitor or to an array of monitors- are actually extremely small, on the order of 32MB for 4k (that’s 3840x2160x32bpp). It’s not like increasing resolution by itself has any real, immediate effect on VRAM usage until the game starts performing hugely memory intensive operations on a per-pixel basis like MSAA, some shaders, and super-sampling based AA.

    This is even more true for the incoming consoles, which will likely have the most efficient forms of AA enabled and accelerated so that it can always be left on.

    Now, what’s going to eat VRAM in next generation games is going to be assets and those per-pixel shader routines, both of which are necessary in order to take the next step towards photo-realistic real-time rendering, and neither of which we’ve really seen pushed to a consumer-level scale to date.

    Essentially, if a game can even be released to run on the current console generation, including BF4, it does not fit into this category- so yeah, all of us, including myself, will probably be fine with our current generation cards.

    BUT, and here’s the real point- games designed around the resources available on next generation consoles will have FIFTEEN to TWENTY times the graphics resources than those designed around the current generation of consoles. That’s a tremendous jump, and it’s a far larger jump than we saw over the jump from the generation before last, on the PS2, Xbox, and Gamecube to the current PS3 and Xbox 360.

    Now, expound for a moment on what happens when a new console generation arrives. It usually takes a year or two for developers to really get comfortable with the new systems and start bumping into their limits, even though consoles are usually between one and two generations behind the current PC technology, since the overall hardware and software architectures are usually so different, and because there’s usually tremendous memory limitations that need to be creatively minimized.

    Once the developers get comfortable, though, they can start focusing on PC games again. They know how to keep the game engines in check, but they can let the artists, level designers, character designers, and effects engineers (shader coders) go wild, since all of these things can be quickly scaled back selectively for each platform that the game will be released for. Hell, most commercial game engines these days can be used to scale the same game from high-end PCs down to the most limited consoles (Wii U) and even most tablets, and across platforms like Linux, Mac OS, iOS, and Android, at the same time.

    So there’s no real reason for the developers to hold back; they can set their ‘creative’ limits as high as they can afford, and then tailor the assets for each platform release, getting the most out of what each platform is capable of. And that’s where the unprecedented resources available on these incoming consoles from Sony and Microsoft has me excited.

    Essentially, if history holds true, developers will go buck wild with game assets for truly next-generation games. Cross platform games like BF3, Crysis 3, and even the much older Dragon Age II are good examples here. These games ran plenty well on the current geriatric consoles, yet the PC releases were and are capable of bringing even the most high-end PCs to their knees. You think that 1GB of VRAM is enough? Try turning on the high-resolution textures in DA2. It’s not even a terribly complex or wide-open game like Skyrim.

    And that’s my point here- Game developers haven’t had to develop ‘to a point’ for quite some time. If they have 3GB or 4GB of VRAM allocated for a particular game’s graphics, they’ll typically overshoot that by 50% during asset development and then scale it back as they put together the different versions for each platform they’re releasing on. Almost all AAA cross-platform titles that hit the shelf in the last couple of years are examples of this process, where the PC versions shipped with far more graphics assets than the console versions.

    And I don’t expect this trend to stop- rather, since it’s the dream of graphics artists to put as much detail as they can into the games that they work on, I expect it to only get ‘worse’ from a VRAM usage standpoint. If their final target is in the 3GB to 4GB range like I mentioned above, I expect them to build to a 6GB to 8GB asset range, and while that will be scaled down for the console releases, I fully expect developers to give PC gamers the option to try and play their games with all of the assets they created. And why shouldn’t they? Everyone still wins- the graphics artists get to see more of their work in the final product, the developer gets to tout a better looking game, graphics card vendors will have a new, more demanding game that they can throw into bundles and use as a marketing tool to sell more cards, and we, the gamers, get better looking games!

    Final point- this whole situation arises because memory is cheap, which itself arises both because foundries get better at making memory wafers all the time, and because memory die densities constantly increase as new nodes come online, which the foundries relentlessly pursue.

    So you can expect that making an HD7870 or GTX770 with 8GB of RAM is actually pretty easy and cheap to do, as would an HD7970 or GTX780/Titan with 12GB. Actually, we know that it’s easy because the professional versions of these cards already have double the memory of the consumer versions; so it really just comes down to cost!

    And cost is far less of an issue than most think. Yes, GPU vendors usually charge significant premiums for models that have above-average amounts of memory, just like Apple will happily charge you $100 for that extra 32GB of flash on your iDevice even though it costs them all of $3 to $5 more to produce.

    But there’s plenty of other factors here that can quickly nullify the cost issue- demand, cost of goods, competition, and economy of scale all work together to bring the cost to the consumer down faster than you’d expect. In reality, when games start shipping that can actually make use of ~6GB of VRAM on the PC, the early adopters at that time (and now, if you bought an HD7970 or GTX Titan with 6GB) will pay a higher price, but competition will very quickly erase that premium.

    So, here’s what I expect. By the time that the holiday season rolls around next year, a smattering of games PC gamers actually want to play that can make use of 6GB to 8GB of VRAM per GPU will be on the market, and GPUs with 6GB+ VRAM per GPU will be available at reasonable prices, and will be based both on respins of the GPUs coming out this fall, at least from AMD, and on GPUs from both AMD and Nvidia coming out on TSMC’s next node, whatever that turns out to be.

    In the interim, and in summary, my advice to anyone that has a competent graphics setup today is to wait. When AMD releases their stuff in the next month or so, and then that stuff goes on sale for the holidays, that’s the time to buy- just make sure you buy something with at least 6GB of VRAM, only settling for 4GB cards if you absolutely must. And good luck!

      • JohnC
      • 6 years ago

      TL; DR version? :-/

        • Airmantharp
        • 6 years ago

        I did put the warning at the top… and no one piece can really be said without the support of the other pieces, in a public discussion like this.

        • B.A.Frayd
        • 6 years ago

        Read the last paragraph.
        You should read the whole thing because it is well stated and reasoned, but I understand lazy. 😉

      • kilkennycat
      • 6 years ago

      [quote<] In the interim, and in summary, my advice to anyone that has a competent graphics setup today is to wait. When AMD releases their stuff in the next month or so, and then that stuff goes on sale for the holidays, that's the time to buy- just make sure you buy something with at least 6GB of VRAM, only settling for 4GB cards if you absolutely must. And good luck! [/quote<] Maybe by that time AMD will have a stable WHQL driver with no residual CrossFire issues.... 🙁 🙁 Better keep wallets closed and wait until all the next-gen GPU offerings from BOTH AMD and nVidia are exposed, methinks.....

        • Airmantharp
        • 6 years ago

        Oh, I’m personally watching the Crossfire stuff closely- my intention is to upgrade my 30″ 1600p monitor to ~30″ 4k, and AMD tends to be the best value for such setups. But it damn well better work flawlessly.

          • HisDivineOrder
          • 6 years ago

          If you want two cards that work flawlessly together, then… well, you need to readjust your expectations. If you want to have the two card-setup that has the LEAST problems and is MORE OFTEN useful, then I suggest you not lock your brain on, “saving money” because you’ll be inviting yourself to have problems if you go with the platform by the company that didn’t even figure out that frame latency was an issue on their own until their competition decided to do the honorable thing and walked them step by step through how to check the issue after several sites pointed out how badly AMD was screwing it up (after years of doing so).

          Consider that AMD only just figured this out in 2013. Do you really expect they could implement a hardware fix to their problem (like the frame metering that nVidia’s built into SLI for years now and advertised that they did, too) when the hardware they’re putting out in a month-ish is the same hardware that should have come out at the beginning of this year if AMD wasn’t the company that delays things just cuz?

          You just have to have realistic expectations. AMD is fixing the Crossfire issues right now with 7xxx series on a case by case basis, taking their time, and only slowly catching up to “important” games (read: benchmarks). If you want full support for systems with two GPU’s, don’t try to save the petty cash. Pay up for nVidia and be sure of your purchase.

          It’s sad because AMD has great hardware otherwise with lots of VRAM, but when you can’t count on them to do the most basic things in one of their main business markets, well… there’s the reason they’ve traditionally been the “el cheapo” route. They constantly screw up the little things.

            • Airmantharp
            • 6 years ago

            Well, I’m running a pair of 2GB GTX670’s in SLi right now (and an x8/x8 PCIe 2.0 configuration at that), and it works pretty darn well. As smooth as one GTX670 would at 1080p? Not a chance. But two cards do work far better at 1600p than one would; and in reality, it was the cheapest, fastest, quietest way, at the time, to get the performance I was looking for, over a pair of 2GB HD6950s, that were the previous champions in that regard (except, you know, in playability).

            AMD will have to prove that they’ve got it down, and they’ll have to exceed Nvidia with this next generation in every meaningful way. And I believe that they can do it.

            So here’s the thing- AMD’s ‘large’ GPU is far more capable, resource-wise, than Nvidia’s mainstream GPU (used in the GTX670/680/770 at least), and should have been faster all along. AMD also has a propensity to include more VRAM on their cards, which originally pushed me to HD6950’s instead of getting a second GTX570 SC, as I only had the 1.25GB version and the 2.5GB version was far too expensive- though in hindsight if I’d gone that route I’d still be running a pair of them. AMD really falls down in two key areas for me, though, which are:

            1. Drivers. I don’t mean general stability or compatibility, I mean AMD’s ability to properly finesse their drivers to consistently produce excellent gameplay experiences. Skyrim was great on my HD6950’s (once they figured out the CF issue), but BF3 was really bad. I just thought it was the game until I read more, and then bought a single 2GB GTX670 on release. Technically, the single Nvidia card was at best 80% as fast as my AMD cards, but in BF3 at least, it felt like it was twice as fast- even while putting out lower average FPS.

            2. Cooling. Holy shit AMD. Nvidia has this down- their Titan cooler is the new benchmark, and before that the GTX690 cooler was, and before that, the GTX680 cooler was, and all of them are better than anything AMD has put out to date. AMD’s coolers on this next generation better be blowers that knock it out of the park. I want cool cards, with good boosting, that can overclock a decent amount while sacrificing only a few dB over stock, that are silent at idle and produce a muted, neutral tone whisper under full load. And I don’t want any of that open-air crap that their vendors have been pushing, I want blowers that get all of the exhaust air out of the case right away.

            Oh, and yeah- one more thing. I want at least three DP ports, and at least two HDMI 2.0 ports, and I want retail availability of an inexpensive selection of powered MST hubs that allow multiple DVI/HDMI monitors to be hooked up, I want all of those panels to be fully synchronized, including panels with differing refresh rates, and I want to be able to push up to 7.1 audio out any or all of those ports, including the ones beyond the MST hubs, with a separate audio device available to Windows for each interface.

            Make it happen, and earn my dollars. Or don’t. Nvidia sure is looking good lately…. 😀

          • kamikaziechameleon
          • 6 years ago

          as an AMD single GPU consumer, don’t ever bet on their drivers. I love their hardware to death but their driver support is serviceable at best. Its the drivers that keep the cards from working well in light workstation environments, its the drivers that make multi monitor configs so quirky, its the drivers that don’t let you push audio over HDMI w/o a video signal.

      • torquer
      • 6 years ago

      Its easy to see what your VRAM usage is in game with evga PrecisionX. Even better if you have a G15 or comparable keyboard. At 2560×1440 on a pair of GTX 670s I’m using around 1700MB of VRAM at any given time in BF3. If I turn on MSAA, I’m going over 2GB (which is the physical RAM on these cards). Theres a noticeable drop off in framerate but its hard to tell how much of that is due to VRAM use and how much is due to the demands of the MSAA calculations.

      I have a pair of 780s spanning 3 monitors on another desktop and without MSAA I’m at around 2200MB of VRAM used. 4xMSAA bumps me up to around 2800, close to the 3GB physical RAM on these cards.

      Crysis 3 uses relatively similar amounts of VRAM but tends to be more than BF3 (with appropriately better looking textures).

        • Airmantharp
        • 6 years ago

        Thanks for sharing your experience- I also find that I cannot enable any MSAA at all at 1600p in BF3 on my 2GB GTX670s while maintaining playable performance in multiplayer.

        Now, imagine just how much VRAM developers will be using when they have ~4GB for graphics available on consoles as a development target; that’s why I’m pushing for 6GB minimum, 8GB to 12GB preferred depending on the memory controller, for current and future GPU purchases actually meant to last.

      • DeadOfKnight
      • 6 years ago

      Memory may be cheap, but that hasn’t stopped board makers from charging you $60-$90 to double up on GDDR5 VRAM.

        • Airmantharp
        • 6 years ago

        I did allude to this issue above with an Apple iDevice analogy, but essentially, supply and demand takes care of the costs when competition kicks in. It’s really just a matter of time, because if it doesn’t, it results in anti-trust and collusion litigation for all parties involved.

    • Krogoth
    • 6 years ago

    I call BS on the recommend specs. It is probably build with the assumption that you are going crank up the AA/AF at 2Megapixels and beyond. 3GB and more VRAM is required for high levels of AA/AF on the current crop of modern titles that push the GPU envelope.

    8GiB for system memory is dubious at best. BF4 probably uses a little more than 4GB of system memory at most. It may use around 6GiB if they are taking 64-128 player matches into account. There’s a large number of users that operate 32bit CPUs and OS. EA/Dice isn’t stupid enough to alienate its userbase. That’s assuming BF4 doesn’t suffer from severe memory leak issues at launch…..

    Quad-core CPUs are going to be a practical requirement, although hex and octal cores from Intel aren’t going net any tangible benefits, since BF4 is going only use up to 4 threads at most. Pilediver and Bulldozer-based units have a large uphill battle here, unless BF4 is integer-depended.

    I think it is probably a good idea to sit this one out, before jumping the gun. The BF franchise is always plagued by stupid issues (balance, performance, server-end stability) at launch that take almost a year to address.

      • Airmantharp
      • 6 years ago

      I call BS on your BS call :).

      I take ‘recommended’ specs as the specs that will ‘mostly play the single-player game okay’. I fully expect to need at least 50% more resources to play the game decently at 1080p60Hz, and obviously more if you plan on playing at a higher resolution or refresh rate.

      Has your experience with the recommended specs on games been different than that? Even BF3 followed that pattern :).

      As for release-day bugs; I agree with you, but I’ll also point out that DICE has had a very long time to iron most of this stuff out, so there’s a chance that they’ll actually be able to pull off a smooth launch. Not likely, but definitely a chance, since most of the stuff that could/should go wrong already went wrong with BF3 and was mostly fixed.

        • JohnC
        • 6 years ago

        Relax, he’s just trying to “troll” people 😉

        • Krogoth
        • 6 years ago

        BF3 grossly overstated its recommend specs at launch, unless you were trying to play a 32-64 player match when things got very hot (30 players spamming explosives and stuff in the same general area). Not the even most high-end platform at launch could handle it without becoming a slide-show.

        I expect the same thing for BF4. To be honest, BF4 is nothing more than an evolutionary upgrade of BF3 with the Commander returning. Where BF3 was really an evolutionary upgrade of BF:BC2 with jet aircraft thrown in.

          • Airmantharp
          • 6 years ago

          You sure?

          Actually, I know you are, it’s just that you’re wrong. The ‘high-end’ platforms at the time most certainly could handle a full-on 64 man bash, but that’s what it took, and the recommended stats were grossly understated for that purpose.

          And for most of us, that ‘purpose’ was the whole point of BF3.

        • DeadOfKnight
        • 6 years ago

        Not everyone has the same standards of performance. The 120Hz guys want to run at >120 fps all day, every day. The devs probably set these for 30 fps minimum at 1080p. Still, 50% more resources only makes sense for some resources and not others. You can get a 50% improvement with a gpu upgrade alone.

      • chuckula
      • 6 years ago

      So Krogoth… you aren’t impressed with these specs?

      • PixelArmy
      • 6 years ago

      *Facepalm*

      TL;DR I confuse the min and recommended specs.

        • Krogoth
        • 6 years ago

        No, just smelling BS when I see it. That’s unless EA/Dice is being honest about the state of BF4 at launch (buggy, memory leaks everything), because EA (not DICE) wants to catch holiday season rather then releasing a solid title on launch.

          • Airmantharp
          • 6 years ago

          Given that BF4 really is an evolution and extension to BF3, and there’s nothing wrong with that, I fully expect this launch to go far better than the BF3 launch did- and BF3 launched fairly well, given the number of moving parts involved. We played it from the first public beta on.

      • Srsly_Bro
      • 6 years ago

      I call BS on both of you!! *Evil laugh*

      • bcronce
      • 6 years ago

      I remember reading the BF3 blog and one of the things that caught my eye was them mentioning something like a 300MB frame-buffer for the working-set.

      The current working frame had a lot of meta-data with several different types of frames and even past frame information.

      Most of this data was worked on by the compute system and even some by the CPU.

      Once each frame-type was finished, then the final step would composite the different frames together into one regular 12MB frame.

      • maxxcool
      • 6 years ago

      cputemp sasy bf3 utilizes all six of my k10 cores each at around 80% utilization.

        • Airmantharp
        • 6 years ago

        That’s cool! What’s you’re testing situation and CPU speed/configuration?

          • maxxcool
          • 6 years ago

          um amd thuban k10… aka 1090t running 4ghz with 16gigs of 2100ddr3 gskill.

          just pop open core temp and have it log the cpu load on a 1/sec polling rate.

            • Airmantharp
            • 6 years ago

            Wow, that’s quite nice.

            But what testing environment in game? BF3 varies considerably depending on a number of factors, between various game settings, the map being played, your location and point of view on the map, and the degree of action going on, etc.

    • GasBandit
    • 6 years ago

    All those requirements I can stomach.

    The one I can’t? Origin.

      • JohnC
      • 6 years ago

      Well, then don’t.

      • Airmantharp
      • 6 years ago

      I have cut-rate aluminum foil to sell you. Send me a PM!

        • DeadOfKnight
        • 6 years ago

        There’s more reasons to hate Origin. Like how I have to use a VPN to make it think I’m in the US, otherwise the whole thing is in Japanese and language settings won’t fix it. It’s garbage software thrown together to compete with Steam. They should have just stuck with web-based like all the others, but oh no they can’t do that. That would undermine their plans for DRM.

          • Klimax
          • 6 years ago

          Funny, I am more likely to get upset with Steam, then Origin. At least Origin doesn’t lock up main window. (Also DL manager seems more intelliegent.)

          And lastly, Origin has better DLC management. (There are more issues with Steam, just the one I encounter often)

      • Farting Bob
      • 6 years ago

      Why? It doesn’t interfere with your gaming, treat it like a launcher for your EA games.

        • JohnC
        • 6 years ago

        …but it’s evil! It steals your “order confirmation” emails and uploads them to IRS!

      • kamikaziechameleon
      • 6 years ago

      In this market with the selection of games that are out there never feel the requirement to buy a game if it goes against your consumer rights or personal preferences.

      • bittermann
      • 6 years ago

      Agreed…let me attach it to my Steam account then GTFO…

    • TREE
    • 6 years ago

    Recommended GPU’s GTX 660 and HD 7870
    Recommended GPU memory 3GB

    GTX 660 tops out at 2GB…

    Dice, still not taking PC seriously!

      • Bomber
      • 6 years ago

      There are actually several models of GTX660 that include 3GB of RAM.

        • Airmantharp
        • 6 years ago

        And none that couldn’t ship with 4GB, 6GB, or 8GB in a heartbeat, depending on memory controller configuration.

    • Wildchild
    • 6 years ago

    If 8gb’s starts to become the norm this quick, then I won’t feel as bad about buying 16 gb’s of ram a year ago knowing full well I wouldn’t utilize all of it.

      • cmrcmk
      • 6 years ago

      I got 8GB in the computer I built a few years ago and never until last week have I seen it page to disk. Theses BF4 specs are just confirmation that it’s time for a new build with more RAM.

        • bcronce
        • 6 years ago

        My 6GB of ram system pages all the time. Just running Bittorrent and Chrome turns into about 8GB of commit in a few days.

          • travbrad
          • 6 years ago

          I think you need a new bittorrent client. Mine uses 30-100MB maximum. Chrome is a memory hog though I’ll give you that.

            • Airmantharp
            • 6 years ago

            Chrome is usually my worst offender too- but then again, I have at least 20 synced tabs open at all times. And it doesn’t really slow my system down (at all), except for the RAM usage, and I have plenty of RAM to go around :).

            • cygnus1
            • 6 years ago

            no idea how to tell how much ram chrome is using total… but currently open in chrome: 10 windows, 120 tabs

            16GB ram, 10.2GB commit, 7.2GB physical memory in use.

            • Airmantharp
            • 6 years ago

            Yeah, it’s like that :).

            • cygnus1
            • 6 years ago

            Yeah, I don’t really use bookmarks so much. I just pop a tab and save it for later. Then once I’ve read it or I know I won’t want to look at it again I close it. I remember I had TR FNT about good books open for over a year and was referencing it occasionally when looking for something to read.

        • Pettytheft
        • 6 years ago

        What are you running that required a page to the disk with 8 gigs of ram? I have 16 Gigs but I even with multiple browsers, tabs and background programs I don’t go over 8 Gigs when I game.

          • cmrcmk
          • 6 years ago

          I think what triggered it was when I was running two VMs + Chrome + Firefox (web development) and then I tried to take a break by firing up Civ V. My development setup usually consumes about 60-70% of RAM after several hours and I was hoping Civ wouldn’t push it too far, but I was wrong.

      • colinstu12
      • 6 years ago

      Happy my X79 machine is rocking 64GB.

      People thought I was crazy.

        • willmore
        • 6 years ago

        Still do. +1 anyway.

          • Airmantharp
          • 6 years ago

          Yup! But seriously, I regularly exceed 8GB of memory usage on both my gaming desktop and mobile workstation. Not for gaming, mind you, but I use the RAM all the same, and I can see the need for 32GB of RAM in my next system (whenever that happens).

            • B.A.Frayd
            • 6 years ago

            What are you doing that requires so much ram, if you don’t mind me asking?

            • Airmantharp
            • 6 years ago

            I’m actually pretty sure that it’s just Windows 8 (and 7 before it) doing it’s prefetching thing, to be honest. I have a Windows 8 x64 install with a mobile Sandy quad i7 and 6GB of RAM, and it handles the same desktop tasks with ease, including the picture editing.

            But when I start actually pushing the system- rendering JPEGs from Lightroom, movies from Premiere, layered photos in Photoshop, etc.- I’ve easily approached 15GB of memory usage, which is my max due to parting off 1GB to the integrated HD graphics, as I have a pair of monitors hooked up to that as well (three on one GTX670, two on the integrated video).

        • Anomymous Gerbil
        • 6 years ago

        …so you can play BF4 simultaneously in eight VMs?

      • bittermann
      • 6 years ago

      So you all equate poor programming with memory leaks for needing more than 8Gb of ram…got it!

        • Airmantharp
        • 6 years ago

        I don’t equate anything to needing more than 8GB of RAM- except actually needing more than 8GB of RAM. The real issue here, though, is that BF4 has been developed in a 64bit world, and that bears out both in the graphics and gameplay fidelity they’re delivering, and in the requirements to achieve that fidelity.

        As for everything else- well, there are lots of crappy programmers out there. But I’m not one :D.

    • DeadOfKnight
    • 6 years ago

    8GB. That’s a first.

      • Waco
      • 6 years ago

      I wonder if they’ll actually ship a 64-bit binary…

        • ClickClick5
        • 6 years ago

        I’ll put down $64 that they don’t.

        EDIT: I would LOVE it if they ship a native 64bit exe, but, being honest here….will they?
        Last game I remember having a native 64bit exe was Crysis. The warhead had a patch for a 64bit exe….then ghost town.

          • Airmantharp
          • 6 years ago

          I’ll put down $64 hoping that you’re wrong- but you can only collect by coming to Dallas and letting my buy you dinner if you’re right :).

          • bcronce
          • 6 years ago

          LAWL about Win8, but yeah.

          “‘We’ll have Frostbite-powered games in 2013 that will require a 64-bit OS,’ Andersson told his followers (his emphasis.) ‘If you are on 32-bit, [it’s a] great opportunity to upgrade to Windows 8.'”

          [url<]http://www.bit-tech.net/news/gaming/2012/05/23/frostbite-engine-64-bit/1[/url<]

        • Haserath
        • 6 years ago

        They recommend a 64-bit OS and 8GB of memory along with 3GB of VRAM. I’d be surprised if they didn’t!

          • Waco
          • 6 years ago

          I wouldn’t be. It’s EA.

          EDIT: To make myself clear – I bet the only reason they’re recommending a 64 bit OS and 8 GB of RAM / 3 GB of VRAM is that the game will be able to use 2 GB without issue and your memory space won’t be polluted by memory mapping.

        • homerdog
        • 6 years ago

        A better question is whether they’ll ship a 32bit binary, since it was stated very clearly months ago that the game would be 64bit only. They may have changed their mind since then.

          • Airmantharp
          • 6 years ago

          They list 32bit Vista as an option- so yeah, they’re shipping the 32bit binary. I just wouldn’t want to be the poor bastard who has play on that system :).

            • homerdog
            • 6 years ago

            So they did change their minds! And I didn’t read the article!

            • Airmantharp
            • 6 years ago

            Or look at the top line of the picture posted above? Really?

      • Aliasundercover
      • 6 years ago

      8GB is $60 of memory.

        • TheEldest
        • 6 years ago

        And a 4 core processor is $200 of processor?

          • Airmantharp
          • 6 years ago

          If you’re actually serious about playing games, you’ve had a quad-core CPU for years. If you’re not- well, welcome to PC gaming!

Pin It on Pinterest

Share This