Battlefield 1 system requirements ask for a Core i5-6600K

The system requirements for EA's upcoming World War I-inspired first person shooter Battlefield 1 are out. While the video card requirements are surprisingly modest, the baseline CPU that developer DICE recommends is a Core i5-6600K, a quad-core CPU with 3.9 GHz turbo clocks and 6MB of cache. That requirement looks to be a bit of hyperbole, since DICE also states that an AMD FX-6350—a much slower CPU—will handle the game too. Full system requirements are shown below:

Minimum

  • Windows 7/8/10 (64-bit OS required)
  • Intel Core i5-6600K or AMD FX-6350 CPU
  • 8 GB of RAM
  • NVIDIA GTX 660 2GB or AMD Radeon HD 7850 2GB graphics card
  • 50GB of free HDD space

Recommended

  • 64-bit Windows 10 or later
  • Intel Core i7-4790 or AMD FX-8350 CPU
  • 16 GB of RAM
  • NVIDIA GTX 1060 3GB or an AMD Radeon RX 480 4GB

Even the recommended configuration's graphics card spec is relatively modest compared to what some gerbils are packing in their chubby little computers. The Frostbite engine has always scaled well to various levels of performance, and DICE says that BF1 has is the most optimized version of the engine yet. Battlefield 1 hits Origin on October 21, although purchasers of the deluxe edition (or EA Access members) can get a three-day head start on October 18.

Comments closed
    • dikowexeyu
    • 3 years ago

    Oh no! it requires a processor 5% faster than a decade old one!

    I will not buy a 5% faaaaster processor just to play this game.

    • Welch
    • 3 years ago

    Betting money my 2500k is still more than enough :P. Ohhh and look, my 7850 meets the “Minimum” specs too. This is Dice we are talking about, so I don’t suspect that the game is entirely optimized worth talking about. Man, I’ve gotten so much out of this rig it’s not even funny.

      • StefanJanoski
      • 3 years ago

      I played the beta with a 2500K, stock clocked at the time, and it was mostly running 80+fps on ultra settings. So, yeah, your 7850 will be a bottleneck long before your 2500K, especially if it’s overclocked.

    • Smeghead
    • 3 years ago

    Of course you need a fast CPU; all those glitches really eat cycles, dont’cha know…

    [url<]https://youtu.be/gw9KcqQSRRI[/url<]

    • Chrispy_
    • 3 years ago

    Those minimums sound bogus.

    Looks like it needs a 4T CPU, 8GB RAM and a boatload of disk space. I know for a fact it runs on integrated graphics (a lowly HD4000) so the GPU [b<]requirement[/b<] is just silly. It doesn't even run that badly on an HD4000, admittedly 15fps isn't exactly competitive but I'd imagine something more modern like an HD 530 would provide enough frames for casual players. [i<]edit - I forgot to mention that the HD4000 were integrated graphics on an [b<]i3-3110m[/b<]. The world did not end because I wasn't using an i5-6600K[/i<]

    • Ninjitsu
    • 3 years ago

    Funny thing is, all the CPU benchmarks for Frostbite I’ve seen since BF3 indicate that beyond a dual core, 4 thread CPU, you’re not going to see much of a difference in terms of FPS.

    Don’t know what it means for frame times though.

    But yeah, for Battlefield, general trend has always been that GPU matters more than CPU, after a point.

      • Krogoth
      • 3 years ago

      Multiplayer games in BF3 are CPU-bound if you have 64-player matches during heated engagements (lots of fireworks and effects going on).

        • Waco
        • 3 years ago

        Are there any benchmarks showing this? I’ve heard it for years, but I don’t know that I’ve ever seen proof.

          • Krogoth
          • 3 years ago

          It is because it is almost impossible to properly benchmark a multi-player game since you cannot fully replicate results of a “timedemo”.

          Battlefield franchise does pull a lot more strain on the CPU. Ff you take a glace at CPU utillization during a big MP match versus going SP only.

            • Waco
            • 3 years ago

            Even anecdotal evidence is welcome. 🙂

            • Ninjitsu
            • 3 years ago

            Same, I’ve heard this for years (admittedly forgot about MP while making my original post), but I’ve never seen proof.

            And sure, it’s hard to replicate in exactly the same way, but I think if there is indeed a noticeable gain to be had by a faster or more threaded CPU in MP vs SP then it’s a fairly straightforward thing to show.

            • ikjadoon
            • 3 years ago

            [url<]http://www.overclock.net/t/1438222/battlefield-4-ram-memory-benchmark[/url<] If RAM increases FPS...there's a CPU-centric bottleneck.

    • puppetworx
    • 3 years ago

    This is so stupid that only marketing can be responsible. Well at least they got people talking about their game.

    • Fieryphoenix
    • 3 years ago

    Gosh, no wonder the beta ran well on my 6600k.

    • Vaughn
    • 3 years ago

    lol so based on this I better throw my i7-970 @ 4.2 Ghz in the dumpster.

    Which I might add is still faster than a FX-8350!

    [url<]http://gamegpu.com/action-/-fps-/-tps/battlefield-1-open-beta-test-gpu[/url<]

      • fullbodydenim
      • 3 years ago

      I believe I got consistent slowdown on the Tatooine level in SW:BF on my AMD FX-8350 stock clocked at 4GHz than when compared with my Xeon W3570 3.2GHz which is equivalent to a Core i7 960, both with a Radeon R9 285 2GB.

    • Krogoth
    • 3 years ago

    I smell a ton of BS or really bad optimizations at work here. BF1 is a marginally updated BF4 using WWII-style warfare with WWI-reskins.

    If your system can handle BF4 and the new Star Wars Battlefront just fine then it most likely handle BF1 without an issue aside from glaring optimizations issues *cough* Arkham Knight *cough* Deus Ex: Mankind Divided *cough*.

    For crying out loud, a bloody Xbox One and PS4 can handle BF1 at 720p60 without a hitch. They are equipped with CPUs that are far weaker than a i5-6600K and i7-4790K despite having “eight” Bulldozer cores.

    Methinks, it is because the development team only had time to test the game on latest, hottest hardware platforms out there. They just want to cover their asses.

      • K-L-Waster
      • 3 years ago

      Was thinking of pointing out DICE’s recent.. ahhh… *close* relationship with AMD as a possible contributor…. but the GPU recommendations don’t really fit with that theory.

      I say we chalk it up as “just plain nonsense.”

      • Firestarter
      • 3 years ago

      BF1 runs as well as or even better than BF3 did for me. I don’t know what they were smoking when they made these requirements up but it must have been some grade A ganja

      • Voldenuit
      • 3 years ago

      [quote<]For crying out loud, a bloody Xbox One and PS4 can handle BF1 at 720p60 without a hitch. They are equipped with CPUs that are far weaker than a i5-6600K and i7-4790K despite having "eight" Bulldozer cores.[/quote<] Jaguar cores, which are more Atom-class, but I see the point you were making.

      • the
      • 3 years ago

      The consoles don’t even have Bulldozer cores, they have the weaker Jaguar mobile cores.

    • Rza79
    • 3 years ago

    I don’t think it’s a matter of the actual CPU performance but more that the engine is very optimized for 6/7-threads or better said consoles. Probably that’s why 8-thread CPUs are recommended.

      • K-L-Waster
      • 3 years ago

      Then why wouldn’t they specify a Sandy Bridge i7?

        • Bensam123
        • 3 years ago

        Hyperthreading isn’t the same thing as modules found on AMD processors.

          • Rza79
          • 3 years ago

          True but a Hyperthreaded CPU still handles 8 threads better than one without. Especially if those threads are not too heavy.

        • Rza79
        • 3 years ago

        Because that’s obsolete from a marketing point of view. Surely AMD and Intel have some say in these recommended specs.
        For the same reason you could ask why they don’t put a Phenom X6 as a minimum for AMD.

          • Krogoth
          • 3 years ago

          It is because marketing is still stuck in “Moore’s Law” meme which has been dead for almost a decade now.

          The semiconductor industry is just starting to creep up onto a plateau.

    • egon
    • 3 years ago

    Tried to make sense of it and thought perhaps this might be some sly Intel marketing strategy, but most probably, it doesn’t make sense for the same reason an 8-foot tall Wookie living on Endor with a bunch of 2-foot tall Ewoks doesn’t make sense.

      • chuckula
      • 3 years ago

      [url=https://www.youtube.com/watch?v=clKi92j6eLE<]IF CHEWBACCA LIVES ON ENDOR YOU MUST ACQUIT![/url<]

        • Voldenuit
        • 3 years ago

        If Chewbacca lives on endor you must Ctrl-Quit.

      • JustAnEngineer
      • 3 years ago

      Wookies come from Kashyyyk, not Endor. Turn in your geek card.

        • ColeLT1
        • 3 years ago

        It’s a southpark quote. The wookie is out-of-place on Endor.

    • Billstevens
    • 3 years ago

    For reference the FX-6350 passmarks below an i5-3570k so an Ivy Bridge chip… You can probably play fine on a freakin 2500k Sandy Bridge/ 4-5 year old chips…

      • Growler
      • 3 years ago

      The GTX 660 is also 4 years old, so that makes sense.

      • _ppi
      • 3 years ago

      To me this looks like Intel is paying game devs to artificially inflate game requirements.

    • Billstevens
    • 3 years ago

    CPU requirements always seem to traditionally be pretty stupid or at least targeted only what is available today. So its marketing crap… Which is abundantly clear when you have a brand new mid to high end intel quad core sitting next to an AMD chip that gets railroaded by probably the last 3 generations of intel.

    • Concupiscence
    • 3 years ago

    You look at the system requirements, and then you check YouTube because various people with lots of time on their hands record video playing new games on all kinds of computers. And you find [url=https://www.youtube.com/watch?v=uG0F2LDdSJ0<]this[/url<] - Battlefield 1 running on an A10 7850K with single-channel memory and no discrete video. The IGP's letting it down, but there's nothing really CPU-blazing going on here. What is EA's QA department smoking? Any i3 or Athlon X4 should be able to run this without problems.

      • jokinin
      • 3 years ago

      Agreed, and so any i5 will be enough for that game. And the i5 6600K will be faster than that poor AMD in gaming.

        • Concupiscence
        • 3 years ago

        Given that the 7850K’s raw performance numbers look a lot like a seven year old Core i5 750’s, I think any i5 could manage too. That probably even applies to Clarkdale parts, which are first-gen i3 chips in all but name.

      • TwistedKestrel
      • 3 years ago

      Re: that A10 7850 – wow. That actually went shockingly well.

        • Chrispy_
        • 3 years ago

        If you read the comments you realise he’s running that with single-channel RAM too. Another commenter says he gets around 35fps in dual-channel mode with the same A10 7850

      • travbrad
      • 3 years ago

      Yep. I played the beta on my overclocked 2500K and I think it actually ran better than Battlefield 4. The minimum for BF4 was “AMD Athlon X2 2.8 GHZ Intel Core 2 Duo 2.4 GHZ”

      • Andrew Lauritzen
      • 3 years ago

      FWIW Skull Canyon ran the beta @ those settings at ~60fps… just to give people some perspective on the whole “AMD SoCs are better are graphics” meme 😉 (Or 1080p medium @30.)

      I’m not sure why the CPU requirements are so bizarre. I could definitely see wanting a real quad core for multiplayer but beyond that unless you’re trying to run 120Hz+ or something I’m not sure it’s going to make a huge difference.

        • chuckula
        • 3 years ago

        That IGP running in a 45 watt power envelope compared to the 95 watt 7850K [b<]DOESN'T COUNT[/b<] because [b<]REASONS[/b<].

          • Andrew Lauritzen
          • 3 years ago

          People will point out that it costs a lot more – which is totally fair – but I think most can agree that it’s still an impressive piece of technology and architecture.

          I was initially going to point to the 15W i7 NUC w/ Iris 540 getting 40-50fps, but that would have been rubbing it in a little much 😉 [url<]https://www.youtube.com/watch?v=wFtnTHALxjw[/url<]

            • Magic Hate Ball
            • 3 years ago

            Let’s reconvene after Raven Ridge lands 🙂

            • chuckula
            • 3 years ago

            If Raven Ridge doesn’t have on-package RAM then it won’t beat that Skylake at GPU.

        • Concupiscence
        • 3 years ago

        edit: Apparently the 7850K in the linked video’s running in a single channel memory config, so that should be taken into account… It’s hard to perform an honest comparison of two cars when one’s been smacked by an overaggressive governor. The rest of my post follows, unedited.

        I’ve no doubt. Skull Canyon’s also a much newer part. But Intel should be proud: they’ve gone from being the “well, if we [i<]have[/i<] to, at least it's not SiS" option a decade ago to being very good within the thermal and bandwidth limits an IGP imposes. That's even more true in Linux; you guys are killing it over there. I suspect the CPU requirements are inflated because nobody bothered to test on anything lower. At least it's not as egregious as the Steam release of mid- to late-90s Win32 games, which frequently suggest a Core 2 Duo to run software released in the days before [b<]MMX.[/b<]

          • Andrew Lauritzen
          • 3 years ago

          Yeah unfortunate about single channel memory, but even if you double it’s performance, it’s still outperformed somewhat by a 15W NUC/tablet and by a lot by a 45W NUC.

          Yes old vs new and yes I’m excited as much as anyone to see what AMD brings to the table with their next generation, but I do think it’s worth pointing out how far ahead Intel is at the moment on graphics (not to mention features and API support) as it’s not something most folks would be able to accurately quantify if asked their “feelings” 🙂

          Anyways the real kudos is to Frostbite/DICE here: the game looks great even when running on lower settings and it’s awesome to be able to play something like it reasonably on a tablet (SP4)!

      • Rza79
      • 3 years ago

      And he’s running it in a single channel memory configuration. LOL

        • Concupiscence
        • 3 years ago

        Christ, I wish they’d mentioned that… 😡

      • southrncomfortjm
      • 3 years ago

      Wouldn’t the CPU requirements increase with higher graphical fidelity? So, if you go from something an IGP can barely handle, to ultra settings, wouldn’t you get a pretty huge increase in CPU load?

    • selfnoise
    • 3 years ago

    I played in the beta using an i3-4150 and the game ran great. I wouldn’t worry about this, it’s common in game requirements to list something ludicrous on the intel side for whatever reason.

      • Noinoi
      • 3 years ago

      Makes you wonder if it’s intentional, due to lack of hardware, or if someone actually meant to write in a first-gen i5 part number and someone “corrected” the model number. If I’m not wrong, even a Sandy Bridge i3 will steamroll the game?

    • RAGEPRO
    • 3 years ago

    The part that really cracks me up about this is that the [url=http://ark.intel.com/products/88191/Intel-Core-i5-6600K-Processor-6M-Cache-up-to-3_90-GHz?q=Core%20i5%206600K<]Core i5-6600K[/url<] is [i<]probably[/i<] faster in most games than the [url=http://ark.intel.com/products/80806/Intel-Core-i7-4790-Processor-8M-Cache-up-to-4_00-GHz<]Core i7-4790[/url<]. Also that the official requirements specifically list the "FX-8350 Wraith". I guess all those guys running Hyper212s and Noctuas on their FX rigs are out of luck.

      • DancinJack
      • 3 years ago

      TOO SLOW RAGEPRO

      edit: just saw your comment on mine 🙂

      • techguy
      • 3 years ago

      You would be wrong.

      [url<]http://www.anandtech.com/bench/product/1260?vs=1544[/url<] Haswell->Skylake is at most a 10% difference in IPC, but the clockspeeds of the SKUs in question differ by greater than this amount (6600k: 3.5GHz base 3.9GHz Turbo 4790k 4.0GHz base 4.4GHz Turbo). Throw in HT for the 4790 and all the threaded-workloads end but being faster on it.

        • DancinJack
        • 3 years ago

        They said 4790, not 4790k, but good try!

          • techguy
          • 3 years ago

          No, I saw that. So on a list where we’re already questioning the legitimacy of the list and the intent of its creators, you don’t think a typo is possible?

          Even if it wasn’t a typo, the price difference between the two SKUs is negligible. You would be stupid as a consumer to have purchased the non-K part.

            • DrDominodog51
            • 3 years ago

            Prebuilts.

            • techguy
            • 3 years ago

            I’m well aware of OEM sales typically outnumbering direct-to-consumer sales, however this series of SKUs would be the singular exception to that rule, being the unlocked enthusiast series and all.

            • w76
            • 3 years ago

            Isn’t there a virtualization feature neutered on ‘K’ series parts? Okay, you said consumer, maybe that’s… prosumer.

            • techguy
            • 3 years ago

            They fixed that in the Devil’s Canyon series. Know how I know that? I bought one when it came out and I needed to know if they finally fixed Directed-IO (VT-d) for the K parts, as it was disabled on all previous K SKUs including the 4770K which my 4790K replaced.

      • 1sh
      • 3 years ago

      The Frostbite engine is pretty well optimized and it can use up to 8 threads so that explains some of it.

      • Aquilino
      • 3 years ago

      The i5 6600K fares almost as good as an i7 6700K on many videogames (with top class videocards, that is). It even midly outperforms the i7 5820K, a 6-core processor.

      These “requirements” go beyond bullsh*t. I hope they get “updated” (i.e.: remove the lies) soon.

        • Krogoth
        • 3 years ago

        I’m willing to wager the requirement for BF1 is actually a Core 2 Quad Q6600 or Phenom X4 8100e as far as CPUs are concerned. The memory requirements are going to be 4GiB minimal.

        It will run the game, but you will have to crank down the eye candy and effects to get a “playable” framerate and avoid virtual memory swapping.

        8GiB of memory will work fine for high settings but you may have to watch your background programs to avoid virtual memory swapping which means “no streaming”. 16GiB provides plenty of comfort space for background processes.

        It kinda sounds like requirements are really for “steaming live at 1080p60” crowd.

          • BurntMyBacon
          • 3 years ago

          I’d rather see it specified as a Core 2 Quad 9550 or Phenom II X4 955. About the closest I can recall Intel and AMD to having performance parity processors in the post Athlon era.

    • DancinJack
    • 3 years ago

    The low and high-end are a little strange. The 4790 vs 6600K is a very strange rec considering if you take them literally (4790 non-K) the 6600K is likely going to be the higher performing chip.

      • RAGEPRO
      • 3 years ago

      Beat me to it by seconds, man. 🙂

    • Firestarter
    • 3 years ago

    huh and here I thought my i5-2500K was doing just great in the beta

    welp, better toss it in the bin then

      • Voldenuit
      • 3 years ago

      Welp, looks like DiCE doesn’t even know what a CPU is.

      Better toss them in the bin… oh wait, I already have.

        • Magic Hate Ball
        • 3 years ago

        The dichotomy in performance between the AMD CPU and the Intel CPU for required… pretty hilarious.

      • StefanJanoski
      • 3 years ago

      Same here, mostly 80-100+ fps paired with a GTX 970 with everything on ultra settings. On a 75Hz monitor, so all’s good.

    • Srsly_Bro
    • 3 years ago

    I’d rather have a Skylake i5 over the AMD 8350….

    • chuckula
    • 3 years ago

    [quote<]That requirement looks to be a bit of hyperbole, since DICE also states that an AMD FX-6350—a much slower CPU—will handle the game too.[/quote<] Yeah Zak, I'm looking out for you in in the joint man. Be careful saying stuff like that, the AMD fanboys make really mean toothbrush shanks.

      • RAGEPRO
      • 3 years ago

      Hey, you seen my username? It hurt me to write, bro. Truth’s the truth though.

Pin It on Pinterest

Share This