Evolve PC will put beefy gaming rigs to use

Remember Evolve? Developed by Turtle Rock Studios, the folks behind the original Left 4 Dead, this sci-fi multiplayer shooter features asymmetrical gameplay reminiscent of L4D's tank battles. Evolve is scheduled for a cross-platform release on February 10, and Turtle Rock has now posted the hardware requirements for the PC version.

MINIMUM REQUIREMENTS:

OS: Windows 7 64-bit

INTEL CPU: Core 2 Duo E6600

AMD CPU: Athlon 64 X2 6400

SYSTEM RAM: 4GB

NVIDIA VIDEO CARD: GeForce GTX 560

ATI VIDEO CARD: Radeon HD 5770

VIDEO MEMORY: 1GB

HARD DRIVE: 50GB

OS: Windows 7 64-bit

RECOMMENDED REQUIREMENTS:

INTEL CPU: Core i7-920

AMD CPU: A8-3870K

SYSTEM RAM: 6GB

NVIDIA VIDEO CARD: GeForce GTX 670 or GTX 760

ATI VIDEO CARD: Radeon R9 280

VIDEO MEMORY: 2GB

HARD DRIVE: 50GB

The minimum specs are relatively tame, but the recommended ones call for plenty of RAM—and some pretty fast graphics cards by today's standards. (Both the GeForce GTX 760 and Radeon R9 280 sell for around $200 right now.) The CPU recommendations make less sense, since the Core i7-920 came out three years before the A8-3870K and has four extra threads. I'm guessing you'll probably be fine with any new-ish quad-core CPU, though.

Oh, and yeah, be sure to free up 50GB of storage space. Here's hoping Evolve won't fill that up with a bunch of uncompressed audio.

According to Turtle Rock, the PC version of Evolve will support 4K resolutions and will include a fair number of graphical options. Players will be able to adjust detail levels for textures, shaders, models, shadows, and particle effects. Anti-aliasing and tessellation settings will also be tweakable. The game won't support SLI or CrossFire multi-GPU setups at launch, but the studio plans to remedy that omission "as soon as possible."

Comments closed
    • rahulahl
    • 8 years ago

    Mine is Asus Rog Swift. So 144 FPS.
    Thats why I was agreeing with you when I said “Yea. Probably.”
    Get a monitor with higher FPS and you can make better use of the new GPUs.

    • Crackhead Johny
    • 8 years ago

    Yep.
    this thing OCs like a Celly 300 or 366

    • Crackhead Johny
    • 8 years ago

    How many frames is your monitor capable of? Mine is locked to 60.

    • jessterman21
    • 8 years ago

    SMAA TX2 is awesome.

    • Deanjo
    • 8 years ago

    Not all that pricey, 8 GB of DDR2 Dual channel kits could be had for about $100 around the time I put it in my 4200+ system.

    • DPete27
    • 8 years ago

    Nice catch, I didn’t notice that.

    • Ninjitsu
    • 8 years ago

    “I hope it’s not uncompressed audio” should be a meme or something.

    • puppetworx
    • 8 years ago

    I see what you’re getting at. Remember, AMD has a new series launching soon, when it does the situation won’t look nearly as bad for AMD.

    NVidia is still pumping out releases regularly. Excluding the 900 series since the 680 launched in March 2012 seven cards have been faster than it: the 690, 770, 780, 780 Ti, Titan, Titan Black and Titan Z. AMD said they wouldn’t build any more ‘monster’ GPUs way back before Hawaii though so it makes sense that they have only offered one competitor to the Titans.

    • Krogoth
    • 8 years ago

    8GiB of memory capacity become commonplace once 2GiB and 4GiB DDR3 DIMMs became affordable which I believe was right around the time Bloomfield and Phenom-IIs were the new hotness.

    It was doable with 2GiB DDR2 DIMMs on older platform platforms but 4GiB DIMMs were and are still pricy.

    • rahulahl
    • 8 years ago

    When I need drivers, I still type ATI.COM
    Even though it doesent really make sense, it seems easier than typing AMD.COM

    • rahulahl
    • 8 years ago

    Yea. Probably.
    Simply change your aim from solid 60 FPS to solid 144 FPS, and you will once again be wanting higher end GPU cards again.
    Even my GTX 980 cant get over 100 in many new games at max settings.

    • brucethemoose
    • 8 years ago

    All those are Hawaii GPUs… I guess I meant GPU design.

    The point is that the 4890, 5770, 5870, 6870, and 6970 are all unique GPUs that lie between the 4870 and 7970.

    And after 3 years, there’s nothing between the 7970/280X and 290X. The 285 hardly counts, as it seems to be slightly slower than the 280X.

    • puppetworx
    • 8 years ago

    Fourth actually. The 290, 290X and 295X2 are all significantly faster too.

    • ronch
    • 8 years ago

    I see some people still see AMD’s graphics unit as ATI. ATI is and always will be ATI, I guess.

    • Vergil
    • 8 years ago

    Getting A8 7650K APU once it’s out later this month. Would be sweet if they added Mantle support in this game. I’d really love to run it at 1080p, high/ultra sets no AA.
    Looking forward to this game, as L4D is one of my favorite games of all time. Not 2 though only 1.

    • moose17145
    • 8 years ago

    Looks like I still have no reason to upgrade my cpu…

    Current system is a i7-920 OCed to 3.2GHz
    24GB of ram
    and an R9 290

    Built the system back in Dec of 08… still plays any game out there on maxed out settings, and at this rate it looks like it is going to see another 1 or 2 years of service.

    Sure i upgraded the videocard from it’s original 4870… but doing a simple videocard upgrade is a lot cheaper than a whole system rebuild.

    • brucethemoose
    • 8 years ago

    The 7970 was released 3 years ago… And it’s AMD’s 2nd fastest GPU today.

    • tipoo
    • 8 years ago

    And thats actually still behind the 920 –

    [url<]http://www.notebookcheck.net/Mobile-Processors-Benchmarklist.2436.0.html[/url<]

    • tipoo
    • 8 years ago

    Tbh nothing there looks terribly exotic, in fact I thought the specs were light in comparison to some newer games built around the 8th gen consoles. I think even my 4770HQ/Iris Pro 5200 should be comfortable, given where it sits between the minimum and recommended. And that’s an integrated GPU, albeit a good one.

    My only gripe of course is how much storage games are taking now. And yup, hit the nail on the head, a lot of it is audio. Since both big consoles support 50GB disks now, developers don’t bother paring much down even when porting to PC. Uncompressed audio is cool for those who appreciate it, me, I’d rather take a game under a fifth that size without the uncompressed audio. It’s not like texture sizes suddenly shot up that much.

    The funny thing about that four year older i7 920, being of the the oldest i7 series, is still ahead of the AMD reccomendations performance –

    [url<]http://www.notebookcheck.net/Mobile-Processors-Benchmarklist.2436.0.html[/url<]

    • Deanjo
    • 8 years ago

    Ya, I didn’t think the “8GB of ram” was all that exotic. Heck, I used to have 8 GB of ram on my AM2 boards.

    • Welch
    • 8 years ago

    And how little Intel has really done to improve each generation of Core i series.

    With SandyBridge being the last major change, there wasnt really that large of a change to today generation of Core i series CPUs. I mean the i7 920 i. All respects is not a bad CPU for kost games that are more GPU intensive.

    The A8-3870k was alright, but id be curious to see something like the Phenom II 965 in the mix as it matches the i7 920 almost in generation and is actually faster than the A8-3870k. Its also a true CPU like the 920, no on-chip GPU to generate heat, waste space since we arent using that at all playing games like this.

    8gb of RAM is common place these days for gaming, so I dont think it matters that they recommend 6. Hell I have 16 and I know people with 32. All you have to do is take a look at Techreports RAM survey/polls.

    • Krogoth
    • 8 years ago

    An overclock i7-920 rig can still game and handle mainstream stuff as well as the majority of the Sandy Bridge to Haswell at stock rigs out there. The only difference is overclocked i7-920 system is a bit toasty and power hungry when loaded and the lack of PCIe 3.0 support.

    There’s a reason why on the professional-tier Intel has been increasing the number of cores and cache on their chips with clockspeed taking a backseat.

    • Krogoth
    • 8 years ago

    Intel first started placing PCIe controller on die with Lynnfields. They were customer-grade version of Bloomfield. Intel had to remove a memory channel, axed a few QPI links (only needed for multi-socket configurations) and make memory controller smaller on Lynnfield to make room on the die for PCIe controller.

    • travbrad
    • 8 years ago

    What kind of games do you like?

    P.S. This is a bit different from most “multiplayer shooters” in that you don’t shoot any people and one side doesn’t actually shoot anything. It’s more in the vein of Natural Selection or Left 4 Dead.

    • JosiahBradley
    • 8 years ago

    I’m glad to see SMAA taking off with better options without the need for hacks like sweetFX.

    • Krogoth
    • 8 years ago

    It also shows how little has changed on the Intel side for the mainstream market. Outside of clockspeed, power consumption and PCIe 3.0 support. The old-flanged Bloomfields are almost as fast as current Haswell and somewhat older Ivy Bridge chips for mainstream stuff.

    • chuckula
    • 8 years ago

    So…. [url=https://www.youtube.com/watch?v=Ug75diEyiA0<]where's the beef?[/url<]

    • Meadows
    • 8 years ago

    Actually, GPU performance hasn’t stagnated nearly as bad as the rest of the PC ecosystem, so a high-end card today will obliterate a high-end card from 3 years ago.

    That’s where the incentives end, however.

    • bfar
    • 8 years ago

    I have one overclocked to 4 ghz, and I have’t come across any titles it can’t handle with ease. I gather Broadwell won’t offer much in the way of temptation, so I guess Skylake may be the time to upgrade.

    • Meadows
    • 8 years ago

    TR highly recommended the title, so I thought I’d give it a shot and got a pirate copy to see for myself. What I beheld appalled me. An ugly game with an uninteresting story and boring gameplay.

    I’m not lying when I say I went and uninstalled it somewhere around the second level or so. It was *that* flat. I literally got the game for “free” and it still wasn’t good enough to slog through.

    In fact, I had a bad feeling about it from the get-go. The story has an abrupt start that made no sense whatsoever and you had a sudden deluge of random backstory and unexplained events crashing against you. You’re asked to suspend disbelief and assume the role of some character in some situation and you don’t even know yet who the hell you’re supposed to be or where you are at all.

    Well, I wasn’t able to get immersed. Not one bit. If that means there’s something wrong with me, then I feel I’m better off this way. At least I have taste.

    • cmrcmk
    • 8 years ago

    Are those the first processors from AMD and Intel to bring PCIe onto the CPU itself instead of through the chipset? [url=http://en.wikipedia.org/wiki/Nehalem_(microarchitecture)<]Wikipedia[/url<] seems to confirm that for Nehalem, but I'm having trouble finding confirmation for Llano.

    • bthylafh
    • 8 years ago

    Pfft. We all know there’s something wrong with /you/. 😛

    • Concupiscence
    • 8 years ago

    Unless there’s a 960 Ti, I honestly think the vanilla 960’s planned 128-bit bus is going to hobble it pretty badly. The 970’s the safe bet here.

    And I’m right there with you. I could afford a 970 right now, but I’d feel like a tool for grabbing one to replace my 660 Ti, given that the game I’m playing the most is The Binding of Isaac Rebirth…

    • sweatshopking
    • 8 years ago

    I didn’t love it either.

    • ozzuneoj
    • 8 years ago

    You have a 2500K don’t you? 😉

    I do, and I feel the same way. And its a good thing I guess… I don’t really play the latest, most system intensive games, so its nice to not have that itch to upgrade for no reason. I have no itch to upgrade at all… except maybe my GTX 660 to a 960 at some point…

    • brucethemoose
    • 8 years ago

    “Gaming” CPUs pretty much stagnated after Sandy Bridge and the last Phenom II. And it’ll stay that way through 2015.

    Skylake/Zen could be significantly wider than today’s CPU cores… But I’m not getting my hopes up.

    • DPete27
    • 8 years ago

    Look at it this way: Devs are finally getting most of the game stuff that GPUs WERE MADE DO TO to run on the GPU. That leaves less and less for the CPU to grind away at.

    • bthylafh
    • 8 years ago

    I knew there was something wrong with you.

    • Meadows
    • 8 years ago

    That game is rubbish. I know because I’ve tried.

    • DPete27
    • 8 years ago

    Good thing with Evolve is that each opposing side has vastly different tactics and abilities. Might help level the playing field.

    • Crackhead Johny
    • 8 years ago

    How beefy?
    My 3+ year old PC has 8? or 16GB of RAM (because it was cheap and I wanted it, that’s why!)
    The CPU is still clocked faster than anything you can buy off the shelf.
    So buy a new graphics card and my rig will be “beefy”?

    I like that PCs have stagnated for years and we no longer have to look into the grim face of obsolescence every 6 months. Am I wrong or is it now about a “beefy” graphics card and a “meh” PC? As long as I can upgrade graphics cards, when will I actually need a new CPU to keep up? Or a build with more RAM?

    Come to think of it I’d probably need to replace my 24″ Ultrasharp to justify a new graphics card.

    • bthylafh
    • 8 years ago

    Dishonored.

    • ColeLT1
    • 8 years ago

    Keep in mind the i7-920 is the slowest i7 ever made, 2.66ghz (2.83 single thread). When it came out it was intel’s slowest chip any i series. Released Q4’08 it predates the first i5 (750) by almost a year, and any newer i5/i7 are faster than the i5-750 and i7-920, except for some later low power chips.

    • Meadows
    • 8 years ago

    Bleh, not another multiplayer shooter. Give me something I can play instead.

    • Concupiscence
    • 8 years ago

    Yeah, true. It could just be an artifact of how long it’s been in development at this point.

    • derFunkenstein
    • 8 years ago

    If it’s just they wanted cores, why didn’t they set the bar at Athlon II X4 or Phenom II X4?

    Then again maybe we’re all over thinking it.

    • Concupiscence
    • 8 years ago

    A Nehalem i7 and Llano quad core make for a weird combination. But the conclusion I draw is not that AMD’s hopelessly outgunned (at least in this instance), but that the game’s heavily multithreaded and probably likes an abundance of available FPUs. Performance benchmarks should be interesting to see… I wonder how FX-8000 chips will do.

    edit: On that note, I wonder if i3 and dual module APUs will perform on par with each other. It’ll also be interesting to see whether the Pentium G3258 and A4 and A6 APUs can cope trying to manage this.

    • Maff
    • 8 years ago

    “Just goes to show how far behind AMD is in their processors.”

    Actually, it doesn’t show that at all. If anything they are in a way comparing the almost top of the line i7 CPU of intel with an APU that wasn’t even ment to produce top notch CPU performance.

    To be honest the only thing this really shows is that this game probably won’t easily be bottlenecked by any modern CPU, even with a high end graphics card. That, plus that the CPU’s mentioned in these minimum/recommended specs are kind of arbitrary and only serve as a rough example.

    Edit: also not trying to say AMD isn’t behind on Intel, but i was actually kind of surprised that they would compare the A8 with the i7-920 in this way. I’ve always regarded the 920 as being superior.

    • derFunkenstein
    • 8 years ago

    I was thinking maybe there’s some sort of instruction extension set they want to use on higher details, but I can’t really find anything that Phenom II couldn’t do that Nehalem can.

    • Deanjo
    • 8 years ago

    [quote<]The CPU recommendations make less sense, since the Core i7-920 came out three years before the A8-3870K and has four extra threads. [/quote<] Just goes to show how far behind AMD is in their processors.

    • willmore
    • 8 years ago

    Hmm, I have way more CPU and memory than needed, but my HD7850 (w/2GB) might have a harder time. Well, it’s a good deal above the minimum at least.

Pin It on Pinterest

Share This

Share this post with your friends!