Here’s what kind of PC you’ll need for Crysis 3

The Crysis series has always been notorious for its steep hardware requirements. Crytek posted minimum, recommended, and "hi-performance" PC specs for Crysis 3 on the game’s official website today, and by the look of it, this latest chapter will be no different from its predecessors.

The minimum requirements aren’t anything to write home about. Crytek says you’ll be able to run the game with anything equivalent to (or better than) a dual-core processor and a DirectX 11 graphics card with a gig of RAM. The recommended requirements are a little bit more demanding. See below:

• Windows Vista, Windows 7 or Windows 8

• DirectX 11 graphics card with 1GB Video RAM

• Quad core GPU

• 4GB Memory

• Example 1 (Nvidia/Intel):

          • Nvidia GTX 560

          • Intel Core i5-750

• Example 2 (AMD):

          • AMD Radeon HD5870

          • AMD Phenom II X4 805

I assume that should be "Quad core CPU" up there. In any case, recommended requirements describe pretty much the kind of PC I’m running today: a Core i5-750 with 4GB of RAM and a Radeon HD 6870. (Yes, I realize I’m a few generations behind.)

So, what about those "hi-performance" specs?

• Windows Vista, Windows 7 or Windows 8

• Latest DirectX 11 graphics card

• Latest quad core CPU

• SLI / Crossfire configurations will run even better

• 8GB Memory

• Example 1 (Nvidia/Intel):

          • NVidia GTX 680

          • Intel Core i7-2600k

• Example 2 (AMD):

          • AMD Radeon HD7970

          • AMD Bulldozer FX4150

Yikes. We’re talking about something like the Editor’s Choice build in our System Guide, and even that has a slower graphics card than what Crytek is suggesting.

Looking at that single-player video that was posted last month, it’s pretty clear Crytek didn’t skimp on shader effects, luxuriant foliage, and big, debris-laden explosions. I hope lower-end configs can enjoy all that eye candy, even if they have to live without uber-high texture detail or insane resolutions. I also hope the use of DirectX 11 tessellation is a little more sensible this time around. (Thanks to Rock, Paper, Shotgun for the link.)

Comments closed
    • rrr
    • 7 years ago

    Sky high requirements, they better make a game worth it. I’m not having high hopes for it though.

    • clone
    • 7 years ago

    you really have to wonder about the CPU recommendations when Cryteck believes that an AMD bulldozer FX4150 will fair just as well as an Intel i7 2600k…. seriously?

    apparently Cryteck believes consumers will require either an editors choice build if consumers are “foolish” (total sarcasm) enough to use Intel or they can go with a low end piece of junk to run the game at hi perf settings so long as it’s an AMD based comp.

    if the cpu specs are true I guess I can hang onto my X3 455 because I’m pretty sure it’ll spank the 4150….. at least I hope it will.

    p.s. if the cpu specs are accurate Crysis 3 will be yet another modern game held back by the current consoles.

      • mesyn191
      • 7 years ago

      If they made a executable that is designed to deal with BD’s quirks then I wouldn’t be surprised if it did well against a stock clocked i7 2600k.

        • clone
        • 7 years ago

        I suspect Crysis 3 will be pretty easy on CPU’s overall because it was built with cross platform support in mind leaving the tweaks limited to the graphics side of things.

        that and the spec “guys” threw in requirements to make the game look like it’ll offer something different for desktop enthusiasts.

          • jessterman21
          • 7 years ago

          I dunno, the Alpha was Killing my i3-2100 – but that was 16player-multi, too

    • l33t-g4m3r
    • 7 years ago

    I still haven’t found a game that’s unplayable on my 470. lol. Might just hang on to it a bit longer.

    • gmskking
    • 7 years ago

    • Example 1 (Nvidia/Intel):
    • NVidia GTX 680
    • Intel Core i7-2600k
    • Example 2 (AMD):
    • AMD Radeon HD7970
    • AMD Bulldozer FX4150

    Interesting to see the FX4150 here compared to the 2600k. Not even close.

      • geekl33tgamer
      • 7 years ago

      Maybe they mean 8150. Oh, wait, even that isn’t even close…

    • redwood36
    • 7 years ago

    @ Yogibear (since for whatever reasons this message board system really doesnt like my chrome plugins)
    Yeah I guess I associate the two titles because I thought they were made with the same engine. Actually as it turned out that isnt true at all either. Yeah im aware that the streaming spaces of FC3 are more taxing than the enclosed of hitman, but yeah there has to be some middle ground.

    • mcnabney
    • 7 years ago

    The minimum/recommended/high performance levels are USELESS without knowing what resolution you are playing at.

    Specifically, what is required to play Crysis 3 at high settings @1080p, @2560×1600, or @4K?

      • Vasilyfav
      • 7 years ago

      Well, for starters to play it @4k, you’ll need a $5000 monitor that doesn’t get released until next year.

        • JustAnEngineer
        • 7 years ago

        For five times as much, you could have it now.
        [url<]http://store.sony.com/webapp/wcs/stores/servlet/CategoryDisplay?catalogId=10551&storeId=10151&langId=-1&identifier=S_4KTV[/url<]

    • yogibbear
    • 7 years ago

    I like high end specs. I’ve never complained about them, and I don’t intend to start. More reason for AMD/Nvidia to innovate and make Intel’s integrated on-die GPUs look as tragic as they really are.

    • deinabog
    • 7 years ago

    Finally, a game to give my GeForce GTX 670 cards a workout. I wonder if Dragon Age 3 will have similar specs. And before anyone complains about the system specs let’s remember that we’ve spent the last few years lamenting the lack of high-end games that require muscular hardware to run with all of the bling turned on.

      • BestJinjo
      • 7 years ago

      It’s 1 thing to give a GPU a workout through awesome graphics and another to make a game that runs poorly and looks just good, but not spectacular.

      [url<]http://www.youtube.com/watch?v=h2NuT-zeAvk[/url<] Other games fall into this category like Hitman, Sleeping Dogs, Dirt Showdown. They look good but not spectacular and yet destroy GTX670/680/HD7970Ghz cards.

    • redwood36
    • 7 years ago

    I installed this game running a i7 920 and a 285 gpu and yeah– i gotta say it runs like crap. Which is basically ridiculous since i run basically every game fine for the most post, obviously lacking directx11. I played it anyway thinking well its ok. But then I installed hitman and tried that and realized what I was missing again from the smooth framerates. whats odd is that on my rig Hitman runs alot better than farcry and looks a lot better to. Not being an expert on such things I’m not trying to say definitively, but it just seems like its a poorly optimized game for anything other than the best latest rigs, which seems like lazy coding to me. I understand we all hate consolification, but since when did that mean dropping support for older graphics cards so quickly. 3 years ago my card was what like 300. I expect and have seen it to last longer than that in general (for most games, witcher 2 gives some trouble but only if one turns up the specs.)

      • yogibbear
      • 7 years ago

      Ya do realise this article is about Crysis 3…. by Crytek….. not Far Cry 3… by Ubisoft……

      The 285 can’t handle the textures on high settings and AA and bloom effects etc. with the draw distance of Far Cry 3… whereas Hitman is set within confined locations with minimal draw distance issues and can focus on pumping out high resolution textures in the smaller environment.

    • Meadows
    • 7 years ago

    On a nigh unrelated note, I’ve tried Far Cry 3 and it’s an amazing positive surprise. On my brother’s PC, with “Low” settings across the board, the game not only looks as good as the original Crysis did with medium/high settings but also runs considerably faster on the same old 8800 GT despite the higher resolution of 1440×900.

    And the game environment is two orders of magnitude greater, and can be traversed freely. The game is engaging and also requires a lot less driving around than FC2, or even none at all, if you prefer that way.

    Crysis 3 will have to surprise me greatly in one of the above areas before I’ll pay attention. It just might; according to all this, third time’s the charm, right?

      • sweatshopking
      • 7 years ago

      it’s a pretty banging game. it’s my joint.

    • Arclight
    • 7 years ago

    You guys are trippin for nothing. I’m sure it will run better than the original Crysis….after all it’s the same game as Crysis 2 with new maps.

      • BestJinjo
      • 7 years ago

      You are too optimistic. Crytek and NV will work together to make sure people want to upgrade to GTX780 SLI. You can count on it :). You have tessellated vegetation and tessellated frogs ffs! For that alone, you need GTX690 just to get 30 fps minimums at 1080p.

      [url<]http://gamegpu.ru/images/stories/Test_GPU/Action/Crysis%203%20MP%20Alpha/test/crysis%203%20vh%201920.png[/url<] vs. Crysis 1 [url<]http://www.techpowerup.com/reviews/HIS/HD_7970_X_Turbo/10.html[/url<] Crysis 2 [url<]http://www.techpowerup.com/reviews/HIS/HD_7970_X_Turbo/11.html[/url<]

        • Arclight
        • 7 years ago

        If they have a slider for tesseletion or on /off option im certain most mid-end cards will do just great. The link you posted said it was on very high quality and multiplayer……usually you get lower fps in multiplayer and it’s usually recommended you dial the settings down in to gurantee 60 fps or higher. Single player i have no doubts that the game will run fine even on 500 series cards.

    • Krogoth
    • 7 years ago

    Crytek doesn’t say what in-game settings and resolution they are using.

    As far as we know they could be setting the bar to 2560×1600 with 4xAA/AF, maximum in-game settings.

    • ronch
    • 7 years ago

    [quote<]Quad core GPU[/quote<] Does that mean my Radeon HD5670 with "400 Radeon Cores" is 100x faster than the recommended specs?

      • Srsly_Bro
      • 7 years ago

      It’s even faster if you add the frequencies of all 400 cores together.

        • Saribro
        • 7 years ago

        And multiply with the bitness of each core.

      • Arclight
      • 7 years ago

      Technically you are correct, the best kind of correct.

    • brucethemoose
    • 7 years ago

    These “example” specs smell fishy.

    -The i5-750 is faster than the FX 4150 (which is a budget processor, after all).
    -The 7970/GTX 680/2600k aren’t significantly faster than the 7950/GTX 670/2500k, so why list them at all?

    And if that doesn’t smell fishy to you…

    -[quote<] Quad core [b<][GPU[/b<] [/quote<] -I think it's being released on the 360, PS3, and PC at the same time This is just marketing. As much as we want it to be, this isn't the second coming of Crysis.

    • kamikaziechameleon
    • 7 years ago

    After upgrading my computer and playing the original crysis and crysis 2 with all the HD expansions you have to download I have to say the prospect of what they are promising this time around isn’t so tantalizing.

    • ish718
    • 7 years ago

    I guess Crysis 3 will be the new benchmark for video cards

      • BestJinjo
      • 7 years ago

      I would put my $ on unoptimized port of GTA V. That’s bound to run like garbage on any high-end PC for 5+ years. 🙂

      The graphics in Metro Last Light screenshots look way more impressive than Crysis 3 gameplay. It’s easy to make a game that tanks any GPU (Assassin’s Creed 3 MSAA anyone?) but to make it a new benchmark for GPUs, it has to be next gen looking to warrant the type of GPU upgrades that Crysis 1 spurred after its release. Crysis 3’s graphics look nothing special.

    • colinstu12
    • 7 years ago

    Amazing. You need a decent PC to play a game at high-end settings. :eyeroll:

    no crap. Build a machine out of used parts for a good price… nothing that fancy looking to me.

      • DancinJack
      • 7 years ago

      What? A GTX 680 or HD 7970 will set you back a MINIMUM of 380 bucks. If you want to buy me one, go for it.

        • BIF
        • 7 years ago

        LOL, don’t get your hopes up, dude. With a name like “Jack”, it probably ain’t gonna happen. Unless it’s short for Jacqueline. And then that dancing had better be pretty damned good! 😀

    • Welch
    • 7 years ago

    Well I guess my i5-2500k will be fine for the high performance build (not hi) and my 16gb of DDR3 1600 @ 1333 should cover it….. but ummmm…. All I need to do is win the 2 x 7870’s and I should be fine 🙂

    Its sad that they don’t ever mention what “speed” your memory should be running at… 8gb is pretty damn vague. I guess with memory controllers on chips as of the last 6-7 years, why care about the speed or latency of memory for games anymore, not to mention the pretty much 1gb standard for any video card (again, assuming GDDR5 as a lot of people forget to ask that)

    Then again, they are called “Guidelines” for a reason.

      • xeridea
      • 7 years ago

      Memory speed is never mentioned because it is irrelevant. There is only a small performance improvement for going above average, and the CPU matters 10 times more. Memory speed was more of a factor in the past.

      • Chrispy_
      • 7 years ago

      are you not overclocking your 2500K?

      It’s so easy that you can practically guarantee 4.2GHz even with the stock cooler and no voltage adjustments whatsoever. Just enter the BIOS and set the multiplier at 42.

      Maybe you’re overclocking already, but if you aren’t there really isn’t an excuse assuming you bought a Z or P-series chipset to go with your K.

    • willmore
    • 7 years ago

    HD6870 < HD5870, right? They did a bit of a changeup on the 6xxx generation so that the 69xx was the real successor to the 58xx, right? I sort of lost track for a while there. Living with a GF9800GTX+, I didn’t pay much attention to the stuff that was way faster than me.

      • DancinJack
      • 7 years ago

      They’re virtually identical according to TR’s review.

      • xeridea
      • 7 years ago

      6870 is virtually identical in performance for games, but power is a lot less. There are less, more optimized shaders. 6970 and 6950 are faster.

    • PenGun
    • 7 years ago

    Just blew up one of my two GTX 460s, guess it’s time for a GTX 680. My LGA 2011 build is scheduled for February and I’ll put it in that when it’s built..

      • Srsly_Bro
      • 7 years ago

      Keep us posted. Your story intrigues us.

        • sweatshopking
        • 7 years ago

        yeah! let us know what version of linux you’re going to be running that 680 on! love to see what games it can’t play!

        • PenGun
        • 7 years ago

        Just seeing how bad my numbers could be but I actually am making out better than usual.

        You think I care. How sweet.

      • BestJinjo
      • 7 years ago

      Why would you waste $ on a GTX680 if you are doing a rebuild around February? Save your $400+ for GTX700/HD8000 series. GTX680 will be crushed like a little girl Bioshock in Crysis 3:

      [url<]http://gamegpu.ru/images/stories/Test_GPU/Action/Crysis%203%20MP%20Alpha/test/crysis%203%20vh%201920.png[/url<] "Future proofing" with GTX680 for Crysis 3 today is Lulz when next gen is coming out around spring 2013.

        • PenGun
        • 7 years ago

        I could wait and probably will wait till the sales get stupid, but I’m not going to need a monster hammer for all the games that are made these days.

        I very much doubt I will buy Crysis anything and waiting till March/April is longer than I am willing to wait.

    • cegras
    • 7 years ago

    Considering the i3-2120 is probably faster than the FX4150, I believe my system should be able to play this.

      • zzz
      • 7 years ago

      Y’know, there is a videocard component to this ‘can my machine play it’ equation.

        • cegras
        • 7 years ago

        I’m pretty sure I was discussing the CPU requirements.

        • Srsly_Bro
        • 7 years ago

        Go back to bed you crabby old man! Don’t forget to take your meds either!

    • bthylafh
    • 7 years ago

    So this will definitely have a 64-bit executable? Can’t use 8GB of RAM otherwise.

    About time 64-bit EXEs started showing up in games.

      • Narishma
      • 7 years ago

      Doesn’t say anything about the game using 8 GB of RAM, just that it’s recommended you have 8 GB.

      • sweatshopking
      • 7 years ago

      crysis 1 had a 64 bit executable.

      • jensend
      • 7 years ago

      Even if a game is 32-bit and doesn’t use more than 4GB of memory per process, it can still benefit quite a bit from being used on a system with more memory. There’s tons of stuff that doesn’t have to share the same process address space- OS, drivers, system services, and caching, whatever other background stuff you’ve got running (tray apps, Steam etc), and more. There are even instances where it could be worthwhile to split some of the game’s code into a separate process too.

      Honestly, while it’s important to use a 64-bit [i<]OS kernel[/i<], for the time being, most [i<]applications[/i<] would be best served by something like the [url=http://en.wikipedia.org/wiki/X32_ABI<]X32 ABI[/url<]- all the architectural advances of modern 64-bit x86 processors' long mode available but using 32-bit pointers. 64-bit pointers can be quite costly. Unfortunately, since nobody really put any emphasis on that kind of ABI back in the initial years of the 64-bit transition, it's unlikely that it'll catch on outside of specialized niches before we get to the point where more applications are bloated enough to actually benefit from a larger process space.

        • willmore
        • 7 years ago

        You are correct, but no one will +1 you. Sucks to be right sometimes.

          • HallsMint
          • 7 years ago

          I did. HAH, proved you wrong!

            • willmore
            • 7 years ago

            At least someone is helping, thank you.

            • HallsMint
            • 7 years ago

            I think that since it’s Monday people are in a more mischievous mood. Mondays are a drag

        • stdRaichu
        • 7 years ago

        If memory serves me correctly, 32bit programs running on 64bit windows are still limited to 2GB per process (or 3GB with the /3GB boot switch, which most applications don’t support) – there’s a 4GB address space, 2GB of which is reserved by default for the kernel. In reality, many apps will crash out at just over 1.8GB (hi, firefox!), and in a game there’s very little point in having any data above the 2GB boundary pushed out to swap.

          • jensend
          • 7 years ago

          Not so. [url=http://msdn.microsoft.com/en-us/library/windows/desktop/aa384219%28v=vs.85%29.aspx<]Thus spake MS:[/url<] "If the application has the IMAGE_FILE_LARGE_ADDRESS_AWARE flag set in the image header, each 32-bit application receives 4 GB of virtual address space in the WOW64 environment. If the IMAGE_FILE_LARGE_ADDRESS_AWARE flag is not set, each 32-bit application receives 2 GB of virtual address space in the WOW64 environment." That flag also allows a process to use 3GB of memory under regular 32-bit Windows if /3GB is set in boot.ini. Every memory-intensive program should be setting that flag. So WOW64 programs do get 4GB of per process address space. [url=https://bugzilla.mozilla.org/show_bug.cgi?id=556382<]Firefox included[/url<] starting with FF17.

            • just brew it!
            • 7 years ago

            [quote<]Every memory-intensive program should be setting that flag.[/quote<] No. The reason it's a flag (and not done automatically whenever you've got /3GB set or a 64-bit OS) is that there's a lot of legacy 32-bit code out there that does not properly handle addresses above the 2 GB boundary. Using the /3GB boot flag can also have adverse effects on the OS, since it reduces the amount of address space available to the kernel.

            • jensend
            • 7 years ago

            Bah. If you have problems with addresses above the 2GB boundary, the solution is to fix the signed/unsigned bugs, not to leave your application hamstrung with an artificial memory limit.

            Yes, /3GB is not for everyone who’s using 32-bit, and there are a lot of misbehaving programs and drivers out there. But it’s not as though setting the flag forces your 32-bit users to start using the /3GB switch. The point here is that it gives your WOW64 users the capability of using twice the process space; that it removes one more roadblock for those of your 32-bit users who would otherwise like to use the /3GB switch is a nice side effect. It has no drawbacks.

            Well, no drawbacks except the effort required to fix any old signed/unsigned bugs. For legacy non-memory-intensive apps that might not be worth the hassle, but people have now had about a *decade* to shake out the bugs and thus allow substantial improvements to the performance of their memory-intensive applications, and the discussion here was centered around new games.

            • jensend
            • 7 years ago

            Besides, even if your app doesn’t really need more than 2GB, having a larger process space speeds up mallocs by reducing address space pressure and fragmentation, and though I don’t know much about the security end of things, I’d imagine it also helps with ASLR.

            • just brew it!
            • 7 years ago

            Fixing the signed/unsigned bugs may be problematic if the bug happens to be in a library you don’t have source code for. I agree fixing the bug is the best technical solution; but it may not always be practical or cost-effective.

            • stdRaichu
            • 7 years ago

            As just brew it! points out, there are actually very few programs that have the large address aware flag set – simply because it was often an absolute nightmare to get it working. When every app and (most importantly) every device driver written for windows assumed the standard 2GB/2GB split, you can bet your bottom dollar that a great many programs would crap out completely whenever they came across something that didn’t respect the same memory boundaries. There’s a nice article on the potential pitfalls of using the /3GB switch here:

            [url<]http://blogs.technet.com/b/askperf/archive/2007/03/23/memory-management-demystifying-3gb.aspx[/url<] Edit: a quick google will show you metric duckton of people who've experienced a great deal of driver breakage (especially video drivers) when the /3GB switch is set. Personally, dollars to doughnuts crysis will be available as 64bit native, since it's *much* easier to compile a 64bit version than it is to make sure your application (and by extension all your target platforms) honours the /3GB switch properly - even MS ditched it as soon as they could which they went x86-64-only for exchange 2007. As an added bonus you don't have to ask users to edit their boot.ini 🙂

            • just brew it!
            • 7 years ago

            It’s not even that developers [i<]assumed[/i<] the normal 2GB/2GB split, per se. It's that they didn't think about it [i<]at all[/i<]. The 2GB boundary is where 32-bit addresses wrap around to negative numbers if treated as signed 32-bit integers. There's a lot of code out there that (incorrectly) treats addresses as signed integers instead of unsigned integers, and blows up if the storage for a piece of data happens to get allocated at a "negative" memory address. (As an aside, a similar issue is scheduled to cause a Y2K-style headache a few decades down the road. *NIX-style OSes -- and many of the applications that run on them -- store and manipulate timestamps as a 32-bit integer representing the number of seconds since midnight 1/1/1970. Systems which incorrectly treat this as a signed value will blow up when timestamps roll over to negative numbers early in the year 2038...)

            • stdRaichu
            • 7 years ago

            Indeed, the vast majority of drivers thought of the 2GB/2GB split as sacrosanct and many of them effectively had it hardcoded. IIRC if you wanted your hardware to be validated for use on an NT5 server, MS insisted it work in 2/2, 3/1 splits and both with/without PAE enabled, but there were no such restrictions for desktop systems.

            Re: the 2038 problem, it’s highly likely that it’ll cause problems much sooner than that – we had a legacy app at my last place that ran on an AS/400 (which was thankfully soon migrated over to running on 64bit RHEL) used for tracking tenancy agreements – many of which go a long way into the future. As soon as the first entry was put in past 2038 (think it was 2040-01-01) all hell broke loose. The amount of code in the app that had to be edited and then audited to use 64bit ints for time_t was quite staggering. As the saying goes, when you make assumptions you make an ass of u and mptions. And I say that as a sysadmin who’s also made some stupid mptions myself 🙂

            • Waco
            • 7 years ago

            This. I can’t count the number of times I’ve seen signed integers used to hold memory addresses in code…

            • chuckula
            • 7 years ago

            I’m already stockpiling canned goods for when the Y2K+38 bug hits!

    • internetsandman
    • 7 years ago

    I like how my specs are essentially equal to the high performance ones Crytek recommends; I have a 4GB classified 680 and a 2600k at 4.6GHz. Can’t wait to see if this game challenges my rig properly

      • moose17145
      • 7 years ago

      If the coding is as sloppy as crysis 1 was it will be challenging home PC’s and super computers for years to come…

    • glacius555
    • 7 years ago

    I¨m guessing they’re tessellating the sky this time around. Or worse – the air..

      • internetsandman
      • 7 years ago

      I actually LOL’d at the idea of tessellated air

        • BIF
        • 7 years ago

        Me too, and I have no idea what tessellation is. I’m thinking “exploding vacuum cleaner bag”, but that’s probably not it, and it makes me want to sneeze anyway!

          • BIF
          • 7 years ago

          -5? Did I insult your grammas or something?

          Awww come on, you guys can do better than that. Up your tesselators with your downvotes, the whole lot of you! 😉

      • nicktrance
      • 7 years ago

      Well they had tessellated water underneath most maps in crysis 2, draining performance for no reason so that’s not too far fetched

        • willmore
        • 7 years ago

        Your reality makes me sad.

        • Deo Domuique
        • 7 years ago

        Jebus!… I thought it was just a joke or something, and I truly laughed. But then, I saw your comment and by luck I decided to type it on Google… Damn! I didn’t know it was a full blown story! It drove me back here, on Techrepor… lol

    • HallsMint
    • 7 years ago

    Woo!!! My computer can play Crysis!!!

      • tbone8ty
      • 7 years ago

      but can it play crysis……..3?

      lolz

        • HallsMint
        • 7 years ago

        I see what you did there…

        My rig is almost compliant with the “high-performance” specs, so I would have to say yes

          • Jigar
          • 7 years ago

          Wait till the game comes out and you see your high performance system slowing down.. Muhahahaha..

            • HallsMint
            • 7 years ago

            I would think myself less of a man if that were the truth…

    • tbone8ty
    • 7 years ago

    hi performance specs Amd FX-4150

    lolz

    id say minimum FX-6300 or higher

      • MadManOriginal
      • 7 years ago

      Yeah I wondered about that counting as a ‘Latest quad core processor.’ Technically it’s true if you go by AMDs marketing it as a quad core processor and the fact that it’s their latest one, but the overall performance is, at best, similar to the ‘recommended specs’ quad cores.

      • just brew it!
      • 7 years ago

      Not sure that would make much difference; other than the “half load turbo” mode the 4150 is actually clocked higher than the 6300. I don’t know if the extra 2 cores or Piledriver tweaks are enough to make up for the clock speed deficit; it probably depends on whether Crysis 3 can make effective use of the 5th and 6th cores.

        • tbone8ty
        • 7 years ago

        ok then FX-4300

          • xeridea
          • 7 years ago

          You also forgot the FX-4170 (4.2 GHz base, 4.3 GHz turbo).

        • xeridea
        • 7 years ago

        Half mode turbo its down by ~2.5%, but this is with 3 active cores rather than 2. PD IPC boost is ~5%. If a 5th thread was used for anything at all it would be a clear win. I wonder why the FX-6300 is only $10 more than the FX-4300, the clock difference is negligible at worst, and is for more active cores, plus you get 2 more cores. Its like -2% to +48% performance for 7% more cost.

        • MadManOriginal
        • 7 years ago

        It would be a freakish programming feat if Crytek could get an FX-4xxx CPU to noticably outperform a first generation Intel Core true quad core which only counts as ‘recommended’.

          • BestJinjo
          • 7 years ago

          [url<]http://gamegpu.ru/images/stories/Test_GPU/Action/Crysis%203%20MP%20Alpha/test/crysis%203%20proz.png[/url<] People who played the Alpha would tell you that GTX690 can't even max this game out at 1920x1080: [url<]http://gamegpu.ru/images/stories/Test_GPU/Action/Crysis%203%20MP%20Alpha/test/crysis%203%20vh%201920.png[/url<] I never pay attention to PC specs for any game anymore as they are never right. The recommended specs are useless. This game is going to crush current generation of GPUs unless performance somehow improves 2-3x from Alpha.

    • south side sammy
    • 7 years ago

    and here we are, again, in dire need of a new higher end graphics card because nvidia and amd nickle and dime us to death with small incremental advances in graphics card performance release after release.

    they do have the knowledge and do have the wherewithal to give us something adequate to :”future proof” but it’s not good business so we keep getting screwed……….. just like processors. They won’t give you 5Gz out of the box but you can over clock it and void your warranty………

    what happened to the new millenniums 8800gtx ? even the 680 get pounded. that came out when ? …………..

      • internetsandman
      • 7 years ago

      Show me a game in which an 8800gtx beats a 680. Show me any game in which a 7970 or a 680 are not even close to the highest performing graphics cards (all other things being equal). The reason they don’t give 5ghz straight out of the box in CPU’s is because not every chip/motherboard can deliver that, and because not every heatsink can dissipate that heat.

      Your conspiracy theory is bad, and you should feel bad

        • south side sammy
        • 7 years ago

        hmmmmm……….. the “8800gtx of the new “millennium” ……….. remember when the 8800gtx came out ? leaps and bounds ahead of it’s time. wasn’t a game to bring it to it’s knees for 3 years. Now you buy a gtx680 and in 6month’s it’s irrelevant………. kind of like your post.

        the processors. for how many years has the performance stagnated ? I want you to think about that before spurting out. we’ve been at the 2.8 to 3.5 area for how long. the only way to get beyond that was to over clock……. voiding your warranty. all the while we were being told what we really wanted was something other than pure speed they were dumping very small speed advance towards the consumer.

        how does low processor speed effect your life in anything that you might be using your computer for. if you say it doesn’t you should invest in an APU………. granny.

          • internetsandman
          • 7 years ago

          First of all, I hate using the voting system, but already you’re getting down votes and I’m getting upvotes, so there’s that.

          Second of all, if you expect a massive leap in technological capability from every single generation, you’re gonna be sorely disappointed, because as these chips get more and more advanced, it becomes harder to advance them further.

          A GTX 680 is only irrelevant in 6 months for people who’s e-peen is incredibly sensitive to the opinions of others. People are still using 5870’s for a lot of games and not experiencing any problems at all, and in fact I’m currently using a 680 and from the performance it’s given me I don’t expect to NEED to replace it for several years. I may WANT to upgrade or buy another for SLI, but it certain will not become outdated anytime soon

          What’s wrong with being at 3.2ghz for a stock clock? IIRC 3.2 lines up rather nicely with some math that I don’t quite understand but I’m given to assume is very convenient for default speeds and such. Overclocking is so painless now anyway, and the processors are so reliable, why does it even matter if you void your warranty if you get to have 5ghz or more? Especially since you can get CPU’s that can do that across six hyper threaded cores as long as they have the cooling for the job. There are many, many more things that contribute to improved performance than mere clock speed, any half decent CPU or GPU review will tell you this

          Lastly, I routinely transcode large (2-4GB) video files on my system as well as play numerous games, obviously processor speed is important to me, however I’m not so ignorant as to demand that Intel start selling six-core, 5ghz unlocked processors for a couple hundred bucks. I’m aware of the technological challenges each successive node shrink and architectural advance face, and I find it damn impressive what CPU’s have managed to evolve into, both on the far low end as well as the bleeding edge high end.

            • south side sammy
            • 7 years ago

            I don’t know what the voting system is and i don’t care. if nobody likes my opinion, too bad. you got it anyway.

            about the 680 being irrelevant…….. you still don’t get it ?

            • internetsandman
            • 7 years ago

            I do hope you’re aware that trolling doesn’t make you better in any way (if that’s indeed what you’re doing, which at this point is my best guess)

            • south side sammy
            • 7 years ago

            first it’s conspiracy theory’s and then it’s popular votes now it’s trolling…………. get over it Mr. Romney, you got beat. Leave us alone now.

            • internetsandman
            • 7 years ago

            Yup, trolling. I thought the Tech Report played host to better people than this

            • MadManOriginal
            • 7 years ago

            You thought wrong! buahaha

            No, but seriously, stick around and get to know frequent users and you’ll see people here are mostly light hearted and have fun even when arguing about stuff.

            • yogibbear
            • 7 years ago

            THUMBS DOWN FOR ALL!!!! Except SSK cause he’s a cutie pie!

            • sweatshopking
            • 7 years ago

            <3

            • Beelzebubba9
            • 7 years ago

            How is a GPU that’s too fast to be limited by any current game at the most common gaming resolutions ‘irrelevant’?

            My Radeon 4890 lasted me three years, I expect my 660 Ti to do the same.

          • modulusshift
          • 7 years ago

          You do realize clock speeds are not an accurate measure for comparison of CPU performance? Intel has massively increased the Instructions Per Clock since we first hit these speeds, because it found out trying to increase the GHz rating at the expense of everything else was a sucky strategy right around Pentium 4 and the NetBurst architecture.

          • ThorAxe
          • 7 years ago

          Clearly you never played the first Crysis. I ran 8800GTX SLI and that could barely cope with Crysis at 1920×1200 without turning down some settings.

            • south side sammy
            • 7 years ago

            the gtx260 wasn’t any better in that game either. vram limitations.

            that’s what I meant about small incremental changes from one card release to the next. Nickle and dimed. Otherwise Crysis would have fared better because we would have had faster and more capable cards. Think back about how slow the progression was. It’s like the 8800 series lasted forever. Even the 560 sucked……… was just another 8800 derivative. Nothing special. Not until the 600 series are we now seeing a change. With AMD’s next release ( 8k ) there’s no way nvidia would now dare to pull that with the 600 series or nobody would buy those cards. Think about it.

          • geekl33tgamer
          • 7 years ago

          Your an idiot. Clockspeed advances slow down in favour of higher core count. It’s much better to have 2/4/6/8 cores clocked at 3Ghz than a single core at 5Ghz.

          Modern day CPU’s can’t (reliably, in most cases) run all their cores at very high clockspeed neither. You’ve got thermal and design limits to consider, not to mention the power consumption. You also find me a single benchmark that shows a modern day multi-core CPU being beaten by the single-core CPU’s of the early 2000’s.

          I know there are none. A 3.6Ghz P4 is beaten by a 2 Ghz i3 CPU by a MILE in everything from gaming to encoding and general porductivity. Your argument is pathetic.

          • ultima_trev
          • 7 years ago

          Wasn’t a game to bring 8800 GTX to its knees for three years? Um, no…

          8800 GTX came out late 2006.

          Crysis came out late 2007. Even at 720P, a 8800 GTX could barely manage high settings, let alone max it out.

      • albundy
      • 7 years ago

      i believe the term you are looking for is consumer milking. and no, its not a conspiracy anymore, it’s down to a science. nvidia wouldnt even exist without microsoft and direct x, and ms wouldnt prosper so much without gaming companies using their api. in between it all, there has to be some major swine backscratching going on under the table for all these companies to collaborate for a piece of the pie.

      and to be on topic, it looks like i wont be playing crysis 3 anytime soon since i really dont want to buy another video card at the moment. probably will play it when steam has it in their dollar bin in the near future and when i decide to upgrade my video card, hopefully by the time radeon hits the 9 series. right now, i have a plethora of games lined up, with serious sam 3 coming up that i picked up for 5 bucks last month along with a $2 dlc from amazon.

      • Jigar
      • 7 years ago

      Your trolling was so bad that even Tejas felt ashamed.

        • south side sammy
        • 7 years ago

        stop already. the only one trolling is you……… or is it constantly spamming a thread with BS. grow up.

          • Jigar
          • 7 years ago

          This comes from the one that has -11 Thumbs down and counting ? LOL.

    • MadManOriginal
    • 7 years ago

    No mention of Origin required, but I’m just going to assume it is so I can look forward to a lot of QQ’ing about it.

      • StuG
      • 7 years ago

      Considering Crysis and Crysis 2 have shown back up on Steam, I do not think this will be Origin only anymore.

        • Phishy714
        • 7 years ago

        We can only pray..

          • Srsly_Bro
          • 7 years ago

          Do you think it’ll help?

            • yogibbear
            • 7 years ago

            We can only Prey. 😛

            • Phishy714
            • 7 years ago

            Well.. it sure as hell can’t hurt.

    • sweatshopking
    • 7 years ago

    watch me run it just fine on my q6600 and 4890

      • henfactor
      • 7 years ago

      I’d rather watch something else…..

        • sweatshopking
        • 7 years ago

        like me and no51 hug???!!?

          • no51
          • 7 years ago

          *~*~huggles~*~*

          • Srsly_Bro
          • 7 years ago

          I down voted you for not inviting me to the hug. I think we can fit Neely in too.

            • sweatshopking
            • 7 years ago

            you don’t need an invite. we’re bro’s for life, you know that.

      • Phishy714
      • 7 years ago

      …. at 15fps?

      Enjoy!

        • sweatshopking
        • 7 years ago

        i will!

        • willmore
        • 7 years ago

        Slide Show King?

        • Meadows
        • 7 years ago

        It runs fine on my brother’s old 8800 GT. Granted, “Low” detail, but it looked fine to me. The vegetation doesn’t allow you to notice detail culling often, and none of the textures were blurry to the point of annoying. Considerably higher FPS than you assert, too.
        I was actually impressed.

      • Arclight
      • 7 years ago

      Where’s the “hold my beer”?

      • jessterman21
      • 7 years ago

      Looks like it may be DX11-only, like the Alpha was…

      • Meadows
      • 7 years ago

      Krogoth? Is that you?

    • CampinCarl
    • 7 years ago

    I think that should be “Quad Core CPU” in the recommended list…

      • LauRoman
      • 7 years ago

      It’a the second site we’ve seen this typo on.

      • derFunkenstein
      • 7 years ago

      [quote<]I assume that should be "Quad core CPU" up there. In any case, recommended requirements describe pretty much the kind of PC I'm running today: a Core i5-750 with 4GB of RAM and a Radeon HD 6870. (Yes, I realize I'm a few generations behind.)[/quote<]

      • MadManOriginal
      • 7 years ago

      GEE YOU THINK? DOES THAT MEAN MY GEFROCE FX 5600 WITH 4 PIXEL SHADERS=QUAD CORE GPU WON’T RUN CRYSIS?!

        • ULYXX
        • 7 years ago

        only on 640×480 inches.

          • moose17145
          • 7 years ago

          [quote<]only on 640x480 inches.[/quote<] DOOOOOD... that'd be a HUGE monitor! I want one!!! 🙂

    • jessterman21
    • 7 years ago

    Interesting that 1GB VRAM is a minimum requirement…

      • Chrispy_
      • 7 years ago

      Textures > blurry console-port mess.

      \o/

      • Airmantharp
      • 7 years ago

      I think it’d be hard to get a GPU with less than 1GB of VRAM that’s also worth even booting this game up on.

      • OneArmedScissor
      • 7 years ago

      It probably doesn’t exceed 512MB by much, but that could still really slow you down on an older card.

      I got burned that way for going with what at the time was a faster 128MB 6600GT, instead of a cheaper 256MB card from that generation, which games went on to require for years to come lol.

Pin It on Pinterest

Share This