Reports: Assassin’s Creed Unity is a glitchy performance hog

Assassin’s Creed Unity came out yesterday, and folks who picked up the game on the PC sound not-so-pleasantly surprised. While Unity made headlines prior to release for its steep hardware requirements, even users with high-end hardware are reporting lackluster performance—not to mention a bevy of miscellaneous glitches.

Alec Meer of Rock, Paper, Shotgun, for one, says he “cannot get this sucker to consistently run faster than just over 40 frames per second.” Meer is playing on a Core i7-980X system with a Radeon R9 290X and 8GB of RAM. “What’s odd is that the performances changes relatively little if I pump the resolution up to 1920×1080 (or even 2560×1440) and stick the settings on High or Very high,” Meer adds. “Generally, it’s stuck around between 30 and 40.”

Radeon users aren’t the only ones complaining. In this thread on the NeoGAF forums, more than a few people with high-end GeForce cards say they’re seeing generally low frame rates—particularly during cut scenes—as well as occasional stuttering in Unity. Some recommend turning off PCSS soft shadows to improve performance, but by the look of it, not even a GeForce GTX 970 can sustain 60 FPS at 1080p with detail levels maxed out.

What about lower-end cards? PC World’s Hayden Dingman tried to play Unity on a GeForce GTX 760, which is slower than the GTX 680 Ubisoft lists as a minimum requirement for the game. Dingman says performance was “[r]eally rough.” While “most of the issues come during cutscenes instead of the actual game,” Dingman says he saw characters “popping into place every five feet” during Unity‘s more crowded sections.

Also in the glitch department, Polygon has compiled a series of 11 videos showing various bugs, from jacked-up ragdoll animations to gravity-defying characters. Ubisoft’s QA department clearly has its work cut out.

A number of yesterday’s reviews suggest even the core game is a little half-baked, too. Polygon gave Unity a 6.5 rating and stated in its conclusion, “Ubisoft Montreal failed to fix the problems that have accumulated over so many annual releases. Combined with an uninspiring story, and a long list of considerable technical problems, Unity falls short of the fresh start Assassin’s Creed needs.”

Oh well. On the flip side, folks with lower-end PCs may not miss much if they pass this game over.

Comments closed
    • PainIs4ThaWeak1
    • 6 years ago

    [url<]http://www.metacritic.com/game/pc/assassins-creed-unity[/url<] That is likely the lowest User Review score I've seen for any game reviewed on MetaCritic. ... But most definitely THE lowest I've seen for any AAA title. (*Sits back and laughs to himself at Ubisoft*)

    • Ninjitsu
    • 6 years ago

    Well, I think most of us here screamed “POOR PORT” the moment the system requirements were announced.

    It’s telling that FC4 needs less hardware to run, and while it may well be a buggy launch, I’m not getting the feeling it’ll be a shit show like Unity is proving to be.

    This game is clearly unoptimised and CPU bound in a wasteful way.

    • Krogoth
    • 6 years ago

    Protip: Unity is a beta game rushed to the market to meet the holiday season.

    There is no other explanation.

    • sschaem
    • 6 years ago

    Most likely reported already. but UBI blame AMD (but they really blame Directx11 in their statement)

    In short UBI said DX11 was made to only handle about 10,000 draw call per frame,
    but they are doing 50,000 per frame. 5 time what DX11 was design to handle.
    And they give that as a reason for the glitchs and performance issue.

    But this high level of draw call is not a problem on console (as we know for years now),
    just an issue on the PC platform.

    So this game is a prime candicate to use Mantle. But being an Nvidia titles, fat chances UBI even considered it. Better to blame AMD instead and wash the blame off their shoulder.

    Instead UBI work with nvidia to have optimized driver to handle DX11 to its breaking points,
    and in their latest statement put the entire blame on AMD.. nice UBI.

    But.. the glitch are also reported on intel/vidia machines… So I think WTF is up with UBI hating AMD that much to run over them in those public statements ?

    Knowing this game was developed for the xbox 1, with a low end 1.6ghz CPU and 7870 class HW, the entire blame is to be put on Dx11. Yet UBI put all the blame on AMD.

    And this after UBI fiasco on telling PS4 users that they wont get 1080p even so the game is not GPU limited on a xb1.

    Thumb down to UBI soft for this release , all the hype, and lame excuses.

      • Essence
      • 6 years ago

      “”In short UBI said DX11 was made to only handle about 10,000 draw call per frame,
      but they are doing 50,000 per frame. 5 time what DX11 was design to handle.
      And they give that as a reason for the glitchs and performance issue””

      Nope this was AMDs reply to WCCF when they asked for a reply on UBISOFT blaming AMD for Issues, and this was the reply WCCFTECH got from AMD below:

      The game (in its current state) is issuing approximately 50,000 draw calls on the DirectX 11 API. Problem is, DX11 is only equipped to handle ~10,000 peak draw calls. What happens after that is a severe bottleneck with most draw calls culled or incorrectly rendered, resulting in texture/NPCs popping all over the place. On the other hand, consoles have to-the-metal access and almost non-existent API Overhead but significantly underpowered hardware which is not able to cope with the stress of the multitude of polygons. Simply put, its a very very bad port for the PC Platform and an unoptimized (some would even go as far as saying, unfinished) title on the consoles.

      I thought this was a Nvidia GameWorks TWIMTBP??? Its also funny how Nvidia and UbiSoft are blaming AMD for their “game work” ROFLMAO

      • Pwnstar
      • 6 years ago

      Lazy devs blame their tools.

    • moose17145
    • 6 years ago

    Apparently a big part of the crappy frame rates can be “fixed” by disabling your internet / putting your computer offline.

    [url<]https://www.youtube.com/watch?v=bVdRBGfMze0&list=UU__Oy3QdB3d9_FHO_XG1PZg[/url<]

      • estheme
      • 6 years ago

      This didn’t work for me. Still in the 40-50 range with all settings maxed @ 1920×1080, 2x GTX 970 in SLI on a 4930 w/16GB ram using the Windows 10 evaluation. Crazy awful performance, considering.

    • UnfriendlyFire
    • 6 years ago

    So apparently the actual system requirement calls for a 5 GHz 4-core Cannonlake CPU.

    • UnfriendlyFire
    • 6 years ago

    If the game chokes on an i7-980X, then that means light gaming (30 FPS, low-medium graphic settings) on a laptop is not possible.

    • derFunkenstein
    • 6 years ago

    PC buyers are still getting the better experience. PS4 and XBone versions drop into the teens during fights:

    [url<]http://www.eurogamer.net/articles/digitalfoundry-2014-assassins-creed-unity-performance-analysis[/url<]

      • Pwnstar
      • 6 years ago

      That is no excuse.

    • Krogoth
    • 6 years ago

    I invoke a new meme….. “Will it run Unity?” 😉

      • dashbarron
      • 6 years ago

      A wiseguy eh? Noone likes it when memes try to propagate new memes.

        • derFunkenstein
        • 6 years ago

        But Krogoth himself is a human meme. A meme has come up with a meme propagated from another meme. It’s memeception.

          • Krogoth
          • 6 years ago

          [url<]http://inception.davepedu.com/[/url<]

    • Pax-UX
    • 6 years ago

    Glad I didn’t pre-order this, UBIsoft can thank Watch Dogs for that. I love Assassins Creed and I will play this at some point when it hits the bargain bins and gets a few patches. I’m hoping for a solid release of Far Cry 4 again I won’t pre-order. FC3 is one of the best all round games I’ve played, even if a little short. Hope some bad sales increase quality but I’ll guess they’ll just blame Edward Kenway.

    • ChangWang
    • 6 years ago

    TBH, I didn’t expect any different. If you recall, Black Flag was a similar situation when it launched. Once I saw the PC recommended specs, I knew it was a rap. Couple that with the “anonymous Ubisoft Dev” that said we are already at the max of the new consoles…..

    It’s like, lemme get this straight…. Other companies can release games all day and get good performance but not you? So you automatically blame the hardware? Not going for it dawg….

    • albundy
    • 6 years ago

    I’m just wondering if any testing was ever done on this game? sounds like another driver update is necessary.

      • derFunkenstein
      • 6 years ago

      No time. It was delayed earlier in the year as it is. Launch day buyers are the testers.

        • albundy
        • 6 years ago

        Then why all the shock and awe? splains it all then.

    • BoBzeBuilder
    • 6 years ago

    Thanks, Obama!

      • albundy
      • 6 years ago

      I want what he’s smoking, drinking, and shooting! That is one awesome high!

    • srenia
    • 6 years ago

    [url<]http://psychcentral.com/lib/the-5-stages-of-loss-and-grief/000617[/url<] Denial “PS4 is 50% more powerful, no way.” Anger “The game developer is lazy, boycott!” The last three stages haven’t been reached yet. Just know PS4 fan boys that in bargaining realize that Sony lied. In your depressive stage realize that there is a better console: Xbox One. Accept this in the last stage and buy the One. Ps4 is the same old tech in the PC's as well. Don't worry new DirectX 12 cards are coming for PC's soon. In the meantime enjoy the One and realize its already next gen.

      • Krogoth
      • 6 years ago

      [url<]https://www.youtube.com/watch?v=QbSQ8JbyQo8[/url<]

    • odizzido
    • 6 years ago

    haha the link with all the videos of bugs. This is right after release too. What a mess.

    • juzz86
    • 6 years ago

    Not sure why he’s whingeing about 40 FPS, he’s getting 10 more than Ubisoft said he should.

    • ThorAxe
    • 6 years ago

    A GTX 760 is nowhere near a GTX 680. A GTX 760 is about 18% slower.

    A GTX 680 is more like a GTX 770.

      • Cyril
      • 6 years ago

      Doh. Fixed. Hard to keep track of all these rebrands.

    • jihadjoe
    • 6 years ago

    Someone just posted this on NeoGAF:
    [url<]http://gamegpu.ru/images/remote/http--www.gamegpu.ru-images-stories-Test_GPU-Action-Assassins_Creed_Unity-test-ac_proz.jpg[/url<] Looks like the game is indeed CPU limited. 5960X runs it very well. Also looks like it uses a LOT of cores, because the 3970k is significantly outperforming the 4770k. Edit: Reverse image search for the article that graph came from: [url<]http://gamegpu.ru/action-/-fps-/-tps/assassin-s-creed-unity-test-gpu.html[/url<]

      • the
      • 6 years ago

      Hrm, I have an uber 32 core box I home with a GTX 970. I wonder how of the hardware it’d actually use.

        • jihadjoe
        • 6 years ago

        Here are the CPU load % charts:
        [url<]http://gamegpu.ru/images/remote/http--www.gamegpu.ru-images-stories-Test_GPU-Action-Assassins_Creed_Unity-test-ac_intel.jpg[/url<] [url<]http://gamegpu.ru/images/remote/http--www.gamegpu.ru-images-stories-Test_GPU-Action-Assassins_Creed_Unity-test-ac_amd.jpg[/url<] It seems to load 8 core CPUs just fine, which is sort of expected since the consoles are 8-cores. Intel seems to get lower overall utilization, probably because Haswell-E has high enough IPC to not be CPU-limited, but it does spawn 16 threads on the 5960X, and the FX8350 sees 90% load across all its cores.

          • the
          • 6 years ago

          Actually it looks like they’re getting close to 100% on the Haswell-E cores. The catch is that they’re using Hyperthreading and running two thread so neither thread can fully monopolize a core. Only when combined do you get close to 100% as expected.

          Being able to use 16 threads in a game is rather impressive considering that most titles (and yes, there are a handful of exceptions) don’t scale beyond 4 threads.

            • jihadjoe
            • 6 years ago

            Those NPCs better have the smarts to show for all the CPU time eaten.

    • the
    • 6 years ago

    Any report on how well multithreaded this game is? Considering the scaling problems seem to be centered around numerous NPC’s running around, one would think that multithreading the AI would be straight forward way to make this game scale better.

      • Ninjitsu
      • 6 years ago

      But it’s also hard (if they hadn’t planned for it), and Ubi rather have your money.

    • brucethemoose
    • 6 years ago

    In case anyone forgot, the “minimum” AMD CPU requirement for unity is a Phenom II X4 940 (vs. a 2500k for Intel), and the “recommended” AMD CPU is an 8350.

    … Am I going crazy, or does that seem a bit off? There’s a BIG gap between the 940 and the 2500k, hell there’s a big per-thread gap between the 8350 and 2500k. If Unity can barely keep 30 FPS on a 4.0ghz 980X, I doubt a stock Phenom II X4 will be playable at all.

    • puppetworx
    • 6 years ago

    Is this the same game that had a review embargo set 12-hours after release so that consumers couldn’t know what they were getting into?

    • tipoo
    • 6 years ago

    Alt title: Unity is like every modern Ubisoft game.

    • Chrispy_
    • 6 years ago

    I’ll pick this up in the December [i<]2015[/i<] Steam sale when it: 1) Runs properly 2) Is finished 3) Is a $10 GOTY edition with all the DLC If it can run on an XBone there is no excuse for poor performance [i<]at all[/i<].

      • brucethemoose
      • 6 years ago

      That’s just it… It CAN’T run well on the XBone.

        • Waco
        • 6 years ago

        Yet we have CPUs that are 5-10x faster performing poorly too…

          • nanoflower
          • 6 years ago

          Ubisoft did up their design requirements for the PC so you get much larger crowds when playing on a PC vs playing on a console. I suspect that if the PC version had the same design requirements as the console we wouldn’t be seeing the issues with performance. Though the graphical glitches would still be there.

          Hopefully all of this is something that can be fixed in future patches from Ubisoft and not something that’s truly inherent in the design of the game.

      • Prestige Worldwide
      • 6 years ago

      I’ll get it for free from a relative (non-dev, so no pointing fingers for this mess) at Ubi.

      But to be honest I probably wouldn’t pay any amount of money for this game.

    • Meadows
    • 6 years ago

    Not sure what people expected after those ridiculous requirements.

      • derFunkenstein
      • 6 years ago

      I think they expected it to not actually need those requirements. As far as GPUs go, I still kind of believe that, but I think the CPU is definitely a hard requirement.

    • LoneWolf15
    • 6 years ago

    Really wish I’d pulled the trigger on Maxwell when Borderlands TPS was still being offered.

    Now I get to choose between three Ubisoft games and the one that sounded the most interesting looks to be half-baked. Nuts.

    • sschaem
    • 6 years ago

    I always wondered.. people posting broken game video, do they run overclocked systems?

    Because many people seem to overclock beyond the ‘breaking point’.
    And it only take some error in the FPU (never really cause a crash) to get a small error that propagate in some crazy results.

    I dont think UBI recoded the game logic for the PC. so if we dont see any of the weird video from the xbox1 game play, I would reserve judgment.

      • Beelzebubba9
      • 6 years ago

      In the 16 years I’ve been building overclocked PCs I have [b<]never[/b<] experienced in game slowdowns that were fixed by reducing the clockspeed of my CPU/GPU.

        • lilbuddhaman
        • 6 years ago

        You need to add, “on a system that has been properly stress tested”.

          • Beelzebubba9
          • 6 years ago

          Very true. I can’t imagine completing a build without a stress test, so I didn’t explicitly call that out.

        • Deanjo
        • 6 years ago

        Insufficient cooling will cause the CPU to go into self preservation mode cutting clock speed and of course show as slow downs in a game.

        • sschaem
        • 6 years ago

        Not slowdown. glitchs.

        You can see this running Prime95.

        Your system wont crash, but it can fail mathematically.

        If you dont run something like Prime95 overnight you might never know your system is not 100% stable. (stable != blue screen)

        And this can also happen with different type of code. Your system might even pass prime95, but it might fail with some other code type that stress other computational units.

      • tipoo
      • 6 years ago

      The XBO and PS4 versions have plenty of weirdness.

      [url<]http://abload.de/img/acv-21msbq.jpg[/url<] [url<]http://abload.de/img/acvncsd2.jpg[/url<]

        • sweatshopking
        • 6 years ago

        AT LEAST IT LOOKS GORGEOUS? NOPE.

      • Waco
      • 6 years ago

      Small errors in overclocks generally cause crashes versus glitches in games…

      • nanoflower
      • 6 years ago

      Everyone (PC/Xbox One/PS4) is reporting the same sort of issues. The glitches are in the Ubisoft code and not due to issues with the PCs or consoles.

        • sschaem
        • 6 years ago

        My comment was not for Unity on console, but any report from PC gamer of random glitch.

        How many is attributed from game code VS overclocked PC on the borderline?

        I also assumed the glitch where PC only as this article focus on the PC version mentioning.

        If the exact same glitch happen on all platform then in this case its game code related.

      • Meadows
      • 6 years ago

      That’s not how this works. That’s not how any of this works.

        • sschaem
        • 6 years ago

        You still have a LOT to learn young padawan…

      • ozzuneoj
      • 6 years ago

      As crazy as it sounds, in the 14 years I’ve been overclocking systems, I did have one instance of a system that would run seemingly stably but have extremely odd problems when overclocked… without crashing.

      It was my old Athlon XP 1700+ on an Abit NF7-S 2.0 back in 2002-2003. At one point I pushed it too far (from 1.46 to 2.2Ghz I believe) and it seemed to boot fine. I probably stress tested it, and it probably failed Prime95 or something similar without crashing the system, but I do remember trying to run my favorite stress testing game at the time as well… Battlefield 1942 with a 64 bot single player game on Stalingrad (tiny map). It was the most ridiculous thing I’d ever seen. There were soldiers getting stuck in walls and barricades, tanks flying through the air, bullets that did nothing, random deaths, the player getting lodged in the ground… I didn’t even know that a game could do these things without just crashing. Obviously I scaled back the overclock and the problems went away.

      To this day I’ve never had a another game (or anything else) get “glitchy” from an overclock, but I also haven’t been as desperate for CPU performance like I was back in those days.

        • Waco
        • 6 years ago

        XP was a lot more tolerant of overclocking than any modern OS in my experience. Even Vista was great for finding “stable, but not” overclocks during upgrades from XP.

        • sschaem
        • 6 years ago

        Mathematical error will result in strange behavior but not crash, so your experience seem in line with that.

        Because floating point is rarely use for memory indexing (causing access violation / crash)
        or indexing jump tables where you execute crap code that will result in an exception.

        And you can see that with prime95 when pushing a system to its limit.
        You might be able to run prime for an hour, and suddenly “core 4” result in an computational error.

        So many of those people recording those crazy glitch could come from borderline system,
        where 99.9999% of the time its ok, but then you get that floating point error that cause visual problems. (Without any crash)

    • derFunkenstein
    • 6 years ago

    Yeah, I guess I’ll be passing on this. Doesn’t change my feeling on is prequel, though. Black Flag is one of my all-time favorite games. Sounds like a letdown for the new one.

      • UberGerbil
      • 6 years ago

      Black Flag had its share of glitches too, didn’t it? At least I remember seeing videos of flying / submarining ships and such. Nothing like this, though (and if people are finding this many just hours after release, one can only imagine what kind of anomalies are still lurking, awaiting discovery). That said, in a sandbox world like this I might want to play strictly to be entertained by the glitches.

        • derFunkenstein
        • 6 years ago

        I’ve only ever played it on PS4 and haven’t noticed anything weird other than the fact Edward wants to climb on everything while I’m running around. I got used to it eventually and could limit his monkey tendencies. Maybe on other platforms though. Also I didn’t play it until a couple months after launch so maybe it was patched by then.

        • Meadows
        • 6 years ago

        A pirate submarine. Oh yes.

    • brucethemoose
    • 6 years ago

    Ya’ll are looking at this wrong. Unity is the first truly “next gen” game, as none of today’s hardware can run it well!

    Those “ragdoll animations”, floating, clipping, and partially invisible faces are all just human abilities from the future. What appears to be “falling through the map” is just a representation of trans-dimensional travel.

    Also, our perception of time is much slower in the future, so the low framerate compensates for it.

    • superjawes
    • 6 years ago

    Looking at the attached picture, the obvious explanation is that this game is haunted by the ghost of Michael Jackson.

    • ClickClick5
    • 6 years ago

    At least Far Cry 4 is back on steam.

    AS before, I’ll wait for about 2 patches before buying FC4.

    As for Unity….eh.

      • sweatshopking
      • 6 years ago

      wait for the new ziggys mod. I seriously can’t tell you enough how much better that made the game. seriously. if you have fc3, play it with ziggys. the regular game is mediocre, ziggys is fantastic.

        • nanoflower
        • 6 years ago

        Only if you want to play a harder game as that seems to be the main purpose of the mod.

          • sweatshopking
          • 6 years ago

          Not just harder. Much of the gameplay is fundamentally changed.

    • geekl33tgamer
    • 6 years ago

    Yay, Ubimization – It’s like optimization, but hasn’t improved since 1986.

    • sschaem
    • 6 years ago

    If you run a game at 1920×1080 and get the same FPS then running at 2560×1440,
    You are not GPU limited. You are CPU limited.

    The i7-980x is 5 years old… yep, time fly. This was a CPU released in early 2010.
    The game doesn’t fully use all core, so you really have an old core architecture and 1066mhz memory.

    Ubi recommend an i5-2500K as its Minimum. and the i7-980x is just short of it ?
    Its most likely up to 25% slower for 2 to 4 thread workloads then the i5 sandy.

    I wonder how an overclock i7-4790k and a r9-290x perform…
    Or in other words, what CPU does it take to have a system thats GPU limited and not CPU limited in this game.

    CPU/GPU scaling in modern game would be interesting.

    And lets bring back the Pentium Anniversary edition for a laugh…

    Makes me wonder if the game is having big problem with Directx11 / OpenGL ?
    Could Mantle have fixed most of the problem we are seeing ? (or Directx12 when its ready)

      • derFunkenstein
      • 6 years ago

      Yeah, I have to agree for the most part. The 980x on a per-thread basis was surpassed by the i5 2500K. I’d be interested in seeing a 290X system with a current-gen CPU does. My guess is the framerate is better, even if it’s a lowly i5.

      The one place I might disagree is that the effective bandwidth of that 1066MHz memory with 3 channels is roughly the same as 1600MHz dual-channel memory on a current Haswell system. Memory bandwidth is probably not the issue as much as the older CPU architecture.

      • Laykun
      • 6 years ago

      [url<]http://cpuboss.com/cpus/Intel-Core-i7-980X-vs-Intel-Core-i5-2500K[/url<]

        • derFunkenstein
        • 6 years ago

        Unless the game uses 8 threads maxing the CPU out (which, btw, it isn’t), then it’s slower than a 2500K. People forget just how badly Intel turned the high-end CPU market on its head with Sandy Bridge.

          • Laykun
          • 6 years ago

          The difference isn’t 25% though, generally it’s closer to 15%, small nit-picking. I think the point is though that the 980X is still up there close to the top of the pile, and it’s definitely not considered a bottleneck for the 290X. Anyone expected to have more than a 980X for this game is asking far too much as it’s still much faster than the current day mid-range CPUs.

          I very much doubt the problem is in the graphics API, it’s more probable to be a game logic problem. If I get my hands on the game I’d be tempted to run it through nvidia nsight or API trace to see what they’re doing to build the scene, only once you’ve done something like that will you understand why it runs so slow.

            • derFunkenstein
            • 6 years ago

            If the CPU is doing non graphic tasks it can absolutely be a bottleneck.

            • Rza79
            • 6 years ago

            Funny how derFunkenstein gets upvoted for being wrong about the 980X being a quadcore processor and you get downvoted for actually being right. :/

            • derFunkenstein
            • 6 years ago

            lololol whoops. 6 cores it is.

            My main point is that games are still mainly constrained by single-thread performance, because they’re not using more than 3-4 of them. That’s probably why.

            • Rza79
            • 6 years ago

            Agree, but the thing is that I’ve not seen more than 10% difference between Nehalem and SB in games (at the same clock) and Alec Meer is running his 980X at 4Ghz. That basically means that even a 2500K wouldn’t suffice which is just ridiculous. If this game can run on 4 Jaguar cores, which are like 100% slower in games, then why does it even need a 2500K as a minimum in the first place. Lazy programming, that’s all.

            • derFunkenstein
            • 6 years ago

            It “can run” on those Jaguar cores but XBone and PS4 are both seeing fairly consistent sub-30FPS (edit: dipping into the teens if you watch the video below), so it’s not like the PC version is magically running worse.

            [url<]http://www.eurogamer.net/articles/digitalfoundry-2014-assassins-creed-unity-performance-analysis[/url<]

          • Meadows
          • 6 years ago

          But that begs the question, why doesn’t it use 8 threads?

            • UberGerbil
            • 6 years ago

            Because threading is hard, and keeping 8 threads fully active in a game with a lot of shared state isn’t as easy as laypeople presume it to be. Even before you take into account all the non-technical considerations (ie time/resources, repeatability for testing, and so on). Given how much more QA this title appears to require, it’s probably just as well they didn’t also push the Dining Philosophers envelope.

            • Rza79
            • 6 years ago

            But you don’t need 8 threads to get good performance out of the 980X. 6 would do it. Some new games do actually use 6 threads.
            Don’t forget that the 980X is 6 core CPU! The only things it’s really lacking is AVX but I highly doubt that this game makes any use of it.

            • Meadows
            • 6 years ago

            Well, they did complain about CPU usage on the consoles, and dozens of simultaneous AI simulations, so they might as well have done it properly.

          • jihadjoe
          • 6 years ago

          [url<]http://gamegpu.ru/images/remote/http--www.gamegpu.ru-images-stories-Test_GPU-Action-Assassins_Creed_Unity-test-ac_amd.jpg[/url<] Maxes out all 8 cores of an FX8350.

            • derFunkenstein
            • 6 years ago

            then I officially have no idea what’s going on

            • dragontamer5788
            • 6 years ago

            WTF is this benchmark?

            • Meadows
            • 6 years ago

            No it doesn’t. If you add up the numbers, you’ll find that it’s 6 “full” cores at most (plus background tasks and OS overhead), and then you have to take into account the fact that only every other core is a full-fat one on an AMD FX.

            • BobbinThreadbare
            • 6 years ago

            [quote<]then you have to take into account the fact that only every other core is a full-fat one on an AMD FX.[/quote<] Why? What does that have to do with utilization?

            • Meadows
            • 6 years ago

            I don’t know how much of the game stresses the FPU, of which an FX only has 4 and not 8.

            Edit: therefore, high utilisation may just be a side effect of the processor being inadequate the way it is.

            • jihadjoe
            • 6 years ago

            Isn’t gaming mostly integer math? I would’ve thought a well-threaded game coming from the new console ports might be something the FX series would excel at.

            • Jason181
            • 6 years ago

            No, it’s not. FP is used for the additional precision in the 3d rendering. I’m sure some games use the fpu more heavily than others, but there’s definitely a heavy fpu component.

      • Andrew Lauritzen
      • 6 years ago

      Agreed, but I don’t think that excuses the developers from doing a poor job on PC optimization. A 980x is still vastly more powerful than the console CPUs…

      • nanoflower
      • 6 years ago

      The answer to how well an overclocked I7-4790k and an R9-290x performs with this game is poorly. For instance John Bain (Total Biscuit) is running an I7-5930, 16 GB of RAM and dual 980s and he can’t get the game to run a consistent 60FPS at 1080P. He has reported even frame rates down into the 30s in some cut scenes. That’s after spending quite a lot of time playing around with the graphics settings to see if there were some options that would significantly boost performance.

      The game just isn’t ready for general release given the performance issues and the number of other bugs being discovered. Perhaps the majority of the issues will be released in a few weeks but I wouldn’t advise picking up the game today.

        • ThorAxe
        • 6 years ago

        What drivers was Total Biscuit using? Just curious to see if the new ones make any difference.

          • nanoflower
          • 6 years ago

          Unknown but given how much time he put into trying to get it working right he was likely using the latest. Here’s his video on his troubles [url<]https://www.youtube.com/watch?v=SgpzT5V5Mgs[/url<]

        • Ninjitsu
        • 6 years ago

        He said he was using a 5930K, if I didn’t mishear him.

      • NTMBK
      • 6 years ago

      He also has that 980X overclocked to 4GHz… and that platform has triple-channel memory, so the 1066MHz memory gives the same bandwidth as 1600MHz memory on a dual channel platform.

      And it’s a game built for 8 core consoles. If it’s not utilizing all of the cores, something went seriously wrong with their engine design.

        • Ninjitsu
        • 6 years ago

        Poor multithreading can actually be detrimental to performance vs single threaded performance, and given the general poor performance of the game, I’m pretty sure this was poorly done too.

      • l33t-g4m3r
      • 6 years ago

      Meanwhile, consoles are using AMD cpu’s that aren’t half as powerful as a i7 980, and probably can’t even match a phenom II. This is an optimization problem, not a go spend another $2000 upgrading your already decent system problem.

      • UnfriendlyFire
      • 6 years ago

      Not everyone has the latest rig. Check Steam’s Hardware survey, nearly half of the gamers use dual-core CPUs (most likely the mobile CPUs). Majority of them also have somewhere bewteen an Intel HD 3000 to a mid-range desktop GPU.

        • torquer
        • 6 years ago

        Do we drive the industry forward by programming for the lowest common denominator?

        Setting aside the issue of poor code and optimizations…

        Everyone complained to high heaven from “consolitis” and the dumbing down of games and graphics to work on the last gen of consoles. Now you have new games coming out and people complain it won’t run well on a $300 PC. You can’t have it both ways, folks – you either want the industry to be pushed forward by demanding software or you don’t.

        People will reply and say this game is written poorly and maybe it is, but every time there is a post about high hardware requirements, everyone freaks out. Seriously people if you want your $300 PC to be able to play the latest games, maybe you should buy a console. I for one welcome more demanding software – its what justifies my significant hardware investments.

          • Ninjitsu
          • 6 years ago

          Your argument would hold water if the game actually ran splendidly on high end hardware, which it doesn’t.

          No one has a problem with higher system requirements if the game actually needs them and uses them. High system requirements shouldn’t be a cover up for poor coding, which it more than often is.

            • torquer
            • 6 years ago

            Can you show me an example recently of a game with high system requirements that wasn’t met with moaning, wailing, and gnashing of teeth?

            Not every game with high system requirements is that way because its poorly written. This one may very well be, but not all are. I was also replying directly to UnfriendlyFire who was pointing out how most people have low end machines according to the steam hardware survey, so my point holds.

          • BobbinThreadbare
          • 6 years ago

          What is this game doing with the horsepower that gamers should be pleased with?

      • Joel H.
      • 6 years ago

      I benchmarked it with a Core i7-4790K and then re-tested with a Core i7-5960X just to be certain on this point.

      [url<]http://www.extremetech.com/gaming/194123-assassins-creed-unity-for-the-pc-benchmarks-and-analysis-of-its-poor-performance[/url<] Link provided not as spam, but to demonstrate the point. No performance differences, even when I overclocked the Core i7-5960X. This game isn't CPU-limited in a meaningful way at the high-end. Turning MSAA off impacts performance far more dramatically, even at 1080p -- suggesting that the GPU is saturated and the entire product non-optimized. EDIT To add: There's some evidence that more cores can help SLI configurations (It's not clear if they had AA enabled or disabled in these tests). [url<]http://www.pcgameshardware.de/Assassins-Creed-Unity-PC-258436/News/Benchmarks-GPU-CPU-AMD-Nvidia-Intel-1142535/galerie/2290866/#?a_id=1142535&g_id=-1&i_id=2290872[/url<] I didn't see differences in single-card performance, but I used 4x MSAA and above for testing.

      • Ninjitsu
      • 6 years ago

      I agree that the game’s CPU limited, but I strongly disagree that a hexacore Nehalem processor clocked at 4 GHz is not enough for a game like Unity.

      If four measly (by comparison) Jaguar cores get 15-30 fps at 1.6-1.75 GHz, I’d expect the 980X to be good for 60 fps at the very least.

      It’s poor optimization and a the absence of a good multi-threaded implementation, which speaks poorly of a AAA title released at the end of 2014.

    • Narishma
    • 6 years ago

    This is Ubisoft, that’s to be expected. They even managed the feat of making the game run choppier on PS4 than Xbox One.

      • sweatshopking
      • 6 years ago

      is that a feat or is it because THE PS4 IS GARBAGE AND THE XBOX ONE IS THE BEST THING THAT GOD EVER ALLOWED HUMANITY TO USE?

        • derFunkenstein
        • 6 years ago

        it’s a feat, because they have very similar APUs with MS’s being the lesser of the two.

        The only thing I can think of is that the Xbox CPU got a bump right before release. It runs at 1.75GHz instead of 1.6GHz.

        if it’s really as intensive on the CPU as certain Ubisoft nerds said, then it seems reasonable the extra 10% CPU is helping.

          • sweatshopking
          • 6 years ago

          Ps4 boys will tell you that the ps4 has a faster cpu because sony never officially released it’s speed or how many cores it has and the xbone only has 6.
          the xbone also has cloud offloading that the ps4 doesn’t.

          I honestly have no idea, nor have I console gamed since the snes, but you know. I LIKE TO POST IN ALL CAPS.

            • derFunkenstein
            • 6 years ago

            They’ll tell you that but the reality is that according to Chipworks, they both have identical Jaguar-based 8-core designs with the Xbox running at 1.75 and the PS4 running at 1.6. The Xbox reserves 2 cores for the OS and the widely-held belief is the PS4 does the same.

            • lilbuddhaman
            • 6 years ago

            My guess is that the limitations of the cpu could be resolved with some code rewrites, but they were unwilling to do so (time+money).

            • tipoo
            • 6 years ago

            There is no way for Chipworks to know the clock speed. Physical features, sure, but clock speed isn’t something that’s visible in a die scan obviously.

            1.75 and 1.6 seem likely though, not denying that. There was however a benchmark showing the PS4s CPU could do more floating around though, not sure why that is.
            Edit: Just curious if anyone is disagreeing that a die scan can’t tell the clock speed, due to the comment-less downvote?

            • lilbuddhaman
            • 6 years ago

            Xbone’s slow RAM evens things out (cpu wise), I wager.
            The graphics performance gap will close tighter in the next wave of games (next 6-12mo), but from there on out the PS4 will dominate in fidelity + frame rate.

            • tipoo
            • 6 years ago

            I don’t think the bandwidth of DDR3 is a CPU disadvantage, CPUs don’t take nearly as much bandwidth as GPUs, latencies matter more after a point to them. The DDR3 there has slightly lower latencies than the GDDR5 I think, so wherever the weird CPU benchmark showing PS4 in the lead came from, the RAM probably was not the reason.

            • BobbinThreadbare
            • 6 years ago

            Fluid clockspeeds are all the rage these days. I would not be surprised at all if the PS4 CPU has a dynamic clockspeed.

            • tipoo
            • 6 years ago

            I would be. Fully dynamic, I mean, because the point of consoles is predictable performance. Maybe a mode where a developer can chose 8 cores or 4 higher clocked ones, but not a fully dynamic one.

          • Ninjitsu
          • 6 years ago

          I dunno, cause that implies a 980X at 4.0 GHz should be a lot better.

    • Peter.Parker
    • 6 years ago

    It shows once again that this is pure bad programming. That is why they had to cap the framerate on XB1 and PS4 at 30. They will probably fix this one in the future, but they were definitely pushing the realease date . UBI, once again, dissapointing… That’s not to say I’m not going to buy this game, I’m still a sucker for some of their titles.. I’m gonna wait for a improved service pack for a little while.

      • nanoflower
      • 6 years ago

      The sad thing is that I’m hearing the game only gets about 20 FPS in many areas on the Xbox One and even the PS4 is having performance issues. So even though they reduced the graphics and cut down on the CPU requirements (fewer people in the cities to control) the game still performs at least as poorly on the consoles as it does on the PC.

    • Shoki
    • 6 years ago

    It’s pretty sad. I guess it runs poorly on all platforms.

      • christos_thski
      • 6 years ago

      Well, that about sums it up.

      According to the developers, Next gen consoles have “too feeble CPUs” to handle the game, while PCs with multiple times the cpu horsepower have…. “something” too feeble to handle it, too.

      Call me cynical, but it seems like crap programmers all around.

        • Pez
        • 6 years ago

        This pretty much sums it up. TotalBiscuit on YouTube has an X99 system and GTX980SLi and can barely get stable framerates, and reporting lots of pop-ins and general bugs.

        All in all it’s a sad release and very, very poorly optimised.

    • PrincipalSkinner
    • 6 years ago

    Is new DRM to blame?

      • nanoflower
      • 6 years ago

      No, while that was suspected at first the number of problems that are being found with the game go far beyond what any DRM would cause. The game just wasn’t ready for release. Likely someone committed to a date and, by George, they hit their date even though the game is buggy. I’m sure they are happy about the pre-sales and the day 1 one sales they accomplished with their review embargo (no one was supposed to post a review until 12 hours AFTER the game went on sale) but the poor reviews are going to hurt any additional sales. At least until they finally fix them (which I’m sure they will.)

Pin It on Pinterest

Share This