Assassin’s Creed Unity is too much for console CPUs

The latest game consoles aren’t even a year old, but developers are already bumping into the limits of the hardware. Loads of games run at sub-1080p resolutions, especially on the Xbone, which has less graphics oomph than its PlayStation counterpart.

Assassin’s Creed Unity is another casualty; senior producer Vincent Pontbriand told Videogamer the title will be limited to 1600×900 resolution and 30 frames per second on both consoles “to avoid all the debates and stuff.” GPU horsepower doesn’t appear to be the most important limiting factor in this case, though. According to Pontbriand, Unity is more bottlenecked by the CPU, which “has to process the AI, the number of NPCs we have on screen, all these systems running in parallel.”

Source: Ubisoft

Developer Ubisoft Montreal apparently didn’t expect the CPU limitation. The team anticipated “a tenfold improvement over everything AI-wise,” Pontbriand said, but they were “quickly bottlenecked.” He describes the limitation as “frustrating” and says the game could be running at 100 FPS if it was “just graphics.” Seems like this generation of consoles has enough pixel-pushing prowess for prettier visuals but not enough CPU power for richer gameplay.

The Xbone and PS4 are both based on custom silicon with eight x86-compatible CPU cores. That might sound potent, but those cores are derived from AMD’s low-power Jaguar architecture, which is designed for mobile devices like cheap tablets and notebooks. Intel’s eight-core Core i7-5960X processor is a whole other class of CPU. So is the dual-core Pentium G3258 Anniversary Edition, for that matter.

Assassin’s Creed Unity is coming to the PC, and that version shouldn’t be locked to a specific resolution and frame rate. However, Pontbriand doesn’t say whether the AI and other systems will exploit the more powerful CPUs available on typical PCs. That outcome seems unlikely, especially since Ubisoft is optimistic about releasing Unity simultaneously for the PC and consoles.

Comments closed
    • tygrus
    • 5 years ago

    I assume AMD CPU used in both has a turbo up to 2.75GHz so the 1.6-vs-1.75GHz is not a clear advantage to Xbox.
    Maybe the CPU clock rate is being throttled because of the thermal budget. Reduce the GPU demand to increase the CPU speed and memory bandwidth available to the CPU. I wonder if a few system and game updates will improve performance ?

    They should plan for a die shrink (below 20nm), 50% more max CPU speed and GPU improvement. If the CPU/GPU instructions and basic design is the same then game devs only need to add the option to increase resolution/FPS for the faster hardware. eg. PS4.2, Xbox OneHD, both for 1080p60.

    • derFunkenstein
    • 5 years ago

    “Whoops, my bad, you guys.”

    [url<]http://blog.ubi.com/assassins-creed-unity-delivers-resolution-players/[/url<]

    • WiseInvestor
    • 5 years ago

    Those lazy bastards don’t know or don’t have the budget to fully use hUMA.

    You’re going to see 30 fps locked on almost all Ubisoft games. All that talk is aim to provide an excuse to cutting corners and save money.

    I’ve been boycotting Ubi made games since 2002 and prod of it.

      • auxy
      • 5 years ago

      Consoles don’t have hUMA. Dirty little secret, kept quiet by the console makers, but AMD confirmed it around the same time the console makers were trashing on Mantle. (not that the two are related; not saying that.)

        • auxy
        • 5 years ago

        Downvote me all you want, but it’s true~ ( *Β΄θ‰Έο½€)

    • akaronin
    • 5 years ago

    Crappy code will make the mightiest CPU fall on its knees.

      • Airmantharp
      • 5 years ago

      Or GPU, or both- all hail Crysis!

    • squeeb
    • 5 years ago

    Lets be real here, Ubisoft just blows.

    • maxxcool
    • 5 years ago

    This is laughable.. all the rage at UBI.. I can’t find it because searching this site is a pain but this has been discussed already at length.

    Games have a very sharp point of diminishing returns when it comes to core usage. I remember 4 being pretty much it. maybe someone else can find the discussion .. hell it may have been elsewhere .. dunno.. but with DSPs doing most of the work there is not alot left for the cores to do, and when there weak to begin with.. well surprise! .. fail.

    even on my rig with my all-mighty 1090t@4ghz and well coded games all I see is about 50% utilization across 6 cores. I’m pretty certain that won’t change because of magic apu dust, but only get worse.

    • Arag0n
    • 5 years ago

    Xbone… should I say ps4 is “piece of shit 4”?

    • Meadows
    • 5 years ago

    I’ve never seen a game saturate 8 cores with one of today’s CPU architectures, no matter which type.

    This is literally an admission that Ubisoft’s programmers suck.

    Either that, or they’re lying about the bottleneck.

      • tipoo
      • 5 years ago

      That, and the fact that the PS4 and XBO are geared towards offloading a lot of compute to the GPU. I suspect these have a lot of headroom yet.

        • Meadows
        • 5 years ago

        But seriously. “AI and a [i<]number of[/i<] NPCs"? The devs can't possibly be using more than 2 cores if they get "bottlenecked" by things like that.

    • Billstevens
    • 5 years ago

    Its a sad day when a budget 2 core chip could have been easier to code for and more potent than the solution chosen. Consoles have the least excuse to bottleneck their own hardware given that they have full architectural and software control.

      • Narishma
      • 5 years ago

      They have a good excuse and that is cost. A major reason consoles are successful is that they are cheap. You can make them cheap either by using cheap and slow components or by subsidizing the high cost of high performance components for a few years. The latter didn’t work so well for them (especially Sony) last gen, so they went with the former, which seems to be working much better.

      • tipoo
      • 5 years ago

      Anything from Intel is pretty much a no-go for consoles. Microsoft tried with the OG Xbox, but Intel and Nvidia proved bad partners. Both want to keep control of their chips too much, while console makers want control of the chip to be able to tweak and shrink it on their own schedule. Iirc, Nvidia woudln’t allow shrinks in proper time for RSX, and Intel for the Pentium in the Xbox.

      That, on top of charging far higher margins, Intel doesn’t want to touch low margin stuff (AMDs margins in consoles is low even now).

      AMD and IBM are much more willing to licence things out and do custom work.

      And I don’t know that a Core 2 Duo would be more potent, per-core yes, not in total. Jaguar, core for core, clock for clock, was about half as performant as a Ivy Bridge i5, so if 6 of them are well used it would be well above a Core 2 Duo.

    • joyzbuzz
    • 5 years ago

    B-O-G-U-S. Title of this article is SOOOO bogus. Mark Cerny said there’s at least five years performance headroom built into the PS4 hardware – it will take at least five years for developers to fully tap the performance potential of the PS4 hardware. Ubisoft isn’t remotely close to tapping the power of the PS4. Guarantee there will be games coming Q4 2015 that blow away Assassins Creed Unity graphics AND AI. And games coming in Q4 2016 that blow THEM away.

    To state “The latest game consoles aren’t even a year old, but developers are already bumping into the limits of the hardware” is non-sensical.

      • Airmantharp
      • 5 years ago

      Performance headroom- like performance gains from improved drivers and game patches- comes from a better understanding of how to optimize for the hardware and intermediate software platform(s) involved.

      Thing is, unlike the last Xbox and every Playstation, these consoles are the least complicated to develop for since the first Xbox’s Celeron and pseudo-Geforce 3. It seems rather logical that they’ll be bumping up against the limitations of the hardware pretty quickly.

        • Bensam123
        • 5 years ago

        How long have oct-core processors been the norm for?

          • Airmantharp
          • 5 years ago

          Wouldn’t matter if they’d used a 16-core CPU, given that the last cores are going to be doing exactly the same thing.

            • Bensam123
            • 5 years ago

            I don’t understand? I’m talking about breaking down a program into multiple threads to properly utilize a higher core processor. If it’s built around single thread performance or lumping everything onto one or two cores, it wont work well on a eight core processor… or sixteen.

            If people are used to designing around 2/3/4 cores, it wont be designed to utilize eight cores well.

            • Airmantharp
            • 5 years ago

            The assertion seems to be that they did break the game’s work down for the six or eight cores available, and that still wasn’t enough for their design goal.

            • Bensam123
            • 5 years ago

            I was pointing out that developers have been working with 2/3/4 core processors for a really long time and may not yet possess the expertise to take full advantage of a eight core processor. If you see one core sitting at 100% it doesn’t matter what the rest of them are doing.

            • Airmantharp
            • 5 years ago

            I agree that that has been the case; but the reality of programming for these new consoles and having to split up work is several years old. We’ve already seen with Crysis 3 that even rudimentary video-game code can split it’s work up amongst many cores, and all the major engine houses have certainly also moved in that direction.

            In the case of Assassin’s Creed, it looks like the extra grunt is needed completely for AI, and if they were able to split AI among two cores, then they could just have easily split it amongst four or five.

            Of course, I’m not saying that they even remotely programmed the game efficiently. I’m just suggesting that it’s possible that they did, and that they really have found a hard limit for their usage case.

            • Bensam123
            • 5 years ago

            Yes, for 2/3/4 core processors. Eight core hasn’t been the norm. Remember everyone a few years ago was saying how games don’t use more then two cores so a four core processor is pointless?

            You can verify this yourself by looking at games and task manager. They don’t all use cores evenly and very much depends on programming. Even if something is made for quad core, that’s not made for oct core. Games like Planetside 2 for instance still very much have problems with high core count processors and favor faster processors with less cores.

            Saying things like ‘easily’ and ‘should’ does not mean that’s what happens, especially when it comes to Ubisoft.

          • maxxcool
          • 5 years ago

          You mean 7-6 core with OS priority override for messaging, streaming, network stack offloading, disk access, rendering two separate screens, etc. etc..

          there is not alot of cpu power there to begin with and the way MS uses the base os is not helping.

      • maxxcool
      • 5 years ago

      “”Mark Cerny said there’s at least five years performance headroom built into the PS4 hardware””

      Really.. every other day we see devs “cutting back” on rez, cutting back on graphic details to just get it to work. I think were at peak now.

      • ronch
      • 5 years ago

      Considering many PC titles are ported over to these new consoles and most rely on fewer but faster cores that PCs have, then you realize those Jaguar cores are going to be overwhelmed. I reckon many were expecting these new consoles to force devs to use a ton of cores to benefit PC gaming, but it looks like it’s the other way around. Devs will still be lazy and just stick to fewer but faster cores (PC) then hope the title runs well enough on these consoles. That’s the downside of z86 inter-compatibility between PC and console.

        • Airmantharp
        • 5 years ago

        Pretty sure the ports go in the other direction…

    • sschaem
    • 5 years ago

    Just to repeat once more, what many jumped at reading this.

    So this game could run at 100+ FPS if it wasn’t for the CPU….

    Then why in hell is it rendering in 1600×900 and not 1920x1080p ?!?

    What a freaking LOAD of baloney .

    The PS4 in Ubi soft hands is like giving pearls to a pig πŸ™

      • exilon
      • 5 years ago

      Maybe he’s talking about the CPU frame time? That is a thing.

        • sschaem
        • 5 years ago

        Direct quote, its not ambiguous:

        “We were quickly bottlenecked by that and it was a bit frustrating, because we thought that this was going to be a tenfold improvement over everything AI-wise, and we realised it was going to be pretty hard. It’s not the number of polygons that affect the framerate. We could be running at 100fps if it was just graphics, but because of AI, we’re still limited to 30 frames per second.”

        Also about his CPU expectation.

        xbox360 is tripple core with 6 HW thread at 3.2ghz
        xbone 8 core at 1.6ghz (1 or 2 core reserved)

        If you expect a 10x speedup, the IPC would need to have jump by factor of over 8x.
        And we knew from day one that jaguar could not execute 8 to 16 instruction per cycle.

        But one thing Ubi should have known is that the next gen HSA GPU was fully programmable.

        Anyways, my problem is them saying the CPU is holding them back, not the GPU side.
        Yet they choose to run the GPU at 900 vs 1080p .

          • exilon
          • 5 years ago

          That’s very ambiguous as to whether he’s referring to CPU frame time or GPU frame time unless you were fishing for conclusions.

          There’s no way the GPU on those consoles can push 100 FPS at that quality to begin with, so why are you assuming he’s referring to GPU time?

          • Jason181
          • 5 years ago

          Two threads on one cpu is probably only good for about 30% maximum additional performance, not 100%.

          3.2 Ghz x 3 cores * 1.3 = 12.5
          1.6 Ghz x 8 cores = 12.8

          I’m sure the overhead is a little more on the xbone, but not that much. Sounds like you’re comparing ipc per thread, and not ipc per cpu. Big difference when you’re running multiple threads per cpu versus one thread per cpu.

          • tipoo
          • 5 years ago

          “a tenfold improvement over everything AI-wise” doesn’t mean a tenfold IPC improvement. It just means in AI specifically, and I believe it could be close. AI is inherantly branchy code, and that absolutely slaughtered the Cell and Xenon which had long pipelines with massive stalls. Jaguar is much shorter and AI friendly.

          • maxxcool
          • 5 years ago

          “”But one thing Ubi should have known is that the next gen HSA GPU was fully programmable.””

          HSA does not accelerate INT any any reasonable rate like FP.. AI is mostly branches of logic as i recall so HSA would be largely useless as the cost would be stupid high and detract from the gpu and reserved gpu assets.

          Cell was a comlete bitch to program for but would make a better AI engine then that apu ever can be.

            • mesyn191
            • 5 years ago

            Cell was awful at branchy code and the LSU management was so horrible that even if it did actually perform well at branchy code on a per SPE basis over all performance still wouldn’t meet or beat a Jaguar APU at the same task.

            Cell was very good, particularly for its time, at the same sort of tasks GPU’s are (ie. relatively simple highly parallel workloads) but as a general purpose CPU it was fairly crappy. Most PS3 developers ran all the AI on the PPE instead of the SPE’s for a reason. The SPE’s mostly ended up being used to augment the mediocre RSX where they excelled though they were also sometimes used for surround sound and physics too.

          • ronch
          • 5 years ago

          I would think those Jaguar cores aren’t really that much faster than IBM’s cores even if we assume Jaguar has 50% higher IPC, anc if we also assume that two of those cores are system-reserved. Here’s the math:

          6 x Jaguar cores = 6 x 1.6 x 1.5 = 14.4

          3 x IBM cores = 3 x 3.2 = 9.6

          14.4 Γ· 9.6 = 1.5

          So, I would think Xbone is only about 50% more powerful than X360 on the CPU side, and that’s also assuming devs would fully harness all 6 cores, which is gonna be tougher than utilizing 3 cores.

            • mesyn191
            • 5 years ago

            Xenon IPC was .2 in real world use.

            [url<]http://i.imgur.com/aWDYoew.png[/url<] Bear in mind that the PPE in Cell was very similar architecturally and was also made and designed by IBM. The harsh reality is that both Xenon and Cell were at best mediocre general purpose CPU's despite the massive peak performance numbers they showed on paper. Jaguar has a IPC of 1.1 in real world use. [url<]http://www.extremetech.com/wp-content/uploads/2012/12/Jaguar-IPC.png[/url<] Modern high end performance x86 CPU's like Haswell don't get a IPC greater than 2 FWIW. Its a fundamental limitation of the x86 ISA which is part of the reason why Intel tried to develop a VLIW ISA with IA64. Despite the low clock speed and low power usage oriented nature of the Jaguar core its IPC really isn't bad at all. Real world performance of the Xb1's CPU is probably more like 5-6x Xenon's. Its also much easier to hit that performance too due to it being a OoOE CPU with a fairly modern cache structure that runs at the same clock speed as the CPU core.

        • Pwnstar
        • 5 years ago

        He’s not.

        • Bensam123
        • 5 years ago

        How would lowering the resolution reduce CPU frame times?

      • Jason181
      • 5 years ago

      [quote<]the title will be limited to 1600x900 resolution and 30 frames per second on both consoles "to avoid all the debates and stuff."[/quote<] The 1600x900 is a concession to the xbox, it sounds like. The article's title is completely correct in that it's a problem with AI.

      • MathMan
      • 5 years ago

      Maybe the CPUs are BW limited? And the BW is shared with the GPU. Running at 1080p then would only make it worse.

    • Mat3
    • 5 years ago

    Are they making good use of all the available CPU cores? Or is it being bottlenecked by one core with others less than full load?

    GCN was designed for more than just processing graphics. Did they look into offloading some of the AI and physics to the GPU?

    Things we don’t know…

    • Bensam123
    • 5 years ago

    Oh god, consoles are holding back PCs, this has never happened before! Stop the presses!

    Jokes aside, I wouldn’t want an entire generation of consoles based off of two cores. Putting aside how fast those cores are, it essentially is going to destroy any sort of PC ports as everything will be made and optimized for JUST two cores. Four cores or more is a good place to go. It’s taken long enough to start moving away from four cores for PCs, we don’t need another decade of dual cores.

    Also keep in mind games have been made for low core counts for the last decade or so, it’ll take awhile before they’re able to optimize for eight cores properly.

    • ish718
    • 5 years ago

    A Core2quad Q6600 would probably outperform jaguar by far, lol.

      • sweatshopking
      • 5 years ago

      Yes.

      • sschaem
      • 5 years ago

      If you run legacy code, the Q6600 would have a 10 to 20% lower execute throughput
      With AVX code, the delta could be much greater.

      • Meadows
      • 5 years ago

      Only if you let the console overheat.
      [url<]http://www.cpu-world.com/Compare/563/AMD_FX-Series_FX-8350_vs_Intel_Core_2_Quad_Q6600.html[/url<] AMD's architecture has higher single-thread performance and about twice the potential performance as a result. While the Jaguar implementation is indeed slower, it also doesn't have a TDP of either 125 or 105 W but considerably less. Had they gone with a Q6600 the consoles would either sound like a hair dryer or burn to the ground. And had they underclocked said Q6600, then the console performance would be even worse than it is now.

        • derFunkenstein
        • 5 years ago

        Comparing to a Vishera CPU is irrelevant.

      • tipoo
      • 5 years ago

      Copy pastaing my own comment, but anything from Intel is pretty much a no-go for consoles. Microsoft tried with the OG Xbox, but Intel and Nvidia proved bad partners. Both want to keep control of their chips too much, while console makers want control of the chip to be able to tweak and shrink it on their own schedule. Iirc, Nvidia woudln’t allow shrinks in proper time for RSX, and Intel for the Pentium in the Xbox.

      That, on top of charging far higher margins, Intel doesn’t want to touch low margin stuff (AMDs margins in consoles is low even now).

      AMD and IBM are much more willing to licence things out and do custom work.

      And I don’t know that a Core 2, even a quad, would be more potent, per-core yes, not in total. Jaguar, core for core, clock for clock, was about half as performant as a Ivy Bridge i5, so if 6 of them are well they could probably edge out a Core 2 quad, and then if you add in using modern advanced SIMD like AVX the gap widens.

    • GTVic
    • 5 years ago

    Maybe the higher level AI should be offloaded to the cloud, it shouldn’t require instant response πŸ™‚ Or if the XBONE detects a PS4 or a PC on the network it could offload the AI processing there.

      • sweatshopking
      • 5 years ago

      Only the xbone supports cloud computing.

    • odizzido
    • 5 years ago

    “We tried to power though our poor optimizations with CPU power, but those netbook CPUs just can’t do it :(” says the dev team after ubisoft gave them no time to work on the game.

    • HisDivineOrder
    • 5 years ago

    Every year, it’s the same refrain.

    The PC version is coming out day and date with the console releases. It never happens with a mainline AC game.

    I don’t think it’ll happen this year either.

    As for the consoles having low end CPU’s, well yes. They have low end CPU’s. We know this. They’re based on AMD’s netbook platform. What do you expect? Something better than a netbook APU with twice the CPU cores?

    You can paste three Jaguar CPU’s and you’d still have a netbook CPU underneath it all running the show. Picking the weakest AMD core they could find was probably not the best course if they wanted high end performance…

      • Pwnstar
      • 5 years ago

      Exactly.

      • ronch
      • 5 years ago

      Those are weak cores, alright, and they hoped that throwing more of them in there would just pass the problem to developers to extract max performance out of 8 weak cores for the sake of keeping hardware costs down. Copy-pasting cores is much, much easier and cheaper to do than developing faster cores. And why not? MS and Sony are after maximum profit from each console sold and royalty fees from devs.

    • torquer
    • 5 years ago

    Wii U 4 Life

    • Chrispy_
    • 5 years ago

    This stinks of Ubisoft making excuses for terrible coding.

    None of their statements correlate with the facts, they’re just blaming their tools like all bad workmen.

    • wof
    • 5 years ago

    At 100Hz they have 128 000 000 machine cycles on a two way oo superscalar simd capable cpu available per frame.

    My guess is that 99% of those cycles goes to untangling bad abstraction layers, running messy code and waiting for memory stalls due to bad memory layout.

    I blame it on the code and I bet there are coders on their team who can fix it if they are given enough time.

    • GasBandit
    • 5 years ago

    “MORE CORES!” -AMD

      • Pwnstar
      • 5 years ago

      MOAR COARS

    • Airmantharp
    • 5 years ago

    To me, this sounds like very good news.

    It means that developers are really pushing to make use of the new consoles, particularly when it comes to using all of those ‘wussy’ cores, and it means to me that they’ll be making better use of desktop CPU cores by default rather than attempting to half-ass some sort of multi-threading for their ports after the fact.

    Means better games on the PC!

      • Arclight
      • 5 years ago

      You overestimate Ubisoft and co. Maybe their engine is so bad it barely works on desktop so no wonder it breaks down on the console port.

    • blastdoor
    • 5 years ago

    It is amazing that both Sony and Microsoft thought having 8 wussy cores is better than having 2 kick-butt cores.

      • shank15217
      • 5 years ago

      At that time their alternative was atom gen 1/nahelum celeron so gimmi a break. None of Intel’s desktop cpus developed at the time these consoles were engineered had that level of performance per watt. It takes time to validate a console platform, not to mention engineer it.

        • blastdoor
        • 5 years ago

        Xbone came out in fall of 2013. Ivy bridge came out in 2011.

        Microsoft and Sony went for cheap over good. But since they both did the same thing, neither will pay a price for it. In fact, it’s probably the most profitable thing they could have done?

          • Narishma
          • 5 years ago

          Intel isn’t an option in consoles if you want to sell them at a reasonable price and not lose a ton of money.

          • khands
          • 5 years ago

          Ivy wasn’t anywhere near cost effective for a console. Show me a $25 Ivy bridge part in bulk from then, please.

            • blastdoor
            • 5 years ago

            1. Why does it have to be a $25 CPU? Why not $50?
            2. I doubt Intel would have charged Sony and Microsoft retail prices.
            3. No doubt that Sony and MS both seem to be doing fine, so maybe they both made the right call. But I think it would be interesting if one of them would have decided to go with a much more powerful CPU and a console that cost $50 more.but we will never know.

            • khands
            • 5 years ago

            Because there are no margins for consoles at launch, they’re already making pennies, if not losing cash when it consoles launch. It’s only after the first year that consoles generally start making money per console sold. That’s a big reason why last generation lasted so long, the 360 and much more so PS3 took a long time to get their ROI.

          • Pwnstar
          • 5 years ago

          Yup.

        • Pwnstar
        • 5 years ago

        They could have used a better AMD chip.

      • shiznit
      • 5 years ago

      6 wussy cores

    • ronch
    • 5 years ago

    MS and Sony both deserve some flak for cheaping out on the CPU cores. They could’ve chosen IBM again, or Intel (and went for Nvidia graphics, perhaps). As for AMD, does anyone think they’d sell them cheap if they had better perf/watt? If Jaguar had, say, 30% higher per-core grunt you can bet AMD would price accordingly, giving MS and Sony less incentive to choose AMD. As it happened, AMD sold ’em cheap, and MS and Sony took the offer. Of course, part of the reason was also because AMD is a one-stop chip shop, but still, if AMD had faster cores to offer at the same price then devs would have less reason to gripe.

    And I don’t think MS and Sony had to really desperately keep the BOM costs down too. Those AMD SoCs probably contain pretty much all the systems’ core components, and the rest are just peripherals (drives, wireless controllers, etc.). Given how AMD looks like they’re selling them a dime a dozen, I would think MS and Sony both still have a fair bit of wiggle room.

      • daviejambo
      • 5 years ago

      Not sure why you are blaming AMD for this , nobody else would have made the chips for Sony and Microsoft at the price they were willing to pay

      They (Sony and MS) could have put a better chip in their consoles but it would off course cost more money

        • ronch
        • 5 years ago

        Not blaming AMD. Putting more blame on MS and Sony for choosing AMD. Just explaining AMD’s position on the whole matter.

        It’s the same thing with someone who’s building a gaming rig. You can buy a Core i7 if you had the money but if you wanna pay less you go with AMD. AMD isn’t necessarily bad, but the fact is, they’re selling slower CPUs for less. Can say sell faster CPUs at the same price? Sure, if they had faster cores.

          • shank15217
          • 5 years ago

          Oh so are you saying that the PS3/Xbox 360 cpu were amazing when they came out? I think you have quite the memory if you believe that was the case. The Cell cpu was very difficult to program for and the the 3 cpu Xbox 360 cpu was anemic almost immediately after it came out. Ubisoft is making excuses as they always do, and you’re AMD bashing.

      • the
      • 5 years ago

      There was a bunch of drama between Sony and MS on using IBM for their respective consoles. Essentially the Xbox 360’s CPU was underwritten by Sony’s relationship with IBM for Cell. The other factor is that IBM was rumored to be charging a high premium for any custom design to be their successors.

      While not directly related to IBM, managerial incompetence also affected Sony’s decision to look elsewhere. Their dream for Cell was so vast that Sony bought licensing for IBM’s fab technology to build their own foundry. This cost Sony billions who later sold the fab to Hitachi for roughly a quarter the amount they spent. The executives who desired their own production capabilities were removed form the company and they were also the big proponents of IBM’s Cell design.

      Both Sony and MS have some bad blood with nVidia as well. There was the lawsuit between MS and nVidia back during the original Xbox days about NV2A pricing and how it didn’t decrease over time. Sony went to nVidia for the RSX chip but it was initially more expensive than Sony would have liked: nVidia was using exotic laptop packaging to reduce board area. nVidia did over time revise their chip’s packaging and moved to GDDR3 on the motherboard which they could source independently.

      Intel and Sony were rumored to be in talks about getting Larrabee into the console market. Intel was willing to go for the razor thin profit margins on their part in an effort to gain market share and developer relationships against AMD and nVidia. This deal fell through as Intel cancelled all consumer versions of Larrabee. (Only select developers got original Larrabee hardware with the second generation chip being released as Xeon Phi.) MS had a sour relationship with Intel during the original Xbox days as well due to them keeping chip pricing high like nVidia.

      Thus the only company that hadn’t burned and bridges was AMD. MS was happy with the GPU design, including obtaining the core layout which they could then modify on their own. (This is the reason why MS has had chip engineers on their pay roll for years now and they designed a good chunk of the SoC themselves.) Sony has no reference for working with AMD but no active reason not too.

    • zorg
    • 5 years ago

    This is the number one reason why the next-gen has unified memory with platform atomics. If the CPU is not enough than use the iGPU.
    If they have so much free iGPU resources, than a simple static partitioning can help a lot. Dedicating just one CU block for AI is nearly 0.5 TFLOPS extra performance. Anybody who wrote AI system for PS3 should know how to do it.

    • maxxcool
    • 5 years ago

    Allow me to join SSK in the fire ::

    I #$%^ing told you AMD zealots that my phenom x6 was a better cpu than this.. so all those downvoters and haters from year + ago claiming this gimped apu was soooooo awsome… here you go :

    HAHAHAHAHA…

      • sweatshopking
      • 5 years ago

      Who argued that the phenom II x6 was slower?!?!? They’re idiots and that is clear information.

    • MadManOriginal
    • 5 years ago

    Have fun, NeelyCam!

      • NeelyCam
      • 5 years ago

      This one is difficult… on one hand, I’d cheerfully ridicule AMD’s Jaguar cores, but on the other hand, I wouldn’t want to say anything bad about consoles… Argh!

      Hmm.. maybe that dilemma is what you were referring to?

        • Narishma
        • 5 years ago

        On the bright side, I don’t think anybody will mind if you say bad things about Ubisoft.

    • DPete27
    • 5 years ago

    If they have so much GPU overhead, why not increase resolution and/or make visuals prettier. That way you might be able to hide the “dumb” AI. I’m not sure how many players would notice if all the townfolk were running the same AI pattern as long as you give the pertinent characters unique actions.
    Like others have said, the statements by Ubisoft don’t add up. It seems that their “AI group” got a little greedy and hogged too much system resources from the “artistic group”, resulting in an unbalanced game.

      • MadManOriginal
      • 5 years ago

      The game looks spectacular, so I don’t think that’s the issue.

    • kamikaziechameleon
    • 5 years ago

    He describes the limitation as “frustrating” and says the game could be running at 100 FPS if it was “just graphics.”

    This would be caused by a game engine limitation if I’m not mistaken not a hardware limitation. Connecting frame rate and resolution (GPU DRIVEN) to the AI and other systems (CPU DRIVEN)

    Anyone remember supreme commander 1??? The frame rate could stay high but the AI computations would slow down in larger games causing units to get stupider and slower to respond to commands.

    I’m confident that the AMD CPU is underwhelming as its an underwhelming component, but the explanation for GPU rendering issues is rather silly.

      • exilon
      • 5 years ago

      Your dismissal of the explanation is not an informed one. The CPU has to drive the GPU. You cannot decouple the two.

      The difference is that these console APIs allow rendering threads to eat up multiple cores with low overhead. DX11 had the capability but it was rarely used because the extra threads would collide with simulation threads. So if the simulation logic was removed, the rendering thread would be free to use up all the resources. Unfortunately this would make a very bland game.

      You need refresh yourself on SupCom as well:

      SupCom’s rendering thread was isolated to itself on a thread, and the rendering thread had even more room in multiplayer when the game slowed down thanks to someone else’s toaster. In AI games, SupCom units didn’t stupider, [b<]the whole simulation side slowed down[/b<]. If you were the "toaster" of the game, it took the rendering side with it because the AI/pathfinding thread started clobbering the rendering thread. This made for a very frustrating RTS and is definitely not acceptable in an action game.

        • sschaem
        • 5 years ago

        The graphic call list is 100% identical if you render in 1600×900 or 1920×1080.
        CPU load is 100% identical.
        And even on the GPU side the vertex shader load can be identical.
        The difference all focused on the shader units & ROPS by +30%.

        If you can render the game at 100FPS graphically, but you frame limit to 30fps.
        that a 300% drop in GPU requirement.

        The fact that they can only do 900 and not 1080, is a red flag that they are lying.

        Or, if you can render at 100FPS and limit to 30FPS, you have 70% of the HSA GPU unused.
        What could Ubi do with 70% of a GCN2+ HW with uniified HSA and EDR or high speed GDDR5?

        And at a minimum , it seem like a total waste not to run the game in 1080p if you dont even use 30% of the GPU.

          • exilon
          • 5 years ago

          Why do you think the console GPUs can do 1080p at 30 fps for this game consistently? There are GPU bottlenecked scenes and CPU bottlenecked scenes in any game. Consoles shoot for 30 fps, so the render target is based on both cases.

        • kamikaziechameleon
        • 5 years ago

        Acceptable, no it was not playable to be sure but the point I made was that I could see CPU intensive slow down w/o rendering issues.

        I’m fairly confident in my assertion about the rendering resolution not taxing the CPU any more but I leave that up to others like sschaem to comment on as I’m not that intimately familiar.

        I’m still confused how the 8 core processor can be maxed out so quick as this should have allot of head room for them, maybe the per thread yield on this AMD architecture is poorer than expected. I bet AMD won on the price side of this proposal and not the performance per watt side, a xeon inspired design should be able to clobber this AMD cpu at 8 cores and could have hyperthreading adding another bump on there and still come in at a similar or better thermal rating.

    • Voldenuit
    • 5 years ago

    This is Ubisoft. Where creating a female playable character is “too hard”.

    I’d take anything they say with an asteroid-sized grain of salt.

      • willmore
      • 5 years ago

      If only that asteroid could be steered to a precise location in Montreal…..

    • Techgoudy
    • 5 years ago

    [quote=””<]According to Pontbriand, Unity is more bottlenecked by the CPU, which "has to process the AI, the number of NPCs we have on screen, all these systems running in parallel." [/quote<] This is a crap excuse imo for the Xbox One. I say that because developers have access to Microsoft's cloud infrastructure, Azure, which would alleviate this issue for the Xbox One version of this game. All of those listed above were implemented in Titanfall using the power of Azure datacenters. Don't try to blame this issue on the console, but rather the developers for not thinking this all the way through. [url=http://mp1st.com/2014/03/16/titanall-respawn-engineer-xbox-live-compute-internet-sceptical-real/#.VDPrtPldUk0<]Microsoft Azure powering Titanfall AI[/url<] On the other hand though, the consoles could use a little bit more power.

      • kamikaziechameleon
      • 5 years ago

      That would make the SP game online only.

        • Techgoudy
        • 5 years ago

        You are right, it isn’t fair for me to assume that people have the internet. πŸ™

          • dragontamer5788
          • 5 years ago

          Seriously though, a number of military guys were complaining about the XBox One “online only” features. Something about terrible internet on nuclear submarines… and Afghanistan.

          “Online Only” means giving the middle finger to a fairly large portion of the population. Its why the XBox One-Eighty was somewhat successful.

    • EzioAs
    • 5 years ago

    As much as I’d like to bash and trash the PC version, I’ll hold out until it’s actually released.

    As for the consoles, this was expected.

    • gmskking
    • 5 years ago

    When the Xbox One hits $299.00 I might think about getting it. That’s about what it’s worth.

      • jessterman21
      • 5 years ago

      $299 with Halo:MCC and an extra controller bundled.

    • Generic
    • 5 years ago

    “Assassin’s Creed Unity is coming to the PC, and that version shouldn’t be locked to a specific resolution and frame rate. However, Pontbriand doesn’t say whether the AI and other systems will exploit the more powerful CPUs available on typical PCs. That outcome seems unlikely, especially since Ubisoft is optimistic about releasing Unity simultaneously for the PC and consoles.”

    That’s a shame, I may have actually paid full retail price for an AC game with excellent AI.

    The PC platform is once again hamstrung by its slow cousins, but it keeps my wallet fatter in the long run. So there’s that.

    • tbone8ty
    • 5 years ago

    Uuuuubersoft

    • who_me
    • 5 years ago

    Such CPU. Very next-gen. WoW!

    • guardianl
    • 5 years ago

    “a tenfold improvement over everything AI-wise,” Pontbriand said, but they were “quickly bottlenecked.” He describes the limitation as “frustrating” and says the game could be running at 100 FPS if it was “just graphics.”

    Resolution increases are effectively “free” for the CPU (minus the shared memory bandwidth for these SoC designs), so this statement is suspicious. Also, it’s possible (common) to run A.I. updates not in synch with frame-rendering, so again, even just being limited to 30 FPS “because of AI” is odd.

    Ubisoft has their own in-house engine (AnvilNext) for this game, so it doesn’t necessarily mean much for other next-gen games.

      • Laykun
      • 5 years ago

      With first party titles like this there are certain bugets on CPU and GPU that are allocated to their respective sub-systems. I get the feeling they needed a higher budget for AI to get the results they were happy with while sacrificing whatever CPU budget they had for serialising commands for the GPU. 30fps is highly acceptable on consoles.

      Their reason for downing the resolution doesn’t make any sense though, if what they say about 100fps on the GPU is true (perhaps they mean without any characters on screen :P).

        • Narishma
        • 5 years ago

        This game isn’t first party.

      • Zizy
      • 5 years ago

      Another possibility – they have AI draining almost all CPU resources and CPU cannot process enough draw calls and prepare textures to render in higher resolution and framerate.

      Simpler explanation – Xbox has small advantage in CPU and had to suffer because of PS4 while PS4 has a large GPU advantage and had to suffer because of Xbox, so Ubi does not piss off anyone. Well, except PC gamers. But we don’t matter as we don’t have a large company behind us πŸ™‚

    • sweatshopking
    • 5 years ago

    FOR THE RECORD, SINCE THERE SEEMS TO BE CONFUSION:
    THE XBONE HAS A FASTER CPU THAN THE PS4. THE SYSTEM HOLDING IT BACK IS NOT THE XBONE IT IS THE PS4.

    ALSO, XBONE IS GETTING DX12, WHICH WILL LOWER CPU USAGE, AND THE PS4 IS NOT.
    NOT SAYING BOTH CPUS DON’T SUCK, CAUSE THEY DO.

    [url<]http://www.extremetech.com/gaming/156273-xbox-720-vs-ps4-vs-pc-how-the-hardware-specs-compar[/url<] Down voting for FACTS?!?!? NOT ON TR?!?!?

      • tanker27
      • 5 years ago

      Caps lock is cruise control for the cool.

        • sweatshopking
        • 5 years ago

        THESE SUCKAFOOLS AINT UNDERSTANDING BEING A FLY DADDY.

          • tanker27
          • 5 years ago

          Regardless, you know how it is around here. People either vote with their ‘feelings’ or respond as much.

          • kamikaziechameleon
          • 5 years ago

          The CPU diff is NEGLIGIBLE!!!!!!!!!!!!

            • sweatshopking
            • 5 years ago

            MAYBE, BUT IT STILL EXISTS!!!!

            COME ON GUYS. NOBODY LIKES JOKES THAT ARE IN POOR TASTE. DO BETTER NEXT TIME.

            • tanker27
            • 5 years ago

            Seriously SSK was that even necessary?

            • kamikaziechameleon
            • 5 years ago

            I THOUGHT IT WAS FUNNY!

      • maxxcool
      • 5 years ago

      Careful, AMD Stockholm syndrome zealots can be dangerous …

      • l33t-g4m3r
      • 5 years ago

      Basic logic shows Vincent is both a liar, and a cheat. He’s lying about the real bottleneck, and cheating ps4 owners out of increased resolution.

      If there is any actual problem with the AI, it’s their fault for not optimizing more for the hardware. Games have had sufficient cpu for AI since last gen. Hell, even HL1 had reasonably good AI, and that game could run on a 400mhz single core cpu with 100mhz sdr. If you can’t get your AI to work here, it’s your own fault.

      Also, resolution limitations on the xbone are purely optimization issues. They could easily tone down effects to gain higher resolution. Apparently that option’s not on the table, and they’re making ps4 users suffer to equalize the playing field with the xbone.

      If Carmack was still making games, he’d make something that would blow everyone else out of the water with full HD resolution and 60fps. Ubisoft just can’t make good games, and that can be interpreted in more ways than one.

        • sweatshopking
        • 5 years ago

        That may well be the case, but given the excuse he gives, the ps4 would run slower, not the xbone.

          • cobalt
          • 5 years ago

          Eh? You’re saying the PS4 has a lower CPU clockspeed, and this would limit their rendering resolution? That’s unlikely. It might might sense if he was talking about a framerate cap, but the whole discussion was about a resolution cap.

            • sweatshopking
            • 5 years ago

            No. He is saying they are cpu bound for Ai and assets. I don’t think it has anything to do with red either. I took that Ai and unit numbers are bound.

            • l33t-g4m3r
            • 5 years ago

            Bunch of crap. It’s a multicore cpu, so use the other cpu’s. 30 fps is arbitrary. 1600×900 is arbitrary. Vincent is a liar.

            The problem is 100% optimization, or lack of it.

            • sweatshopking
            • 5 years ago

            I don’t necessarily disagree. I am hesitant to call a man a liar though.

      • jessterman21
      • 5 years ago

      Down voting because SSK

      πŸ˜›

      • zorg
      • 5 years ago

      This article is full of bullshits.

      The PS4 uses GNM API from day0. It is a very forward looking low-level high-performance API. Writing the GPU commands into the command buffer will take just a few CPU cycles. Mantle is the only API that keep up with this speed. The XBOX One uses a special Direct3D API, based on Direct3D 11 with low-level extensions. One year ago writing the GPU commands into the command buffer is nearly a million times slower than GNM and Mantle. Now the new XDKs allow much more speed and much more control, but only Direct3D 12 will have comparable speed.

        • sweatshopking
        • 5 years ago

        WHO KNOWS, ZORG.

      • HisDivineOrder
      • 5 years ago

      Dude.

      That’s based on old info from launch era 2013. The PS4 CPU can be clocked higher by applications. Not only that, but you are ignoring the very real implications of Microsoft’s choice to go with a highly niche, hard to fully realize caching system with low-speed DDR3L in lieu of straight up and conventional high speed GDDR5.

      Face it. The Xbox One is holding all of gaming back by its mere existence. It’s the lowest common denominator. It is literally the shackles that keeps gaming from doing so much more than it is. At least last generation, you could ARGUE that Cell offered more performance potential in scenarios where it was used properly. With the Xbox One, you can’t even make that claim. At best, it runs as well as a PS4 and at worse, it runs considerably worse.

      Thank you, Microsoft. We all love Kinect so much it was totally worthwhile hobbling consoles for another 10 years with a crappy memory subsystem and sub-$200 GPU (even at launch). That is why they did it, you know.

      For Kinect. To get it in the box and have the system “only” cost $500. At least Sony understood that gamers want one thing from their hardware: gaming hardware. No stupid Kinect peripherals whose primary feature that most use is Voice Command that could have been done with a $5 boom mic or via the headset mic they were forced to include after people whined about it being missing.

      How many people used Kinect for only Voice Command versus used Gestures, too, I wonder? Enough to hobble the whole console (in terms of hardware, in terms of port quality, and in terms of sales)?

        • sweatshopking
        • 5 years ago

        Do you have up to date top speeds for the ps4?? Cause it sure can’t go to a bizzilion ghz. Numbers or the information isn’t useful.

        Nobody is ignoring anything, and your post is crazy long and off topic.

        • Meadows
        • 5 years ago

        “Keeps gaming from doing [b<][i<]so much more[/i<][/b<] than it is"? Please.

      • Chrispy_
      • 5 years ago

      I thought the XBone CPU was 1.75GHz but only six cores because two were reserved for the secondary OS.

      The PS4 is the same CPU hardware at 1.6GHz but all eight are available.

      Unless I’m missing something, the XBone has the bottlenecking CPU of the two for devs, and it has slower DDR3 system RAM too.

    • ronch
    • 5 years ago

    I don’t remember console devs complaining about Xbox 360 when it came out (the PS3’s questionable choice of CPU setup is/was another matter though). While AMD sure gained some extra points for bagging all three current-gen consoles CPU/GPU-wise (or just the GPU with the Wii U), all these complaints about those Jaguar cores holding the consoles back is a bit embarrassing, isn’t it?

    Seems to me these consoles would be fine if devs could utilize all available cores (some are reserved for the OS and other services, I reckon), but it looks like things aren’t going as easy as they had hoped. Would going for two Piledriver modules running at, say, 3.0GHz be the better route?

      • Rza79
      • 5 years ago

      I was thinking exactly the same. Two Piledriver modules would have made a whole lot more sense and wouldn’t have been too power hungry at 3Ghz.
      Or Sony/MS thought too much of Jaguar and thought multi-threaded programming could solve everything. Or they were just too cheap to cough up the dough for Piledriver.

      BTW, on Xbone, games have access to 6 cores and 5GB of RAM, afaik.
      Also it’s more like two quad cores. So two of those six cores are not sharing cache.

      On topic, one thing doesn’t make sense to me though. If the CPU is the bottleneck and if it were just graphics they could pump out 100fps, why not go for 1080 at 30fps?

        • willmore
        • 5 years ago

        Or you could just realize that they don’t *have* a piledriver core synthasized for that manufacturing process, so they used a core that they had which was.

        Sure, they could have thrown in a ton of NRE to port some larger processor over to that process and hope to make the money up in licensing?

        They’re not a charity. If Sony or MS wanted more/different CPU, they would have had to pay for it.

          • ish718
          • 5 years ago

          Yes, ultimately it came down to corporate greed. Since they were targeting a certain price point, using a stronger CPU would have cut their margins.

            • Narishma
            • 5 years ago

            They don’t have any margins to begin with. The consoles are typically initially sold at a loss.

            • ish718
            • 5 years ago

            Yes, true but you want to minimize loss so it won’t take as long to start profiting on console sales.
            Sony started profiting on the PS4 7 months after release…

            [url<]http://www.geek.com/games/sony-says-the-ps4-hardware-is-already-profitable-1594697/[/url<]

            • w76
            • 5 years ago

            There’s different strategies you can take… Like never making a dollar, ever, on console sales, and getting it back in game licenses. Anyway, the whole hardware thing, one man’s “corporate greed” is another mans cost-benefit analysis.

            • chuckula
            • 5 years ago

            In AMD’s defense, so-called “corporate greed” on their part means: “We aren’t intentionally going to Kamikaze our entire company just so that Microsoft/Sony/Game companies can get rich.” AMD needed to design a chip that could be produced while actually making money for AMD, and making chips for consoles has *not* been a historically lucrative niche to occupy.

            • willmore
            • 5 years ago

            Okay, I was critical of the idea that some people seem to have a strange concept of self-entitlement, but I’m starting to see the validity of the arguement.

      • Zizy
      • 5 years ago

      That PD configuration would be slower, had worse perf/W (which is the limiting factor here) and couldn’t be fabbed everywhere. And it probably wouldn’t help here, as (non-strategy) AI is rarely limited by a single thread performance.
      Sure, AMD could create another PD version for TSMC, but this costs (and still doesn’t fix perf/W issue) πŸ™‚ If cost was not a concern I think consoles would have Intel+NV combo anyway.

        • Airmantharp
        • 5 years ago

        Intel + Nvidia was the only other technically feasible alternative at the time that the consoles were developed, but would have been far more expensive as the first version would likely have to have been a multi-chip solution, aside from the fact that both Intel and Nvidia would probably have wanted to charge as much or more individually as AMD charged for the whole part.

        • Rza79
        • 5 years ago

        Can’t be fabbed anywhere? true
        Slower? definitely not true

        [url<]http://www.anandtech.com/show/8067/amd-am1-kabini-part-2-athlon-53505150-and-sempron-38502650-tested/9[/url<] I don't think that adding two cores would overcome a 100% deficit. Two Piledriver modules at 3Ghz would definitely be faster. The Piledriver config wouldn't even use too much more power. I assume the 8 core Jaguar config uses somewhere around 35-40w during gaming. The Piledriver config would be in the 50-55w region. But as Willmore said, Piledriver isn't ready for TSMC.

          • Zizy
          • 5 years ago

          Doubling cores, not adding 2. We are comparing 8C@1.75 vs 2M@3 (or 6C @ 1.75 vs 1.5M@3). If OS takes 2 cores on Jaguar it is reasonable to assume it will reserve 1 PD core as well.
          Sure, single threaded performance will be much better for PD, but throughput will be slightly lower at worse perf/W.

      • w76
      • 5 years ago

      I’m no fan of AMD hardware by any stretch, but I don’t think it’s embarrassing. These things are meant to be affordable, and it appears developers ability to make use of processing power has simply outpaced vendors ability to chase Moore’s Law in the low-end. But to a degree, you’ll always be able to find someone who wished any piece of equipment had higher performance. I wish my 2600K had higher performance, but cost-benefit… I’ll be quite content to keep it until, probably, Skylake, unless Skylake gets delayed. Then Broadwell. But still. It’s a tradeoff for me, just like console hardware.

      • Airmantharp
      • 5 years ago

      They complained.

      They complained that the new-fangled triple-core multi-threaded in-order PowerPC CPU was only about twice as fast as the single-core out-of-order Celeron it replaced in the real world.

      They also complained heavily when the retail version of the console was cut down to 512MB of total RAM from the 1GB that was in the development consoles.

      In both cases, this new console generation is a much bigger leap than the last one was, compared to available hardware at the time and the preceding console generation.

        • ronch
        • 5 years ago

        Ok, so if they complained then I guess these devs keep complaining all the time! I say, give them a console with 32 Cortex A5 cores running at 500MHz!

          • Airmantharp
          • 5 years ago

          Honestly if they had another year or two, they could very well have been running ARM setups. Of course, the next generation very well might, and it might still be designed by AMD :).

          (but it could also be AMD x86 with GCN, Intel x86 with Intel graphics, Nvidia ARM with whatever replaces Maxwell, Samsung ARM with off-the-shelf graphics, or Qualcomm, or…)

            • ronch
            • 5 years ago

            If AMD plays nice during this console generation I would expect MS and Sony to stick with AMD. Whether they use Zen or K12 would be anyone’s guess, but if Zen shapes up really well and would be economically viable it would obviously be the better choice, as backwards compatibility with current-gen games would definitely be a huge selling point for anyone choosing between a PS5 or Xbox Two.

      • Pwnstar
      • 5 years ago

      Not really AMD’s fault, as they produce the chip that Sony and Microsoft want. They make better chips, but Sony/Microsoft didn’t want them.

    • Jigar
    • 5 years ago

    [b<]Seems like this generation of consoles has enough pixel-pushing prowess for prettier visuals but not enough CPU power for richer gameplay[/b<] Mantle could have help but its a no go on Xbone.

      • Laykun
      • 5 years ago

      These consoles have their own low-level graphics API that’s much closer to the metal than Mantle. Mantle is no help here.

      • Narishma
      • 5 years ago

      Mantle is for PC. Consoles have always had low level (even lower than Mantle) APIs.

    • I.S.T.
    • 5 years ago

    [url<]http://www.eurogamer.net/articles/2014-10-07-ubisoft-defends-assassins-creed-unity-graphics-lock-for-parity-on-ps4-xbox-one[/url<] In short: foot in mouth syndrome lols.

      • wierdo
      • 5 years ago

      [quote<]"We're a bit puzzled by the ACU situation and had a chat about it amongst ourselves. Internal bandwidth is shared between CPU and GPU on both machines which can result in a battle for resources, and if ACU is as CPU-heavy as Ubisoft says it is, that could potentially have ramifications - the only problem with this theory is that there's very little evidence that other titles have experienced the same issue, certainly not judged by Ubisoft's own output.[/quote<] Ok so it's not the CPUs but rather interconnect limitation? Sounds like they ran out of bandwidth with CPU and GPU sharing resources here. I guess that explains why resolution was bumped down, was confused about that for a minute.

        • cobalt
        • 5 years ago

        Thanks for the quote. (For those that didn’t read the Eurogamer article, that quote is from Digital Foundry editor Rich Leadbetter.)

        I, too, was surprised that an [i<]increase in resolution[/i<] could be CPU-bound. As we see here in benchmarks all the time, for instance, higher resolutions are more GPU-bound, so the CPU prowess should have little impact on the ability to increase your resolution. Bandwidth limitations might make sense, EXCEPT: doesn't the PS4's GDDR5 have far more bandwidth than the XB1? Anyway, I suppose it's possible that if they're only at 30fps, they're already riding a very fine line, and any increase of any sort on either platform bumps down the playablity too much. I'll be interested to see if the 900p cap stands.....

    • I.S.T.
    • 5 years ago

    “to avoid all the debates and stuff.”

    Seems like they handicapped one version because another couldn’t keep up. The same thing happened last gen as well a few times, only it was Sony’s machine being the one that couldn’t keep up…

      • bfar
      • 5 years ago

      Yup, seems like a lazy reason to cap both platforms.

      • internetsandman
      • 5 years ago

      That happens every single time a console game comes to the PC. Ubisoft had a particularly notable example with how they hobbled Watch Dogs on the PC (and even then certain aspects of the game were choppy regardless of your hardware)

      • derFunkenstein
      • 5 years ago

      well if it’s really the CPU, then maybe it’s still the PS4 that can’t keep up. MS boosted clocks across the board when they found out how much faster the PS4 APU was, including a CPU bump of ~9%

      [url<]http://www.eurogamer.net/articles/digitalfoundry-xbox-one-cpu-speed-increased-in-production-hardware[/url<]

    • Maff
    • 5 years ago

    Right from the start, it should’ve been clear that heavy use of physics and AI should be offloaded to the GPU. It might not be what game developers are used to doing, but it is what PS4 devs said in their presentations.
    They didn’t give the GPU all this compute and scheduling ability for it being used as just the graphics accelerator.
    Whether or not the SoC as a whole is powerfull enough is another thing, ofc.

      • Ninjitsu
      • 5 years ago

      Unless AI is massively parallel, I’m not sure they can do that.

      • Laykun
      • 5 years ago

      Run the whole game on the GPU, problem solved.

    • TwoEars
    • 5 years ago

    It’s hard to program physics and AI for 6 slow cores instead of 1 fast one. You can often make it work if you have at least 1 fast core that can “run the show” and then slower “support cores” around it. But if they’re all slow you’re in trouble.

    • chuckula
    • 5 years ago

    [quote<]According to Pontbriand, Unity is more bottlenecked by the CPU, which "has to process the AI, the number of NPCs we have on screen, all these systems running in parallel."[/quote<] But but but... MANTLE AND HSA!!

    • SoM
    • 5 years ago

    omg, the PC is dead :rolls eyeball:

      • Price0331
      • 5 years ago

      Rolling a single eyeball is a bit… unsettling.

        • Grigory
        • 5 years ago

        What if he is a pirate?

          • ronch
          • 5 years ago

          A software pirate.

          • superjawes
          • 5 years ago

          I just assumed he had a glass eye…

          • Nike
          • 5 years ago

          Or a cyclops?

    • geekl33tgamer
    • 5 years ago

    Doesn’t run properly? This isn’t hardware limits – It’s Ubisoft’s standard approach to “optimization”.

      • geekl33tgamer
      • 5 years ago

      Stop with the downvotes. I was taking the P after the Watch Dogs game on PC. We all know how well that runs.

    • drfish
    • 5 years ago

    Shocker! *rolls eyes*

Pin It on Pinterest

Share This