Deus Ex: Mankind Divided won’t support DX12 at launch

You may be thinking "Whiskey Tango Foxtrot!?" after reading that headline, but yes, it's true. Deus Ex: Mankind Divided, a title meant to be among the "first wave" of DX12-enabled games, won't actually support the new API on the game's August 23 release day.

This move comes as a surprise, to say the least. Last year, Square-Enix made a point of showing off a video of the game's "Dawn Engine" that highlighted plenty of fancy graphical effects. The publisher then went on to specifically say that Mankind Divided would be infused with Microsoft's new graphics API at launch. AMD has also touted the game's DirectX 12 support in its own promotional efforts.

Fans looking forward to enjoying DX:MD's DX12 renderer will have to wait a while, though. In a Steam Community post, the Deus Ex team revaled that it needed some time for "extra work and optimizations" on the DirectX 12 front. On a more positive note, it seems that the team needs only two weeks, as it says an updated version with DirectX 12 support should come around on September 5.

Deus Ex: Mankind Divided has already suffered a six-month delay, as it was originally set to be released this February. A late-but-polished release is always better than a rushed one, though, so we can only hope the delay means we'll have a solid, optimized DX12 implementation when the game goes gold.

Comments closed
    • Kaleid
    • 3 years ago

    Works fine with 4670k, 16GB, rx480 8GB in DX11. Has some extremely sharp textures, uses about 5.5GB VRAM.

    • LostCat
    • 3 years ago

    Ooo. First game I know wanting more than 4GB for a lot of the options. Guess those 1060s and 480s will keep movin.

    • Laykun
    • 3 years ago

    Unless you have a below average CPU then this really shouldn’t mater. The DX11 implementation should run just fine with all the same visual features.

    • Krogoth
    • 3 years ago

    Need DX12 augmentation canister a.k.a “Skullgun”.

    • cldmstrsn
    • 3 years ago

    since when is 2 weeks “a while”?

    • Chrispy_
    • 3 years ago

    Wait until the games that started development [i<]after[/i<] the major engines shift to DX12 come out. Right now, for example, Unreal 4.12 is still DX11, with DX12 being classed as "experimental support" so games being made [i<]now[/i<] that will come out in 2017 are still being made for DX11.

      • LostCat
      • 3 years ago

      As much as I used to love Unreal I can’t even remember all the titles using it with DX11 having major issues…or Unreal titles that didn’t even bother with a DX11 renderer because of that…

        • Airmantharp
        • 3 years ago

        Not that I would dispute your claim, but would you consider those issues ‘misuses’ of the engine? Every game does try to differentiate itself by doing something ‘different’ right, and plenty manage to screw things up along the way.

          • LostCat
          • 3 years ago

          Smoke+fire…It was common enough to make me suspicious. I’m sure UE4 is good tho.

      • Laykun
      • 3 years ago

      Generally, with Unreal Engine 4, unless you make some pretty fundamental changes to the renderer it should be incredibly simple to go from DX11 to DX12 since the content pipeline should be exactly the same. So games made now can easily support DX12 when it becomes stable in Unreal by simply keeping their engine version up to date (some studios do not do this, but some do, and most should).

        • Chrispy_
        • 3 years ago

        Well exactly, but since UE4 still doesn’t officially support DX12 don’t expect to see anything that leverages DX12’s advantages for some time.

    • exilon
    • 3 years ago

    Just saw the benchmarks on gamegpu…

    There’s no way they’re going to release a DX12 enabled version when it’s half the FPS on Nvidia and even slower on AMD.

      • LostCat
      • 3 years ago

      I highly doubt they have actual review code, though I didn’t run it through a translator. DX12 is most likely disabled for the reviewers as well as the final version.

    • Klimax
    • 3 years ago

    Surprise? Why? It should be by now clear that support for low level APIs is very expensive and expansive. You have to have GPU-specific tweaks and system for efficiently selecting them. And for bonus, you might not even beat baseline set by driver in DX11. And features you might like are available on DX 11 too, so decision is fairly simple…

    ETA: And no matter what you do DX 12/Vulcan code path will be always extremely fragile. And any and EVERY new GPU will required new patch. Good luck with that…

    • TwoEars
    • 3 years ago

    It might just as well be DX9 for all i care. Just give me a good RPG experience and I’m happy.

    • Yan
    • 3 years ago

    No more “infused”, please. It’s starting to challenge “cloud” as a meaningless marketing buzzword.

      • Rectal Prolapse
      • 3 years ago

      Seconded. We need a paradigm shift in how these buzzwords are used!

        • RickyTick
        • 3 years ago

        We could use KPI’s to measure the deliverables.

        • UberGerbil
        • 3 years ago

        Buzzwords aren’t used, they’re actualized.

      • Meadows
      • 3 years ago

      Still not as bad as “disruptive”.

        • Stochastic
        • 3 years ago

        Eh, disruptive is overused (as is innovative), but it serves a useful purpose. A lot of Silicon Valley/tech companies truly are disruptive (Netflix, Amazon, Youtube, Spotify, Uber, Tesla).

        • Redocbew
        • 3 years ago

        “disruptive” is one of the worst so far.

      • Anonymous Coward
      • 3 years ago

      Tech people who don’t get “the cloud” are out of touch.

        • Krogoth
        • 3 years ago

        Cloud computing is a 2000s re-branding of terminal computing.

          • Anonymous Coward
          • 3 years ago

          Out of touch.

            • Krogoth
            • 3 years ago

            No, it is the technically correct answer.

            The only difference is that network is “internet” rather than some LAN/WAN intranet. You have a bunch of dumb terminals/computers tied to “servers/clusters/mainframes” that do the brunt of the work.

            • Anonymous Coward
            • 3 years ago

            Ah, so you are focused on the [i<]location[/i<] of the servers and clients as the defining feature of the cloud. If that were actually the interesting part of the cloud, then yes it would be pretty boring, [i<]but thats not what is interesting about the cloud[/i<]. Its not even the impressive ability to create and destroy servers and storage on demand, in a vast array of configurations, and in response to events without human interaction. Once you see that you don't even need to run servers... or administer databases... then you start to see what is going on. The whole question of OS starts to fall by the wayside, for many purposes. A new world opens before you. Its just you, your objective, and an astonishing field of services and resources for rent.

            • Krogoth
            • 3 years ago

            All of that can and has been done with terminal computing.

            Cloud computing is terminal computing with a new brand name.

            • Anonymous Coward
            • 3 years ago

            What a bizzare thing to claim, I could be doing that work I am doing right now, between checking this forum, on some mainframe. I not only laugh in the face of your claim that “its all been done before” but its also insane to suggest that mainframes where accessible to all sizes of firms, down to one-man shops, such as a variety of cloud platforms are. One person with an idea and the cloud and make it happen, starting immediately.

            You need to admit you have no idea what you’re talking about.

            • Krogoth
            • 3 years ago

            They are both forms of centralized computing where your clients use dumb terminals that don’t have the resources and computing power to do complicated tasks/work. The servers, mainframes and computing clusters would do the bulk of computing work. Creating separate personal computers for each client that could do the complicated tasks/work would be cost-prohibitive. The dumb terminals were nothing more than overglofried I/O devices to the centralized computing resources. The only difference is the type of network involved.

            Cloud computing uses the bloody internet while older terminal computing typically ran on some kind of intranet LAN/WAN.

            You are confusing architecture with functionality and accessibility.

            • Anonymous Coward
            • 3 years ago

            You’re astonishing.

            Let me guess some other things you believe:
            1) GPUs not revolutionary because it can all be done with CPUs
            2) Cars not revolutionary because people already had horse-drawn wagons
            3) Radio not revolutionary because people could already yell
            4) Printing press not revolutionary because people could already write

            Really, I insist, the [i<]least remarkable[/i<] features of the cloud include: 1) Its far away 2) It has computers in it 3) Thin devices can talk to those computers Notably those are the things you keep talking about. Also, another thought for you (and the handful of downvoters): the cloud is not necessarily about saving money. Arguably its more about [i<]saving time[/i<] and [i<]boosting flexibility[/i<].

            • Krogoth
            • 3 years ago

            Keep flying onto those clouds bud.

            Cloud computing is not revolutionary by any means. It is the old computing “dumb” terminals slaved to mainframe/clustering paradigm with a modern face lift.

            • Anonymous Coward
            • 3 years ago

            Well you can believe what you want, but for the people on TR who are open-minded and have so far thought that the cloud is a mere buzzword, I hope you pay a bit of attention to what I have said. The cloud is in a position to transform the professions of many on this forum.

            Some people will work with it, others will have their jobs taken by it. Probably the cloud is going to go full circle and come back to “mini clouds” in machine rooms, replacing VMWare or mainframes or just jumbles of a servers, with a software stack that has API compatibility with the “big clouds” out of the net. However only large companies will be able to justify that investment.

            The cloud is way beyond mere piles of servers. The cloud is services, the cloud is flexibility, the cloud is automation and rapid implementation and best practices and redundancy and it is low capital investment. Seriously, when you go around saying the cloud is a buzzword and you are just illustrating your own ignorance. Maybe thats fun when you hang out with all the other ignorant boys but thats still ignorance, and the cloud could very well undermine your career, so pay attention.

            Seriously.

            • Krogoth
            • 3 years ago

            *sigh*

            Nothing has changed in the computing paradigm in the past 40 years. You just keep throwing out more marketing-speak non-sense that tries to polish-up old computing concepts to the kiddies and ignorant masses.

            • Anonymous Coward
            • 3 years ago

            You haven’t changed in the past 40 years, computing has.

            • Krogoth
            • 3 years ago

            Computing itself has not fundamentally changed in the past 40 years. It just become widespread and commercialized due to miniaturization on the hardware, paradoxically software became more complicated and difficult to debug.

            • DrCR
            • 3 years ago

            Pure entertainment this thread

            • Anonymous Coward
            • 3 years ago

            OK, I sort of agree with that. Though higher level abstraction and increased productivity has been significant, not to mention to types of problems which are approachable with the resources now available. I am not planning on writing C in vi on a green screen without the internet.

            • Beelzebubba9
            • 3 years ago

            You are correct. 🙂

            • Anonymous Coward
            • 3 years ago

            Yay, somebody supports me!

            • Beelzebubba9
            • 3 years ago

            I’m an Enterprise Cloud Architect and in the Fortune 50 space who’s lead/designed some of the largest scale projects in Azure in the world.

            Anonymous Coward is correct and your opinion about Cloud is as lame as the dozens of people who my designs have replaced.

            • Krogoth
            • 3 years ago

            Can’t accept the fact that “cloud computing” is just old tech with a new face?

            • Anonymous Coward
            • 3 years ago

            Well, its fundamentally based on computers which use transistors etched in silicon, also the concept of “0” and “1” is holding on well, and data is still written and read, locally and remotely.

            But none of that is much of a victory to claim or an interesting thing to discuss.

            • Yan
            • 3 years ago

            Oh, so you’re the guy with no server, no database and no OS.

          • Yan
          • 3 years ago

          “Cloud” is just a way of saying “one or more servers”.

            • Anonymous Coward
            • 3 years ago

            As an answer to a test question, that might merit 0 points.

            • Wonders
            • 3 years ago

            Imma call you Anonymous Clouward from now on.

            • Anonymous Coward
            • 3 years ago

            I’ll print that on my coffee cup.

            • Beelzebubba9
            • 3 years ago

            No.

      • thermistor
      • 3 years ago

      See Al Yankovic’s “Mission Statement” for latest hotness in buzzwords.

    • yogibbear
    • 3 years ago

    Alternative article title “Deus Ex: Mankind Divided PC launch media coverage won’t be focused on DX12 launch issues”

    • sweatshopking
    • 3 years ago

    The entire problem with programming to the metal is it takes a ton of work. A ton of work means a ton of developers. A ton of developers means cost. It doesn’t matter whether it’s vulkan, dx12, whatever apple is using today, the reality is that it takes good developers and a ton of cash for dx12 to make a difference. Until the major engines have done 99.9% of the work built into the engine, dx12/vulkan/metal is a waste of time.

      • hansmuff
      • 3 years ago

      Exactly. Until they sell enough games *because of* DX12/Vulkan to make up for the extra dev cost, why bother? I was quite surprised DOOM did it, it runs very well on OpenGL already.

      • bill94el
      • 3 years ago

      Since I primarily just game I’m yet to be convinced there is one single reason to move to Win10. If dx12 won’t be implemented consistently for yrs I see no point really. Now I understand the underhanded MS tactics to get users to move (off Win7) to Win10. Win7 is this gen’s XP.

        • sweatshopking
        • 3 years ago

        Win 7 this gens xp? Sure. And just as silly this time too. 10is objectively better. Ask tech report. They’ve been recommending it for a year.

          • bill94el
          • 3 years ago

          Subjectively better. Again, it offers nothing I need at this point. Free beta testing has ended (to some extent) so maybe I’ll consider by New Year. Quite content where I am for now.

            • psuedonymous
            • 3 years ago

            ” Again, it offers nothing I need at this point.”

            If you consider continuing security updates as something you don’t need, you are part of everybody else’s problem.

            • jihadjoe
            • 3 years ago

            Security updates will be available for Windows 7 until Jan 14, 2020.

            • kvndoom
            • 3 years ago

            Yep… and something tells me it will extend beyond that. Either way I’m going to enjoy 7 for the next 3 or so years.

        • Klimax
        • 3 years ago

        Sorry, but you just don’t transplant WDDM 2.0 into old OS…

          • bill94el
          • 3 years ago

          Well, you got me there. Guess I’ll be downloading Win10 tomorrow. …can’t do without WDDM 2.0. Forgot all about that. Darn. Now why would I need WDDM 2.0? Oh, to get dx12 which I can do without be cause its not being implemented. Sorry, guess I won’t be downloading, don’t need, don’t want.

        • LostCat
        • 3 years ago

        Win7 doesn’t have the new input APIs, improved audio system, d3d11.1+/wddm2.0 obviously, so like it or not it offers a subpar gaming OS. The XP reference only goes so far since XPs support was extended largely because of netbooks and 7s is not likely to be.

        Might not matter too much for the near future but the uptake of DX12 is already far ahead of DX11 at a similar timeframe, so we’ll see.

          • Voldenuit
          • 3 years ago

          Win 7 is also less likely to break your comtroller support, webcams and misc peripherals with every other update. I’m on 10, but I certainly don’t feeled as if the 7ers are missing out on any must-haves.

          • I.S.T.
          • 3 years ago

          XP’s support was also extended because Vista was a flop.

            • Krogoth
            • 3 years ago

            Nah, it was because there was no killer app that required x64 on the SMB and mainstream market until recently.

      • Klimax
      • 3 years ago

      Trouble is, that work will have to be redone every time new GPU is released. And unless ou use fully stock engine configuration, you will have to do that wok yourself anyway. You thought Batman Arkham Knight was bad? Transposition that wreck into DX12/Vulcan…

      • Kretschmer
      • 3 years ago

      Yeah, “making developers optimize for us” might sound great to cash-starved AMD, but it’s tough to realize with limited budgets. Especially when DX12 is X% of your market.

    • selfnoise
    • 3 years ago

    Has there been a game yet which really benefited from the unique qualities of DX12? I’m not sure about Ashes of the Singularity, but both Hitman and ROTR have DX12 versions that seem like they don’t serve any real purpose.

      • UberGerbil
      • 3 years ago

      Anything that has been released so far necessarily took a low-hanging-fruit approach, which won’t benefit much. Anything that was actually re-architected to take advantage of the unique qualities of DX12 is unlikely to be finished yet. As SSK says in another comment, the real benefit will come when the underlying engines are rebuilt to use the low-level APIs; at that point a lot of people can leverage the benefits without the huge investment in development time.

        • Klimax
        • 3 years ago

        Maybe. But it will be fairly short lived gain. (Until new GPU is released)

      • Voldenuit
      • 3 years ago

      The true benefit of DX12 is vendor lock-in for Microsoft. Hopefully Vulkan takes off as a cross-platform alternative, but I’m not holding my breath.

      • chuckula
      • 3 years ago

      Unlike older versions of these big API updates, there’s no easily definable graphical effect that present in DX12 that you can’t do in DX11. For example, DX11 standardized the tesselation shader (less-standardized forms of tessellation had existed earlier) and that was a pretty big graphical effect that you got from DX11. There’s nothing like that in DX12.

      The idea for DX12 is that — if everything is done properly — you can do more of what you could already do with DX11 using the same hardware because you can get the hardware to run more efficiently by having more explicit control along with reduced overhead for bookeeping and accounting… of course, it has to be done right and nobody said that’s easy.

        • Klimax
        • 3 years ago

        Also it will be very expensive and continual expense to boot. New GPU? New patch and new work. And still you won’t likely beat driver…

      • Klimax
      • 3 years ago

      There is nothing interesting in DX12/Vulcan. Async? Only if you have some unused resources on GPU and task won’t starve already running code. And few API-only stuff, which can be a any time ported to DX 11 if Microsoft wants. Everything else is available in DX 11.

        • Airmantharp
        • 3 years ago

        Graphical feature-wise, you’re probably right, at least to a great degree- but the basic premise that has most of us sold on DX12 etc. is the possibility of reduced CPU usage and ability to spread that usage across multiple threads and thus cores in a way that DX11 doesn’t make possible.

        But as we’ve seen, even meeting that promise is going to be a challenge- for the developers, and for AMD and Nvidia.

          • Klimax
          • 3 years ago

          And very continuous near infinite expense. (It’ll end only when GPU is sufficiently powerful that sudden case of pessimisations don’t matter) And there’s no reason why DX 11 can’t be fitted with similar mechanisms without forcing developers to spend time and money on chasing ephemeral “better then driver” code for each GPU. (Especially immutabity of pipeline state objects and full command buffers can be trivially fitted into DX 11 API)

            • Airmantharp
            • 3 years ago

            Optimizing for DX12 sounds like slippery slope, but it really isn’t, and succeeding generations of hardware will likely still run optimized code quite well. Whatever is lost due to less than optimal tuning for newer architectures will likely be more than made up by increased performance.

            (if there’s a big loss, patches will likely be forthcoming, be they from the software developer or from the driver developers)

      • Pettytheft
      • 3 years ago

      Maybe you should look at Hitman again. ROTR second DX12 patch boosts performance. Don’t look at the first DX12 benchmarks they actually made things worse for both sides. Total War:Warhammer has a huge boost. Forza Apex runs incredibly smooth and looks amazing in action. it’s a DX12 beta. There are quite a few more games coming down the pipeline. I don’t get all the backlash against DX12 as it seems to be taking off quicker than any other DX version. Not every game will benefit but what is wrong with getting a boost in image quality and performance?

        • Voldenuit
        • 3 years ago

        [quote<]I don't get all the backlash against DX12 as it seems to be taking off quicker than any other DX version.[/quote<] When DX10 came out, there was a backlash against it as well. Notwithstanding that DX10 was frequently slower than DX9 (much like DX12 is cf DX11 today), there were probably a lot of people who did not care for Vista's registration and activation features (again, much like DX12 today and telemetry). From a purely technical point of view, I'm surprised that DX12 performance is worse than DX11 in so many cases; I'd have thought that with the multi-threaded scheduling and lower overhead, there'd be some immediate benefit even without deep optimization. I suppose there's always a learning curve with new APIs, and what works well in DX11 may not work well in DX12. DX12 is the future, but the future is not necessarily imminent, and if you're developing a game today, you support 42% of users in DX12, and 99% of users with DX11. The sensible path for any developer [i<]not getting paid by MS to screw consumers[/i<] is to prioritize their DX11 pipeline first, and add DX12 later.

          • sweatshopking
          • 3 years ago

          Which developers have been paid to screw customers? Can you provide an example? Quantum break, which is probably the most offensive to you nerds game has done literally nothing to screw customers, except being a timed exclusive, which isn’t new or unusual. Stop the melodrama. It is ridiculous.

            • Voldenuit
            • 3 years ago

            Well, Quantum Break was originally supposed to be an XBone exclusive. Then it was announced for Win10 via the windows store. The Steam release was not announced as an eventuality, however, abysmal sales on Win10 probably had something to do with it.

            Also, Gears of War 4 is still supposed to be Win10 exclusive, which is not the same thing as DX12-exclusive, but from all accounts it’s said to be pretty buggy. From what I read, the DX12 support was a hack job, which sounds like the developers were more interested (or contractually obligated) to tick a checkbox rather than pursue the most sensible path.

            There’s no such thing as a free lunch. If you’re adding in support for any one API, it takes programmers, testers and manpower away from everything else. I think Square made the right decision to prioritize DX11 for now, they can add DX12 later if it makes sense for them to do so.

        • Freon
        • 3 years ago

        In DX11 mode there’s still a big favoring in Hitman for AMD. And likewise for NV for RotTR. The most highly correlated variable here is card brand, not DX version number.

      • jihadjoe
      • 3 years ago

      DX12’s biggest problem is it’s an even numbered D3D! When was the last relevant even numbered D3D?

      7 Hardware T&L! Boom! DX7 was great and there were tons of games for it.
      8 Pixel shaders right? I can’t remember any game that specifically required DX8 to run.
      9 HLSL, MRT, MET, WDDM! Tons of games used it, even to this day!
      10 Some new lighting stuff. Aside from a few outliers like Alan Wake nobody used DX10.
      11 Compute shaders, Tesselation, Multi-threaded redering! Console Ports! Everyone loved DX11!
      12 “close to the hardware” which makes programming more difficult. Hmmm let’s wait and see.

    • Voldenuit
    • 3 years ago

    Looking forward to another buggy DX12 port.

    • Firestarter
    • 3 years ago

    DOOM at 50% off with Vulkan support was a way better deal than launchday DOOM without Vulkan support, especially for people with AMD cards. If these guys have a DX12 build brewing it might be a good idea to wait it out

      • LostCat
      • 3 years ago

      I think playing without it for two weeks won’t kill anyone. It’s not likely a short enough game you can finish it in two weeks.

Pin It on Pinterest

Share This