AMD and Firaxis join forces to bring DX12 goodies to Civilization VI

The Civilization family of games is without a doubt one of the most popular and long-running franchises out there for the PC. Developer Firaxis has big plans for the next title in the series, Civilization VI. Owners of Radeon graphics cards will be pleased to learn that AMD and Firaxis have joined forces to implement a DirectX 12 rendering path for the game, including explicit multi-adapter and asynchronous compute support.

While asynchronous compute support is arguably of greater benefit to Radeon graphics cards than GeForces, it's exciting to see another title adopt DirectX 12's explicit multi-adapter support. That feature allows developers to split workloads across multiple cards without the use of proprietary multi-GPU setups like SLI and CrossFire. No matter whether you bleed red or green, it should be possible to gang together graphics cards from either major vendor (or even cards from both companies in the same system) to get some kind of performance boost.

This partnership isn't the first time AMD and Firaxis have teamed up. Civilization: Beyond Earth was one of a few high-profile titles that offered support for AMD's Mantle low-overhead API before it was reborn as Vulkan, and the companies also worked together to implement the unusual split-frame rendering method on multi-GPU rigs for that title. Civilization VI is up for pre-order now, and it'll hit store shelves October 21 for $59.99

Comments closed
    • oldog
    • 3 years ago

    “There is no end to our imagination…”

    I suppose this quote refers to a gamer’s consternation after viewing this promotional vid for Civ VI.

    • Mat3
    • 3 years ago

    [quote<]While asynchronous compute support is arguably of greater benefit to Radeon graphics cards than GeForces [/quote<] Uhmm, "arguably"? Seriously?

      • Shobai
      • 3 years ago

      Well, it probably is at the moment. If and when Nvidia’s long-foretold ASync driver appears, I guess the review sites will be able to determine the truth of that statement.

        • Pwnstar
        • 3 years ago

        He means there is no question it helps AMD. There is no arguing about it. It’s an accepted fact.

          • Shobai
          • 3 years ago

          Yes, absolutely: it helps AMD. AMD probably stands to gain the most, because of their architectural decisions, but that’s not what he’s quoting. What he’s quoted is ‘arguably of greater benefit’.

          It remains to be seen whether GCN’s untapped potential can be converted to performance (and the Doom results are looking pretty good), but it also remains to be seen whether Nvidia can extract any performance from its designs with Async. Until both can be investigated, I would think it definitely remains ‘arguably’.

            • Kougar
            • 3 years ago

            I think it’s fairly conclusive. In DX12 async mode HItman, Ashes, and Doom have the Fury X easily on top if it wasn’t for the 1080.

            [quote<]Nvidia can extract any performance from its designs with Async[/quote<] NVIDIA can't fix it in drivers, it's a fundamental part of the hardware design. Remember NVIDIA knew about it's asynch difficulties years ago, it would already have fixed it after the 900's launched if it could've.

            • Shobai
            • 3 years ago

            I definitely agree with the first bit, don’t get me wrong. I’m running an R9 290, so I’m stoked that the improvements are materialising.

            [quote<]NVIDIA can't fix it in drivers[/quote<] It would seem that this is still the case, also, and again there is no disagreement here. My point is that Nvidia appears to have promised an ASync driver in spite of those apparent deficiencies [if you will - it definitely hasn't hampered them in DX11] in their hardware. Going back to the quote in question, then, the only case I see for Jeff having used "arguably of greater benefit" rather than simply "of greater benefit" is to leave room for comparing what benefits come from both camps enabling ASync. I don't think it is wise for him to paint himself into a corner [by removing the "arguably", for instance] until such time as that driver can be tested - which comes back to Nvidia releasing it. As it stands, I think a good case can be made that there will be little improvement due to ASync in previous generation hardware in light of an ASync enabled driver from Nvidia. However, as this [i<]cannot[/i<] be known at this point, I would avoid throwing around the "can't" word. I hope I'm making sense...

            • Shobai
            • 3 years ago

            Having said all that, here it is straight from the horse’s mouth:

            [quote<]I wrote about this in our GTX 1080 review, but Nvidia says right in the GP104 whitepaper that Maxwell's async implementation is rather coarse-grained and has the potential to deliver worse performance when it's enabled than when it's not. When Nvidia says the feature isn't enabled in the driver, it's probably a matter of not giving programmers the rope to hang themselves with. It would genuinely surprise me if we ever see Nvidia enable the feature in the driver, or if async code will ever see any kind of performance increase on Maxwell.[/quote<] As such, there appears to be no need for Jeff to have said 'arguably', as he already had this information [and 'Geforces' must at this point refer to previous generation hardware in the vast majority].

    • Pettytheft
    • 3 years ago

    Well, so much for DX12 not taking off.

    • TheMonkeyKing
    • 3 years ago

    Absolutely the worst game announcement. And by that statement I mean, where was any shred of game design? I guess they got the loading screen down pat, the rest should be easy right?

    • Yan
    • 3 years ago

    Civilization doesn’t have particularly demanding graphics. The [url=http://support.2k.com/hc/en-us/articles/201332873-Civilization-V-PC-System-Requirements<]recommendation[/url<] for Civ 5 is merely "512 MB ATI 4800 series or better, 512 MB nVidia 9800 series or better ". It doesn't seem to be the kind of game that would profit from DirectX 12.

      • tipoo
      • 3 years ago

      It’s more CPU bound, which is exactly where these low overhead APIs help the most. And moving more compute to the GPUs with async is also good. You can get by with a slow CPU, but the turn times take longer.

        • sweatshopking
        • 3 years ago

        Huge map, slow speed, 16 civs, turn 199 ILL BE BACK IN 10 SO I CAN CLICK END TURN IMMEDIATELY

      • ChicagoDave
      • 3 years ago

      I play Civ 4 on a 970 @ 3440×1440 with high details and it starts to spin up after a few dozen turns. I’m sure the combination of high resolution and extra tiles the ultrawide displays puts a lot more demand on the card than a standard 16:9 screen would.

      BTW Ultrawide on Civ is…awesome 🙂

    • tipoo
    • 3 years ago

    I have a confession to make. I didn’t get civ. I got civ 5, played a match, it seemed to go on forever over multiple days of picking up the save file…Does that sound about right? I didn’t do a lot of winning, nor did my opponents,it just seemed to…Go?

      • ChicagoDave
      • 3 years ago

      Try playing Civ 4, it’s a much better game IMO.

        • sweatshopking
        • 3 years ago

        I disagree. Civ v was the high point of the series, but yeah, you’ve got to give a game at least 10 hours for a good one.
        I play civ quite a bit, and beat Sid difficulty regularly. I think the single units, espionage faith, and general improvements of v to be quite a bit better than previous iterations. BE was garbage.

          • Shobai
          • 3 years ago

          Civ2 is still the high point of the series for me =P

        • kalelovil
        • 3 years ago

        Although in most ways I agree, its really hard to go back to Civ4’s terrible combat system after using hexes.

        Civ 5 is IMHO a good game in its own right, provided you have both expansions. A pity though the AI still doesn’t know how too use some of the game systems.

      • Sam125
      • 3 years ago

      I like Civ5 because that was the first Civ where Korea was introduced properly. Otherwise I like Civ3 because I think Joan of Arc is hot. Korea should be one of the starter nations of Civ6, IMO.

    • DPete27
    • 3 years ago

    Hopefully Civ6 is better than CivBE….pretty low bar, I know.

      • drfish
      • 3 years ago

      I’m optimistic. BE never scratched the itch, and I’m pretty burned out on Civ5.

      • KeillRandor
      • 3 years ago

      Are they changing the One-Unit-Per-Tile system for this one? It pretty much killed Civ V for me after I saw through everything about how the game was built around it, and didn’t like it much (compared to III/IV). Am playing Age Of Wonders 3 atm, waiting to see what Civ VI is like/does…

        • sweatshopking
        • 3 years ago

        Not quite. New changes with units who can link up with other ones as support units. But largely the one unit per tile remains.

        • Zizy
        • 3 years ago

        Well, the game has linking of units of the same type into more powerful squads, linking of support units to main attacking units, and linking of useless civilians to your settlers and workers. So you can at least rely on having settler and warrior going together.

        Still closer to civ5 than to AOW3.

          • KeillRandor
          • 3 years ago

          I guess we’ll have to wait and see how well it all works, then… (I know AOW3 is fairly shallow, but it’s just to tie me over until something else comes along (that’s cheap)).

        • Timbrelaine
        • 3 years ago

        Yeah. I think they were right in that the old stacking system was a weakness, but their solution just wasn’t great. Instead of uber-stacks rolling over the land, you just had blankets of units covering up literally every tile. I’m hopeful about the new “support units” idea though. Being able to stack units in many (but still finite) ways sounds like a better approach to me. We’ll see.

      • Platedslicer
      • 3 years ago

      I liked BE… but it was somewhat lacking in character, especially compared to Alpha Centauri. It’s obvious that Firaxis didn’t really put their creative backs into it.

    • PrincipalSkinner
    • 3 years ago

    Good move by AMD.
    DX12 multi adapter sounds like it can render ultra high end GPUs obsolete or at least keep their prices in check.

      • tipoo
      • 3 years ago

      I’m excited to see the results for it. Hope it’s much better at frame pacing than SLI/Crossfire

    • chuckula
    • 3 years ago

    Yay, hardware vendors “teaming up” with developers to implement vendor-specific features in low-level software that can’t easily port to other platforms.

    It’s like 1992 when you had to select your soundcard model from a menu at game installation time all over again.

    [Thanks for all the downthumbs from completely honest and objective people who would [b<]TOTALLY[/b<] call me out and disagree with me using rational arguments if the headline had included the word "Nvidia" instead of "AMD"]

      • maxxcool
      • 3 years ago

      oh man what do I pick ? Mini-glide GLQuake Maxtox m3D? vodoo Glide? my AMD 3DNOW! drivers? VQuake.exe for my Rendition card !?

        • Voldenuit
        • 3 years ago

        [quote<]oh man what do I pick ? Mini-glide GLQuake Maxtox m3D? vodoo Glide? my AMD 3DNOW! drivers? VQuake.exe for my Rendition card !?[/quote<] BitBoys Oy, of course!

          • maxxcool
          • 3 years ago

          Ah man, Warp5! 🙂

        • DoomGuy64
        • 3 years ago

        Except dx12 / Vulkan is a standard, and Nvidia is simply not following it. Why? Because they can beat AMD with synchronous shaders, so they’re holding back to stagnate progress and keep their lead. Kinda like they did with Physx, when AMD beat them to market with dx11.

        This is also perfect for planned obsolescence. When Nvidia finally supports async, it will force existing users to upgrade by updating Gameworks with async effects, obsoleting all their existing hardware. At this point Nvidia is more or less competing with themselves rather than AMD, and the only people who actually escape from this are AMD users who already have async supported cards.

          • Waco
          • 3 years ago

          Careful, your bias is showing.

            • DoomGuy64
            • 3 years ago

            So? I’m allowed to have an opinion. However, Dx12/Vulkan being an industry standard isn’t one. It’s not glide, and Nvidia could easily support Async if they had the hardware to do it.

            As far as what happens when Nvidia finally supports Async, well that’s a pretty safe hypothesis. Nvidia likes to mix compute like PhysX with Dx11 graphics and VR, and obviously Async would give them a huge boost if it was supported.

            We just saw the requirements for Nvidia’s VR Funhouse, and 1080 SLI is recommended. I can confidently predict that if Nvidia supported Async in a mid-range next gen architecture, that would blow the 1080 SLI out of the water on this tech demo. The demo doesn’t need brute force, it needs better efficiency, which would exist with an Async supported architecture.

            • Waco
            • 3 years ago

            Asynchronous isn’t magic, I hope you realize that.

        • MOSFET
        • 3 years ago

        I usually picked VQuake for [i<]my[/i<] Verite. Later on though, GLQuake was possible, when the mini-ICD/wrapper improved.

      • nanoflower
      • 3 years ago

      I don’t have a problem with developers putting in Window’s specific options if that’s the platform they are targeting. Now putting in AMD or Nvidia specific options is a different matter for me. Ignoring Mac and Unix/Linux platforms is just a consequence of their size. Though nothing says that Civ VI won’t also be made available on those platforms even if Firaxis did implement a DX12 path for the game.

      • slowriot
      • 3 years ago

      Since when did explicit multi-adapter and asynchronous compute support become vendor specific features?

        • chuckula
        • 3 years ago

        Well, according to AMD’s official press statement in the link:

        [quote<]DirectX® 12 Asynchronous Compute Asynchronous compute is a DirectX® 12 feature exclusively supported by the Graphics Core Next or Polaris architectures found in many AMD Radeon™ graphics cards. [/quote<] [Edit: Funny how the AMD fansquad is downthumbing me for posting official statements from AMD. It's amusing how "fans" don't like to acknowledge the pronouncements of their own leaders.]

          • sweatshopking
          • 3 years ago

          Nvidias support is sub par, but it does exist. It’s a core part of dx12, not a vendor tech. We trashed amd for its subpar tesselation this is no different.

          • xeridea
          • 3 years ago

          Async isn’t really supported in any games yet by Nvidia (even with Pascal), at least not to the extent that they actually benefit, so AMD statement is pretty much correct. Even though it is not a vendor feature, they are the only ones actually supporting it.

          It’s like AMD is the only ones currently with support for AdaptiveSync, even though it is an open standard (Intel says they will support it in future, Nvidia vows never to touch it).

          • Ninjitsu
          • 3 years ago

          The feature is of DX12. The best support is by AMD. Salt and pepper by AMD marketing. It’s hardly something like PhysX. ¬_¬

          • BurntMyBacon
          • 3 years ago

          Regarding this statement:
          [quote<]Funny how the AMD fansquad is downthumbing me for posting official statements from AMD. It's amusing how "fans" don't like to acknowledge the pronouncements of their own leaders.[/quote<] They aren't downthumbing you for posting official statements from AMD. They are downthumbing you because you don't seem to know the difference between a vendor specific feature and an exclusively supported feature. Vendor Specific Features - Proprietary features specific to a single vendor. These features are not part of a standard or third party API and are often, but not always restricted from other vendors through technical or legal means. Examples include G-Sync, PhysX, and Gamestream. Exclusively Supported Features - Features that have only been implemented by a single vendor. These features may or may not be proprietary (Vendor Specific Features are a subset of Exclusively Supported Features). Features that are not proprietary are often part of a standard or third party API. While currently implemented by a single vendor, the vendor employs no legal or technical means to artificially restrict adoption by other vendors. There is a grey area with licensing that can sometimes become prohibitively expensive. Examples of non-proprietary Exclusively Supported Features include Adaptive-Sync (VESA Standard) until Intel actually implements it, HSA (AMD Open Standard) before ARM vendors started picking up on it, and Asynchronous Compute (DX12 / Vulkan Feature) until nVidia support is released. Now, lest we get carried away, I should mention that proprietary standards aren't completely without merit. When you control all aspects of the standard (think Apple) you can better control the user experience. As an example, G-Sync monitors universally have good functional frame rate ranges where you have to be very careful to check the functional range of Free-Sync monitors to make sure the functional range is usable. It is up to the end-user to decide whether the advantage is worth the trade-off.

          • blastdoor
          • 3 years ago

          Funny how you’re whining about being down voted for intentionally trolling — after you more-or-less admit that you’re trolling.

        • Concupiscence
        • 3 years ago

        I believe he’s referring to Direct3D 12 versus Vulkan.

      • Concupiscence
      • 3 years ago

      Eurgh, why couldn’t it be Vulkan?

      • tipoo
      • 3 years ago

      If the industry only moved forward when both AMD and Nvidia implemented the same things, it would be a lot slower going, and it would give each veto power if they didn’t feel advantaged by a new feature. AMD pushing async is great, and yes of course it does play to their strengths, it wouldn’t do for AMD to push Nvidias strengths instead.

      Other mentioned DX12 features like multiadapter are for both of them.

        • DancinJack
        • 3 years ago

        I don’t think Chuck would disagree with that. It’s the fact that if this were a Nvidia gameworks article, there would be AMD fanboi outcry like you’ve never seen before. But when it’s an AMD “exclusive” feature (or so says AMD), people are seemingly fine with it.

          • sweatshopking
          • 3 years ago

          But it isn’t an amd exclusive. NVidia just hasn’t supported dx12 as fully. That’s on them.

            • DancinJack
            • 3 years ago

            Tell AMD that.

            • sweatshopking
            • 3 years ago

            It currently is “exclusive” to amd, but that’s because NVidia didn’t include it. It isn’t an amd technology, but they’re the only ones using it right now

            Some developers have recently said that NVidias implementation actually slows things down, and they have to program to actually disable it.

            • Ninjitsu
            • 3 years ago

            They know, we know, but the average consumer doesn’t! So got to do the marketing.

            And really, they do clearly state that it’s a DX12 feature and only the support is exclusive…which isn’t technically correct but until Nvidia shows the feature working in the wild, it can’t really challenge AMD on that in front of the mass market.

            • Tirk
            • 3 years ago

            Its just Chuckula and DancinJack changing the facts to fit their narrative. I highly doubt that your factual statement will change their minds.

            Who cares if multi adapter support by its nature can’t be an AMD exclusive. Who cares if Async Compute is a DX 12 feature that is fully open for any GPU vendor to implement. Who cares if Gameworks and Gsync are features that any GPU vendor are free to implement………. wait I’m sorry that last thing is not true but it must be true if Chuckula and DancinJack are to make any sense whatsoever right?

            Error! Error! laws of nature are collapsing in on themselves, Chuckula has pulled the anyone who disagrees with him is a fanboy argument and must therefore be right. Error must be corrected…………..

            Corrected:
            AMD IS EVIL!!!!!!! They have the largest market share because of all the EVIL things they do to Nvidia and Intel. AMD has single handily destroyed PC gaming! Down with AMD! Down with AMD’s massive market share!!!!!! Once more, the Sith will rule the galaxy. And, we shall have peace.

            • derFunkenstein
            • 3 years ago

            not sure if you’ve noticed, but Nvidia hasn’t really had to do anything about it.

          • tipoo
          • 3 years ago

          What DX standard features did Gameworks adopt earlier than AMD? AMD isn’t making stuff up here, it’s in the spec. Gameworks didn’t improve performance, it tanked it.

            • stefem
            • 3 years ago

            And since when adding effect has improved performance?
            Don’t want to offend but that’s a really stupid way to see things.

            • tipoo
            • 3 years ago

            Gameworks was compared to a technology like async that *improves* performance. Gameworks gimps even Nvidia GPUs. The analogy being stupid is exactly what I’m saying.

          • xeridea
          • 3 years ago

          It is widely accepted that Gameworks is junk. Review sites tend to turn the features off because they give horrible performance, for subpar visual results. The features tend to just brute force it with tessellation rather than properly implementing them.

          It is widely accepted that DX12 is the future, Async, multiadapter etc all have huge benefits, nothing in DX12 is vendor specific. If a vendor chooses to not, or barely support a feature, that is their choice. Developers shouldn’t not use awesome features just because vendor green, red, or blue doesn’t want to support it.

            • Ninjitsu
            • 3 years ago

            [quote<]Developers shouldn't not use awesome features just because vendor green, red, or blue doesn't want to support it.[/quote<] Well developers have to consider their user base and sales as well, which is probably why these features are additional goodies for now (which isn't a bad thing).

            • stefem
            • 3 years ago

            That just a bunch of baseless assumptions, GameWorks comprise some very advanced technique, VXAO, HFTS, FLEX, HairWorks, Clothing…
            Are they overkill? maybe but that’s an entire different story.
            That’s nothing to do with your claimed “brute force” approach, and if you have a problem with tassellation you have a slide in your driver’s control panel and in some game’s option too to reduce it, right?

        • blastdoor
        • 3 years ago

        I think this is a fair point.

        Also, it’s clearly not like the 1992 sound card analogy. There are only two PC GPU vendors out there (I guess 2.33, if you count Intel).

      • wierdo
      • 3 years ago

      It’s an open standard, nVidia can support these features if they wish to do so. Perhaps they’re waiting for adoption to pick up first, we can only guess what their plans are as there’s no “legal” barrier to entry here.

      Nothing vendor specific at play here if you mean “proprietary” as the technology is on an open table and vendors can freely choose how much or how little they want to take advantage of the features available to them.

      • Puiucs
      • 3 years ago

      Async shaders are an open standard that have support in both Vulkan and DX12. The fact that Nvidia decided to ignore it for better dx11 performance (my own speculation) is their problem not ours. AMD had them implemented since the first GCN cards.
      It’s the same with Gsync, Nvidia ignored the open standard and decided to make something just for them.

        • stefem
        • 3 years ago

        Well, NVIDIA was the first that developed a variable refresh solution to eliminate tearing and V-Sync induced quantization effects, maintaining visual quality unlike many FreeSync monitor.
        And, to be honest, was NVIDIA aware of their proposal for a new API while they were designing GCN? it’s strange do blame someone for ignoring something about which has been purposely kept in the dark

          • Pwnstar
          • 3 years ago

          Why do people keep falling for nVidia’s propaganda? They were only first to market, not first to develop.

            • stefem
            • 3 years ago

            There was other variable refresh rate implementation before but, as I said, none of them to solve screen tearing and V-Sync added latency and quantization.
            Or you know of someone that had a working solution before NVIDIA?

            • DoomGuy64
            • 3 years ago

            Yeah, eDP. Which Nvidia STOLE, and claimed it as their own. Total liars about inventing it. Nvidia stole tech from the mobile market, and brought it to the desktop as a proprietary product.

            This whole Gsync thing was a fiasco from day one, as AMD immediately proved Nvidia to be lying thieves by showcasing freesync with existing eDP monitors. The only issue with freesync is that it did not exist in the desktop space, and Nvidia beat Displayport 1.2a to market by releasing Gsync as a custom FPGA module that could just be slapped into existing monitors.

            If you actually followed your history, you’d know that the first Gsync monitors to market were existing 144hz monitors that you had to modify yourself with the Gsync kit.

            Gsync is a hack, has been from day one, and it’s sole purpose was to create a proprietary standard and split the market. Also, because of Nvidia’s refusal to follow industry standards, Nvidia now has problems running products like the VIVE which require supporting modern displayport standards. Hilarious, considering they are marketing their new products specifically for VR.

            Nvidia. Most unethical graphics company ever.

            • BurntMyBacon
            • 3 years ago

            I learn something new every day. 🙂
            I was under the [i<]misconception[/i<] that nVidia did in fact implement G-sync first, but refused allow anyone else to use the technology prompting VESA (no doubt at the request of AMD) to release the DP 1.2a standard. That said, I knew that eDP monitors existed in the mobile space prior to G-sync. I just didn't know there was a "Sync" implementation that actually took advantage of the adaptive capabilities beyond power efficiency. [i<]Edit: Apparently admitting you had a misconception and learning from someone else is frowned upon here. Who would've figured.[/i<]

            • DoomGuy64
            • 3 years ago

            Yeah, “panel self refresh”. Made to save power in mobile devices like laptops, also exists in cell phones.
            [url<]http://www.theregister.co.uk/2011/09/14/intel_demos_panel_self_refresh_display_tech/[/url<] [url<]http://www.anandtech.com/show/7208/understanding-panel-self-refresh[/url<] [url<]https://techreport.com/news/26451/adaptive-sync-added-to-displayport-spec[/url<] [quote<]Adaptive-Sync is a proven and widely adopted technology. The technology has been a standard component of VESA's embedded DisplayPort (eDP) specification since its initial rollout in 2009. As a result, Adaptive-Sync technology is already incorporated into many of the building block components for displays that rely on eDP for internal video signaling. Newly introduced to the DisplayPort 1.2a specification for external displays, this technology is now formally known as DisplayPort Adaptive-Sync.[/quote<]

            • Pwnstar
            • 3 years ago

            Fear not, I upvoted you back to neutral.

      • BurntMyBacon
      • 3 years ago

      [quote<]Yay, hardware vendors "teaming up" with developers to implement vendor-specific features in low-level software that can't easily port to other platforms.[/quote<] You do know that nVidia is releasing support for Asynchronous Compute, do you not? Explicit multi-adapter is also a non-issue. What exactly is this fabled vendor-specific feature you speak of?

      • sweatshopking
      • 3 years ago

      WELL POSTED, SIRE! MISSION ACCOMPLISHED!

      • Goty
      • 3 years ago

      You mean like G-Sync? PhysX? Any of the whole host of other GameWorks “technologies”?

Pin It on Pinterest

Share This