DirectX 12 to support existing hardware; first games due in late 2015

GDC — At the Game Developers conference this morning, Microsoft revealed the first details about its next-gen DirectX 12 API. I’ve got a little free time between briefings and keynotes just now, so let me try to cover the highlights briefly.

 

DirectX 12 is, of course, all about improving performance by cutting CPU overhead and giving developers more direct control of the hardware. There was talk of "lower level abstraction than ever before" and "unprecedented performance." Microsoft showed DirectX 11 and DirectX 12 versions of the latest 3DMark release running on a Core i7-4770K-powered system. In the DX11 version, most of the CPU load was on a single thread, and the other cores were underused. In the DX12 version, workload distribution was even across the cores, and overall CPU utilization was down 50%.

In this respect, DirectX 12 will mirror many of the improvements AMD implemented in its own Mantle API. We’ve suspected this development since the first DX12 pre-announcements were made.

The big news today was that DX12 will support more than just the PC and will work on existing hardware. Microsoft will bring DX12 to all of its platforms, including the Xbox One and, if I understand correctly, Windows Phone. This decision should have huge implications for portability across platforms, and one would expect it to make life easier for developers in a number of ways.

Which hardware will be DX12-compatible? AMD said all of its Graphics Core Next-based Radeon GPUs (i.e. Radeon HD 7000 series and newer) will work with the new API. Nvidia pledged support for all Fermi, Kepler, and Maxwell (i.e. GeForce GTX 400 series and newer) parts. The keynote included a demo of Forza 5 running in DirectX 12 mode atop an Nvidia GPU. Finally, Intel said the integrated graphics in its existing Haswell processors will also have DX12 support. All in all, Microsoft estimates that 100% of new desktop GPUs, 80% of new gaming PCs, and 50% of PC gamers will be able to take advantage of the new API.

Microsoft said DirectX 12 will premiere in next year’s crop of holiday-season games. A preview release of the API is coming later this year.

Comments closed
    • Chrispy_
    • 6 years ago

    [quote<]Microsoft will bring DX12 to all of its platforms, including the Xbox One and, if I understand correctly, Windows Phone. This decision should have huge implications for portability across platforms[/quote<] Holy hell, it's taken almost two decades, but have Microsoft finally worked out the point of having a [i<]standard[/i<]?!?

    • Welch
    • 6 years ago

    So it looks like Microsoft is ripping AMD’s Mantle off. If that’s the case, AMD you might want to look a little closer at that Mantle on Linux feasibility study you were talking about doing. Might be time to start working on it to beat M$ out on DX12 release and give the SteamOS a way to compete with a low level API right out of the gate.

    • Rza79
    • 6 years ago

    So MS uses Chrome? Lol

      • Meadows
      • 6 years ago

      Surely only for testing purposes.

        • sweatshopking
        • 6 years ago

        Chrome is the worst. I won’t update my opera past 12, since its so bad.

      • kamikaziechameleon
      • 6 years ago

      keen observation!

      • derFunkenstein
      • 6 years ago

      Gotta have something standards-compliant to compare IE’s page rendering to. :p

    • TwoEars
    • 6 years ago

    Well color me unimpressed.

    Did MS drop the ball completely on this one or what?

    I guess their marketing research dep. told them we’d all be sitting around enthralled like stupid monkeys singing and dancing to the tune of their glorious surface interface by now, and that there was no point in developing DX any further.

    Ha – fat chance.

    • albundy
    • 6 years ago

    I was hoping to see something more than this since DX11 was released 5 years ago. So no new shader model? this feels more like a small update instead of an all new release.

    • anolesoul
    • 6 years ago

    With Microsoft—it’s ALWAYS..too little..TOO late.

    • Milo Burke
    • 6 years ago

    If CPU utilization is down 50% and it’s able to utilize all cores evenly, this translates to an 8x improvement for people with quad-core CPUs like any i5 assuming core counts [i<]don't[/i<] increase in the next two years. - Remember how Inside The Second testing showed that frame latencies can be more consistently low when using a grossly overpowered CPU? This will be good for smoother frame delivery. - Aren't multiplayer maps hard on CPUs anyway? - What about enormous armies in RTS games? Will this become much easier to accomplish? I always hate the ridiculously low population limits they always set. - Will this improve AI by expanding how much CPU power is available to dedicate to it? In future games, of course. - And lets not forget how laggy tower defense games get on the super high levels. Let's hope Onslaught and Bloons TD 5 will be the first to receive DX12 support. =D

      • indeego
      • 6 years ago

      If it was 8x improvement they would state it as such. Amdahl’s Law would prevent such an improvement from occurring anyway.

    • torquer
    • 6 years ago

    I’m amused at how AMD fanboys think that somehow mantle forced Microsoft’s hand here. You don’t just throw together an API in 3 months in response to a competitor. I have no doubt whatsoever that Microsoft has been working on this for quite some time, long before Mantle came out.

    Lower level access to hardware and more efficient use of a CPU are no brainers – they weren’t revolutionary ideas that came from any one vendor (AMD or anyone else).

    The end result is positive – a STANDARD and cross platform method of getting better performance that isn’t vendor specific.

      • ptsant
      • 6 years ago

      If low-level access to the hardware and efficient CPU usage are no-brainers, how do you explain the fact that several previous iterations of DX were only minor updates, to the point that NVidia didn’t even bother supporting them? I mean, if efficiency was a priority, we wouldn’t have gotten at that point, no?

      Furthermore, don’t forget that AMD *publicly* announced Mantle 5 months ago and DX12 is far from being a usable product today, ie it’s not shipping. MS was probably aware of Mantle several months before. Knowing that MS has already worked with AMD on the XBone, I wouldn’t be surprised to learn that they modified DX12 accordingly in a similar timeframe, say 6-9 months.

      Anyway, the end result is positive as you say, but don’t forget that DX12 IS vendor specific, only the vendor is MS (versus linux/steambox, OS X etc for example) instead of AMD.

      • Bensam123
      • 6 years ago

      Dude if all they’re doing is making the calls multithreaded, they definitely can do that. Just because it has a extra +1 at the end of DirectX, does not mean it’s a fully developed brand new API. Nvidia did it with gameworks.

      • WaltC
      • 6 years ago

      I’m amused that you don’t understand that this is just an “announcement” of D3d 12 from Microsoft…and that you won’t be seeing it until *next year,* if then, either…;) Over the years Microsoft has said many things about its “commitment” to PC gaming–but precious few of its promises have been kept, more’s the pity. Is it simply a coincidence that Microsoft starts actually talking about DX12 *months* after AMD started talking about Mantle? I think not. It’s fine with me. *Somebody* needs to kick Microsoft square in the beJeebers these days to get a rise out of them–they don’t seem to have a clue as of late. I’m hoping the new CEO has a much better grasp of reality than the last one.

      You don’t seem to be able to properly appreciate the fact that the reason AMD started working on Mantle in the first place is because Microsoft was being very unreceptive and non-responsive. Don’t you think AMD would be much happier having Microsoft do it? Of course they would–AMD isn’t making any direct money from it, remember. Had Microsoft been energetic about the advancement of DX12 and listened to AMD *and* nVidia a bit more ardently then I doubt Mantle would have seen the light of day. Don’t tell me you haven’t noticed that as of DX10 that DX development at Microsoft has slowed to a crawl…? (And of course, AMD manufactures Microsoft’s xBone.)

      Do you think it’s “bad for the industry” that nVidia does things like PhysX and AMD doesn’t? I don’t. Likewise I don’t think Mantle is bad for anybody–it’s just an AMD technology that, like PhysX, may or may not get very far. Did you also know that SLI and Crossfire aren’t supported by any version of DX (D3d)? The IHVs have to support those *outside* of anything Microsoft does with D3d.

      Lots of cool, useful things are supported without “standards” like D3d, which is something you should definitely think about.

    • Ninjitsu
    • 6 years ago

    My interpretation is that current cards will support DirectX 12 Feature Level 11.x, not full Direct3D 12.

    Same for operating systems. Probably will be a feature of Windows 9. Late 2015 makes sense seeing that Windows usually has an October launch.

      • encia
      • 6 years ago

      From [url<]https://www.amd.com/us/press-releases/Pages/amd-demonstrates-2014mar20.aspx[/url<] "Full DirectX 12 compatibility promised for the award-winning Graphics Core Next architecture" AMD's PR has claimed "FULL DirectX 12 compatibility" for their current GCNs. NVIDIA has yet to claim "FULL DirectX 12 compatibility".

        • Ninjitsu
        • 6 years ago

        Yup nvidia’s clarified that they don’t have that.
        [url<]https://techreport.com/news/26210/directx-12-will-also-add-new-features-for-next-gen-gpus[/url<]

          • encia
          • 6 years ago

          From [url<]http://www.eurogamer.net/articles/digitalfoundry-2014-directx-12-revealed-coming-to-xbox-one[/url<] [i<]When asked about specific enhancements for Xbox One, Microsoft confirmed that DX12 was on the roadmap for the console, but "beyond that, we have nothing more to share."[/i<] The Xbox One factor.

    • JosiahBradley
    • 6 years ago

    With all these game engines now working on Linux (OpenGL) variants, the future seems to be evolving finally into a sea of open standards hopefully with zero driver overhead, massive parallelism, and heterogeneous computing. ZDO OpenGL combined with OpenCL and the embedded and web spinoffs allow for any application, not just games, to expand across devices from smart watches and mobile phones all the way up to 8core+ CPUs with 8+ GPUs running in tandem monster machines.

    TL:DR Not impressed and looking forward to a more open future.

    • Bensam123
    • 6 years ago

    So basically what they could’ve done years ago, but didn’t. I think a testament to that is that AMD and Nvidia don’t even need to add hardware support. Seriously MS?

    I wonder if all it’s platforms includes Windows XP… Or just Windows 8.

    Do games even need to be compatible with this? If all it does is tweak the CPU thread scheduler for DX, the games wouldn’t need to do anything. People would just need to update their version of DX… That means a new version of DX isn’t even necessary, unless they’re adding other things on top of it. Current games would be compatible with it… I think that deserves another WTF MS.

      • LostCat
      • 6 years ago

      Why would you even wonder if XP will get DX12?

      Put down the crack pipe.

        • Bensam123
        • 6 years ago

        I would also wonder if Nix would get DX12… and OSX… It’s about supporting everyone if they’re going to compete with mantle.

        • JustAnEngineer
        • 6 years ago

        People still using Windows XP at the end of 2015 don’t deserve games. πŸ˜‰

          • chuckula
          • 6 years ago

          YES THEY DO!

          Minesweeper, Solitaire, and Hearts are all games that deserve the best in D3D XII support!

    • capricorn
    • 6 years ago

    AFAIK Kepler does not even fully support DX 11.1, but DX 12?

    • shank15217
    • 6 years ago

    Here is a perfect example of AMD asserting it’s influence over the industry. All you AMD haters should be ashamed of yourselves.. Nvidia gave you gsync, AMD gave you DX12.

      • UnfriendlyFire
      • 6 years ago

      Gsync was derived from an upcoming Displayport standard, from what I’ve heard.

        • Pwnstar
        • 6 years ago

        Yes, nVidia is part of VESA (who made DisplayPort) and shamelessly stole it.

      • Klimax
      • 6 years ago

      Since DX12 was already in development AMD gave us nothing.

        • forumics
        • 6 years ago

        DX12 was only really in development after AMD announced mantle.
        i think credit must be given to AMD for speeding up DX development.

          • Klimax
          • 6 years ago

          According to NVidia, it was already in development for 4 years, so you are wrong.

            • memorylane
            • 6 years ago

            They “talked” about it four years ago.

            • Klimax
            • 6 years ago

            You discuss and gather, then design and then develop. Standard development cycle. Where other products and outer companies come in, you can’t do cowboy style development..

            But in all cases, Mantle never had any influence and at best I’d say AMD came on board very late, otherwise their last statements on DX 12 would be outright bald lies.

        • ptsant
        • 6 years ago

        Obviously, DX12 was already in development in the same way that DX12 comes after DX11 and there was bound to be some DX12 at some time.

        However, when exactly did Microsoft describe DX12 in detail and when exactly did they promise the specific features (less overhead etc) that they are now showing? DX12 would have existed even without AMD, but I am not sure it would have had the same features and priorities.

        What people are saying, is that the kind of DX12 we are getting has been influenced by AMD/Mantle or at least AMD/XBone, and I believe this is a reasonable assumption.

          • Klimax
          • 6 years ago

          Don’t think so in the least. People are seeing what its not there. And Mantel influence was so far after DX 12 started that it is pure nonsense to claim that Mantle had any effect on DX.

          Four years in development. And remember, it was AMD who claimed DX 12 doesn’t exist at all…

          • encia
          • 6 years ago

          From [url<]http://www.eurogamer.net/articles/digitalfoundry-2014-directx-12-revealed-coming-to-xbox-one[/url<] [i<]"When asked about specific enhancements for Xbox One, Microsoft confirmed that DX12 was on the roadmap for the console, but "beyond that, we have nothing more to share.""[/i<]

    • HisDivineOrder
    • 6 years ago

    So when is Forza coming to PC? πŸ˜‰

      • Aerugo
      • 6 years ago
      • nanoflower
      • 6 years ago

      Christmas 2015. πŸ˜‰

    • odizzido
    • 6 years ago

    100% of new desktop GPUs but only 50% of gamers huh? Sounds like they’re not planning to release this for anything but the latest version of windows. That’s fine, I don’t need it and I am hoping that steamOS will be good to go by then anyways. They seem to be getting a lot of support for it.

      • gamoniac
      • 6 years ago

      Who will have the guts (and dishonesty) to claim that it will support 100% of gamers, many of whom own older GPUs? Remember we are talking about end of 2015… I think even 70% would be a bold estimate.

        • HisDivineOrder
        • 6 years ago

        At least we know how long our current GPU’s are going to last us. πŸ˜‰

          • BestJinjo
          • 6 years ago

          February 2015.

          [url<]http://www.ign.com/games/the-witcher-3/pc-134497[/url<] [url<]http://www.rockpapershotgun.com/2014/01/30/the-witcher-3-wild-hunt-is-a-pretty-pretty-thing/[/url<]

            • JustAnEngineer
            • 6 years ago

            That should be just in time for the first wave of 20 nm GPUs, anyway.

    • Krogoth
    • 6 years ago

    Wake me up when there’s compelling *games* that use this API rather than silly tech demos.

    I’m afraid it is going to be a long sleep.

      • HisDivineOrder
      • 6 years ago

      Forza forza PCza?

        • Pwnstar
        • 6 years ago

        He said “compelling”.

      • nanoflower
      • 6 years ago

      If you are waiting for a game to be released then of course it’s going to be a long wait since they have already said Direct X 12 won’t be available until 2015. However I think what they did with Forza is more than just a simple tech demo. But even that clearly won’t be released until Direct X 12 is available to end users.

      • moose17145
      • 6 years ago

      Upvoted for simply being your usual unimpressed self lol πŸ™‚

    • Concupiscence
    • 6 years ago

    Well… worst-case scenario, this isn’t going to hurt the long-term usefulness of my FX-8320. I’m keen to see what the Khronos Groups has planned for OpenGL, too.

    • Laykun
    • 6 years ago

    I’m happy to be wrong about it requiring new hardware πŸ™‚

      • dragosmp
      • 6 years ago

      But does it require new software? Like an OS upgrade?

        • HisDivineOrder
        • 6 years ago

        “DirectX 12. Only on Windows 9 Pro Ultra Elite Hyper Fighting Edition PC’s. Now you’re playing with power. Multi-threaded, unlimited draw calls power.”

    • snook
    • 6 years ago

    Johan Andersson ‏@repi 1h
    Direct3D 12 blog with some more details: [url<]http://blogs.msdn.com/b/directx/archive/2014/03/20/directx-12.aspx[/url<] … You may recognize the design πŸ˜‰ repi knows i was correct. πŸ˜›

      • encia
      • 6 years ago

      From [url<]https://www.amd.com/us/press-releases/Pages/amd-demonstrates-2014mar20.aspx[/url<] "Full DirectX 12 compatibility promised for the award-winning Graphics Core Next architecture" - AMD. AMD's PR has claimed "FULL DirectX 12 compatibility" for their current GCNs. NVIDIA has yet to claim "FULL DirectX 12 compatibility".

        • Voldenuit
        • 6 years ago

        Does full compatibility = full capability?

        If MS is bringing back capability flags with fallbacks, a GPU might still be considered fully compatible as long as there is fallback codepath, depending on how flexible your definitions are.

        I’m not trying to spread FUD here – there are still very few specifics on what new additions there will be. So far, only 2 have been announced – new blend modes and a new rasterization method. From what nvidia has said, it sounds like their current Kepler chips won’t support these modes, but they will still be able to run and take advantage of other DX12 features, like possibly the lower level abstractions.

        Does anyone know definitively if AMD will support the new blend and conservative rasterization? As well as every other DX12 feature that hasn’t even been announced yet?

          • encia
          • 6 years ago

          Using CPU’s example, Intel Sandybridge/Ivybridge has full compatibility with X86-64 ISA.

          Microsoft already has Direct3D’s Feature Levels. For example

          From [url<]http://en.wikipedia.org/wiki/Direct3D#Feature_levels[/url<] NVIDIA Kelper/Maxwell = Feature Levels 11_0 with DirectX 11.2 e.g. it's missing 64 slots UVA for all shader types. DirectX 11.0 introduces 8 slot UVA for pixel and compute shaders. Supports only Tile Resource Tier 1. AMD PC GCN - Feature Levels 11_1 with DirectX 11.2 e.g. 64 slots UVA for all shader types and Tile Resource Tier 2. AMD Xbox One GCN - Direct3D 11.X superset. From [url<]http://blogs.windows.com/windows/b/appbuilder/archive/2013/10/14/raising-the-bar-with-direct3d.aspx[/url<] [i<]"The Xbox One graphics API is β€œDirect3D 11.x” and the Xbox One hardware provides a SUPERSET of Direct3D 11.2 functionality"[/i<] In terms of functionality, Xbox One's GCN hardware and Direct3D 11.X has exceeded PC's DirectX 11.2 Feature Level 11_1. From [url<]http://www.eurogamer.net/articles/digitalfoundry-2014-directx-12-revealed-coming-to-xbox-one[/url<] [i<]"Microsoft confirmed that DX12 was on the roadmap for the console, but "beyond that, we have nothing more to share."[/i<] If Xbox One's GCN ~= PC GCN, then AMD's PC GCN already exceeded PC's DirectX 11.2 Feature Level 11_1. This is why AMD can claim "FULL DirectX 12 compatibility" for their PC GCNs i.e. the Xbox One factor. NVIDIA has yet to claim "FULL DirectX 12 compatibility". If you read [url<]http://timothylottes.blogspot.com.au/2013/08/notes-on-amd-gcn-isa.html[/url<] AMD GCN ISA has exceed the DX and GL APIs. [i<]DX and GL are years behind in API design compared to what is possible on GCN. For instance there is no need for the CPU to do any binding for a traditional material system with unique shaders/textures/samplers/buffers associated with geometry. Going to the metal on GCN, it would be trivial to pass a 32-bit index from the vertex shader to the pixel shader, then use the 32-bit index and S_BUFFER_LOAD_DWORDX16 to get constants, samplers, textures, buffers, and shaders associated with the material. Do a S_SETPC to branch to the proper shader.[/i<] Both S_SETPC and S_BUFFER_LOAD_DWORDX16 instructions comes from the scalar processor within each GCN CU and it's not directly expose by DX and GL. For the above test case, AMD GCN doesn't need the embedded ARM CPU nor the host X86 CPU.

    • SCR250
    • 6 years ago

    Question for Cyril to ask Microsoft.

    With DX12 installed on a PC will any of the changes in DX12 improve DX11 games or are the improvements only for DX12 games?

    [quote<]Microsoft showed DirectX 11 and DirectX 12 versions of the latest 3DMark release running on a Core i7-4770K-powered system. In the DX11 version, most of the CPU load was on a single thread, and the other cores were underused. In the DX12 version, workload distribution was even across the cores, and overall CPU utilization was down 50%.[/quote<] The above is confusing. Is there a special DX12 version of 3DMark or was the DX11 version of 3DMark run on the DX12 system and it gained multi-threading and 50% lower CPU usage. If it was the later then that implies that there should be improvements to existing DX11 games.

      • strange_brew
      • 6 years ago

      I believe they worked with Futuremark to recompile it for DX12.

      • Cyril
      • 6 years ago

      Yes it was a special build. To take advantage of DX12, developers need to do some amount of rewriting to take advantage of the lower-level abstraction. Same deal as with Mantle.

        • nanoflower
        • 6 years ago

        Did any of the developers talk about how much of an effort it was to take advantage of DX12 over DX11/10? How does it compare to the work necessary to support Mantle? I would think that there would be less work necessary to support DX12 but how much effort does it take? Can they quickly take advantage of the ability to better spread the work load over multiple cores? That would be great if they can quickly do that even if that means they have to skip some of the nicer DX12 features since better usage of cores is obviously going to help most games out if you have a decent GPU.

        Edit: Looks like about four man months to do a port from Direct X 11 to Direct X 12 from what I read over on Reddit. Of course some games may take more effort and others less but still it doesn’t seem to be particularly onerous given the apparent benefits of Direct X 12 (the lower CPU overhead and ability to better use multiple cores seeming to be the biggest wins.)

    • The Egg
    • 6 years ago

    “Works Across All Microsoft Platforms”

    Does that include Windows 7? Because DirectX 11.2 is currently Windows 8 only.

      • nanoflower
      • 6 years ago

      Doubtful. Win7 is a version of the Windows OS. The question is will DX12 work on Windows 8.1 or will it require Windows 9. They clearly have it working on Windows 8/8.1 since developers have drivers that they can test with now.

        • HisDivineOrder
        • 6 years ago

        Windows 9 is just Windows 8.1 Update 3.

        So that’s not saying much because we know how Microsoft LOVES to rebrand their OS to sell it anew and also LOVES to tie new DirectX releases to new Windows releases.

          • nanoflower
          • 6 years ago

          It may be that Windows 9 won’t be that different from Windows 8/8.1 but I think Microsoft will be charging money to upgrade to Windows 9 from Windows 8/8.1. That is going to lead to some fragmentation of the market, with some people sticking to Win8/8.1, unless Microsoft again makes the upgrade price very low.

    • anotherengineer
    • 6 years ago

    “workload distribution was even across the cores, and overall CPU utilization was down 50%.”

    Save me a CPU upgrade maybe πŸ™‚

    Well good to see some work going into the software as it’s lagging behind the hardware by almost a decade in some cases.

      • indeego
      • 6 years ago

      Do you have any games that are currently severely CPU limited?

        • Pwnstar
        • 6 years ago

        Civ 5.

        TERA with 400 players on screen.

          • Klimax
          • 6 years ago

          If run under NVidia’s HW I suggest looking into TargetJobSizeoption in config file. Can have massive influence on performance.

          • LostCat
          • 6 years ago

          Does TERA even use DX11 yet?

          • indeego
          • 6 years ago

          TERA has a [i<]recommended[/i<] CPU from 2008! By the time this DirectX comes out modern CPUs will be capable of about 10x performance.

            • Pwnstar
            • 6 years ago

            I don’t think you’ve actually played the game. The CPU goes to 100% with a few hundred players on-screen.

            That recommendation obviously doesn’t include the end-game content. Most of the rest of the game has only a dozen models onscreen.

        • Ninjitsu
        • 6 years ago

        Arma 3, FreeSpace 2 SCP, Total War: Rome II, Borderlands 2 (seems to be, even at 1080p), Planetside 2, Far Cry 3 in some parts.

        I have a Core 2 Quad.

        • UnfriendlyFire
        • 6 years ago

        Starcraft 2… Wargame Airland Battle with 10v10 players (5 FPS on a 1.6 GHz quad-core i7 720qm)… Planetside 2…

        AMD’s APUs should benefit, especially for mobile Trinity/Richland APUs which have a locked CPU but OC-able GPU.

          • indeego
          • 6 years ago

          A lot of you are posting… well, quite old CPUs, that even if SOFTWARE reduced the CPU by half (best case) you still wouldn’t be up to par with mid-range CPUs of today. I was kinda looking for examples of modern CPU/GPU combinations being CPU-limited in games.

          I know it’s possible, I just don’t think it’s a problem for the industry right now. I play games and run 20-30 programs in the background. This is 2014, folks.

            • UnfriendlyFire
            • 6 years ago

            Even if I was running a first gen i7 at 4.8 GHz, I would probably get around 15 FPS in a W:AB 10v10.

            • Marshal
            • 6 years ago

            Dude, indeego’s right. Take your Lenovo Thinkpad outback, shoot it, then get an actual gaming rig. That processor is junk for gaming, and chances are the graphics card paired with it ain’t so hot either. My 3 year old setup with an AMD Phenom 3.2 gig hex core and 5830 graphics card would still run Crysis 2 @ 30 FPS or better.

            • sweatshopking
            • 6 years ago

            Civ v is still a monster. Huge map with 30 civs takes like 1.5 minutes for ai turns by turn 400. Its unplayable. On my phenom 2 940, it just crashes to the desktop, but faster CPUs take a looonnnggg time.

          • indeego
          • 6 years ago

          [url<]http://www.youtube.com/watch?v=HZeFvoFa7RQ[/url<] What am I missing here? Seems to run fine...

          • derFunkenstein
          • 6 years ago

          Ever since the engine update in StarCraft II it’s taken a lot more CPU power, since it does physics on every single unit, plus foliage and such. My CPU (i5 3570K OC’d to 4.5) can handle it but the i3 2100 I had before it really couldn’t.

      • ptsant
      • 6 years ago

      If this is true, maybe my FX-8350 will prove much more useful in the near future. Nicely multithreaded software has been really late.

    • Ryu Connor
    • 6 years ago

    So they actively demonstrated DX12 in software and drivers at the conference?

    Seems Microsoft has been working on this since before Mantle.

      • cynan
      • 6 years ago

      Oh, I dunno. Mantle’s been in the media for at least six months. Which probably means industry insiders had inklings for even longer. Is that not long enough to scrape together some demo drivers?

      And MS probably always had their finger in the DX12 pie. It just probably took mantle to make it a priority – a reality within any reasonable time frame.

        • Ryu Connor
        • 6 years ago

        I’m presuming they opened with a development bible that detailed the needs and wants of DX12. They would also have to determine what hardware could support those needs and wants. Breaking from the past this article implies they mapped most (perhaps all?) the features back to existing hardware logic. That bible could take anywhere from a month to six months depending on the scope of the project.

        Microsoft’s increased focus on security development would have also included white boarding out the logical design of the new features and evaluating if these changes open up a potential vulnerability.

        The work load would then need to be distributed out by the project lead and the individual developers would finally get to programming. Their code as it evolved would have to undergo at minimum QA testing (fuzzing for example) and peer review. IIRC I’ve read an article detailing that the commit system that Microsoft uses also applies some heuristics to code being checked in and looks for blatant input validation flaws.

        Once the development team got to a point they were ready to share a product (not an idea, but a product, even if it might be alpha) with their partners they would need to pull them in and those companies would effectively need to do the same work listed above. I suppose it’s possible companies like 3DMark cut corners and that the DX12 version is a separate development only branch and didn’t undergo the same level of scrutiny written above. Legal is also likely engaged in this stage (NDAs) and could cause some slow down within the typical corporate bureaucracy.

        You also have to expect some back and forth between these companies as bugs are uncovered.

        Since 3DMark is worthless without supporting drivers that would be a step in this as well. Pull in NVIDIA, Intel, or AMD and get them to dedicate valuable resources into this beta (alpha) project instead of their mainline driver development. Possible that they might have someone within driver group that is designated “research” and they got the task leaving the production driver team free and clear.

        Just as above, you have to expect some back and forth between these companies as bugs are uncovered.

        I see months of work here and that doesn’t even include how long they’ve been reaching out to developers to get feedback to help them flesh out the development document into needs and wants.

        I suppose one can’t rule out cowboy programming. Everyone involved could be coding first and asking questions later, but I doubt it.

          • cynan
          • 6 years ago

          Perhaps one possibility is that much of this core code was already done for the Xbone running on GCN AMD hardware. The last few months largely consisting of packaging it in the DX API and getting feedback from certain core developers (eg, 3DMark).

          Just because MS has confirmed that current Xbone games don’t use Mantle-esque techniques, and are essentially based on DX 11.x APIs, doesn’t mean there hadn’t been development looking to squeeze more performance out of the Xbone for future releases.

          In the end, I think we’re almost arriving at the same thing. Just that MS may not have been in any particular rush to export these developments to PC gaming without the push form Mantle.

          • Klimax
          • 6 years ago

          Did you say cowboy programming?
          [url<]https://twitter.com/fearthecowboy[/url<] πŸ˜€

        • HisDivineOrder
        • 6 years ago

        I doubt that. Mantle was no threat to DirectX. Mantle was nothing. Seriously. Look at it unbiased. It’s an API supported by only one GPU maker on only cards it’s made in the last two years and on APU’s its made in the last three-ish months. It isn’t supported by nVidia or Intel. It isn’t supported even by Qualcomm, which is AMD reborn.

        So you think Microsoft became a-feared of an API with such a limited audience? Oh, I bet Microsoft was scared straight by Mantle… yeah…

        …no.

        OpenGL scared Microsoft. For the first time in recent memory, developers are really taking a hard, long look at OpenGL due to its superior cross-compatibility. If a developer is going to make an OpenGL version of a game to take to SteamOS, Linux, Android, OSX, or iOS, then why not consider just making the game in OpenGL to start with?

        DirectX keeps Microsoft’s stranglehold on PC gaming intact. A threat to DirectX is a threat to their hold on PC gaming which is a threat to their hold on the enthusiast PC bracket. And that is a threat to all the users who rely on those enthusiasts for recommendations of software. Where the enthusiast/gamer goes, the casuals often follow belatedly.

        So riddle me this, Batman:

        Which is more likely? An API with a limited audience of just one GPU maker’s most recent discrete cards and their last three months worth of APU’s and a series of delayed post-release patching of Mantle in is scaring Microsoft into a reactionary DirectX release…

        OR

        …the first real threat to DirectX by OpenGL in almost thirty years on the PC platform is turning Microsoft heads back to PC gaming because Xbox is no longer the sure thing it once seemed and suddenly Xbox appears to be at its weakest since before the 360. Losing Xbox or PC gaming/Windows dominance of the enthusiast individually would be bad. Losing both simultaneously would be catastrophic. They already failed to make even an impact on the tablet and smartphone worlds with gaming or anything else.

        So yeah. Mantle’s “improvements” were inspired by DirectX 12, not the other way around. AMD aped what Microsoft did for Xbox One to get something out quick ‘n dirty as a response to nVidia’s Gameworks initiative. If you want more evidence of this, look at how delayed Mantle has been from start to finish. Not only was BF4 and Thief 4 delayed in their implementation of Mantle, but even news of Mantle was delayed. As though it were being finished hurriedly.

        Given the bugs at the initial release of the first game using Mantle (some two months after it was supposedly to come), I’d say they didn’t quite finish it in time.

        But notice how they were in such a rush to get it ahead of March. Why? Because they KNEW the real deal, DirectX 12, would show up and rain on their parade. Meanwhile, Microsoft’s not worried about the cheap copy knockoff Glide wannabe. They’re focused on their old nemesis come back from the dead with new allies right when Microsoft most needs a rest break…

          • Ninjitsu
          • 6 years ago

          I think you and Ryu have summed it up perfectly. It’s like Nvidia coming out with Gsync before the DP standard is updated…

          • LastQuestion
          • 6 years ago

          If AMD had foreknowledge of DirectX 12 improvements, and Mantle is “inspired” by them, why focus so much time and money on Mantle? Their h/w would already run on DirectX 12, so why bother?

          Moreover, if OpenGL was the true threat, why spend all those resources developing Mantle instead of devoting them towards improving AMD performance for OpenGL?

          AMD’s motivation for Mantle must be to sell h/w, but how? With foreknowledge of DirectX 12, and OpenGL becoming a threat, it would be a high-risk investment with, at best, a short-term return.

          More likely, in my mind, is that Mantle is about insuring AMD can profit from the Windows 7 install base regardless of whether or not MS releases DirectX 12 for Windows 7. Furthermore, their efforts in developing a low-overhead API, and announcing it before MS, might serve to spur the interest of other parties; such as those working on OpenGL.

          So, let’s assume Mantle is in response to DX12. Left to itself, MS will probably only release DX12 on W8/9. The cost of buying a windows license reduces the amount of funds available for upgrading h/w. AMD wants to sell h/w. So, develop an API will insures the dominant platform remains, on a performance level, competitive with those running DX12.

          Nvidia has to respond. It would be unlikely they would develop their own API. I don’t think anyone expected them to rely on AMD’s Mantle either. So, they either deal with platform fragmentation and reduced sales or embrace OpenGL.

          Meanwhile, developers are none-too-interested in platform fragmentation from a new DX tied to a new OS requiring them to support multiple APIs. So, they start developing engines that support Mantle and looking seriously at OpenGL. Money pours in, talent gets to work, and a threat is born.

        • joselillo_25
        • 6 years ago

        there are games running mantle and some of the biggest engines have been updated with mantle support. This is vaporwave for some but a powerpoint with “more direct than ever” sentence and a release date of near 2 years is the holy grail of gaming.

          • Klimax
          • 6 years ago

          Those games and engines only use it to work on AMD’s hardware, no other reason. As for your vaporware assertion, it is beyond wrong and never was correct for DirectX.

      • ptsant
      • 6 years ago

      Or they agreed with AMD to integrate the work that has been done on Mantle. It’s much easier to “innovate” when someone else has already paved the road. Obviously, microsoft would never admit to actually copying Mantle’s design, but I don’t see why AMD wouldn’t sell or at least reveal their secrets, especially since they have said it’s supposed to become open protocol.

      Adopting Mantle and calling it DX12 is a plausible alternative to your theory.

        • Klimax
        • 6 years ago

        Not in the least. Your entire assertion is wrong form start to beginning.

      • encia
      • 6 years ago

      One problem, AMD’s PR has claimed “FULL DirectX 12 compatibility” for their current GCNs.

      From [url<]https://www.amd.com/us/press-releases/Pages/amd-demonstrates-2014mar20.aspx[/url<] "Full DirectX 12 compatibility promised for the award-winning Graphics Core Next architecture" - AMD. ------------------- NVIDIA has yet to claim "FULL DirectX 12 compatibility". To claim "FULL DirectX 12 compatibility" for AMD's current GCNs, AMD basically claims DirectX 12 ~= AMD GCN.

    • DPete27
    • 6 years ago

    [quote<]Microsoft said DirectX 12 will premiere in next year's crop of holiday-season games.[/quote<] 1.5 YEARS OUT YET?!?!?! You've got to be kidding me. This needed to happen 3 years AGO. Surely it [url=https://techreport.com/news/26184/nvidia-gameworks-meets-warface-in-clash-of-weirdo-names<]shouldn't take any self-respecting company that long to reverse engineer Mantle[/url<]...unless you've pulled all your funding out of gaming support in abandonment...

      • puppetworx
      • 6 years ago

      They said that developers already have drivers and that ‘early access’ would be available. It’s not clear what form the early access will take.

      • Concupiscence
      • 6 years ago

      Hell, I still remember Microsoft talking up how DirectX [b<]10[/b<] was going to reduce CPU overhead from draw calls and shine a brave new light forward for PC gaming. Boy, that was really something, wasn't it?

        • sweatshopking
        • 6 years ago

        It does. Amd doesn’t support most of the CPU reduction in their drivers, so most developers dont bother. Its not this good, but it is there.the failure is on the driver companies.

          • HisDivineOrder
          • 6 years ago

          AMD made Mantle to fix that. I mean, fixing their drivers is a lot harder than making a whole new low-level access API to push the burden off onto game developers.

          Duh? πŸ˜‰

          Kinda reminds me of certain politicians. They’ll block anything and everything, then complain when things are going to hell because nothing is getting done and use that as motivation to push their own agenda instead.

          Say what you like about nVidia, but they DO support multithreaded CPU far better than AMD and have for some time now. Long enough that it becomes questionable how committed to truly updating their drivers AMD is…

      • Billstevens
      • 6 years ago

      Yeah these all sound like pretty logical steps to take for a graphics API in a world of multi-core processors and powerful GPUs… But now that they are finally pushing a new API new games have to be built with it so yeah… 1-2 years….

      Direct-X work probably got shelved in the midst of Microsofts crap over mobile, surface, windows 8 and Xbone. Probably didn’t see much purpose in updating their APIs until the Steam OS and Mantel started getting some attention.

        • End User
        • 6 years ago

        Competition is good.

    • ninjagai
    • 6 years ago

    asdf

      • bthylafh
      • 6 years ago

      [url<]http://lmgtfy.com/?q=dictionary[/url<]

    • JuniperLE
    • 6 years ago

    so my DX11 card(5850) is not going to support it?
    and Fermi can support it, AMD support for older card seems to be a lot worse… like they moved their DX10 GPU to “legacy” status 2 years before Nvidia πŸ™

      • USAFTW
      • 6 years ago

      You have to think about that next time you buy an AMD card. What if you buy a 290x now? Will the support you get for it over time be as good or at least as lengthy as nVidia’s wuld be for a 780?

      • windwalker
      • 6 years ago

      Won’t you want to upgrade your card before the end of 2015?

        • JuniperLE
        • 6 years ago

        probably not, but why the question? a 5850 still beats slower cards with DX12/Mantle support for most games ( thinking about stuff like a R7 250 and so on), so it should be somewhat relevant, and it’s the card I have and use for gaming… fact is, a GTX 460 which is not much newer or faster/slower is going to have support, my 5850 not, you really can’t understand why I can’t feel positive about it?

          • USAFTW
          • 6 years ago

          That’s the point I’m trying to make, and I’m getting downvoted for it in every single comment.

            • [TR]
            • 6 years ago

            While you are right, if there’s a 2 year gap in support between AMD and nvidia, late 2015 is almost 2 years away. It stands to reason that even nvidia could be dropping support for some cards that could do DX12.
            Either way, announcing this [i<]now[/i<] for use [i<]then[/i<] adds really little to existing cards.

            • JuniperLE
            • 6 years ago

            I agree that it’s a long way to go, but the information I have is, nvidia announced support for Fermi (which I understand as GTX 480 and newer, like 460, 580 and so on), AMD announced GCN support (which I undestand as 7970 and newer, like 7850, 290x and so on), Fermi was for most of its life competing with VLIW5 and VLIW4 GPUs.

          • windwalker
          • 6 years ago

          If you indeed intend to keep using your 5850 well after the release of DirectX 12, I understand your position.
          But if you’re just sore because the 460 will be supported and your card won’t that’s silly.

          The support cut-off is based on internal architecture (GCN), not level of performance.
          It doesn’t seem to me that AMD is refusing to support it just to save development costs but because of hardware compatibility.

            • Voldenuit
            • 6 years ago

            [quote<]The support cut-off is based on internal architecture (GCN), not level of performance.[/quote<] Shouldn't the cutoff be based on technical capability rather than internal architecture? If two generations of video cards have the same or similar technically capabilities (yes, I know there were some updates to DX11.x after VLIW5/VLIW4), then they should both be technically capable of being supported. You might need to have two driver teams, which may be a sticking point with 'AMD Driver Guy'. Saying, "We won't support our older architecture" then sounds a lot more like "we can't be bothered" rather than "it's not possible".

            • windwalker
            • 6 years ago

            Considering the new APIs are all about lower level access it seems reasonable that they depend more on architecture.
            Previous versions were indeed focused on exposing capabilities in an architecture independent way and the price for that is exactly the complexity and indirection that the new APIs are trying to optimize away.

            • JuniperLE
            • 6 years ago

            Yes, I intend to keep using this card for as long as possible,
            the comparison with the GTX 460 is not silly, it’s an example of a competing product.

            I think it’s based on costs, the 5850 have full DX11.0 support, if DX12 is compatible with Fermi (also DX11.0) I don’t see what is missing for the 5850 here? obviously VLIW5 and GCN would require specific work for each one.

            I can understand that AMD is trying to save some money/time with only supporting more recent products, but it doesn’t make me positive about it if nvidia can support their older architecture.

          • gamoniac
          • 6 years ago

          I think if you can squeeze six years of out a card, you have got your money’s worth, although I think you totally have the right to complain (not being sarcastic).

      • Mat3
      • 6 years ago

      I got a 6870 way back in mid 2011 (only $150 or so). By the time these DX12 games come out it’ll be about time for a new card anyways. If you have a 5000 series it’s even older than my 6870. It’s OK!

        • JuniperLE
        • 6 years ago

        who cares if it’s older? with a small OC it’s as fast as the 6870, and supports the same features.
        and the competition $150 card from mid 2011 (GTX 460?) is going to support DX12.

      • sschaem
      • 6 years ago

      Isn’t the 5850 from 2009?
      And dx12 will be released in 2015… My guess it won’t have much traction until 2016.

      And Dx12 won’t expose any feature the 5850 got that is not already exposed in dx11.

        • Voldenuit
        • 6 years ago

        [quote<]And Dx12 won't expose any feature the 5850 got that is not already exposed in dx11.[/quote<] What features with DX12 expose on GCN? The selling point about this update seems to be about reducing the API overhead. AMD is already losing market share thanks to the miners. The last thing it should be doing is pissing off loyal customers trying to decide which card to buy next.

          • sschaem
          • 6 years ago

          Nvidia GTX 280 wont be supported, and it was sold around the same time as the 5850…

          So should GTX 280 owners ditch nvidia for AMD ?

          The point would be more valid if the OP was talking about a 6870. Because those where sold in late 2010 and 2011.

          But having a 2010 card run games in 2016 using DX11 instead of DX12 is not piss in the wind.
          Want to be pissed? ask Microsoft why Dx11.2 wont even run on Windows7 .

            • JuniperLE
            • 6 years ago

            GTX 280 is from 2008, it’s a DX10.0 card it can’t even run Crysis 3, it competed for most of its life with the HD 4800 series.

            HD 5850 is DX11.0, now nvidia released their first DX11.0 cards a little (months) later (Fermi), but Fermi for almost it’s entire existence competed with VLIW5 and VLIW4 DX11.0 cards from AMD (like 5800s, 6800s, 6900s), 6870 uses basically the same architecture as the 5800s (and it’s slower than a 5870), so you can extend my argument to that card…

            DX12 doesn’t seem to require hardware any different than DX11.0, so that’s why I’m not happy about it, and why the GTX 280 is bellow required anyway.

          • encia
          • 6 years ago

          “AMD is already losing market share thanks to the miners” is hypocritical when NVIDIA has CUDA based non-gaming app markets.

          From PC Perspective

          Click on [url<]http://cdnmo.coveritlive.com/media/image/201403/thumb900_phpsvgfplp1010612.jpg[/url<] D3D12 goes beyond "Nearly zero D3D resource overhead".

            • Voldenuit
            • 6 years ago

            I wasn’t talking about corporate tactics, merely that the price inflation from cryptocoin miners is already driving gamers away from buying AMD cards:

            [url<]http://www.jonpeddie.com/publications/market_watch/[/url<] The mining boom is something beyond AMD's control, but if they were hoping to keep their existing customers currently on 5xxx and 6xxx cards, it would have been a [i<]beau geste[/i<] to promise DX12 API (if not feature) support for VLIWx.

      • l33t-g4m3r
      • 6 years ago

      I’d say their dx11 6x cards are legacy too. They no longer care about supporting any cards or OS’s other than the current gen.

      • HisDivineOrder
      • 6 years ago

      Didn’t you know?

      AMD makes cards obsolete faster than nVidia. Waaay faster. Years faster. nVidia only JUST took its DirectX 10 cards off the “New updates” list for their UPCOMING drivers. AMD killed support off for its equivalent cards two years ago.

      It’s pretty much been a known thing that if you want longterm support for your cards with new updates that are actually relevant to them, you stick with nVidia.

      AMD likes to save a buck wherever they can and one of those places is reduced support for their products beyond their time in the spotlight as the premier product.

        • BestJinjo
        • 6 years ago

        Nice theoretical discussion but by holiday season 2015, cards like HD5850, GTX460 and even GTX570-580 with its crippled 1.28-1.5GB of VRAM will not be sufficient for next generation gaming. It’s a ridiculous notion that playing 2015 DX12 games on a 2009 HD5850/5870 (6 years old GPUs!) should somehow be expected to be the norm. Your argument would be far more convincing if you said that GTX400/500 series are more futureproof (even though there is no such thing) for their tessellation performance vs. HD5000/6000 series but instead you mention DX12 support which will be largely irrelevant for such old GPUs.

        Further, you still haven’t grasped the point of GPU upgrading cycles. GPU industry’s planned obsolescence is good. It means PC game developers can focus on the latest features and don’t need to develop for the lowest common denominator. New GPU purchases ensure that NV/AMD use the profits for R&D for next generation tech. Moves to DX12 around Skylake and 20nm GPUs spur system rebuilds or serious overhauls. 5+ year old GPUs becoming paperweight is actually very positive for PC gaming, not negative as you spin it. A knowledgeable PC gamer also accounts for this and should be smarter than to buy a $500 GTX480/580 and keep it for 5-6 years when a $250 HD7870 just 1.5 years after 580’s aunch and now a $150 GTX750Ti is trading blows with the $500 flagship.

        [url<]http://gamegpu.ru/images/remote/http--www.gamegpu.ru-images-stories-Test_GPU-Action-Titanfall_-test-1920_i.jpg[/url<] Let's not even forget that those who bought HD5850/5870/6950/6970 paid far lower prices than NV's performance equivalents which means they saved $80-130 upfront that can be used to upgrade to Maxwell/20nm AMD GPU. You missed all of these points. One would also hope there aren't many people dumb enough to buy a $700 GTX780Ti and keep it for 5-6 years as such upgrading strategies have proven themselves worthless over the last 20 years of GPU development.

          • JuniperLE
          • 6 years ago

          games allow you to adjust settings, it’s impossible for you to say that 1GB cards will not run 2015 games,
          6 years old by the end of 2015 OK, but a 8800GTX was 6 years old by the end of 2012, and it could still handle most games, obviously it didn’t had the hardware features for DX11 or whatever, I’m just saying that things have changed, with long console life cycles GPUs stay relevant for a long period, DX12 is not going to be irrelevant to any GPU if it’s a lower overhead API, that’s a fact.
          even Haswell IGP which can’t play games even as well as a 5670 is going to support it…
          Tessellation is not THE limiting factor for many current games…

          about your obsolescence talk, it’s simply irrelevant. when Nvidia is going to support their 2010 GPUs apparently, and AMD is not going to support high end cards they were selling by the end of 2011 (not to mention those 6670 and Richland APU sold until recently and based on VLIW5 and VLIW4).

          so if you bought a 6800K 4 months ago you are not getting DX12 support if you bought a Haswell i3 for the same money… which is funny because the 6800K IGP is much faster, but AMD is just not interested in supporting anything older than GCN, even if they had the hardware features appropriate for DX12 and would benefit from it.

        • encia
        • 6 years ago

        No, AMD focused on “FULL DirectX 12 compatibility” for their current GCNs.

        From [url<]https://www.amd.com/us/press-releases/Pages/amd-demonstrates-2014mar20.aspx[/url<] "Full DirectX 12 compatibility promised for the award-winning Graphics Core Next architecture" - AMD. AMD's PR has claimed "FULL DirectX 12 compatibility" for their current GCNs. NVIDIA has yet to claim "FULL DirectX 12 compatibility".

      • UnfriendlyFire
      • 6 years ago

      Are you going to use that GPU past 2015? Or 2016 when large amount of games start supporting DX12?

      • ptsant
      • 6 years ago

      At least for the moment, 5850 is fully supported and is 3 generations behind. I know, because I bought one in 2009 and today is 2014. Obviously, everyone would want the longest support possible, but 5 years for a GPU seems decent, especially since no DX12 game has yet shipped.

      On the other hand, I would really like to keep my windows 7. I suspect that DX12 will be win8+ only. This bothers me much more, since I have no other major reason to upgrade my system.

      • jihadjoe
      • 6 years ago

      Probably more to do with how the chips themselves are architected, than outright malice or neglect.

      The 5000 and 6000 series are still essentially VLIW designs, whereas Fermi from the GTX 400 series onwards is already a compute design. My guess is DX12, like Mantle is geared toward taking advantage of the newer, slightly more general-purpose compute-centric GPUs to off-load certain tasks from the CPU.

    • USAFTW
    • 6 years ago

    So nVidia claims that it’s post 2010 GPUs will support this API, and AMD says it’ll be supported by just it’s GCN parts. It seems those of us that had invested in 6000 series and 5000 series AMD cards were mistaken.

      • Airmantharp
      • 6 years ago

      We were.

      • [TR]
      • 6 years ago

      What does it matter? Games in “late 2015”. One year and a half or more, if that estimate is to be believed.
      If you buy a decent card now, by then you might want to upgrade if you didn’t before. If you have an older card, I don’t know that it will last you that long without some hits to graphical quality.
      Maybe it will allow you a slight bump up in quality when it arrives, but you’ll be salivating over the new and shiny DX12 GPUs anyway.

      It’s not all bad, though. It’s great that an API can change like this and have an impact on existing hardware.

        • USAFTW
        • 6 years ago

        Yeah, that true that at least a lot of current and previous GPUs will support it. But it rings some bells in my mind. In my other comment, I said that nVidia’s support for older GPUs is longer and probably more prominent than AMD.

      • HisDivineOrder
      • 6 years ago

      You don’t buy an AMD card and expect to get longterm support for it. They’ve been like this ever since AMD bought them. Maybe before.

      I remember having new Geforce drivers to update to for all my old cards for years with nVidia cards and always having to struggle to find new drivers after the cards lost their luster to the AMD executives. After that, I actually decided to buy nVidia cards at least partly BECAUSE I want longterm support for my expensive (for me) video card purchases, no matter how old they are.

        • BestJinjo
        • 6 years ago

        The flaw in this reasoning is neither NV’s nor AMD’s 5+ year old cards will run DX12 games at acceptable fps/settings. By the time DX12 games come out, a $199 GPU will mop the floor with a GTX480. Heck a $150 GTX750Ti is as fast as the 480 and uses less than 60W. Longer term GPU support is far more critical in the mobile/laptop segment where it’s not that easy, or often impossible, to upgrade the GPU. On the desktop space, any GPU 5 years or older is a paperweight anyway.

        As 4K monitors drop in price and 2560×1440 becomes even more affordable, even 7970/R9 280X/680/770 GPUs will become too slow by the time DX12 games come out. HD7970 itself will be nearly 3 years old by holiday 2015. By 2016 we should have Volta too which would make cards like 680/7970 ancient and Maxwell architecture 2 years old!

      • Klimax
      • 6 years ago

      For AMD that would be two drivers in one, because hardware cores are way too different including basic structure.

      ETA: Remember VLIW versus regular instructions.

      • flip-mode
      • 6 years ago

      Nonsense. The war between AMD fans and Nvidia fans is so silly. Perhaps you forgot that the 5000 series launched in 2009. Radeon 5000 and 6000 cards were very good buys at the time, and there is really no reason that anyone should have expected one of those cards to get updated to DX12. AMD’s GPU architecture has changed much more than Nvidia’s over that time, too. And lastly – a personal opinion – I’m totally in favor of conservation of driver development efforts – once a video card hits three and especially four years of age it is totally reasonable for it to be put on the legacy track.

      • Scrotos
      • 6 years ago

      It’s true! nvidia supports their cards longer because they keep the same core over 3 or 4 generations (G86 going into 8600, 9600, and 1 or 2 series after that (OEM) if I recall correctly from the last time I looked into that).

      Rebranding = best guarantee of long-term support. Why, nvidia says 400, 500, 600, and 700 series. ATI is a 7000, 8000, and R9 series, that’s only one generation off from what nvidia is supporting.

        • swaaye
        • 6 years ago

        Both companies slide previous generation chips into new line ups. And GeForce 9600 = G94, not G86.

        Lets consider GeForce 6 being supported until 2013. 9 nice years. Windows 8 support. You can use GeForce 6 on Windows 98 through Windows 8. Radeon 4890 ends at the same OS.

      • ptsant
      • 6 years ago

      Late 2015 is a long time for a GPU. Especially knowing that GCN was launched late 2011, ie 4 years before. My own upgrade cycle was very, very long during my student years and I respect the desire to get a maximum out of your investment. Nevertheless, unless you bought the absolute best at the time (6990?) I think you’ll be in need of an upgrade either way. Going from a 5850 to an $150 AMD or NVidia card in 2015 is going to be a great upgrade in all possible terms (performance, features, compatibility, noise etc).

      That being said, not all games will be DX12 at first and, if you have a higher-end card like a 6950 that can keep up, you can probably hope that the developers will include a DX11 or DX10 codepath. This has happened quite often, at least for a transitional period, during the previous DX changes.

        • UnfriendlyFire
        • 6 years ago

        Wargame Airland Battle has a DX9 and DX11 API support. Not sure if they’re still going to keep the DX9 support with their Red Dragon sequel coming out this spring.

      • encia
      • 6 years ago

      AMD has claimed “FULL DirectX 12 compatibility” for their current GCNs.

      From [url<]https://www.amd.com/us/press-releases/Pages/amd-demonstrates-2014mar20.aspx[/url<] "Full DirectX 12 compatibility promised for the award-winning Graphics Core Next architecture" - AMD. NVIDIA has yet to claim "FULL DirectX 12 compatibility".

    • derFunkenstein
    • 6 years ago

    What gaming PCs are coming with stuff older than GCN/Fermi that only 80% of new gaming PCs will be able to take advantage of DX12?

      • Concupiscence
      • 6 years ago

      Maybe a veiled reference to all but the newest Intel video hardware?

        • derFunkenstein
        • 6 years ago

        Well Intel says Haswell is compatible, which I assume means Crystal Lake models, and I don’t know that I’d call it a “gaming PC” anyway.

      • HisDivineOrder
      • 6 years ago

      Many Intel GPU’s are left out in the cold.

      All the more reason for Intel to raise the bar on EVERY GPU they include with their CPU’s, not just a select few.

        • derFunkenstein
        • 6 years ago

        What’s that got to do with the article? It said 80% of NEW gaming pcs. Gaming PCs don’t have unsupported Intel GPUs.

          • derFunkenstein
          • 6 years ago

          We’ll they HAVE them but they’re not USING them, just to clarify. :p

    • Voldenuit
    • 6 years ago

    Windows-8 exclusive?

      • Sargent Duck
      • 6 years ago

      If they did that…someone needs to be fired.

        • UnfriendlyFire
        • 6 years ago

        Windows 7 never got DX11.2, and Vista is still on DX10.

        So most likely DX12 is going to be only for Windows 8 and 9.

        Hopefully they don’t blotch Windows 9.

          • Voldenuit
          • 6 years ago

          XP never got DX10 either, despite the huge installed base compared to Vista.

          I wouldn’t put it past Microsoft to try and feature-restrict W7 from DX12 as a backhanded means to try and spur adoption of their latest [s<]catastrophe[/s<]*cough*product, as it's something they've often historically. But it might backfire on them. Especially if, say, SteamOS (and by extension Linux) supported DX12 feature set* out of the gate. * Obviously, it wouldn't support DX12 (except maybe on Wine), but it might support similar capability.

          • l33t-g4m3r
          • 6 years ago

          Vista did get updated to 11. Just not 7’s 11.1. Oh, and since AMD completely quit releasing drivers for Vista while I was still using it, and their drivers suck in more ways than one, especially on performance, they got the finger for my dx11 and now future upgrades.

      • puppetworx
      • 6 years ago

      Windows 9 would be my bet.

      Windows Vista: Jan 2007 (delayed)
      Windows 7: October 2009
      Windows 8: October 2012
      Windows 8.1: October 2013

      Right on schedule.

        • yogibbear
        • 6 years ago

        Windows RT.

          • [TR]
          • 6 years ago

          Well, by then Intel will be bankrupt and we’ll all be using ARM SOCs, anyway. So, yeah, Windows RT 2015 Edition!

            • UnfriendlyFire
            • 6 years ago

            I think AMD would go first.

        • Firestarter
        • 6 years ago

        Right on schedule to not be adopted till 2018

      • Billstevens
      • 6 years ago

      Considering support for Windows 7 is going away soon, yeah most likely windows 8 +. Your only hope is that the unscrew their next release of Windows.

      Or Linux drivers com far enough along that you have an alternative.

        • mnemonick
        • 6 years ago

        That’s just [i<]Mainstream[/i<] support, which means (free) phone support, design and feature requests, and warranty support. Extended support (i.e. all you really need), which includes security and (major) bugfix updates, online support and Knowledge Base support doesn't end until [i<]January 14, 2020[/i<]. [url=http://windows.microsoft.com/en-us/windows/lifecycle<]MS Support fact sheet[/url<] [url=http://support.microsoft.com/gp/lifepolicy<]MS Support Policy FAQ[/url<] edit: added URLs

          • Billstevens
          • 6 years ago

          Well then you have hope, but Microsoft tends to see DirectX as another feature they can use to push you to upgrade….

            • mnemonick
            • 6 years ago

            True. I’m not holding my breath for DX12 availability on Windows 7. πŸ˜€

      • HisDivineOrder
      • 6 years ago

      I bet it’ll be Windows 9 exclusive.

        • Voldenuit
        • 6 years ago

        [quote<]I bet it'll be Windows 9 exclusive.[/quote<] It wouldn't surprise. But this time, MS is in the unique position of worrying about [i<]losing[/i<] userbase instead of trying to persuade their locked-in users to upgrade. On the gaming front (for which DX is most relevant), we have Steam (which is multiplatform) and SteamOS, and Gog.com has [url=http://www.rockpapershotgun.com/2014/03/19/another-crack-in-windows-gog-lines-up-linux-support/<]announced that they will be supporting Linux[/url<]. Outside of that, we have the Mac market share higher than it has been in a long time, Chrome, and the bleeding of mainstream PC users into tablets and smartphones. And of course, "next year will be the year of Linux", just in time to meet whatever shenanigans MS might try to pull with DX12. Instead of rolling over for MS, customers actually have to be convinced to stay now.

    • LostCat
    • 6 years ago

    Snicker. (I didn’t say anything…)

Pin It on Pinterest

Share This