AMD displays first DirectX 11 graphics processor

COMPUTEX — At an event here in Taipei this morning, AMD gave the first public glimpse of running DirectX 11 hardware. AMD’s Rick Bergman and TSMC’s Rick Tsai displayed a wafer of an upcoming 40nm, DirectX 11 AMD GPU scheduled for release by the end of this year.

(Check out the image gallery below for a higher-res shot of the wafer on its own.)

AMD also showed several demos of DirectX 11 in action, too. In games, the demos focused on tessellation and DirectX 11 Compute Shader. The Froblins demo showed both in action: tessellation was used to increase polygon counts for character models, while Compute Shader allowed artificial intelligence processing to run on the GPU:

If you’d like to see tessellation in more detail, AMD has put up another demo on YouTube. In essence, tessellation works somewhat like the ATI TruForm technology of old.

DirectX 11 Compute Shaders have uses outside of games, too, just like OpenCL. AMD showed a DX11, GPU-accelerated video transcoding application running on Windows 7:

A software-only transcoder was running on another display to showcase the performance gains of the GPU-accelerated version. Interestingly, though, Windows 7’s Aero graphical interface was disabled on the system running the Compute Shader transcoder.

AMD didn’t announce a specific release time frame for its first DX11 GPU. However, whispers around Computex suggest a launch could take place in the late third quarter—think late September or early October.

Comments closed
    • MadManOriginal
    • 11 years ago

    I’m really looking forward to the ~$100 DX11 part from AMD. Given their recent history at that pice point, the 4830 and then 4770, it ought to be a killer for price/performance. Maybe we’ll even see a Radeon 9800-like era where a card lasts a good long time with console-itis in PC games and the next round of consoles not due out for a few years it could happen. One difference is that even fairly budget cards will last.

    • UberGerbil
    • 11 years ago

    I don’t want to get into this, but if you’re going to bat at rumors, the rumor/assertion/conspiracy theory is that DX11 (not 10.1) was what DX10 was supposed to be before nVidia forced MS to dilute it. As described by Mr Anti-nVidia, late of the Inq, here:
    §[<http://www.theinquirer.net/inquirer/news/1137331/a-look-nvidia-gt300-architecture<]§ DX10.1 indeed includes some of the optimizations that are present in 11, which is why it improves performance (beyond any additional advantages certain hardware might have with 10.1).

      • rythex
      • 11 years ago

      Do you seriously believe that crap on the inq? /rolleyes/

        • UberGerbil
        • 11 years ago

        There’s a debate going on in this thread involving this. I posted the link merely so that they’re on the same page.
        (I probably should’ve posted it as a reply to that subthread, but hey, I messed up)

    • BrynS
    • 11 years ago

    #50 A key thing to keep in mind is that although many of these incremental API changes appear quite trivial, in many instances they invariably require architectural re-workings that are anything but. At least that is the impression I get from following some of the chatter on Beyond3D over the past few years.

    ATI was able to transition to 10.1 so quickly because most of the groundwork was laid in R600. Presumably if NVIDIA could have made minor revisions to attain 10.1 compliance they would have done so by now instead of regurgitating the venerable G80 for the past 30+ months.

    Who knows, they could pull another G80 out of the hat for DX11 and it would be about time too.

    • pogsnet
    • 11 years ago
      • Meadows
      • 11 years ago

      NVidia never adopted Dx 10.1.

        • Flying Fox
        • 11 years ago

        But they are going to. Read the Shortbread of the day yet?

          • Meadows
          • 11 years ago

          Well, if so, then Assassin’s Creed will need to be patched once more, I guess.

            • Flying Fox
            • 11 years ago

            They don’t have to, u can read the comments to the Shortbread.

            • Meadows
            • 11 years ago

            I will, can’t be here 24/7 though.

            • pogsnet
            • 11 years ago
    • ssway
    • 11 years ago

    Mmmmmmm…imagine those being zombies and this being an survival action L4D/Dawn of the dead MMO.

    • billyconnection
    • 11 years ago

    You know what would really piss me the hell off…?

    DX11 exclusively in Windows 7, just like DX10 was for Vista. I want improvements without having to upgrade /[

      • pogsnet
      • 11 years ago
        • stmok
        • 11 years ago

        Line of the day!

        *[

          • Grigory
          • 11 years ago

          Awesome! We don’t need you. 🙂

            • stmok
            • 11 years ago

            Who says I was to be needed? 🙂

            • Grigory
            • 11 years ago

            Good point. 🙂

          • shaq_mobile
          • 11 years ago

          sooo if cant play games with the newest graphics, you wont play any games at all? what?

      • JustAnEngineer
      • 11 years ago

      Windows Vista and DirectX 10 introduced a new driver model. DirectX 11 should work fine with Vista.

      The way that graphics drivers are handled under Windows Vista is one of the reasons that buggy NVidia drivers don’t crash the whole OS under Windows Vista like they did under Windows XP.

      • UberGerbil
      • 11 years ago

      …which is precisely why Microsoft isn’t doing it. DX11 will work for Vista as well. They aren’t backporting it to XP, just as they didn’t for DX10, because of the driver model issues JAE explained.

      DX11 won’t even require DX11-class hardware, though because that path will force the CPU to pick up the slack, some DX11 games may not be particularly enjoyable without the new hardware. (But then again, when has that not been the case in PC gaming?)

      DX10’s big problem was that it was tied to Vista, and Vista didn’t sell well. (It also required new hardware but that wasn’t as big a hurdle because people were buying DX10 hardware anyway since it offered a boost to DX9 apps as well). So game devs were reluctant to do DX10 titles, because the potential market was so small. They also had to get up to speed on DX10 (and, since they certainly weren’t going to be DX10-exclusive, DX10 just added more dev and testing on top of the DX9 code-path schedules they already had). And then there were the driver issues, since a new driver model meant AMD and nVidia were back to square one with stability and optimization.

      With Win7 looking like it is going to be more popular than Vista ever was, and Vista + Win7 representing a common target for DX10/11, and game devs now mostly up to speed with the new API (especially the game engines and libraries so many studios use), and with DX 10.1/11 offering significant performance improvements vs DX10 in some cases, and AMD and nVidia further along with their drivers, we finally may be at the point where we’ll see tangible benefits from the new APIs.

        • Meadows
        • 11 years ago

        As much as I like nVidia, they were the reason DirectX 10 was retarded, and subsequently, they hurt ATI sales, they hurt Microsoft sales, and *[

          • factor
          • 11 years ago

          Can you please cite sources for your info?

          Do you realize how small of a step 10.1 was over 10.0? They added:

          §[<http://msdn.microsoft.com/en-us/library/bb694530(VS.85).aspx<]§ The big ones: standardized MSAA patters with more flexibility for their interaction new shader instruction: gather4 texture cube arrays (in addition to texture texture(1d/2d) arrays) 32 IA/VS inputs 10.0 added: GS texture arrays (entirely new), huge textures 8Kx8K, many new formats huge shader limits (instructions, 128 texture stages, 15CB's) shader flexibility (RT indexing, vertex/primitive/instance id, multiple scissors/viewports) So you think nvidia took all the big stuff but decided to punt on those (useful, but minor) 10.1 changes?

            • Meadows
            • 11 years ago

            Probably. I’ve heard this from at least two IT workers as late as 2 years ago, and they were unrelated to each other. Now in the past month or so, I’ve seen the Inquirer recite it again, and while I know what a bad source they are, it still takes some truth to fabricate a good lie. And I’m pretty firm in my belief that the above described part was the truth.

            • Fighterpilot
            • 11 years ago

            It might have been a small step but have you see how that little change effects some games…take a look at HAWX running under DX10.1.

            • Lans
            • 11 years ago

            Yea, the list is very short but there is definite improvement for MSAA (reading back MSAA depth buffer instead of doing another render pass; better performance) and MSAA coverage masks (better quality).

            And if it is so trivial, you got to wonder why Nvidia didn’t want to support it?

            Also DX10 was supposed to be really hard to be supported on XP (reason Microsoft it didn’t want to support it on XP initially) but the form we see today can be easily be supported on XP now (there are a few hacks floating around). Got to wonder about this one too but no hard evidence Nvidia did anything here (public information and as far as I know).

            Soures (googled as I forgotten where I read it initially):
            §[<http://www.pcgameshardware.com/aid,660528/Exclusive-interview-about-Stormrise-DX-101-support/News/<]§ §[<http://news.softpedia.com/news/DirectX-10-Vista-Exclusive-47224.shtml<]§

            • factor
            • 11 years ago

            I completely agree. The features are great and they’re no where near the scale of the DX9 to D3D10 jump. I don’t know why nvidia didn’t transition earlier. Schedule? Maybe they realized the 10.0 adoption was already so low they didn’t want to spend more time on another discontinuity, I don’t know.

            But I just don’t understand the (baseless) accusations that nvidia actively gimped 10.0 so they could get their part out the door. If that were the case and ATI was ready to go with all the 10.1 support, why weren’t their first 10.0 parts also 10.1 compliant? The sad thing is that most of the people that spout this don’t even realize what 10.0 brought. All they claim is that nvidia made 10.0 crappy and is paying devs to not use 10.1.

            • Lans
            • 11 years ago

            Speculation and slow news days I suppose?

            Just my take on the rumors (so you can ignore the rest if you like).

            I know I ask myself why wasn’t DX10 support on XP? Then find “oh there was a technical reason for it” then “oh wait, they removed all the technical barriers before releasing it”… News/rumors site probably hoping it’ll cause a sensation? Then fanbois turn half-turths/rumors into “facts’?

            My personal view is there is some truth (half-truths to me) to it but unfortunately there just isn’t enough evidence… And it doesn’t look like Microsoft/Nvidia/AMD(ATI) is going to try set the record straight anytime soon so though luck for me… 🙁

            I am guessing rumors about ATI having DX10.1 all ready to go was based on Radeon HD 3870 (ATI’s first DX10.1 GPU) and HD 2900 (first DX10) being from same generation and released relatively close together. I guess this one can be proven without word from the horse’s mouth (hacking driver to see how much of the DX10.1 features are broken on the HD 2900? lets see what open source community digs up).

        • billyconnection
        • 11 years ago

        Thanks for the info. UberGerbil. I’ll take your word for it, as you always seem to know what you’re talking about. What a relief. I haven’t heard any news of DX11 compatibility until now.

        I always liked Vista, but getting the hardware and the driver support was a slow transition. I think Vista is more user friendly now, more than XP ever was. Windows 7 blah.

      • WaltC
      • 11 years ago

      DX10 for Vista supported DX9 hardware without a problem, and I can’t think of a game that runs in DX10 but won’t run at all under DX9…;)

      What changed between Vista and XP was the driver model used in Windows–which is why XP won’t support DX10 and higher–the DX10 and higher Windows driver model is incompatible with XP. XP shipped in ’01, and Vista shipped two years ago, btw. Hardly seems like anyone is being “rushed” to upgrade anything these days. Mainly, I think, people bitch because everything isn’t free–which is sort of pathetic. Even lowly consoles are updated every 4-5 years.

    • albundy
    • 11 years ago

    i guess this is why gfx cards are so cheap these days. glad i waited!

    • Code:[M]ayhem
    • 11 years ago

    And DX11 will be an even bigger over-hyped scam than 10!

      • Meadows
      • 11 years ago

      Guess who is to blame for Dx 10.

        • MadManOriginal
        • 11 years ago

        The New World Order?

          • Meadows
          • 11 years ago

          NVidia.

          • 5150
          • 11 years ago

          nWo 4 Life

            • MadManOriginal
            • 11 years ago

            hmm yeah but no, I’m pretty sure you’re thinking of a different NWO than I was.

            • pogsnet
            • 11 years ago
      • mesyn191
      • 11 years ago

      I thought it was always considered an incremental improvement over DX10/10.1.

        • khands
        • 11 years ago

        This, and DX10/9 cards will still benefit from DX11.

      • asdsa
      • 11 years ago

      Mmm..No. You are confusing this technology with physX. I’m at least looking forward to new graphics API standards.

      • Lans
      • 11 years ago

      Maybe since DX11 sounds like Windows 7 and probably Vista only (DX11 should be “DX10 compatible”). So in that sense it could be a repeat of DX10.

      At least time if you have Vista already, you probably don’t have to get Windows 7 just for DX11 unlike XP and Vista move. I haven’t been paying a whole of attention on Windows 7 but I didn’t get the feeling that we were touting the GUI update as much as they did with Aero glass…

      Also, I still think Nvidia screwed everyone over with DX10.1 as it definitely showed significant performance gains, especially with less render passed required for multi-sampled stuff. And looks like DX11 is going to inherit those improvement of DX10.1 .

        • MadManOriginal
        • 11 years ago

        I think he’s being a bit of a whiney baby. It’s like people expect the same day a new API is released that there will be games to take advantage of it. That’s just not the way it works and it never has. We only just started seeing games that both looked and performed better with DX10 with Far Cry 2 I believe.

    • CheetoPet
    • 11 years ago

    random thought of the day – windows task manager needs to add a chart for the GPU usage. Not that its practical, I just like moving bar charts that might mean something.

      • valrandir
      • 11 years ago

      GPUZ can show and log GPU usage for most modern GPUs. TR should really consider adding it as a monitoring tool — should be useful to see what tasks are GPU or CPU limited.

    • danny e.
    • 11 years ago

    it will be around the time for a whole new system build for me.. cept I might try to hold off till Jan

    • poulpy
    • 11 years ago

    These videos are pretty nifty but I’d have liked to see some more of the Cinema technology they demoed after the 4800 release (aka raytracing on AMD GPUs). Especially with Intel talking more and more about Larabee.

    Can’t find the official presentation but Google returns quite a few links: §[<http://www.tgdaily.com/content/view/38145/135/<]§

      • Mourmain
      • 11 years ago

      I don’t know, I wasn’t impressed at all. For one thing, I couldn’t /[

        • poulpy
        • 11 years ago

        Found the link again: §[<http://www.amd.com/us-en/Corporate/AboutAMD/0,,51_52_15438_15106,00.html?redir=uve001<]§ Not too much the same old stuff IMO as it was real time raytracing on an existing GPU (demoed on HD3000), when everybody and his dog started raving about Larabee and how ray tracing was awesome. The videos were pretty low res ran off HD3000 cards but they had sweet scaling graphs using single HD4000 and then dual/quad. Anyway just my 2 cents but with Larabee ever more present I'd have thought that a demo of a Cinema 3.0 would have kicked ass, instead of a "TruForm 2 - back with a vengeance" 🙂

    • Game_boy
    • 11 years ago

    It could just be a shrink and refit of the existing HD4xxx series. That would explain the very small die size (estimated at 180mm^2 from images).

    But I don’t see Nvidia’s 40nm anywhere (AMD had it in March), so it might not matter.

      • MadManOriginal
      • 11 years ago

      Considering the HD4000s were already halfway to DX11 with things like the tesselator that might not be too far from the truth.

        • khands
        • 11 years ago

        Except the way AMD did it is not the way that DX11 wants it done, but yeah, the technologies must have a similar root, may be as simple as changing a few lines of code.

        • ish718
        • 11 years ago

        Not to mention, DX10.1

          • Meadows
          • 11 years ago

          Dx 10.1 is pretty much what 10 should’ve been, but nVidia thought otherwise.

          • MadManOriginal
          • 11 years ago

          Exactly what I meant, because some of the stuff that’s in 10.1 but not 10.0 is in 11.

      • FuturePastNow
      • 11 years ago

      It’s also possible that this first DX11/40nm GPU will be a mid-range part. AMD might want to perfect it before scaling up the die size.

      Edit: RV740, by comparison, is 137mm2 for its 640SPs. So 180mm2 could pack in a lot more.

        • OneArmedScissor
        • 11 years ago

        I think you are all pretty much right.

        Wasn’t the new ATI strategy to do something like update the present tech every 6 months-ish, as possible? Of course, they could release “something” new at those intervals, but what they mean is that they will keep building off of what they’ve got.

        In other words, sometimes, it’s incremental, like the 4890, and cutting cost down to $100 (or less) with cards such as the 4770 and 4730, which perform on the level of a 4850.

        The next one isn’t supposed to be TOO crazy, as it is said to run similar clock speeds, with 1,200 processors.

        Then it will probably be overdue to go 3800 to 4800 again, once they’ve got the DX11 market down. That could be what happens with a potential 5870 X4, and not necessarily cost prohibitive X2.

    • valrandir
    • 11 years ago

    What a wonderful convergence of new technologies coming around October — Windows 7, Lynnfield, and DX11 GPUs

      • khands
      • 11 years ago

      Sounds like new computer time to me.

        • indeego
        • 11 years ago

        You betatest, let us knowg{<.<}g

          • khands
          • 11 years ago

          Alright, when I manage to win a few extra grand, I’ll put it all towards betatesting it for ya.

            • Flying Fox
            • 11 years ago

            You can always hope for the best in the Christmas giveaway. 😉

            • khands
            • 11 years ago

            Did that last year… no go.

          • ew
          • 11 years ago

          I thought Vista was the beta test.

            • OneArmedScissor
            • 11 years ago

            And Core i7. 🙂

Pin It on Pinterest

Share This