Microsoft talks DirectX 11 graphics features

As rumored, Microsoft started shedding light on DirectX 11 at the Gamefest 2008 event in Redmond, Washington yesterday. Shacknews has the skinny on the graphics portion of the next-gen application programming interface toolkit, which will bring new functionality while retaining compatibility with Windows Vista.

Quoting Microsoft, the Shack says DirectX 11 will introduce compute shaders, a technology that will “[lay] the groundwork for the GPU to be used for more than just 3D graphics.” Other DX11 features will include tessellation, which will make 3D models smoother up close (like the old ATI TruForm tech), and multi-threaded resource handling, which will help game developers tap multiple CPU cores.

Interestingly, Shacknews mentions that DirectX 11 will add functionality to existing, DirectX 10/10.1-class graphics processors. Microsoft apparently didn’t reveal which features will require new hardware, however.

Comments closed
    • 0g1
    • 11 years ago

    I like that the GPGPU and CPU multithreading is made more accessible. Intel will probably port Havok to DirectX 11 — although, because Havok is not middleware and exists in each application, you’ll need the application to implement the new Havok SDK. CUDA will become pretty much obsolete for GPGPU programming, I guess :(. If nVidia port PhysX to DX11, we would get PhysX acceleration on AMD and Intel hardware that supports DX11.

    Hardware tessellation support is good but its also slow. It’s always going to be faster to use multiple LOD meshes than to dynamically calculate them. The development time saved is minimal. Its a nice visual improvement, but not always worth the performance penalty and sometimes it looks worse.

    DX11 is going to be pretty cool, but I could definitely stay with DX9c.

    • Ryu Connor
    • 11 years ago

    q[

    • pogsnet
    • 11 years ago
    • pogsnet
    • 11 years ago
    • [TR]
    • 11 years ago

    ” Quoting Microsoft, the Shack says DirectX 11 will introduce compute shaders, a technology that will “[lay] the groundwork for the GPU to be used for more than just 3D graphics.” ”

    Er… groundwork?! I can understand the need for a “standard” language for GPGPU, but it has been done. And I don’t know if it would be a good thing to have MS setting the rules on that, too.
    Although they could be in a position to get GPU makers to sit down and discuss it, my guess is they’ll forego the discussion as soon as it turns into a fight and do whatever they want.

      • Scrotos
      • 11 years ago

      It’s been done? Which “standard” do you mean? Cg? CUDA? CTM? OpenCL? Something else?

      • UberGerbil
      • 11 years ago

      Historically, it’s situations like this — when several hardware manufacturers are advancing their own proprietary standards for the same thing — that MS has stepped in and offered a common standard that is then supported by everybody, and it’s usually a net benefit. Remember GLIDE? Remember the state of sound APIs before DirectX? Remember the days of DOS when every word processing program had to ship with its own printer drivers?

        • ish718
        • 11 years ago

        I guess its MS responsibility to do things like that since their the top dog in the PC industry

          • UberGerbil
          • 11 years ago

          Well, sometimes they saw an opportunity and/or sometimes the Windows group got sick of trying to support umpteen different variations of the same thing. But privately, the hw guys have sometimes asked MS to step in because they don’t have a vested interest in the success of a particular hardware implementation, and it gives everybody a chance to climb down while saving face. These days it’s all a little more complicated, but it still happens. (They even got Intel to adopt AMD’s 64bit architecture without any niggling little “variations”)

      • Flying Fox
      • 11 years ago

      It may not be GPGPU right away, it can be just in-game physics as a first step. There was no mention of GPGPU in the announcement.

    • Umbragen
    • 11 years ago

    DirectX 11? Microsoft must be working on the next incarnation of the XBox.

    • Draxo
    • 11 years ago

    “Interestingly, Shacknews mentions that DirectX 11 will add functionality to existing, DirectX 10/10.1-class graphics processors. Microsoft apparently didn’t reveal which features will require new hardware, however.”

    they should have said it will require a new Windows version.

      • Corrado
      • 11 years ago

      Why shouldn’t it? If you want a car with a feature your car doesn’t have, say you want a convertible, or a hybrid, you need to buy a new car. Some people buy new cars every 5 years, some people drive 10 year old cars. Its tough to get support for a 10 year old car from the manufacturer sometimes, especially if the car has been replaced with a new generation.

        • d2brothe
        • 11 years ago

        Not true, parts are available for most old cars, even 70’s and 80’s cars you can get parts for.

          • nonegatives
          • 11 years ago

          But those old parts do not add new features. You have to do custom work, then everyone is driving around in one-off vehicles that are no longer compatible with each other – sorta like linux.

      • d2brothe
      • 11 years ago

      Why would they say that…they said specifically it would NOT require a new version of windows….that vista would have full support.

      • UberGerbil
      • 11 years ago

      DX11 runs now, in its current form, on Vista.

    • Ryu Connor
    • 11 years ago

    q[

      • d2brothe
      • 11 years ago

      Seriously, reply…what are you replying to?

    • Ryu Connor
    • 11 years ago

    q[

      • Mystic-G
      • 11 years ago

      Beta testing ftw

        • greeny
        • 11 years ago

        why waste resources beta testing for a ten year old Os when you have a nice shiney new one you can beta test for instead, like it or not vista WILL become standard just like xp did

          • Mystic-G
          • 11 years ago

          Haven’t you heard? Windows 7 is coming. XD

          Vista will become just like Windows 2000.

            • jroyv
            • 11 years ago

            Hey for a MS OS Windows 2000 was one of the better ones… Don’t compare it to Vista…..

            • Scrotos
            • 11 years ago

            Seconded! It was a good OS.

            • Mithent
            • 11 years ago

            Quite is a good comparison actually, I think. 2000/Vista were both major reworkings of the architecture, XP/7 are refinements of the existing architecture.

            Besides, Vista’s a fine OS; not much better than XP in many ways, true, but that doesn’t make it bad. Its reputation is rather unfair.

            • WaltC
            • 11 years ago

            Well, Microsoft has already said that the next version of Windows will require the current Vista driver model, and in that regard, DX11 or 12 or 13 and maybe beyond, won’t change that at all. I imagine it will be quite sometime before Microsoft moves away from the driver model introduced in Vista.

            I’m amazed at the number of people who don’t understand that what changed with Vista is the driver model, and that’s *why* DX10 and beyond will only be supported in Vista and beyond. WinXP will not support the Vista driver model. Ever.

            • pogsnet
            • 11 years ago
      • Anomymous Gerbil
      • 11 years ago

      Oh dear, Ryu *[

        • Krogoth
        • 11 years ago

        No, he prefers the old, flat layout scheme. 😉

          • willyolio
          • 11 years ago

          or maybe he’s just narcissistic enough to want to make sure everyone sees what he posts, even if it means it’ll be out of context.

    • Usacomp2k3
    • 11 years ago

    I wonder about ray-tracing…

      • ish718
      • 11 years ago

      10 more years pal, 10 more years…

    • Mystic-G
    • 11 years ago

    How about Microsoft gives DX10 to XP so it’ll become more streamlined first.

    The last thing we need is more DXs that go pretty much unused.

      • Meadows
      • 11 years ago

      How about you stop whining and pull the plug of the life support of your wrinkly OS that doesn’t deserve to be up anymore? 😉

        • Mystic-G
        • 11 years ago

        Oh yea, I forgot, developers don’t make games on DX9 anymore. Oh wait…

          • derFunkenstein
          • 11 years ago

          then why bother porting DX10 to XP?

            • Mystic-G
            • 11 years ago

            So DX10 becomes mainstream.

            • Master Kenobi
            • 11 years ago

            DX10 will become mainstream without being on XP. Accept the change and stop trying to fight it. Whats that old saying? The wave of change is like a massive tidal wave. You either ride the wave of change, or get run over by it.

        • Forge
        • 11 years ago

        Please, I beg you, stop trolling.

          • Corrado
          • 11 years ago

          Hell, lets backport it to 2000 too.

            • maxxcool
            • 11 years ago

            F$CK yeah!!! <3 win2k …..

            • Scrotos
            • 11 years ago

            The ONLY reason I upgraded from Win2K to XP a year ago was because some of the newer games required XP before they’d let me install them.

            I didn’t even care that drivers weren’t being released or made anymore for the OS since my video card was pretty old anyway.

            I keep wishing the Vista that I’m running on my new Q6600 setup would be more like 2K. You know, simple UI, just work with me instead of against me. I’m even trying to “do it right” by not disabling the security popup thing that for some reason I can’t recall the name of at the moment. UAC or something. I’m trying to integrate Vista into my workflow, but… it’s just not working too well for me, at the moment.

      • stmok
      • 11 years ago

      Now why would MS want to do that? Vista brings a whole host of wonderful opportunities!

      (1) Gets people upgrading their hardware…Which makes the hardware makers really happy!

      (2) Reinvigorates that “upgrade treadmill” for the consumer (that was slowed down by the 5yr development of Vista).

      I mean you gotta keep buying right? What? You stop upgrading?! Shame on you! Did you know you should upgrade regularly as corporations say so? It helps drive the economy! Keep up with the times!

      What? You don’t like Vista? Upgrade anyway! Make sure its Windows Vista Ultimate! You gotta keep up with technology you know! You wouldn’t want to be left behind would you?

      I can’t wait for DirectX 12 and Windows 7! Yeah!!!!

    • bogbox
    • 11 years ago

    Let’s hope DX11 is a coupe of years away , because aren’t too many games with dx10 even at 2 years after launch.
    I mean games that are really dx10 (maybe Crysis ), not just pretending with some texture(COH).

    And I really need to change some hardware this summer , and don’t want to upgrade in less the 3 years.

      • Kurotetsu
      • 11 years ago

      Well, even if it does come out soon you won’t necessarily need to upgrade. As you said, DX10 adoption in games is pretty damn slow, so DX11 adoption certainly won’t be going anywhere fast. Plus, most if not all DX10 games have the option to fall back to DX9. I imagine DX11 games will have the same feature for DX10, maybe even DX9.

      • Meadows
      • 11 years ago

      Crysis is the worst example in the history of bad examples.
      It’s not Dx10. Simple. Crytek sucks.

      If you’re looking for noticeable differences, you should try Lost Planet, Hellgate: London or Call of Juarez.

        • BKA
        • 11 years ago

        Hellgate: London had a very noticeable difference with DX10. Unfortunately the gameplay was only so-so. The game does looks beautiful though.

          • Meadows
          • 11 years ago

          As far as the game goes, I’d rather play Crysis than Hellgate, true, but Hellgate had some nice visuals too and it had something Crytek have been overlooking: *gasp* the ability to scale up with better hardware.

          Also, you may or may not believe this, but I could max out the game graphics in Dx9 mode with an overclocked 8600 GT – at 1600×1200. That’s what a game should be made like, in my opinion – runs good on moderate gear, but has the options to make high-end PCs sweat too. Crysis promised the same, but instead they only gave the second half, and that was outrageous.

        • bogbox
        • 11 years ago

        First I said “maybe” because is looking great, graphics are awesome ,at very high ,noting like this in DX9 games . The rest of game is crap , yes I know, even Crytek knows.
        Crysis is like the Xbox 360 , out to early. needed more time to optimize.But if Crysis was to launch at normal time in Q3 or Q4 2008 it will be just a normal graphics compered to FarCry 2 and others.

    • Satyr
    • 11 years ago

    Tesselation? Wow, that’s a generic name for what, probably a subdivision scheme?

      • d2brothe
      • 11 years ago

      Yes, its the process of turning a curved surface into polygons I believe. If I understand correctly, tesselations would allow for curves to be sent to DX and then the appropriate level of tesselation to be selected based on distance etc. so that models up close get more polygons than those far away.

        • [TR]
        • 11 years ago

        That would mean that the game engine could do without LOD, or am I mixing two different concepts?

          • Meadows
          • 11 years ago

          They’re comparing it to ATI’s old “TruForm” gimmick, which was “doing LOD in reverse”, but increasing polygon count without knowing the theoretical curves is tricky and potentially difficult.

          I’d imagine DirectX 11 will provide a way for developers to mark the theoretical curves so that polygon count can be meaningfully increased along them.

            • ew
            • 11 years ago

            Actually there are common and well understood methods for interpolating a polygon mesh.

            §[<http://en.wikipedia.org/wiki/Subdivision_surface<]§ These techniques has been used for quite a while in professional 3D animation packages.

            • Meadows
            • 11 years ago

            But it begs the question, why aren’t people using the well known methods? The way things stand, they shouldn’t need DirectX 11 to do something like that.

            • UberGerbil
            • 11 years ago

            People are doing it. On the CPU, mostly, and each in their own way. DX11 moves it onto the GPU, and makes it available to everybody. People could write all their own vertex setup code and texture mapping and the rest but they don’t because there’s an implementation already in DX, and it’s good enough so why not use it? One less piece of foundation code you don’t need to write or optimize or debug. DirectX, like framework libraries in general, is about factoring out a standard implementation of commonly-used code; in DirectX’s case it also provides an impetus for the GPU vendors to provide hw acceleration of the feature, because there’s now a common standard they can spec against.

            • Meadows
            • 11 years ago

            Physics on GPU, geometry and textures on GPU, tessellation on GPU, general processing on GPU (soon AI along with it?), what next? Soon, games will fall back to good old single-threaded simplicity while the bulk of the job is being done by a “videocard” with 65,536 shader processors. Golly.

            Would be a shining future.

            • UberGerbil
            • 11 years ago

            If what you’re doing involves massively parallel floating point operations, it kind of makes sense to use a massively parallel floating point processor to do it, if you have one available. Which is why Intel wants to pull that onto the CPU die.

            Some things do not lend themselves to that kind of hardware, however. AI is probably the most notable example in gaming (though various attempts have been made to do parametric behaviors, it still tends to be branchy integer code)

            • Meadows
            • 11 years ago

            Well, that’s sort of sad.

            • UberGerbil
            • 11 years ago

            I don’t see why. Transistors are transistors. Is it sad your CPU uses a cache instead of getting everything from memory every time?

            • Meadows
            • 11 years ago

            No, but I would’ve liked a future where the GPU is called CPU and the CPU is called Assistive Processing Unit, with intel falling down like a meteor for nVidia and AMD to take its place. Capital but fitting punishment for developing such garbage videochips back in the day, in my opinion.

            • d2brothe
            • 11 years ago

            Umm…dream on then.

            • Flying Fox
            • 11 years ago

            You are missing the point, “controller-style” code is always going to be branchy and should be run in CPU. Parallel calculations “workers” are going to be doing the “assisting”.

            Amount of code or running time is not the determining factor as to who is the “main” processor or not.

            • Satyr
            • 11 years ago

            Heh. Yeah, I realise this. My point was more that they’re calling it tessellation. I just thought that was an incredibly ambiguous name for it. And, assuming it is just some polygon subdivision scheme implemented on the GPU, poorly chosen. That’s why the word subdivision was coined for this process; it’s not simply tessellation; at least not in the geometry/graphics sense.

            • crazybus
            • 11 years ago

            I recall TruForm much improving the look of UT2003 player models. It’s a shame I don’t have the screenshots anymore.

          • UberGerbil
          • 11 years ago

          No, they still generally need LOD in textures. You could start out with the largest and most detailed texture and generate the rest on the fly, but the results tend to be unpleasing compared to letting artists tweak the results while the game is under development. You also don’t want to be doing that work (loading the largest textures just to generate low-detail textures for far-away objects) during the game, or even up front during level-loads (that’s a lot of bandwidth to the card and thrashing of the card’s memory)

            • Satyr
            • 11 years ago

            He didn’t mention textures, just LOD. And by ‘LOD in textures’, I assume you mean MIP mapping? And yes [TR], you’re right, this is a different but similar solution to LOD, basically increasing the density of the points making up the tessellation as a function of distance from the camera.

            • Satyr
            • 11 years ago

            This reminded me of two fairly straight forward papers on the topic actually.

            §[<http://kucg.korea.ac.kr/seminar/2005/src/PA-05-16.pdf<]§ and §[<http://algorithmicbotany.org/papers/subgpu.sig2003.pdf<]§ The later actually concerning subdivision curves.

    • Jigar
    • 11 years ago

    Now they realize about the eye candy missing in DX 10 ….

      • Meadows
      • 11 years ago

      It never was about eye candy. It was about making the code more streamlined, more efficient – Dx 10.1 took that even further. It’s just that nobody wants to exploit these options.

        • cegras
        • 11 years ago

        Blame nvidia.

          • Meadows
          • 11 years ago

          That’s not the point either. IT isn’t about blame and “would have been” scenarios.

        • Jigar
        • 11 years ago

        sorry i should have completed the sentence.. *[

          • ish718
          • 11 years ago

          *thinks back to early DX10 performance on 8800GTX*
          DX10 was a failure from the start. DX10.1 just isn’t taking off.

        • carburngood
        • 11 years ago

        If it were really about streamlined code why does every dx10 game run slower than when it runs dx9?

          • Meadows
          • 11 years ago

          Because those are all Dx9-optimized engines with bells&whistles© instead of actual streamlining of any particular sort.

    • titan
    • 11 years ago

    Didn’t DirectX originally ship as a software feature and then hardware support came later? Is that what’s going to happen with DX11 maybe?

    • titan
    • 11 years ago

    Man, I need a new computer so I can get Vista, so I can get DX10/10.1/11.

      • 0g1
      • 11 years ago

      Get a new computer to run everything faster … not just DX10/10.1/11. If you’re a gamer, I think you would prefer running DX8/9 at 300fps instead of DX10 at 60fps. Gamers don’t care about cinematic effects like fog, depth of field and other post rendering distortions. Cinematic effects belong in low frame rate cinema. Gamers care about game play and that means responsiveness, clarity, and interactivity.

      A lot of progress has been made running DX10 on XP, although I haven’t wanted to try it.

Pin It on Pinterest

Share This