DirectX 11 benchmark goes up for download

With both DirectX 11 graphics cards and Windows 7 now out in stores, Shacknews reports that middleware developer Unigine has released what may be the very first public DX11 benchmark. Dubbed "Heaven," the 127MB application is available at FileShack.

In the benchmark, a disembodied camera pans through a floating fantasy town with grassy hills, paved roadways, and a square with a dragon in the center. The user can press keys to enable or disable tesellation, which increases the polygon count of various objects, from the dragon to walls, floors, and rooftops. The Heaven benchmark also lets you jump into the world and walk around, should you want a closer look at some of the eye candy.

The application seems like it can run on even DirectX 9 graphics cards, although of course, you’ll need one of the new Radeons running on Windows 7 to enable the DX11 effects. Otherwise, you can check out a video of the benchmark in action here.

Comments closed
    • UberGerbil
    • 10 years ago

    I haven’t spent any time looking at or running this benchmark, and without seeing the code (or at least more detailed claims by its developers) it’s impossible to know exactly what the various codepaths are actually (or at least supposedly) doing.

    However, your DX 10.1 parts are not giving you DX11 features without falling back on the CPU. Yes, you might have a “tesselator” but that isn’t sufficient to give you DX11-style (fully GPU-accelerated) tesselation. The DX11 pipeline defines /[

      • Waco
      • 10 years ago

      Tessellation simply doesn’t work in DX11 mode without a DX11 GPU. It looks identical to the DX10 version. Scores are *slightly* lower in DX11 mode though…so it’s doing something extra.

        • UberGerbil
        • 10 years ago

        Well, like I said, without more information from the developers on what they’re actually doing in the various codepaths when various hardware is available, it’s hard to know what the benchmark is actually measuring.

    • YeuEmMaiMai
    • 10 years ago

    lol at all of the “it’s useless” comments…..

    • ltcommander.data
    • 10 years ago

    Seeing that this benchmark has DirectX 9, 10, 11, and OpenGL code-paths and is available for Windows XP through 7 and the Unigine engine itself has a Linux version, I wonder why they don’t make a Mac OS X version? Then we’ll finally get a cross-platform graphics benchmark. Although given that the OpenGL code-path in Windows already looks un-optimized compared to DirectX and OpenGL drivers in OS X and Linux aren’t exactly speed demons, the expected conclusion would be that Windows would be faster. Still, putting the actual numbers out there might put pressure on all parties to improve things.

    • Sahrin
    • 10 years ago

    Looks very good; but I’m not wild about the Field-of-focus blur. That’s something that is happening in your eyes; it will naturally occur when you don’t focus on something. It’s actually pretty distracting when they try to ‘force’ it (especially on the ‘path walk’ up to the dragon-square). It’s like being drunk in 60% of your field of view.

      • Meadows
      • 10 years ago

      It’s called depth-of-field, and it’s used in photography and movies all the time to guide your attention. Done right, it compliments your own focus against an otherwise flat screen.

      It’s overdone here, though. Not in the later bridge balancing scenes and such, but during the floating island overview.
      I suppose it’s overdone because that might make cards work harder, but it sure as hell doesn’t look /[

    • sigher
    • 10 years ago

    Yeah it’s funny but I remember dx8 demos that did have much more awe than stuff that came after, and not just because it was newish.

    /[

    • Meadows
    • 10 years ago

    Can we get an official reply on whether this will be integrated into TR’s suite?

      • BoBzeBuilder
      • 10 years ago

      Everyone quiet. Damage’s about to make a statement at TR press conference. Stock prices are record high.

      • UberGerbil
      • 10 years ago

      That seems premature. The gaming-oriented tests TR uses are either actual in-game benchmarks, or synthetic benchmarks that have something of a track record. Right now, this is neither, and at best would seem to hope to one day be the latter. I guess if the TR folks have extra time they could run the benchmark, but without more detailed knowledge about what its various options actually mean it’s hard to know how to interpret the results.

    • redpriest
    • 10 years ago

    There is a massive difference between tesselation on and tesselation off in DX11 mode. Every stone has depth to it – and I mean every. The stone in the walls, the cobblestones. Even the roofing has depth with it. Tesselation is used everywhere and it makes a huge quality difference. Without it, everything is flat and unrealistic looking.

      • Tony Neville
      • 10 years ago

      Must be nice to have a Radeon 58xx series card. I have a DX10 card and took a snapshot of a stone structure with tessellation on and then off. They’re identical.

      Quick, someone give me a fermi!

        • designerfx
        • 10 years ago

        what do you know, someone like you who games the scores (per your other post) also can’t see tessellation on something that doesn’t have dx11 capability?

        your score with an i7 965 and a gtx290 or whatnot is completely inaccurate unless you are running that 965 nitrogen cooled above 5ghz along with the same for the graphics.

        boy, is that a surprise.

          • crazybus
          • 10 years ago

          Huh? If you’d bother to look around, you’ll see that the demo runs quite a bit faster on nvidia hardware at the moment.

          • Tony Neville
          • 10 years ago

          I’ll tell you what. I’m not allowed to post a link here but if you know what tinyurl is then you will know what to do with this: yguaraq

          Say what???! You see the same figures I do?! Shhhh, keep it secret lest someone questions your honesty, too.

            • Waco
            • 10 years ago

            That’s almost exactly the same score I got with my 4870X2. :shrug:

            EDIT: Q6600 @ 3.2 GHz, Windows 7 x64.

      • UberGerbil
      • 10 years ago

      Obviously, as players you’re going to be most interested in the visual and performance features that tessellation offers, but for the developers the promise is at least as much in streamlining the asset development workflow and getting better productivity out of their artists. Because the tessellation stages in the pipeline can work with the higher-order topology used by many 3D tools (Catmull-Clark quads, for example) it should be possible to reduce a lot of the grunt work — leaving it to the game engine (and the GPU) to convert the models into triangles, rather than hand-rendering them at various levels of detail in advance. This can be more performant, also, because you don’t have to load multiple versions of the model onto the graphics card, but just work with one high-level description. That’s the promise, anyway; as with everything, it takes time for the toolchain to get worked out and for folks to experiment and tune and figure out what works best. And of course you still have to do the work to handle all the people who have DX9-level hardware and/or XP. It’ll be a while before anybody does a major title that is DX10/11-only.

    • Tony Neville
    • 10 years ago

    When I run this benchmark in fullscreen mode the mouse pointer positioning gets more confused the further out it is from the top left corner of the display.

    Anywayz, I have an i7-965 and a GTX295 running with W7 Ultimate 64-bit.

    DX10, 19200×1200, everything else unchanged
    FPS=66.7
    Score=1679

    DX11, 19200×1200, everything else unchanged
    FPS=62.1,
    Score 1573.

    I don’t see any visual difference, and given that my card is a DX10 card I wasn’t really expecting any. Yet there is a sizable difference between results but way to small for tessellation to have anything to do with it..

    • wingless
    • 10 years ago

    Tessellation is not supposed to kill frame rates as much as some of you are reporting. AMD’s tessellation engine on the 2000 to 4000 series seemed more efficient than this DX11 implementation. What happened?

      • Arag0n
      • 10 years ago

      They are using teasellation but they are increasing extremly high the number of polygons also in the narrow items. Look at the path and you will see that it has stones, while at DX10 it’s just plain. The fps may be down but they were going to go deeper if they weren’t using teasellator with full models.

      • Goty
      • 10 years ago

      Umm, we don’t know anything about the tesselator in the 2000-4000 series as it was never used (AFAIK).

    • oldDummy
    • 10 years ago

    DX10 maxed at 1920 X 1200
    gf285, i7 @ 3.6G, win7 pro x64

    19.9 fps, 502 score

    • ssidbroadcast
    • 10 years ago

    Hey guys: running in DX11 mode != actually seeing it run in DX11.

      • Meadows
      • 10 years ago

      Could the inferior speed on my PC then be explained by the betaness of the Vista Dx11 update?

      I’m still trying to figure out where the performance was spent, since scores were lower well beyond the margin of error.

    • TravelMug
    • 10 years ago

    The bumpiness on the road is a bit silly, but the worst part is probably the wheelchair friendly platforms with a texture over them in non-tessalated scenarios instead of real stairs.

    • Arag0n
    • 10 years ago

    wops…
    soz

    • Arag0n
    • 10 years ago

    Has anyone noticed that Directx10/11 is much more CPU efficient than OpenGL?

      • Meadows
      • 10 years ago

      OpenGL has been dead for years now.

      Edit: as far as gaming is concerned, I mean.

        • Arag0n
        • 10 years ago

        Snow Leopards uses OpenGL for UI rendering i guess if im not wrong.

        • Game_boy
        • 10 years ago

        PS3? Wii? DS? PSP?

        They don’t use Direct3D.

          • Arag0n
          • 10 years ago

          But it is rare… I tested the 3 API’s with a Phenom 9950 / NV260GTX.

          DX10/11: 1 core~50%, the other 3 were ~5-15%
          OpenGL: 1 core at 100%, the other 3 unused.

          DX9: §[<http://img33.imageshack.us/i/dx9setup.jpg/<]§ DX10: §[<http://img21.imageshack.us/i/dx10setup.jpg/<]§ DX11: §[<http://img14.imageshack.us/i/dx11setup.jpg/<]§ OpenGL: §[<http://img40.imageshack.us/i/openglsetup.jpg/<]§

            • StashTheVampede
            • 10 years ago

            This is more of a function of the driver and its optimizations than the API itself.

            • Arag0n
            • 10 years ago

            Anyways is impresive the CPU utilization diference, and it’s good to know that OpenGL sucks on windows at least.

            • TheTechReporter
            • 10 years ago

            Nice screenshots, but IMHO you need to turn on anti-aliasing for a true comparison.
            Normally, I wouldn’t complain, but the jagged edges are distracting and make the subtle differences between the images more difficult to notice.

            Also, no offense to Unigen, but I’m more excited about other engines, since they stand a better chance of actually being used in the games that I will play.

            • Arag0n
            • 10 years ago

            Well, im not trying to compare the image quality of the diferent engines. I’m just trying to compare the CPU utilization across the cores of the system.

        • stmok
        • 10 years ago

        r[

      • stmok
      • 10 years ago

      r[http://blogs.msdn.com/kamvedbrat/archive/2006/02/22/537624.aspx<]§ ...Vista or newer implements OpenGL in one of three ways...Which way is this application/benchmark using? How does each implementation affect performance? l[<...it's good to know that OpenGL sucks on windows at least.<]l Considering Microsoft isn't really fond of using standards that they didn't come up with OR are not in control of, should you really be surprised?

        • Arag0n
        • 10 years ago

        Well, I can say that OpenGL makes windows go into a basic displaying mode, so it’s true that WMD it’s disabled. Anyways it is not explaining the higher CPU consumition. Like someone said should be a driver issue because ATI users had a much higher drop on performance than me (NVIDIA).

        • Shining Arcanine
        • 10 years ago

        It is using Legacy ICD, as it does not work with desktop composition.

    • odizzido
    • 10 years ago

    Reminded me a bit of unreal 1, so nice video 🙂

    • Shinare
    • 10 years ago

    nothing like a good benchmark/eye candy to drive sales… The “look what my card can do” urge is… almost.. irresistible!

    • PRIME1
    • 10 years ago

    And in a few short years there will be actual DX11 games to play

      • Meadows
      • 10 years ago

      You mean months. And they’ll be worth 3 times more than the best PhysX title.

      Keep trying, green man.

    • StuG
    • 10 years ago

    I have a 5870 and the benchmark looked fantastic, got around 33 FPS maxed out settings in DX11 (at 1920×1080), and around 55 in DX11 w/ Tessellation turned off.

    In DX10 I didn’t fall below 50 the entire time, DX9 was about 55, and OpenGL was around 28FPS.

    Overall nice and very exciting for me because i get to see the new goods at work!

    Thanks TR 😀

    • mortifiedPenguin
    • 10 years ago

    Quick benchmarks for those curious:

    System: Q9450 @ 3.6Ghz, HD4870 512MB, 4 GB RAM
    Unfortunately, not a 5xxx series card, but hopefully gives a good impression for the rest of us.

    Settings: 1920×1200, default max settings (anisio: 4, no AA, all features enabled)

    *[

      • Ryhadar
      • 10 years ago

      Thanks for the reference, though I’m curious as to why you wrote this:

      l[

        • mortifiedPenguin
        • 10 years ago

        I wanted to make sure that those wondering if a DX10.1 part with tessellation capabilities would work with DX10 tessellation knew that it did not (at least for this version of the Unigine).

        There was /[

      • moriz
      • 10 years ago

      tried it also on my machine:

      E7200 @3.2ghz, 4gb
      HD4890
      2048×1152 resolution, AF16, AAx0

      with everything turned on, i averaged around 32 fps. again, tessellation seems to be disabled, even though my card is capable of it.

      edit: some more solid numbers:
      in DX9:
      35.4 FPS, 891 score

      in DX10/11:
      31.6 FPS, 797 score

        • mortifiedPenguin
        • 10 years ago

        Interesting. Perhaps I need to retest DX9 mode later.

          • moriz
          • 10 years ago

          i wasn’t able to do a meaningful OGL test, simply because every time the camera pans across the entire floating island thingy, the sky is entirely BLACK.

            • mortifiedPenguin
            • 10 years ago

            Same here. Comparing the shots with the DX ones, I figured there were some GLSL render target issues with the depth of field effect causing the near and far planes to be rendered as black instead of blending with the “in focus” portion. If I had to guess though, the OpenGL render path was probably added in as an after thought for reference, seeing as even the non black parts had some artifacting issues.

            • ltcommander.data
            • 10 years ago

            §[<http://unigine.com/download/<]§ The problems you are seeing with OpenGL with alpha-testing and depth of field are known problems from the Unigine release notes. Seeing the developers say only ATI GPUs have these issues, it's probably just a case of ATI's traditionally weaker OpenGL support in drivers than nVidia. Looking at Unigine's previous benchmarks they all have OpenGL issues with ATI GPUs so it's a persistent thing. Although seeing that Unigine isn't the most widely used engine ATI can probably be forgiven in not devoting too much effort in optimizing their drivers for Unigine's OpenGL renderer.

            • mortifiedPenguin
            • 10 years ago

            I suppose I could have checked there. Thanks for the notification. Personally though, I think both parties are partly to blame but that’s mostly unfounded speculation.

      • Meadows
      • 10 years ago

      My 8800 GT didn’t even fall far from the “big guys”.
      DirectX 10, desktop resolution of 2048×1536 (quite a bit more strain than yours), 4× anisotropy and no antialias, and everything turned on /[

        • Shining Arcanine
        • 10 years ago

        I do not believe you can run DirectX 11 mode on your GeForce 8800.

        I tried it on my GeForce 250 GTS on 64-bit Windows 7 and I received an error.

      • larchy
      • 10 years ago

      I got 830 on my 5850 at 1680×1050 with 4xAA

      Why would you expect any visual difference running DX11 ode on your 8800GT Meadows? Get a DX11 card.

        • Meadows
        • 10 years ago

        I was expecting a difference because it allowed to run the mode, but performance was lower. That’s why. I was looking for an explanation why I got less points there.

      • designerfx
      • 10 years ago

      886/35.2 FPS for me at 1920×1200 with an I7 920 and a 4890 vapor x.

      link to my result:

      §[<http://docs.google.com/Doc?docid=0AUFlShbrc8o5ZGdmNGtyOGdfMzFjdHJnZHIydg&hl=en<]§ (publicly hosted google doc)

    • Anomalous Gerbil
    • 10 years ago

    Just watched the video. No doubt technically very impressive but it didn’t have much “wow” factor for me. The stuff at the end with variable lighting as the sun sets is very nice, but the majority of the video set in very bright light seemed fake, as if the buildings were made of marzipan. The focus blur seemed overdone, and failed to induce a sense of scale as presumably intended. Also did anyone else find that crazy foot-shredding cobblestone road distracting? It’s nice that you have a bountiful polygon budget, but you don’t have to spend it on making the road surface look like the Himalayas.

    Sorry if I’m missing the point by critiquing it as 3D animation. Perhaps the effort and budget required to do something like this up to really high modern standards is too much for a free product?

      • shank15217
      • 10 years ago

      I dont thing they are as inspired artists as good software engineers. In reality no road is that blocky, or roof so bumpy, they were trying to emphasize dx11 features not use it artistically. Remember this is a benchmark…

      • Sahrin
      • 10 years ago

      Thank you for pointing the road out! I was like “well, I guess if it’s a brand new old town…”

        • Fastidious
        • 10 years ago

        Those jagged stones on the road were painful just to look at. LOL

    • BoBzeBuilder
    • 10 years ago

    I hope TR includes this in their future benchmarks.

      • JustAnEngineer
      • 10 years ago

      For what?

      “Heaven” Benchmark score:
      ATI: 4, NVidia: 0

      That’s not going to look very entertaining on a multi-colored bar graph.

        • Reputator
        • 10 years ago

        There’s lots to analyze here. DX9 vs DX10 vs DX11 performance. DX10 vs DX11 image quality. 5750 vs 5770 vs 5850 vs 5870 performance in arguably the first, non-partisan DX11-intensive benchmark.

        And of course NVIDIA will eventually get their DX11 stuff out too.

        TR, any possibilities of the above-mentioned happening?

          • OneArmedScissor
          • 10 years ago

          No matter how many tests and cards it supports, it’s still just what the name says…

          A benchmark, not reality.

            • BoBzeBuilder
            • 10 years ago

            Yes, but that doesn’t mean it’s useless. Far from it.

            • asdsa
            • 10 years ago

            It’s still better than to keep using Vintage years to come. There is no news about Futuremark doing DX11 benchmark.

          • Shining Arcanine
          • 10 years ago

          The various levels of DirectX have this benchmark do more processing for better image quality. It is not an apples to apples comparison.

        • Meadows
        • 10 years ago

        This comment almost invalidates your existence as a sentient person.

        Please, don’t.

          • flip-mode
          • 10 years ago

          Seriously, Meadows, if you’re going to slam somone’s comment, do it in a constructive way. I don’t know what your post is even supposed to mean. How does calling him “non-sentient” address his point? He made a point and perhaps you could provide a counterpoint…. I mean this in a friendly way, cause your more thoughtful comments are often worth reading.

          • ScythedBlade
          • 10 years ago

          Lols, it was humorous too … Meadows … come on hahaha …

      • Skrying
      • 10 years ago

      Maybe an analysis of the benchmark itself but putting it in a review of a video card is useless.

        • BoBzeBuilder
        • 10 years ago

        How is it any more useless than other benchmarks in TR’s reviews?

        Not only can you compare all cards to each other under DX9 and DX10, we also get a glimps of the impact DX11 has on performance.

          • Skrying
          • 10 years ago

          Because it isn’t a game. We get to see the impact of DirectX 11 as possible in a benchmark. The circumstances could be entirely different in a real game, between game to game, etc, etc. Like all synthetic benchmarks it might have a use but it really doesn’t tell us much. I don’t see the point of having it waste TR more time for that little bit of information.

            • BoBzeBuilder
            • 10 years ago

            You do realize this is the “only” DX11 benchmark around. Until something better comes along, TR should waste their time with this benchmark.

            • Skrying
            • 10 years ago

            You’re confusing being “first” with being of any use.

            • BoBzeBuilder
            • 10 years ago

            So you consider L4D (which by the way runs fine on a 4850 @ 2560×1600 with 4AA) to be more “useful” that than a DirectX 11 benchmark that makes use of tesellation for testing the 5xxx series?
            Perhaps TR should stop testing L4D to make time for this benchmark.

            • Arag0n
            • 10 years ago

            I’m agree that this benchmark is an unique engine tested with DX11 but as someone said it’s the only avalible. So, you may be right that could bring people to error use only this benchmark to test dx11 because as we saw along time games work much better or much worse depending the engine into ATI or NVIDIA and we may get a wrong information about ho has the best dx11 engine.

            But this is the first one, let’s hope that we get more dx11 benchmarks avalible.

    • DrDillyBar
    • 10 years ago

    Wasn’t tesellation in previous gen ATI cards? Could it not be enabled, even if there is a monster preformance hit? Just a stab in the dark.

      • moriz
      • 10 years ago

      you won’t know until you try. even if it doesn’t work, i suspect some driver hack will enable it on HD4000 series cards.

        • OneArmedScissor
        • 10 years ago

        Not just 4000, but back to 2000 series cards. They’ve had this stuff in them all along. It was supposed to be in DX10. Then they made DX10.1 and Nvidia still gave everyone the finger on that…so here we are, finally at DX11…and no Nvidia DX11 cards in sight. Somebody’s a sore loser!

          • moriz
          • 10 years ago

          well if you really want to trace back, i recall the 9700PRO having a feature called “Truform”, which was basically a tessellator.

            • OneArmedScissor
            • 10 years ago

            Yes, there was similar stuff in even older cards.

            But what I’m getting at is that basically all ATI DX10 cards should be roughly DX11 compliant.

            It’s not really any mystery why they beat Nvidia to the punch so bad. A lot of the work was already done. Nvidia never wanted it and has avoided it at all cost, while ATI were expecting it years ago.

            • designerfx
            • 10 years ago

            wasn’t physx and 3dvision nvidia’s excuse?

            • crazybus
            • 10 years ago

            TruForm was a hardware feature of the Radeon 8500. The R300 removed the tessellation hardware. The only game I really used it in was UT2003/2004, where it made the character models look quite a bit nicer.

            • moriz
            • 10 years ago

            ah well, i stand corrected.

          • albundy
          • 10 years ago

          i guess AMD is loving giving the finger to them now! LoL! NV should just sell itself to intel.

          • shank15217
          • 10 years ago

          Actually the Nvidia NV1 had a tessellation engine, it was used to increase polygon count on various sega saturn ports to the pc.

      • ltcommander.data
      • 10 years ago

      The tessellator in the HD2000, HD3000, and HD4000 is not directly supported in any DirectX API, even DirectX 11 as a fallback. ATI released special extensions to DirectX 9 and DirectX 10 to allow developers to access the tessellator, but you have to use ATI’s own SDK and specifically code for it with it being a separate code path from the DirectX 11 tessellator. Given the more proprietary and niche nature of this older tessellator, it’s doubtful that anyone would code a Windows engine to support it. Although, with the number of games being console ports, you’d think the tessellator in the HD2000-HD4000 series which is the same as in the XBox 360 would have seen some use, since presumably the code can be reused. But, I haven’t heard of a Windows game using this older tessellator.

        • OneArmedScissor
        • 10 years ago

        Good point. If no one used it all along, they’re probably not going to start now.

Pin It on Pinterest

Share This