DirectX 12 DXR and Nvidia RTX bring real-time ray-tracing to 3D engines

Real-time ray tracing has been the elusive Holy Grail of computer graphics as long as there have been computer graphics. Ray tracing has long been used in films and other places where it's okay to take a few hours or more to render a single frame, but historically the technology's just been too demanding to use in games. That said, computers have gotten awfully fast in the past couple decades. Microsoft and Nvidia both announced new software at GDC that could help bring real-time ray tracing to upcoming games: DirectX 12 DXR and Nvidia RTX.

This is not a photograph.

If you're not familiar with ray tracing, you can pore over this article from Nvidia that explains the difference between ray tracing and common rasterization methods used in games. The short version is that rather than approximating or outright faking environment lighting and shadowing, ray tracing calculates lighting, reflections and refractions in a scene by drawing virtual rays that realistically simulate the way light interacts with objects. The technology is hardly new—we used POV-Ray as a CPU benchmark in some of our oldest CPU tests, and we still use Corona as a useful test today.

Microsoft's new technology is DirectX 12 DXR, a suite of functions that can be used to accelerate ray tracing on modern GPUs. The company says that DXR is supported on all current hardware, so you don't need specialized gear to enjoy the benefits. Microsoft also notes that DirectX handles the ray-tracing work like a compute workload. Because of that characteristic, it's possible and even expected that developers will use ray tracing only for some elements in a scene while the rest of it is rasterized normally.

Along similar lines, Nvidia announced RTX. The company doesn't go into too much detail on what RTX specifically entails, but does say it's meant to accelerate ray tracing on Volta GPUs. The only such chip in the wild today sits atop the Nvidia Titan V, so it's not impossible that this announcement is subtle foreshadowing for a forthcoming series of GeForce cards based on Volta. RTX is compatible with Microsoft's DXR, so apps that use DXR on a compatible system will automatically make use of RTX if the appropriate Nvidia hardware is in place.

The demo above is from Remedy Entertainment's Northlight engine, and you might notice some noise in the video. Ray tracing can be implemented in multiple ways, but typical "path-tracing" implementations like Brigade have heavy noise in their output due to the limited number of rays that they can cast in a single real-time frame. Last year, Nvidia showed off an AI-based ray-tracing denoiser that showed very promising results. The company says that GameWorks now includes such a denoiser—possibly the same one in the linked demo. Nvidia also goes on to say that the next version of GameWorks will include further enhancements for ray-traced rendering.

Both companies say that a few big names in the games industry are working with the new ray-tracing technologies. Alongside Remedy, the list includes 4A Games, EA Games, Epic Games, and Unity. That means that before long, game developers using the Unreal, Unity, or Frostbite engines should be able to experiment with DXR and RTX themselves. 

Comments closed
    • Shinare
    • 2 years ago

    Hello beautiful VR, I’m ready when you are!

    • Scrotos
    • 2 years ago

    Anyone remember Caustic and their ray tracing hardware? Comment from Carmack at the bottom.

    [url<]https://arstechnica.com/gadgets/2013/01/shedding-some-realistic-light-on-imaginations-real-time-ray-tracing-card/[/url<] I think they migrated to this which also failed: [url<]https://en.m.wikipedia.org/wiki/OpenRT[/url<] No, wait, that was for saarCOR. And turned into OpenRL maybe? [url<]https://en.m.wikipedia.org/wiki/Ray-tracing_hardware[/url<] You may remember some custom quake 3 versions. [url<]https://youtu.be/2uwAAepv9TE[/url<] Intel had some guy who did a masters thesis on this do some demos for their cribs. [url<]https://arstechnica.com/information-technology/2010/09/intel-shows-off-whats-left-of-larrabee-ray-tracing-wolfenstein/[/url<] I guess that stuff turned into embree: [url<]https://software.intel.com/en-us/articles/embree-photo-realistic-ray-tracing-kernels[/url<] Just gonna leave these here: [url<]https://www.ospray.org[/url<] [url<]https://www.imgtec.com/legacy-gpu-cores/ray-tracing/[/url<]

      • Scrotos
      • 2 years ago

      And more intel tech demo:

      [url<]https://en.m.wikipedia.org/wiki/Quake_Wars:_Ray_Traced[/url<] And another quake 3 project of Star Trek: elite force doing ray tracing: [url<]https://youtu.be/jLFrP0c7VWw[/url<] There were two thesis people who I remember working with quake 3 for ray tracing.

    • deruberhanyok
    • 2 years ago

    [i<]Microsoft also notes that DirectX handles the ray-tracing work like a compute workload.[/i<] This is the kind of thing people need to keep in mind when they say that GPU makers could just cripple compute capabilities on consumer gaming cards, to deter miners from buying up all the stock and causing crazy prices. Compute is already used for a lot more than just mining. And will be even more in the future. Mining was just the low-hanging fruit for how to use it. (It helped that people are making money off of it).

    • Anovoca
    • 2 years ago

    Ray-tracing is exciting but what I really cant wait for is to see ray-trace content with fully realized HDR to allow looking at light sources to wash out surrounding colors. Once the camera can mimic the dilation of the iris we will be stepping into a whole new level of realism.

    • Chrispy_
    • 2 years ago

    Wake me up when we have a raytracing/rasterising hybrid engine that can dynamically render a pixel either way and use the most efficient method of each pixel based on its geometry/lighting complexity.

      • NTMBK
      • 2 years ago

      The NVidia blog post actually talks about this. Example they gave is to rasterize the scene normally into a G-Buffer, then use that as the start points for raycasted reflections.

        • NTMBK
        • 2 years ago

        Blog post from Enscape discussing a similar method: [url<]https://gpuopen.com/deferred-path-tracing-enscape/[/url<]

      • Andrew Lauritzen
      • 2 years ago

      The entire point in this stuff being in DirectX is to mix various techniques to get the best image quality and performance trade-off. I don’t expect any realistic real time workloads to be doing pure raytracing in the near future.

      For our demo almost all of the ray tracing passes also involve significant temporal reprojection, filtering and reconstruction for instance, and I expect this to be the case for everyone. Many of our effects can use a mixture of screen space effects (SSAO, SSR, shadow maps) and raytracing, but honestly in most cases so far we’ve found that once you have bothered to generate an appropriate acceleration structure, it is often higher quality and even faster to just do pure raytracing for the relevant passes. The screen space effects that have been trying to emulate global light transport have significant limitations; this is really the whole reason to do raytracing after all.

      So while I don’t really get your point about choosing rasterization vs. raytracing “per pixel”, integrating a mixture of techniques deeply in the renderer is precisely the point in having this API in DIrectX, and indeed already what we (and I imagine other folks) are doing.

    • Klimax
    • 2 years ago

    Why the hell are they tying it to bad idea like DirectX 12? (Same case would be with Vulcan BTW)

    • GrimDanfango
    • 2 years ago

    Most modern high-fidelity games already have raytracing – Screen Space Reflections.

    As with anything where the “holy grail” of realtime raytracing comes up… this won’t herald the obsoletion of traditional raster-based rendering whatsoever. Whatever raytracing can do with basic geometry and materials, raster-based pipelines will be able to do 100x faster, and ultimately look better.

    This is certainly cool stuff however… and it may mean that there’ll be more raytracing-based effects introduced into games’ rendering pipelines.
    …but any talk of it being a workable total alternative are nonsense. Raytracing makes much more sense as a complement to current graphics, not as a replacement.

    Edit: Hmm, perhaps I spoke too soon… it does sound like raytracing could be making some fairly significant inroads in realtime rendering after all, and I suppose these demos aren’t really making the claim that it’ll be a total replacement for the time being anyway, so much as providing powerful new tools.

      • Pancake
      • 2 years ago

      Ray tracing does much better for tonnes of geometry. Rasterisation scales linearly with number of triangles to render. Ray tracing is sub-linear. So, at a certain point in scene complexity – all being equal – ray tracing will beat rasterising lots of geometry.

      Of course, dynamic level of detail and other techniques can change the balance somewhat which is why GTA V can render a whole of city view using rasterization. But it isn’t as simple as you imply.

      What is interesting is the differences between CPU and GPU ray tracing. Even with a mighty 8700K you’ve got at most 16 threads in play. Ray tracing is pointer-chasing sparse data structures located all over memory (assuming a complex scene). So, even with 16 threads they can be stalled waiting for the poor memory system trying to keep them fed.

      With thousands of threads as with a GPU a whole bunch of them can stall but it doesn’t really matter as the memory scheduler can more optimally feed a bunch of them waiting for about the same part of memory at the same time. So you’re getting better throughput from your RAM.

      So, RAM latency sucks.

        • GrimDanfango
        • 2 years ago

        [quote<]... all being equal ...[/quote<] *If* all else was equal... maybe. But surely it's not is it? End-to-end raytracing of this sort requires a huge amount more computation to put a single shaded triangle on the screen. Rasterising starts out with a huge advantage... I'm certainly no expert, but surely that "break-even" point you're suggesting would only come about in scenes with something on the order of many billions of polygons... at which point sure, the raytracing would be quicker... say 0.1 fps to the raster pipeline's 0.05 fps. Neither would be much use for realtime engines. I do see your point though... perhaps I am wrong to suggest that raytracing will never take over... and I do get the feeling more and more of the realtime pipeline will start to integrate raytraced elements as time goes on. I do think the point where rasterising is entirely replaced is easily 10 years+ away yet however.

          • Pancake
          • 2 years ago

          Ray-tracing doesn’t require a “huge amount more computation” actually. You’ve basically got the same dot-product, cross-product linear algebra basic functions that CPUs/GPUs are already very good at. Of course, with non-realtime rendering you have the luxury of complex procedural materials rendering and other stuff. But *handwaves* all being equal…

          The BIG difference is in the lack of coherency (randomness) in memory access when feeding data to a ray-tracing engine.

          Consider traditional rasterisation. You have something like a great big vertex buffer and you fetch read it in linear order and feed it to your triangle renderers. This works well with cache hierarchies and the way DRAM works. Suck up lots of memory in linear order and feed it to the renderers.

          With ray-tracing you’re pointer chasing and basically stalling your threads on random reads from memory and not getting good use out of your cache hierarchy.

          CPUs and GPUs today have MASSIVE computational throughput providing you can keep the ALUs fed. But the problem is that memory has to be accessed in a certain way (not random) to be able to keep them utilised. There are lots of algorithms/problems in the space I work in (GIS analysis) that have the same ALU utilisation issues as ray tracing.

            • GrimDanfango
            • 2 years ago

            Interesting insights, thanks Pancake (and also Andrew Lauritzen in other comments)

            I’ll certainly welcome true raytracing in whatever form if it really is moving beyond pure proof-of-concept.

            Even if just for true ambient occlusion alone… I’m in the slowtime-rendering VFX side of things myself, and if one effect makes me cry more than any other in current realtime game tech, it’s Screen Space Ambient Occlusion.
            Well, more specifically, it’s the tendency most developers have of over-cranking it until every hint of a corner contains inky-black smudges, and every character walks around with a weird anti-halo. Even the likes of Witcher 3 and Deus Ex: Mankind Divided do it.

            I presume a raytracing-based solution would function a lot better as a straight “make picture look good” tool, and hopefully remove the need for developers to show restraint in how they apply filters 🙂

            • Pancake
            • 2 years ago

            Let’s take a compare of memory latency from the old days to the new days *handwaves*:

            80486: random access fetch = 200ns, 1MFLOPs

            i7-8700k: random access fetch 10ns, 70,000MFLOPs (per core!)

            So, in the last 30 or so years RAM latency has improved 20x but FLOPs 70,000x (per core!). And the good old 486 was already orders of magnitude faster at floating point than the old minicomputers that would have been used to pioneer ray-tracing!

      • Andrew Lauritzen
      • 2 years ago

      Screen space effects are not really “ray tracing” in the same sense as this technology of course. Indeed they have significant limitations… toggling SSR vs. ray traced reflections for instance is quite night and day. Obviously SSR is better than nothing, but there’s a reason why we as an industry are doing ray tracing after all. There’s obviously significant limitations to quality without true global effects.

      Certainly there have been a variety of methods around voxels and surfels and the like to accomplish similar things and none of that will go away I imagine. But ray tracing is a good fundamental tool to have available both to improve those methods and to enable stuff like physically accurate soft shadows that are difficult or impossible to achieve with traditional methods.

    • Tristan
    • 2 years ago

    No animations, no refractions. How does it handle antialias, smoke, fur, particles and other such effects ? It may be very difficult to merge rasterisation and raytracing, and maintain image graphical consistency.

      • stefem
      • 2 years ago

      They have put up this demo in a very short time, there is a demo (I think the EA/DICE) where you see refraction, AA wouldn’t be a problem as other effect

    • Andrew Lauritzen
    • 2 years ago

    Shameless self promotion: also see SEED’s demo [url<]https://www.youtube.com/watch?v=LXo0WdlELJk.[/url<] Was a lot of fun to work on 🙂

      • YukaKun
      • 2 years ago

      Looks neat.

      Do you know how well it escales with pixel growth? That scene, although complex, is incredibly small compared to, say, Witcher 3’s open world.

      Illumination techniques are by far the most expensive currently in rendering with, maybe, real-time reflections. Both problems kind of get resolved with Ray-tracing, but depends on the implementation depth.

      Which brings me to the question of how deep is it reflecting?

      Cheers!

        • Andrew Lauritzen
        • 2 years ago

        In general, ray tracing scales well with geometric complexity and pretty linearly with pixel counts. The main thing that has made this practical in real time (albeit on high end hardware currently) is vast improvements in temporal reprojection, filtering/reconstruction and denoising. That’s what allows you to turn a few rays per pixel from a pile of noise into a nice image 🙂

        There’s a variety of different ray-tracing effects happening in our demo including reflections, shadows and GI. I’ll avoid spoiling some of the details before Wednesday as we have some talks at GDC (I’m sure slides will get posted) that describe them at a high level. To answer your question about “depth”, the GI is iteratively updated so the depth can get quite high but large illumination changes will of course take a few frames to reconverge. In practice it works quite well.

        The scene is mostly simple because it’s a research prototype (generated largely procedurally no less), and was done in a fairly limited time window. The performance is not particularly sensitive to the geometric complexity though, it’s much more sensitive to the ray tracing sample counts (~resolution, although lots of temporal reprojection and filtering going on) and so on.

          • YukaKun
          • 2 years ago

          Oooh, neat!

          Thanks for taking the time to answer my questions.

          I hope you do well at GDC and have a lot of fun 😀

          EDIT: One question that just popped, and in case you can answer it… Were you using a Volta-gen video card (or cards) to render everything? 🙂

          Cheers!

            • Andrew Lauritzen
            • 2 years ago

            In our GDC demo we are indeed using a Titan V (or optionally multiple of them), but the renderer can of course scale down to somewhat more modest hardware.

            • dodozoid
            • 2 years ago

            Do you work at nVidia now or other company utilising the tech?

            • stefem
            • 2 years ago

            I think he work for EA/DICE judging by his self promotion effort 😉

            • ludi
            • 2 years ago

            And that tech demo is re-using music from EA’s SSX Blur, if I’m not mistaken…

            • Andrew Lauritzen
            • 2 years ago

            I work for EA’s SEED research group that did the demo/tech that I linked. (ea.com/seed)

            • YukaKun
            • 2 years ago

            What do you say is the baseline for it, in the near future?

            For example… Say… Would a Radeon RX-480 / Volta-equivalent to 1060 6GB be enough to take full advantage of RT?

            It is promising, but as with any tech-demo, reality strikes hard sometimes when the real products arrive =/

            Cheers!

    • JosiahBradley
    • 2 years ago

    Everything is way too shiny in the demo. Napkins are matte.

      • RAGEPRO
      • 2 years ago

      I think what you’re seeing is a magazine. There are several napkins visible that are clearly matte.

      That said, they’re like #FFFFFF white and insanely luminous. Nothing in real life is that luminous if it isn’t a light source. Still, I think this is probably a relatively-hastily-constructed demo and it looks pretty good for real-time stuff.

      Personally I think [url=https://www.youtube.com/watch?v=pXZ33YoKu9w&t=2m09s<]some of the work[/url<] coming out of the Brigade guys is more impressive, though.

        • morphine
        • 2 years ago

        [quote<]Nothing in real life is that luminous if it isn't a light source[/quote<] Speak for yourself, peasant.

          • psuedonymous
          • 2 years ago

          \o/

          • KeillRandor
          • 2 years ago

          (Sounds like someone’s pregnant and glowing)

    • BobbinThreadbare
    • 2 years ago

    Nit pick with the framing here. There have been ray tracing game engines in the early days but rasterization took off and there has been countless amounts of money spent building dedicated hardware acceleration for it as opposed to ray tracing which had to be done in software. There is probably an alternate timeline in another dimension where the reverse happened.

      • chuckula
      • 2 years ago

      Well duh! I knew I had the raytracing solution when I saw myself with the goatee and without the leather jacket!
      — Jen Hsun.

      • terranup16
      • 2 years ago

      It’s an interesting thought experiment tbqh. I was going to toss something dumb back about ray-tracing not really being something different hardware architectures would have resolved (we’ve ended up with the primary use of die space in GPUs going towards high parallelism “generic” computational “cores” pushed to the highest speeds they can sustain which I’d figure is exactly what ray-tracing would want anyway), but in the earlier days of rasterization I probably wouldn’t have expected tessellation to become such a significant hardware-accelerated lynchpin for improving geometric detail. It wouldn’t surprise me if ray-tracing doesn’t have similar shortcuts (which nVidia’s denoiser sounds like it could be one) which could be accelerated via dedicated hardware.

      • YukaKun
      • 2 years ago

      Didn’t ID Tech 3 have Ray-tracing? Am I remembering wrong?

      Also, wasn’t Ray-tracing one of the “killer features” of Larabee?

      Cheers!

        • smilingcrow
        • 2 years ago

        Well it had one killer feature which it used on itself.

      • ludi
      • 2 years ago

      Not so sure about that second part. Rasterization took off in a big way precisely because it reduced the computing and memory requirements to something that was possible with the hardware of the day. Even now with 30 years of hardware improvements beyond the wildest imagination of anyone tinkering with an Amiga in the late 1980s, this relatively simple demo is still not a trivial task. It is pretty impressive when you consider the technique but the visual impact is no better than what Crysis could do in 2007.

      • Mat3
      • 2 years ago

      I don’t think ray tracing could have worked in any alternate dimension as a replacement for rasterization but I could definitely imagine a world where voxels took off instead of polygons.

Pin It on Pinterest

Share This