Nvidia lists games that will support Turing cards’ hardware capabilities

Nvidia's just-announced GeForce RTX family of graphics cards opens up two new horizons of possibility for game developers: the use of ray tracing to improve the quality of reflections, ambient occlusion, shadows, and global illumination with acceleration from Nvidia's RT cores, and the use of deep learning models on consumer graphics cards with Nvidia's tensor cores. The first major uses of those tensor cores appear to be denoising algorithms for use in tandem with the RT cores, as well as a new form of high-quality anti-aliasing Nvidia calls “Deep Learning Super-Sampling.”

Shadow of the Tomb Raider with RTX enabled. Note the interplay of lighting colors and natural soft shadows

While those technologies are hardware-accelerated by Turing graphics processors, the work doesn't stop there. Developers have to bake support for those features into their titles, and they're not required to implement support for every Turing trick. Some games will offer support for hybrid rendering using Turing cards' traditional rasterization and ray-tracing resources—what one might call the full-fat RTX experience—as well as DLSS. Other titles will include support for DLSS only.

To help gamers figure out how Turing graphics cards will improve their gameplay experiences, Nvidia has compiled lists of titles that will support hybrid rendering and titles that will support DLSS. Here's the launch slate of games that will support hybrid rendering with rasterization and real-time ray tracing:

And here's the lineup of titles that will either support DLSS only or in addition to hybrid rendering: 

On its GeForce blog, Nvidia posted detailed accounts of how several of the above games make use of Turing hardware. MechWarrior 5, Atomic Heart, and Assetto Corsa Competizione use RTX to produce ambient occlusion, reflections, and shadows. Battlefield V uses RTX reflections. Control uses ray tracing to simulate reflections, diffuse global illumination, and contact shadows. Metro Exodus shoots rays to simulate global illumination and ambient occlusion. Finally, Shadow of the Tomb Raider uses ray tracing to produce lifelike soft shadows.

Atomic Heart's RTX effects

Given the pre-release status of most, if not all, of the games that will support hybrid rendering, it's perhaps no surprise that Nvidia has deferred the hard launch of GeForce RTX cards until September 20. Alongside these titles, Nvidia says more than 40 developers are working on future games that will support RTX features, although it's not clear whether that means hybrid rendering support and DLSS, DLSS only, or other, as-yet-undisclosed possibilities opened up by Turing hardware. As always, we're eager to see what Turing cards can do in these titles once they're in the TR labs.

Comments closed
    • wingless
    • 1 year ago

    60+ FPS in PUBGGGGGGGGGG!!!!!!!!!!!!!!!!!! The future is here!…./sarcasm

    • Chrispy_
    • 1 year ago

    If game developers are truly going to use ray-tracing in games properly, it has to be ubiquitous in hardware support, otherwise it’ll just be a gimmicky feature like GPU Physx, Hairworks, TressFX, etc – something that even Geforce owners turn off, to improve performance.

    Game developers are not charities, they need to turn a profit and they do this by appealing to the broadest possible audience for the least amount of effort possible. This is why even most AAA games will run on an Intel Integrated Graphics Potato, as long as you turn the graphics down.

    Until ray-tracing is the [b<]default[/b<] and [b<]ONLY rendering mode[/b<] that the lowest common denominator of customer hardware supports, it will be limited to the sort of thing that can be bolted-on to a game with very little effort. So enhanced water/floor reflections most likely, and quite possibly nothing else. Developers still have to make games with fake, simplified lighting, baked-in lightmaps and shadowmaps, Optimising their game assets for this 'fake-first' way of lighting and shadowing everything. They need to do this for mobile games, The WiiU, the Switch, The XB1, the PS4, and gaming laptops. That short list there covers 99% of the umbrella term "gaming market" already, by revenue - and [b<]NONE[/b<] of those have, or can support a 180W+ piece of proprietary raytracing hardware.

      • psuedonymous
      • 1 year ago

      [quote<]Until ray-tracing is the default and ONLY rendering mode that the lowest common denominator of customer hardware supports, it will be limited to the sort of thing that can be bolted-on to a game with very little effort.[/quote<] Until hardware texture & lighting is the default and ONLY rendering mode that the lowest common denominator of customer hardware supports, it will be limited to the sort of thing that can be bolted-on to a game with very little effort.

        • Chrispy_
        • 1 year ago

        Yup. Took about 5 years and two major releases of DirectX (6 and 7) between the Geforce256 (first T&L hardware engine) coming to market, and five years later, when the first of a [url=https://en.wikipedia.org/wiki/List_of_games_using_hardware_transform_and_lighting<]rather short list of games[/url<] that [b<]required[/b<] HT&L it [i<]started[/i<] to populate. That list is short because hardware T&L was replaced by unified shading, which was the direction the consoles moved in, and therefore the direction all the game engines moved in; I'm not saying it won't happen with raytracing, I'm saying it will take a long time and be driven primarily by the next round of console architecture, and the PS4Pro and XB1X are still smelling fresh for at least a couple of years.

      • Kretschmer
      • 1 year ago

      Yeah, this is early adopter tech right now. But we’ll all tip our hats to them in 3 years when RT is ubiquitous, just like when VR goes mainstream in 5 years.

        • DPete27
        • 1 year ago

        I agree. RT (or any other emergent tech) rarely gains an immediate foothold, but it will NEVER happen if someone doesn’t take the first step.

        I think Nvidia’s problem is that, while they certainly have enough money to push things into adoption, they like to keep their ideas proprietary. AMDs problem is that they don’t have enough money to push things into adoption.

      • K-L-Waster
      • 1 year ago

      Sooo no one should introduce anything new because since the existing install base doesn’t have it it will never get adopted?

        • Chrispy_
        • 1 year ago

        The place to introduce stuff these days is the consoles. Back in the hardware T&L days and the pixel/vertex changeover, consoles were inferior hardware and a minority market segment. Not only were consoles as a whole outsold by PC gaming, they weren’t even consistent platforms. Developers had to work on XB360 or PS3 and the two platforms were drastically different.

        These days, they just code for the same architecture. PS4 and XBone are different flavours of the same thing, and the PC gaming market is now the minority market. Consoles drive game development now, and as a PCMR enthusiast, I don’t like that fact either, but it’s a hard truth that we need to accept.

          • chuckula
          • 1 year ago

          So you are arguing that AMD’s console monopoly harms consumers?

          Careful Icarus, you tread on thin ice by implicitly accusing AMD of the same behavior that is always attributed to Intel and Nvidia.

            • Anonymous Coward
            • 1 year ago

            A funny point, but “same behavior” is a stretch.

            • K-L-Waster
            • 1 year ago

            My objection here isn’t with AMD themselves, but with the positions their fans are taking.

            * NVidia introduces new tech, and the response is “too expensive! too proprietary! stick to incremental performance improvements! it’s useless if consoles don’t adopt it!”

            * Intel sticks to incremental performance improvements, and the response is “when did Intel ever innovate? only AMD is pushing things forward!”

            You can bet if it was AMD introducing real-time ray tracing the very same people would be saying “wow that’s innovative! NVidia never does anything cool like this!”

            It’s basically identity politics, semi-conductor edition.

          • K-L-Waster
          • 1 year ago

          So whoever has the console SOC contracts has the sole right to introduce new tech?

          Sorry, don’t buy it.

            • Anonymous Coward
            • 1 year ago

            Not the sole right, but where the consoles go, mainstream games are sure to follow.

            • Chrispy_
            • 1 year ago

            This is kind of it, right now. Look at the publicly available revenue figures;

            As I said earlier, developers are businesses that follow the money, regardless of how much love they have for PC gaming.

            No money = no business and there’s no circumventing that law.

          • Voldenuit
          • 1 year ago

          [quote<]The place to introduce stuff these days is the consoles.[/quote<] This is exactly the wrong place to introduce new tech, especially tech with a high silicon budget. The RT units on Turing take up 50% as much die space as the traditional shader cores. They're one reason the die sizes are huge on the Turing cards. Once 7nm production is mainstream and mass volume, then consoles could probably add in these features at an acceptable cost (to price, die size and rate). But consoles (and console users) are simply too price-sensitive to push the envelope with new tech like this.

      • Krogoth
      • 1 year ago

      Nvidia is playing the long-game. They are hoping that ray-tracing acceleration reaches critical mass in the mainstream market and that console vendors are going to be using their platform over AMD RTG.

      In the event if it doesn’t pan out. The ray-tracing acceleration is going to solidify their hold on the professional graphics market and be a “halo feature” on high-end PC gaming market.

    • ronch
    • 1 year ago

    Gee I wonder what AMD is doing right now…

      • Chrispy_
      • 1 year ago

      Selling Ryzens and Threadrippers.

        • ronch
        • 1 year ago

        AMD RTG, that is, son.

          • NTMBK
          • 1 year ago

          Selling consoles.

            • Goty
            • 1 year ago

            *Getting small licensing fees for consoles.

            • ronch
            • 1 year ago

            Yow!!!

      • Kretschmer
      • 1 year ago

      Refocusing on console-class hardware GPUs and selling CPUs?

        • Anonymous Coward
        • 1 year ago

        Did they officially say they are refocusing small? I’d think that given the scalability of GPUs and their software, there is little incentive to back off from making a big range of chips. Not the biggest, but still substantial.

      • emphy
      • 1 year ago

      My guess is that they’ll be pretty relaxed whilst nvidia takes all the risk. AMD can afford to do so since they have de facto control of the gpu features list in the next generation of consoles.

        • Anonymous Coward
        • 1 year ago

        Yeah that Ryzen-Radeon combo seems unbeatable in consoles, a real solid foundation for keeping AMD relevant in PC gaming despite the competition there. Strategic.

    • USAFTW
    • 1 year ago

    Thanks, I’ll keep in mind to turn RTX off in these games.

    • albundy
    • 1 year ago

    all i care about are Mechwarrior 5 Mercs and Serious Sam 4. Man, whats it been, like 20 years for a real Mechwarrior game to appear since MW4 Mercs? the ones in between were a joke and laughable at best.

    • Philldoe
    • 1 year ago

    Translation: nVidia details a list of games that will run like crap on their older hardware and on all AMD hardware because they shoved a bunch of BS into these games.

      • DancinJack
      • 1 year ago

      Another person that doesn’t read. Good good.

      • NTMBK
      • 1 year ago

      You know you’ll be able to turn these features off, right?

      • Kretschmer
      • 1 year ago

      This is why adult literacy matters. Thank you Barbara Bush.

    • sweatshopking
    • 1 year ago

    Physx, anyone?

      • shank15217
      • 1 year ago

      Killed Nvidia style, never-mind the 6-8 core cpus that do basically nothing these days, software physx with proper multi-thread support could have done so much good.

      • DPete27
      • 1 year ago

      Can’t upvote this enough.

      …except now they can add reflections and shadows to their pretty glitter clouds.

      • DragonDaddyBear
      • 1 year ago

      I can’t believe I’m about to link an LTT video.

      [url<]https://youtu.be/H9nZWEekm9c[/url<] This is from 2 years ago but, since the hardware in the review is just now being upgraded, it is still relevant Physx just destroys the frame rates for minimal gain.

      • ColeLT1
      • 1 year ago

      I just hope it sticks, and doesn’t go the way of Vulkan / Mantle. I enjoyed PhysX on the borderlands series and used Vulkan on Doom.

        • chuckula
        • 1 year ago

        Vulkan isn’t dead yet!

        In fact, it has ray tracing too:
        [url<]https://www.phoronix.com/scan.php?page=news_item&px=Vulkan-Ray-Tracing-NVIDIA[/url<]

    • YukaKun
    • 1 year ago

    So they’ve been showering game devs with “”””help”””” to include RTX stuff into games only to make them marketable enough for the grand reveal only to know that with real-world performance, this will lower your FPS’es to the 20s, most likely, with what you will really be able to buy…

    I wonder who told Mr Huang this was a clever plan… Unless reviews are incredibly biased, I’m expecting a real sh!t show.

    Cheers!

      • Voldenuit
      • 1 year ago

      Microsoft worked with both nv and amd when adding DirectX Raytracing (DXR) to DX 12, so this is not some lone star effort.

      Ray-tracing will eventually become a standard tool in a game developer’s toolbox, just like all the other hardware advances in GPUs that were originally limited in support (Hardware TnL, Texture compression, Pixel and Vertex Shaders, Stencil Shadows, Tesselation, etc.)

        • Goty
        • 1 year ago

        I was under the impression that DXR and NVIDIA’s real-time raytracing were separate technologies. Am I mistaken in that?

          • Voldenuit
          • 1 year ago

          From [url=https://developer.nvidia.com/rtx/raytracing<]nvidia's page on RTX[/url<]: [quote<]NVIDIA RTX platform includes a ray tracing technology that brings real-time, cinematic-quality rendering to content creators and game developers. Developers can access NVIDIA RTX ray tracing through the NVIDIA OptiX application programming interface, through Microsoft’s DirectX Raytracing API (DXR) and, soon, Vulkan, the new generation, cross-platform graphics standard from Khronos Group.[/quote<]

    • Symmetry
    • 1 year ago

    Wow, seeing “Mechwarrior 5: Mercenaries” I was wondering why a game I grew up playing was using this but apparently half the Mechwarrior games get a “: Mercenaries” expansion.

      • meerkt
      • 1 year ago

      At least the number is different between MechWarrior 2: Mecenaries and MechWarrior 5: Mercenaries. Unlike Hitman 2 (2018) vs Hitman 2 (2002).

        • NTMBK
        • 1 year ago

        Hey don’t forget Mechwarrior 4: Mercenaries! That game was solid.

        Plus it had simple enough graphics that you could probably add ray tracing and run it at >30fps..

        • kvndoom
        • 1 year ago

        You forgot MechWarrior 4: Mercenaries though.

        Ah, NTMBK beat me to it. )

          • meerkt
          • 1 year ago

          It was an expansion of sorts, no? So it doesn’t count. Expansions have a different namespace, of course.

            • NTMBK
            • 1 year ago

            Nope, standalone game.

          • meerkt
          • 1 year ago
    • bandannaman
    • 1 year ago

    [quote<]Deep Learning[/quote<] eyeroll

      • chuckula
      • 1 year ago

      This is like when Mac users talk about all the games that Macs run.

      And Photoshop is always in the top 5.

        • derFunkenstein
        • 1 year ago

        I remember around a decade ago the Mac games I listed included StarCraft and Diablo 2. And, it should be noted, I was seriously arguing the point. You can “game” on a Mac, for various definitions of “game”

        I mean, I still play those games in 2018, but that’s not the point of a Mac.

          • derFunkenstein
          • 1 year ago

          I didn’t say it was a good argument. 😆

        • Concupiscence
        • 1 year ago

        Whenever they officially drop OpenGL, it’s going to completely decimate their game backlog. I’ve heard rumors that 2020 is the hard cutoff date for GL support in macOS, at which point they’ll presumably start jumping to ARM because legacy software support won’t [i<]be[/i<] an issue on macOS ARM.

      • Freon
      • 1 year ago

      Powered by quantum blockchain!

    • hubick
    • 1 year ago

    As someone who plays Battlefield almost exclusively, I can’t wait to see BFV comparison shots with RTX enabled/disabled.

      • Krogoth
      • 1 year ago

      It’ll be very minor at best (Just subtle differences in shadowing and reflections with a noticeable tax on performance)

      Developers/artists barely just got their hands on hardware/tools. Give it time for them to get used it. It always has been this way with every major change in the graphical rendering pipeline.

      BFV’s successor will mostly likely utilize it but with a noticeable difference.

      • NTMBK
      • 1 year ago

      The difference is that with it on you’ll be dead, due to rubbish framerates getting you killed.

      • Goty
      • 1 year ago

      The reflections look good once you get past the EVERY MODERATELY REFLECTIVE SURFACE IS NOW A PERFECT MIRROR effect they seem to employ.

    • gecko575
    • 1 year ago

    So how many games are going to have obnoxious neon RGB lighting to show off “real” ray tracing? Also Cyberpunk 2077 seems to be made at the correct time.

      • tay
      • 1 year ago

      Yeah the linked video has obnoxious lighting with the lights flashing at a restaurant that would be sure to drive any patron to violence.

      edit – nm some of the other linked videos look good. Impressed.

    • Leader952
    • 1 year ago

    NVIDIA RTX Features Detailed for BFV, Shadow of the Tomb Raider, Metro Exodus, Control, and More Games

    [url<]https://wccftech.com/nvidia-rtx-features-detailed-games[/url<] This has much more details as to what RTX features are used in each game.

    • chuckula
    • 1 year ago

    [Looks hopefully for Dwarf Fortress]

    [Leaves sad]

      • Krogoth
      • 1 year ago

      Pffft, real men play solitaire!

        • anotherengineer
        • 1 year ago

        I didn’t see it on the list either!

          • Topinio
          • 1 year ago

          Hah, Solitaire is now an iPhone game; not sure why no Fortnite, WoW, or Rocket League on that harcore PCMR gamelist though 😉

          • K-L-Waster
          • 1 year ago

          C’mon, man, 52 flying cards is too heavy a task for ray tracing…. .

        • jihadjoe
        • 1 year ago

        It would be super awesome to see the card material illuminated properly.

        • NTMBK
        • 1 year ago

        Don’t even joke, the performance of the Windows 10 Solitaire app is bad enough as it is 🙁

      • quock
      • 1 year ago

      You need this card for hardcore ASCII gaming:

      [url<]http://www.bbspot.com/News/2003/02/ati_ascii.html[/url<]

        • chuckula
        • 1 year ago

        [quote<]The ATI Radeon 9500 ASC has a list price of $600.[/quote<] See! Back in 2003 the GPU makers cared about giving cards reasonable prices for the consumers! Also, I like how the date on that article isn't April 1. It's refreshing to see people who aren't constrained to just one day of the year.

      • Voldenuit
      • 1 year ago

      Go on forums and convince Toady One to simulate Dorf AI with tensor cores.

    • Krogoth
    • 1 year ago

    I give it at least two to three years before content begins to use these new ray-tracing hybird acceleration modes aside from tech demos and minor stuff.

    It is pixel/vertex shading all over again.

      • Mr Bill
      • 1 year ago

      [quote<]However, AMD is on the same path with its new ProRender release, which now supports real-time GPU acceleration of ray-tracing techniques mixed with traditional rasterization-based rendering and is now built on the Vulkan 1.1 API, which is fully supported by GNC-based AMD GPUs with the latest Radeon Software Adrenalin Edition driver....[url=https://www.tomshardware.com/news/amd-radeon-rays-raytracing-software,36702.html<]March 2018 Toms Hardware post[/url<][/quote<]

      • Leader952
      • 1 year ago

      If you are trying to be funny you are not. If you were serious them stupid is your first middle and last name.

      Shadow of the Tomb Raider (September 14th – Ray Traced Shadows)
      Battlefield V (October 19th – Ray Traced Reflections)
      Metro Exodus (February 22nd, 2019 – Ray Traced Global Illumination and Ambient Occlusion)
      Control (2019 – Ray Traced Reflections, Diffuse GI and Contact Shadows)
      Atomic Heart (2019 – Ray Traced Shadows, Ambient Occlusion and Reflections)
      Assetto Corsa Competizione (September 2018 – Ray Traced Reflections, Shadows and Ambient Occlusion)
      MechWarrior 5: Mercenaries (2019 – Ray Traced Shadows, Ambient Occlusion and Reflections)

        • Krogoth
        • 1 year ago

        Tech demos, minor applications/effects. Just like Pixel/Vertex shading back when it was introduced.

        When future games start to “require” it that’s when you know feature has reached critical mass (You need RTX or whatever AMD RTG equivalent ends up being or you are SOL).

        • Pwnstar
        • 1 year ago

        Found the nGreedia fanboy.

        • GrimDanfango
        • 1 year ago

        The fact that Mechwarrior 5: Mercs is on this list pretty much proves Krogoth’s point. Nothing is going to save that game from being a half-baked hackjob, given Piranha Games are still the ones with the Mechwarrior licence. The last I saw of it looked like something out of 2005… it’ll take a lot more than badly-hacked-in ray traced shadows/AO/reflections to make it look good, let alone *be* good.

          • tay
          • 1 year ago

          Piranha got a lot of my money for MWO:Mercs. I’m done with them.

      • anotherengineer
      • 1 year ago

      Hopefully.

      I mean DP std 1.3 was Sept 2014 and 1.4 was March 2016 and 99% of monitors I see are still 1.2a.

      [url<]https://vesa.org/featured-articles/vesa-publishes-displayport-standard-version-1-4/[/url<]

      • thor84no
      • 1 year ago

      It’s almost like that’s a natural pattern with new technology like this…

        • Anonymous Coward
        • 1 year ago

        Also, the entire comment list here follows some natural pattern.

      • psuedonymous
      • 1 year ago

      I think it may be quite a bit less than 3 years. Compare to the timeline for adoption of hardware texture & lighting (with the Geforce 256 back in ’99), pixel shaders (GeForce 3 in 2000), and unified shaders (HD2000 in ’07). All resulted in almost total adoption within one hardware generation.

Pin It on Pinterest

Share This