GeForce 416.94 drivers flip RTX on for Battlefield V

Hot on the heels of the 416.81 release, Nvidia just pushed a new graphics driver out, with good reason. The 416.94 release is WHQL-certified and ready for Battlefield V, Fallout 76, and Hitman 2. There's a related bit of news regarding Battlefield V that's actually of bigger import, though: the game now has Nvidia RTX effects enabled for those lucky enough to own one of the green team's GeForce RTX cards. This also pins Battlefield V as the first-ever widely-available game with those ray-tracing effects, so write that one down in the history books.

We'll be taking a good look at the performance effects and image quality benefits that ray-tracing brings to DICE's game in a future article, so keep your eyes peeled.

If you're wondering what the performance hit is like, the folks at Hardwareluxx did some preliminary testing, and the summary so far appears to be "oof."

Should you own an RTX card and wish to get your ray-tracing on in Battlefield V, you'll also need to install the recently re-released Windows 10 October 2018 update (version 1809). That update is seeing the light of day once again after a disastrous initial rollout that saw some users get their documents devoured by a grue. Microsoft says the issue is now fixed—pinky promise.

Anyway, in the new driver release, Hitman 2 got support for Nvidia Ansel screenshots as well as Highlights. Both Hitman 2 and Fallout 76 now have 3D Vision profiles, too. The fixed bugs list for the 416.94 drivers has but one item: a bug causing laggy desktop activity when using GeForce GTX 780 cards has now been fixed. Oddly enough, that bug is still on the list of open issues for this release, but we figure that's a copy-paste snafu.

Those eager to get into the action can download the 416.94 driver from GeForce Experience, or directly from Nvidia's site. Those wishing to peruse the release notes can find them on Nvidia's site. Keep an eye out for the TR analysis of RTX effects on Battlefield V.

Comments closed
    • ermo
    • 11 months ago

    My take? Give it a year or two of development; that ought to give studios and NVidia the time necessary to figure out where this new tech gives the most bang for the buck compared to existing techniques.

    • Voldenuit
    • 11 months ago

    Hardware unboxed has comparison videos as well as benchmarks. Their numbers are looking even worse than Hardwareluxx’s numbers.

    [url<]https://www.youtube.com/watch?v=SpZmH0_1gWQ[/url<]

      • Spunjji
      • 11 months ago

      Not sure what’s more painful, the video or the comments. The “just buy it” fans are in hiding…

    • JosiahBradley
    • 11 months ago

    What’s the point of the “dedicated” hardware if it performs like software ray-tracing? (I know this is hyperbole, but still)

    • elites2012
    • 11 months ago

    why would i buy a 2000 series nvidia card just for RTX ? they stopped making games that use physX, so i know they wont make many games that use RTX. last nvidia card i bought was a gtx 750ti.

      • derFunkenstein
      • 11 months ago

      great story, Grandma.

        • DancinJack
        • 11 months ago

        lol

      • albundy
      • 11 months ago

      well, at least your card survived 5 years. all my cards died this year. how do 4 different gen cards die in different pc’s around the same time in separate locations? perfect timing, or ngreedia put a kill switch on all cards so they could sell more ngreedia cards?

      • ColeLT1
      • 11 months ago

      You wouldn’t. You should buy RTX for being the fastest cards in existence, if you need or want that speed.

    • synthtel2
    • 11 months ago

    It purportedly comes with a latency problem, too. Turning off future frame rendering apparently hurts framerates even more than is reflected in the numbers I’ve seen, but is necessary to make latency reasonable.

    • BigTed
    • 11 months ago

    The cynic in me thinks nVidia probably didn’t want any RTX games available when the 20 series cards launched…

    • Chrispy_
    • 11 months ago

    Three little things:

    1. The performance hit is game-breaking; All three RTX cards are under 60fps.
    2. [url=https://youtu.be/Q8V2F3NtCGk<]I don't think it looks any better than screen-space reflections[/url<], if at all. 3. The 2080 and 2070 are inadequate; 48 and 36fps respectively, at just 1080p. 1440p is a big nope, forget 4K. Honestly, it's hard to tell the difference between RTX and "fake" reflections/occlusion/specular highlights at 1080p. When you can run 4K Ultra at 75fps with RTX off, or 1080p Ultra at 55fps with RTX on, the only things that any gamer is really going to notice are the massive drop in resolution and framerate. [i<]Edit: [url=https://i.imgur.com/wQfpFHg.jpg<]Here's a screengrab[/url<] from the RTX gameplay video I linked to illustrate the point a bit better. In order to get raytracing performance even close to acceptable, the settings have to be lowered so far as to make the quality laughable. I'm all for graphics API progress, but raytracing is clearly several years out and RTX is just an early-adopter/idiot tax at this stage. [/i<]

      • drfish
      • 11 months ago

      FWIW, I would have chosen shadows instead of reflections in a one-or-the-other decision about ray tracing for exactly that reason. It seems like they would be more noticeable, and maybe even have less of a performance hit (but I’m out of my depth here).

      I even thought the name [i<]Shadow of the Tomb Raider[/i<] was playing that up a bit. My gut says that BFV took the wrong course...

        • Srsly_Bro
        • 11 months ago

        The new metro game has a shadow implemention. It looked great.

        • Chrispy_
        • 11 months ago

        The vast majority of shadows in games are static world shadows from fixed light sources, and they’re almost cost-free for the GPU and also high-quality because they’re pre-baked into the world by the developer. The developer did the raytracing on their supercomputer/renderfarm so that you don’t have to on your graphics card.

        We’ve had decent/convincing dynamic shadowmapping techniques for the better part of a decade, at considerably lower cost than RTX. The multiple-light source, dynamic shadow demos that went with RTX launch looked really good, but they were all contrived scenes that don’t really reflect a typical game level.

          • drfish
          • 11 months ago

          I understand, I’m just saying it could have been souped up, and maybe for less performance penalty than what we’re seeing with the direction they’ve taken. There are plenty of games with vehicle headlights, crafted torches, gun flashes, dynamic day/night cycles, etc that I think could look a lot better – especially when all those light sources are interacting. That’s not even considering stuff like shadows being cast by things like smoke or other particles…

      • caconym
      • 11 months ago

      Your screenshot is actually a great example of where raytracing works better than screen space reflections. That’s meant to be a semi-gloss surface, not a mirror finish, which is why the reflection fuzzes out like that. That’s hard to mimic in traditional rasteration, but here it looks perfectly life-like. In normal off-line raytracing, a soft reflection like that requires hundreds of samples per pixel in order to not look noisy, but NVidia’s denoiser is getting a great result with just one sample.

      I agree that the performance hit, in a competitive game, just isn’t worth it, but that’s a lovely reflection.

    • Srsly_Bro
    • 11 months ago

    Performance figures of BFV with RTX have been released. There are multiple settings and the performance impact is as large as speculated. Other sites have the results up. Stay tuned in 3 weeks to a month for them to appear here.

      • Shobai
      • 11 months ago

      And it’s not like it’s the second coming of Fortnite, amirite? “3 weeks to a month” ? This game will be irrelevant!

    • DeadOfKnight
    • 11 months ago

    Has DLSS not yet been implemented? I was under the impression that the performance hit was expected, and DLSS is Nvidia’s solution to help mitigate the problem by basically lowering the rendering resolution without noticeably lowering the visual quality of the final image. If this is the result, then it’s pretty sad that it’s pretty much just something everyone is going to turn off even with a 2080 Ti. If it isn’t implemented, then it looks like we’re still going to have to wait…

      • techguy
      • 11 months ago

      This is literally the first commercial implementation of the technology. Programming techniques will improve as familiarity increases over time. Thus, it is logical to expect that performance will also improve over time. I probably won’t use RTX in the first batch of games that enable it unless the developers release performance-enhancing patches later, but I do expect to be able to use it within the lifecycle of my 2080 Ti.

        • DeadOfKnight
        • 11 months ago

        Yeah, everyone on Reddit seems pretty outraged that they spent $1,200 on a card whose highlighted feature doesn’t seem very valuable even when it becomes available. However, what they’re really paying for is the biggest piece of silicon ever put into a consumer graphics card. Of course it scores low on the value scale, such has always been the case for bleeding edge.

        Personally, I bought my 2080 Ti so I can play games at 3440×1440 ultra settings at 100+ fps. I look at raytracing as a bonus. I know it’s not a great value, but there’s still nothing else available that can do what this card can even for plain old rasterization. The value is also stretched by the fact that I gave my old GPU to my wife, and it’s not just sitting on the shelf.

        $1,200 isn’t such a big pill to swallow when I’ve already dropped that same amount on a 3440×1440 100 Hz IPS display with G-Sync. Feels to me like I’m reducing another bottleneck, which at the end of the day is what we’re all trying to do as computer hardware enthusiasts. Raytracing and DLSS, whenever games do come out that support it, are just icing on the cake

          • DancinJack
          • 11 months ago

          I stopped reading after the first sentence.

          People on reddit are outraged because they didn’t know what they were doing? Shocking.

          • Spunjji
          • 11 months ago

          $1,200 is over the odds for that display too, though. It only looks like a good deal by comparison because Nvidia have boxed you in to G-Sync. Freesync displays with the same capabilities (3440×1440, 100+Hz) occupy the $550-900 range. It’s a shame that there’s no Freesync-capable card of holding those high frame rates in the latest games.

          That “the best has always been expensive” argument holds less water. Empirically speaking, this is a new level of expensive.

    • djayjp
    • 11 months ago

    Seems DLSS is required. They really need to add support for it.

      • DeadOfKnight
      • 11 months ago

      Yeah, it would appear from the graphs that the performance impact does not scale linearly with resolution. Lowering the rendering resolution can make a big difference on how usable this really is. Those 1080p numbers don’t look too bad, if you can get them at effectively 1440p image quality then that would be something I might use. But with these numbers at native resolution, of course I’m just gonna turn it off. Those 4k numbers are basically unplayable.

      • Voldenuit
      • 11 months ago

      Aren’t the Tensor units (the same units used for DLSS) also responsible for denoising shadows and reflections during ray-tracing? I don’t know if they are a bottleneck or operating close to full capacity during ray-tracing (it might depend on the game and the scene in question), but I wouldn’t expect them to be a magic bullet to cope with the performance impact of ray tracing.

        • Jeff Kampman
        • 11 months ago

        AI denoising is one approach to this problem, but it’s not mandatory. A developer could use a different denoising strategy and leave the tensor cores open for other uses.

          • Voldenuit
          • 11 months ago

          Hi Jeff, I’ve seen this mentioned before, but would like to know more. What are the alternative methods? CPU denoising? Cuda Core denoising*? Simply increasing the number of rays cast (which would drastically increase the load on the RT cores, because you need a *lot* of samples to get a clean image)? Or maybe one could just use RTX on shadow edges and rely on simple stencil shadows to shade in the center mass of shadows? That last one might need a lot of computational work to figure out the critical shadow areas in a scene, though, and would have to work quickly and efficiently enough to be worth the tradeoff.

          Of note there are about 3 games that nvidia has listed as supporting both RT and DLSS, being JX3, Mechwarrior 5 and SOTR, so would be interested to see how they fare once they have full RTX support enabled.

          * EDIT: It’s worth noting that using pixel shaders to denoise shadows is something that has been used already, so is technically plausible. There’s a good GDC presentation by the makers of INSIDE (available on youtube) where they talk about various methods they used to denoise shadows and lighting in the game. Worth a watch.

          EDIT2: [url=https://www.gdcvault.com/play/1023783/Low-Complexity-High-Fidelity-INSIDE<]Found the GDC video[/url<].

    • Neutronbeam
    • 11 months ago

    “This also pins Battlefield V as the first-ever widely-available game with those ray-tracing effects, so write that one down in the history books.”

    I’m sensing in the Force that this will be used as a question for a gear giveaway–either for this holiday or for the barbecue next year. Difficult to say; always in motion the future is.

    • liquidsquid
    • 11 months ago

    Ah, glorifying war, death, and destruction in ever higher levels of detail!

      • morphine
      • 11 months ago

      Just like a game of D&D? 🙂

        • Spunjji
        • 11 months ago

        Yes, not enough young folks these days recall the Great Beholder Wars of 1272. How disrespectful of those Wizards folks to turn it into a game.

    • homerdog
    • 11 months ago

    I think the RTX stuff only works in DX12 mode which is already a fair bit slower and stutterier than DX11.

      • DPete27
      • 11 months ago

      You’re creativerier with language than most. +1 to you.

Pin It on Pinterest

Share This