GeForce 417.35 drivers get ready for DLSS in Final Fantasy XV

A fair day, fine fans of Final Fantasy XV. If you happen to own one of Nvidia's RTX graphics cards, you're in for a treat. The company's GeForce 417.35 drivers add support for DLSS (deep-learning super-sampling) to Square Enix's magnum opus.

If you're wondering what all the hubbub is about, Nvidia DLSS lets RTX graphics cards render a game at a lower-than-native resolution before using a trained neural network to upscale the image to what it "thinks" the final image should look like. That work is based on ultra-high-quality input frames fed to the DLSS model on a per-game basis in Nvidia's data centers. The technology could net gamers significant speed and smoothness boosts compared to native-res rendering, namely for 4K gaming.

We took a look at the tech a while back and were pretty darn impressed with the potential performance gains by turning on DLSS in the FFXV canned benchmark. Nvidia thinks that you'll be too, and it's published a video showing off portions of the full game in a 4K side-by-side comparison. The video shows output from a GTX 1080 Ti using temporal anti-aliasing (TAA) versus that from a RTX 2080 Ti using DLSS. A quick Mark I Eyeball estimate suggests the newer card is consistently faster by well over 50%, sometimes by close to twice as much. The video only shows an FPS counter, but our previous testing showed that frame times had a corresponding drop, too.

The game itself apparently has an incoming update to enable DLSS via Steam, an odd development after news in early November saying that DLCs for the title would be canceled and that the head director had left Square Enix. There's no word on just when the update will come out, but given how the release of game-specific driver optimizations tend to coincide with the releases they optimize for, our best guess is "pretty soon."

Other than DLSS support in FFXV, the 417.35 driver offers a handful of fixes. Rocket League players should no longer see their game crash after a white screen, and texture corruption in Hitman 2 should no longer appear. Those using Nvidia cards on notebooks might have been bit by a now-dead bug where frame rates in 3D games would drop to under 30 FPS. Nvidia Ansel controls should work properly in Battlefield V, and owners of Titan Xp cards in SLI should no longer see that feature disabled by default.

GeForce Experience users probably have a notification of the new driver by now. Everyone else can hit Nvidia's site to grab it, and the particularly nosy can check out the release notes.

Comments closed
    • wizardz
    • 8 months ago

    why didn’t they do a 2080 TI TAA @4K vs 2080 TI DLSS @4K comparison?

    i know there are no stupid questions but only inquisitive idiots.. but why???

      • willyolioleo
      • 8 months ago

      50% dishonesty to make DLSS look more amazing than it is

      50% because they’re not trying to get 2080 owners to switch to DLSS, they’re trying to get 1080 owners to buy new cards.

    • Krogoth
    • 8 months ago

    No chocobo? Sacrilege!

    • synthtel2
    • 8 months ago

    Hold the **** up here, [b<]they're messing with brightness and saturation in that video to artificially make DLSS look better.[/b<] I took a screenshot of a frame close to the one shown in the preview there, cropped it to the largest box that didn't contain any 2D elements / labels, and: left side luminance: mean 0.606, median 0.643 right side luminance: mean 0.637, median 0.678 I don't think I have a way handy to get average saturation for the scene, but I sampled both images on a variety of surfaces and the DLSS one is always more saturated, often with a gap of over 0.1. This is seriously scummy. Edit: [url=https://imgur.com/a/tiCtHzM<]here's[/url<] the cropped capture I used, FWIW.

      • derFunkenstein
      • 8 months ago

      You can see that brightness/saturation difference in the thumbnail. Since I hadn’t watched the video I figured it might just be an unfortunate frame selection, but the whole thing is like htat?

        • synthtel2
        • 8 months ago

        The thumbnail is particularly bad, but it’s visible the whole way through.

      • sweatshopking
      • 8 months ago

      If true, it would surprise me not at all.

        • synthtel2
        • 8 months ago

        Don’t take my word for it, it doesn’t take much work to check.

          • sweatshopking
          • 8 months ago

          You underestimate my laziness or lack of time

    • nanoflower
    • 8 months ago

    I can’t help but notice looking at sharp edges that DLSS looks worse than TAA. So it’s a trade off of peformance vs appearance.

      • lycium
      • 8 months ago

      Depending on how it performs, if you’re on a high DPI 4K screen (which I personally don’t like for normal use, because as a gfx coder I need to see every pixel) then it might actually be a decent efficiency move.

      • Voldenuit
      • 8 months ago

      Anything textured will also look blurrier on DLSS than at native res, because they’ll be upsampling a bitmap.

    • lycium
    • 8 months ago

    I would be very interested to see lossless PNGs of the output (remember when we used to perv over each driver revisions’s trilinear and anisotropic filtering changes? Riva TNT days!), as well as benchmarks and especially thermal measurements.

    I’m very curious to see what the power/heat influence of those bajillions of tensor units is, considering they sit idle pretty much 99% of the time. Maybe they will create so much heat / take so much power that it limits the normal FP/INT units…

      • derFunkenstein
      • 8 months ago

      [quote<]remember when we used to perv over each driver revisions's trilinear and anisotropic filtering changes? Riva TNT days![/quote<] Or even more recently, when both the Radeon 8000 series and the GeForce 5000 series all had their own IQ compromises.

Pin It on Pinterest

Share This