Nvidia promises major RTX boost with Battlefield V update

Our early performance testing for DirectX Raytracing effects in Battlefield V suggested that the tradeoff between frame rates and image quality with Nvidia's GeForce RTX cards was quite dear. We didn't write off the tech at the time, though, because a subsequent interview with DICE by Eurogamer suggested there were plenty of bugs to squash and plenty of low-hanging fruit to pick for RTX performance. We're glad we didn't, because today, Nvidia previewed just how much of an RTX speed-up gamers can expect from the first major update to Battlefield V.

The company promises "up to 50%" boosts to the game's performance from the Tides of War Chapter 1: Overture update that's slated to go out to gamers tomorrow, December 4. The company claims GeForce RTX 2080 Ti owners can expect to run the game at 2560×1440 with DXR settings at Ultra and frame rates above 60 FPS. RTX 2080 owners might be able to enjoy over 60 FPS from the game at 2560×1440 with the medium DXR preset, while RTX 2070 owners can use the medium DXR preset at 1920×1080.

Nvidia didn't note whether it observed those performance improvements in multiplayer maps or in the game's single-player campaign, but in any case, they would represent major boosts over the performance we observed in our first round of testing. We should note that we tested RTX effects on a map that was apparently particularly troublesome for the first release of DICE's DXR implementation. Even so, the RTX 2070 barely held on to playability at 1920×1080 with the medium DXR preset, and the RTX 2080 couldn't clear that line at 2560×1440. The RTX 2080 Ti could only deliver 37 FPS, on average, at 2560×1440 with the DXR Ultra preset.

Presuming Nvidia's claims hold, gamers with GeForce RTX cards should be a lot happier after this update. For folks who want to see Battlefield V's ray-traced reflections for themselves, Nvidia points out that it's still throwing a copy of the game in with eligible RTX 2080 Ti, RTX 2080, and RTX 2070 cards, as well as systems with Turing inside. We'll see just how much a performance boost this update delivers for ourselves as soon as it's available.

Comments closed
    • enixenigma
    • 9 months ago

    Battlefield V game update delayed due to last-minute issue:
    [url<]https://forums.battlefield.com/en-us/discussion/167622/battlefield-v-tides-of-war-chapter-1-overture-release-delay[/url<]

    • Freon
    • 9 months ago

    I guess there’s a potential this is tooling or up front costs in some respect, maybe BVH/tree optimization which can happen at build time and is a tricky problem, but I worry the outlook is still not that good. 1.5x is still fairly short. I honestly don’t think the quality improvement is worth that, say 117 vs 158 for the 2080 Ti?

    HBAO+ isn’t all that bad, nor are a few imperfect cube maps.

    • DoomGuy64
    • 9 months ago

    Back when this tech was in it’s early stages, and we were learning about some of RTX’s efficiency features, I asked: Why aren’t they optimizing with these features?

    Now I got a bunch of fanboy excuses for it, but in reality I know why Nvidia didn’t. It’s the corporate culture of slow gameworks performance, and not that it was impossible. I was totally right that Nvidia could improve the performance.

    For the longest time, Nvidia’s Gameworks has been designed to be slow as possible to sabotage performance on AMD products, and justify Ti purchases. Hairworks, ruining AAA games like Batman, etc. Well, guess what? RTX doesn’t run on AMD. You’re sabotaging yourself, when it could be running at acceptable framerates. IMO, the low level employees are finally getting the memo that gameworks has to perform acceptable on $1300+ graphics cards since AMD isn’t in the picture for the effect. The question is now, will this ever be playable on a 2070? Probably not, or at least not until AMD comes out with their own open implementation of it, which won’t be happening anytime soon.

    Raytracing is definitely a cool new technology. I just don’t trust Nvidia with running it, because we know the history of how they handle gameworks, and why they do it. It’s never going to perform acceptably on their “mid-range” products, even when those products cost the same as Ti’s several generations ago.

      • chuckula
      • 9 months ago

      So given how Ngreedia obviously gimps the performance of its old (and according to you its new) parts on purpose, why is it that the Fine Wine (TM) of Vega hasn’t implemented superior ray tracing already? I mean: Primitive Shaders Async!! [Drops Mic]

        • DoomGuy64
        • 9 months ago

        Because Vega doesn’t have RTX hardware, duh. You can’t implement a real time effect on hardware that doesn’t support it. The only way Vega could run the effect would be an unplayable software implementation. AMD needs to release new hardware to support real time raytracing, outside some sort of magical software hack that still wouldn’t be as fast as a hardware implementation.

        I like how you ignore my point about RTX optimizations not being used though. Nvidia was hyping up all this new stuff like adaptive shading, all while obviously not using it for raytracing. This new video sure seems like something like adaptive shading is being used now. “Variable rate ray tracing”.

        Back when I pointed this out, I got all sorts of retarded excuses. Well, I was right. They weren’t using it, and it was possible to use it. Raytracing very much seems to have the potential to be playable on CURRENT RTX cards when fully optimized. It’s more a question of support, than capability.

        The only controversial point I have made is WHY it wasn’t supported, and that is Nvidia’s corporate culture of not optimizing any of their gameworks effects. Well, looks like something has changed, and now RTX is playable.

        • Srsly_Bro
        • 9 months ago

        Read the monitor comments. The V56 is going to be faster than my 1080Ti FTW3 in a year because of dx12 that’s been out for how long?? Even Hitman 2 went DX11 only. That’s what I was told, anyway.

        We can’t rely on these overly emotional AMD shills to make decisions. The first mistake was buying V56/64 and they will never admit it.

        I’m worried for AMD GPU owners. Buyers remorse will be the end of them, and they play mental gymnastics to rationalize it.

          • DoomGuy64
          • 9 months ago

          lol. How about framing the scenario in the correct context? Because the argument was about Vega running 1440p for a 1440p monitor, while Nvidiots kept trying to straw man the whole thing into 4k, and HURR DURR UPGRADE PATH, without any evidence of needing a faster card for a freesync 1440p monitor.

          The reason why dx12 was even brought up was the retarded upgrade path argument. Which dx12 is a VALID answer to, “HURR DURR UPGRADE PATH”. Not only that, but we all damn well know that future cards will be coming out. Just not right now, and nobody needs one on 1440p for the foreseeable future. Not to mention RTX will be bringing dx12 to the masses, since it properly supports it. Unlike your 1080Ti. Enjoy your $1300+ 2080Ti “upgrade path” though.

        • synthtel2
        • 9 months ago

        Maybe because even in the best-case scenario it’s still an astoundingly inefficient way to improve image quality?

        Nvidia’s ability to implement things like this is great, they just tend to give the end user experience the bare minimum priority when deciding what to implement and how to configure it. Of course real-time ray tracing is a hard problem, and of course Nvidia is doing as fine a job of solving it as anyone could hope to do right now. That doesn’t mean that choosing to take on that problem in the first place made any sense from any perspective other than marketing.

      • Srsly_Bro
      • 9 months ago

      Citations needed

        • DoomGuy64
        • 9 months ago

        Which part? The part where the 2070 costs the same as older high end cards? The part where gameworks was tessellating cement blocks that TR wrote an article about? The part where Hairworks ruined TW3 performance, and the developers admitted it? The part where PhysX was proven to be crippled on the CPU path with single core x86 code? The part where I questioned Nvidia’s RTX performance when they supported adaptive shading, and all you fanboys denied the possibility?

        Citation needed yourself.

        This is all you can do. Single sentence trolling, void of any evidence, and completely obvious to anyone not drinking the kool-aid that you’re all delusional.

        Here’s another kick while you’re down: RTX’s power consumption. It’s on a newer process, isn’t much different than Vega, while costing WAY MORE. RTX just put Vega back into the completely viable mid-range graphics segment. If you don’t care about raytracing, Vega is a better choice than a 2070, especially since Nvidia’s now supporting next gen API’s that help both RTX and Vega. I bet you’re all flipping out that this is happening, and it’s hilarious.

    • Srsly_Bro
    • 9 months ago

    The only thing Nvidia can promise is high prices. I’ll wait and see on this.

    • DavidC1
    • 9 months ago

    “Presuming Nvidia’s claims hold, gamers with GeForce RTX cards should be a lot happier after this update.”

    But, but,

    This is ONE game. What about the other titles that might not get the same love because the developer is small or support is mediocre? An AAA studio needed extra time to optimize, and such big gains indicate they released it almost prematurely. There’s almost no hope for anyone else.

    If Nvidia priced these cards right, we wouldn’t complain so much. And right means not what revenue-chasing accountants see it, but right as in priced the same per tier as previous generation.

      • Voldenuit
      • 9 months ago

      According to DiCE, nvidia provided them tools that allowed them to profile RTX operations and pipeline stalls that microsoft didn’t with their DXR toolkit, so assuming nvidia makes the same tools available to other devs, that would already be a step up over what DiCE had to work with for the BF V launch.

      Then again, I’m not expecting indie devs to write big, ray-traced games when only less than 5% of PC gamers would have the hardware to run them.

        • NovusBogus
        • 9 months ago

        A lot of AMD/NV driver development effort is spent cleaning up other companies’ messes. Simply making the tools available is no guarantee that developers will use them properly, and if the performance jump is that extreme it does not bode well for large scale adoption of RTX.

    • Chrispy_
    • 9 months ago

    The real question is this:

    Will Nvidia get RTX performance up to a level where owners of [i<]expensive, high-refresh, vendor-locked, G-Sync monitors[/i<] don't feel betrayed? For at least three or four years, Nvidia have been milking resolution increases and high-refresh capability with their most expensive gaming products, and now they're offering framerates and resolutions with equally-expensive RTX cards that effectively nullify those previous investments. I don't need another reason to hate Nvidia, but I think that would qualify for a lot of people who have been stung by this change of direction.

      • chuckula
      • 9 months ago

      Considering G-sync exists solely for situations in which the frame rate drops below an acceptable level to requiring frame pacing, I would expect RTX owners to feel betrayed [b<]without[/b<] ray tracing since the non-RTX performance is so high as to render G-sync superfluous.

        • cygnus1
        • 9 months ago

        I disagree. Having a high frame rate G-Sync display, I can tell you that the frame rates they’re talking about are just too low to bother with the tech. I won’t be upgrading my GPU until their next gen. I can wait. Paying roughly $1000 to go back to slideshow (under 60 FPS) is not any kind of upgrade. They need about a 50% speed up, not this 10% speedup they’re announcing, before this tech is at all attractive.

      • cygnus1
      • 9 months ago

      No. They won’t. I don’t care about the image quality of the slide shows they seem to think are ok. Having been playing for a quite a while now on a 1440p 144Hz G-Sync display with a GTX 1070 usually sustaining well over 100 FPS, I won’t consider this technology as part of a purchasing decision until I see a card with it sustaining 120+ FPS with the setting maxed out. Until then it’s a laughable beta.

        • Chrispy_
        • 9 months ago

        Heh, laughable beta.

        The funny thing about beta access is that you usually get it for free 😉

          • cygnus1
          • 9 months ago

          Could be worse. The Fallout 76 Pre-Order Beta was basically non-existent. Even though it was advertised as included, it was still on an invite basis (which I never got), and run in such short windows that I never would’ve been able to play even if I’d got an invite.

    • sweatshopking
    • 9 months ago

    Dropping 1k cad to play games on medium at 1080

      • End User
      • 9 months ago

      DO IT! THEN WRITE A HORID REVIEW!

        • sweatshopking
        • 9 months ago

        Horrid*

          • DancinJack
          • 9 months ago

          LITTLE DO YOU KNOW…

          “HORID” IS ACTUALLY NVIDIA’S VERSION OF RTX’D “HORRID.” APPLYING RTX WHERE IT’S MOST IMPORTANT, BRO.

    • Leader952
    • 9 months ago

    [quote<]tradeoff between frame rates and image quality with Nvidia's GeForce RTX cards was quite dear.[/quote<] dear should be clear

      • enixenigma
      • 9 months ago

      dear
      /dir/

      adjective
      adjective: dear; comparative adjective: dearer; superlative adjective: dearest

      1.
      regarded with deep affection; cherished by someone.
      “a dear friend”

      synonyms: beloved, loved, adored, cherished, precious; More
      esteemed, respected, worshiped;

      close, intimate, bosom, best
      “a dear friend”

      •precious, treasured, valued, prized, cherished, special
      “her pictures were too dear to part with”

      antonyms: hated

      •used in speech as a way of addressing a person in a polite way.
      “Martin, my dear fellow”

      •used as part of the polite introduction to a letter, especially in a formula denoting the degree of formality involved.
      “Dear Sir or Madam”

      •endearing; sweet.
      “a dear little puppy”

      synonyms: endearing, adorable, lovable, appealing, engaging, charming, captivating, winsome, lovely, nice, pleasant, delightful, sweet, darling
      “such a dear man”

      antonyms: disagreeable

      2.
      [b<]expensive. synonyms: expensive, costly, high-priced, overpriced, exorbitant, extortionate; More informalpricey, spendy, steep, stiff "the meals are rather dear" [/b<]

    • Voldenuit
    • 9 months ago

    A 50% boost to performance to 37 fps at 1440p is still just 55.5 fps, which is still lower than we’d like. But it’s definitely a good step.

    • techguy
    • 9 months ago

    But you guys, RTX is a horrible failure, completely unusable even on $1200 graphics cards!

      • Spunjji
      • 9 months ago

      I get that you’re making a joke to point out the silliness of people drawing premature conclusions about performance, but given that we currently only have Nvidia’s word to go on re: the scale of these improvements, isn’t your mocking comment ironically premature?

      Full disclosure: As someone sold on high-res gaming and living in the UK, for me the minimum spend to play with this tech is ~£1075. For that price, if it’s doing anything less than making my hair stand on end with sheer glee then it is indeed a horrible failure. All signs still point to “meh”.

        • techguy
        • 9 months ago

        No, because unlike AMD, Nvidia tends to live up to their promises 😉

      • Chrispy_
      • 9 months ago

      If you watch the video it sure does seem like Nvidia are helping DICE by fundamentally reworking the engine and level design to improve where the limited raytracing GPU budget is spent. They’re providing a lot of support and pumping a lot of manpower into fixing the bad PR that BF5 RTX performance has caused.

      On one hand, it’s great to see such a dedicated effort to improve new technology. +1 to all parties involved.

      On the other hand, will Nvidia do this for every game developer? And even if Nvidia do, will every game developer go through all the obviously RTX-only extra steps to make RTX-only features run smoothly for a tiny sliver of the target market?

        • techguy
        • 9 months ago

        This is the reason why Nvidia has done better than AMD in PC graphics – dev relations. Nvidia has the size, the budget, and the tools to work with game devs in this capacity, and have done so many times over the years. Usually it’s a talking point by AMD zealots who think it’s a sign that Nvidia is evil, instead of you know, a selling point (like it actually is).

          • Chrispy_
          • 9 months ago

          I wouldn’t call myself an AMD Zealot. I have more Nvidia hardware for personal use than AMD hardware, and I can see plenty of scenarios where the Nvidia card is just a better experience than the AMD card, despite a hardware disadvantage.

          Nvidia just irk me with the sheer quantity of BS they pull when they simply don’t need to, and it’s happening (predictably) all over again with RTX. I think I’ll just leave it at that.

          • synthtel2
          • 9 months ago

          It’s great when it works, but nobody has the budget to throw that kind of help at every game anyone releases. On games that aren’t big enough to get that attention, it’s at best a wash, and IME AMD does noticeably better on average. AMD, knowing they’re behind on this, seems to do a better job of setting things up so that game devs can take care of themselves.

          I don’t like how Nvidia’s basic strategy relies on them being able to provide this help, but there’s nothing inherently evil about that. The evil bit is that they often abuse these relationships to have games ship with their tech tweaked more to make AMD look bad than for any perf/IQ ratio benefits (Crysis 2 tessellation and Witcher 3 HairWorks being the most obvious examples).

          • Action.de.Parsnip
          • 9 months ago

          No it’s not that. There was a ‘golden era’ for amd from hd 4000 to hd6000. 3 extremely solid, extremely compelling families of products. For nvidia, what they had at the time, at least until the fermi revisions, were to be frank, entirely underwhelming.
          In the end nvidia outsold amd by the truckload. That’s advertising and strength of the brand at work. On products they had no business being above 40% market share. We all know it didnt shake out that way.

            • Srsly_Bro
            • 9 months ago

            I had a 4850 on launch day and some months later picked up two Gtx 260 Core 216 cards to replace the AMD card.

            • DoomGuy64
            • 9 months ago

            Those cards were both the peak and fall of AMD. DX10.1 showed promise early on, but due to developers not supporting it, it never went anywhere. Nvidia played a large hand in this, assassin’s creed for example pulled support, and it died after that. You also had PhysX ruining game performance en masse, and it was unfixable in any game where it couldn’t be disabled, or overly relied on it even for basic physics. That said, it’s not a problem today due to both updates and developers no longer using it in favor of better alternatives. Games like red dead2 and battlefield show how far games have come, making PhysX unnecessary for the exclusive features it once had like cloth and destruction simulation.

            ATI was then bought out by AMD, people were laid off, and the drivers immediately went to hell. There were multiple driver teams that couldn’t synchronize bug fixes and performance updates, with many drivers having bugs repeatedly showing up even after being “fixed” previously. Rage was the peak of these problems, which eventually caused AMD to ditch the multiple team method and consolidate driver development.

            Nvidia released Fermi, and despite what people say about Fermi, it was a really good and future proof architecture compared to VLIW. Tessellation in particular, and Nvidia’s driver updates brought a lot of performance and new features.

            AMD then had completely horrible dx11 hardware and drivers up until GCN 1.1 (Hawaii), which peaked with windows 10 and Vega only offering hardware improvements but no major efficiency or software updates other than the control panel. Does AMD even fully support multithreading in dx11 today? I don’t know. I know there have been improvements, but nothing on par with Nvidia, and high core CPUs like Ryzen aren’t being taken advantage of, outside of game developers.

            That said, it’s not the problem it once was, games are no longer ridiculously unplayable, although the drivers still give Nvidia the advantage in dx11 benchmarks. Dx12 gives peak performance, but I question AMD’s commitment to that, considering Doom’s Vulkan mode doesn’t perform much better on Vega compared to Hawaii, and Hawaii was more stable. Vega likes to do weird things like downclock when viewing the map, maybe for efficiency since I have all those features enabled, but it doesn’t give the same completely smooth experience as Hawaii did.

            I think RTX is a step in the right direction, feature wise, but the pricing is just crazy, and Nvidia needs to ensure everyone uses the efficiency features fully when using raytracing. There’s no point in trying to brute force it when the hardware isn’t powerful enough. Battlefield V may serve as a good showcase, but it’s not getting good reviews as a game. It’s not what people wanted in a battlefield game, bugs abound, content is limited, and the story is history revisionism that takes real stories and distorts them for a false narrative. Metro will probably be a better game for the technology, but afaik they haven’t optimized it like BF5 has. RTX is pretty much a useless feature until more games support it efficiently and prices drop on these cards. If nobody owns the cards, un-sponsered developers aren’t going to support it, making adoption even slower. Fanboys like to complain about AMD, but in reality AMD has over a year to get hardware out for this tech, while people gaming on 4k can afford RTX prices, making Vega still a solid card for average gamers right now.

            RTX isn’t a “horrible failure”. It’s just early adopter tech being sold at early adopter prices, and the elitist 4k gamers can afford it. Don’t like it? Then stick with the 1080Ti or don’t game at 4k, and buy a Vega. Plenty of viable options.

        • DPete27
        • 9 months ago

        I said this a while ago, but these pet projects live and die based on how much time and money AMD/Nvidia are willing to put into them. As we’ve seen here, you can’t trust game devs with handling these cutting-edge features efficiently (or at all) so the GPU manufacturers has to shoulder the brunt of the work.

        The big problem, and one that I don’t blame AMD/Nvidia for, is that the benefit of technologies like this is maximized in selling your own product over your competitors. Nvidia is using RT as a marketing tool to sell more hardware. They have little motivation/reason to push it into ubiquitous hardware/software adoption. Once that happens, you’ve just paid an enormous amount of money for your competitor to easily adopt as a “me too” feature and you lose your competitive advantage

          • Voldenuit
          • 9 months ago

          In related news, nvidia just [url=https://www.phoronix.com/scan.php?page=news_item&px=NVIDIA-Open-Source-PhysX<]open-sourced PhysX[/url<], although that's definitely a case of "too little, too late".

            • DoomGuy64
            • 9 months ago

            Yes, but now we can potentially go back and play old PhysX titles with acceptable performance.

            • Krogoth
            • 9 months ago

            They gave up on PhsyX vendor-lock and are betting on RTX.

            Hopefully, G-Sync 1.0 follows the same route and Nvidia finally implements VESA VRR spec on their desktop GPUs.

        • watzupken
        • 9 months ago

        This is my thought too. BF5 is may basically be a one off case where Nvidia is giving full support since its the maiden RTX title. Beyond that I wonder how many game developers will be willing to invest time in this feature since it will just delay the game launch. Its not scalable. One can’t advertise RTX feature in their game and provide poor performance. It’s definitely a good feature but may be too early for its time since the hardware is barely keeping up. A top end card to play medium RTX setting at 1080p don’t sound right.

        • djayjp
        • 9 months ago

        Great point you mention about the effort involved. Initially ray tracing for DXR was hailed as a time saver for development (as you don’t have to fake a bunch of lighting with sophisticated techniques anymore) but this seems like anything but simpler….

        The up to doubling of performance looks very promising though (about where you would think it should’ve been to begin with).

        • Durante
        • 9 months ago

        Nvidia won’t need to do this for every developer.

        Just like with every dramatic change to how games are rendered, it takes some time for the technology and best practices for its use to be discovered and propagate through the community. Once that has happened (and it has made its way into most major engines) not everyone will have to re-invent the technical heavy-lifting.

      • Krogoth
      • 9 months ago

      Pixel and Vertex shading had the same story back in 2002-2003 well expect that price of entry wasn’t quite as steep.

        • Spunjji
        • 9 months ago

        About 50% less steep D:

    • chuckula
    • 9 months ago

    As usual you have failed Nvidia: You should have waited at least a year after launch to do this so we can call it Fine Wine!!!!

      • drfish
      • 9 months ago

      What do we call this then, Sharp Cheddar?

        • Concupiscence
        • 9 months ago

        Cave Aged Mimoletteâ„¢!

        • chuckula
        • 9 months ago

        Good sharp cheddar has to be aged at least 5 years!

          • drfish
          • 9 months ago

          How about Tableside Mozzarella?

            • chuckula
            • 9 months ago

            Well real Mozzarella is made from [i<]buffalo[/i<] milk so sure!

            • drfish
            • 9 months ago

            Haha! I knew that, and I’ve had that, which is what prompted the comment in the first place, but I totally missed the connection. I don’t know what card that means I need to turn in, but I have failed.

        • K-L-Waster
        • 9 months ago

        Roast Pheasant?

        • Klimax
        • 9 months ago

        Beaujolais nouveau?

          • drfish
          • 9 months ago

          Interesting, thanks for teaching me something new today.

Pin It on Pinterest

Share This