Examining the performance of Nvidia RTX effects in Battlefield V

Nvidia’s Turing architecture promised to bring the holy grail of real-time ray tracing to desktop graphics hardware by blending the best of rasterization and ray tracing into a new form of hybrid rendering platform, but that revolution took longer to arrive than the company probably hoped for. Although GeForce RTX cards have been available to gamers for a couple of months, Microsoft only just finished exorcising a troublesome file-deletion bug that inadvertently delayed the release of the essential-for-RTX Windows 10 October 2018 Update. On top of that, developer DICE and publisher EA pushed back the release of the headlining RTX title, Battlefield V, from October 19 to November 20. Gamers who sprung for Nvidia RTX-ready hardware have understandably been getting antsy for the day when they could finally fire up their cards’ RT cores and trace some rays for realistic takes on effects like reflections, ambient occlusion, global illumination, and more.

The GeForce RTX 2080 and RTX 2080 Ti, powered by the ray-tracing-accelerating Turing architecture

Last week, the support structure that the RTX hardware-and-software stack needed to work finally began to fall into place. As we noted above, Microsoft made the Windows 10 October 2018 update available again after a long period of review and retesting, and DICE began opening up early access to Battlefield V for those who had paid for the privilege. Nvidia also released a corresponding driver to get Turing cards’ ray-tracing hardware up and running in that game. We ponied up for the Deluxe Edition of Battlefield V to get into the game early, and we’ve been working hard over the weekend to see just how RTX effects look—and perform—ahead of that game’s release to a wider audience tomorrow.

As we hinted at above, titles that incorporate RTX effects can use Turing cards’ ray-tracing hardware in a variety of ways. In the case of Battlefield V, DICE has opted to use the technology to improve the quality of certain reflections on cars, glass, small puddles of water, stone flooring, and more. The best way to appreciate these reflections is in motion, so I’ve created two 4K gameplay videos to try and show the differences.

To test the performance and appearance of RTX effects, we walked through a portion of the game’s “Tirailleur” single-player mission with an abundance of standing water for light and objects to reflect from. To establish a baseline for what to look for in our RTX-enabled walk through the woods, I traveled through the scene with RTX off and Battlefield V‘s settings otherwise left on the ultra preset.

I’ve stopped in a few places in the video above to show some of the ways that the game’s default reflections fall short of realism. You’ll especially want to note how leaves passing over puddles create distracting flickering between dark and light tones in the water, as well as comet-trail-like artifacts in some cases. To my eye, there’s a lot happening in these reflections, but they often fall short of realism in ways that break the sense of immersion to the watchful eye.

With RTX on at the game’s low preset, the appearance of reflections in these puddles becomes more subtle, and to my eye, more convincing. Part of that may be because the RTX reflection algorithm doesn’t seem to try to account for the leaves blowing across the puddles of water in the scene, as the default reflections do, simply because casting that many rays on fast-moving objects may be too much to ask at the moment. Still, the fainter images of trees, structures, and such that do get reflected seem more true to life, in my estimation, and there’s much less of the jarring alternation between quite light and quite dark values of tone that we can sometimes see in Battlefield V‘s default reflection rendering.

Despite the photorealism promised by ray tracing, RTX reflections aren’t free of artifacts of their own. As we approach and move up the hill past the guard shack in this video, you might see some of the puddles exhibit what appears to be noise. These artifacts appear as “sparkles” or cross-shaped areas of bright pixels whose source isn’t immediately evident. I didn’t see this kind of noise too often while putting my nose in this section of the game, but it was evident enough when it did appear that I was reminded of the fact that I was looking at a screen. Whether this noise is something that can be worked out with future software updates and improved use of RTX resources will remain to be seen, but at the moment, it is a reminder that we haven’t quite grasped the holy grail of computer graphics even in the limited context of hybrid rendering.

Now that we have a basic idea of how RTX effects look in practice in Battlefield V, let’s see what those improved reflections ask of the hardware used to generate them.

 

Our testing methods

If you’re new to The Tech Report, we don’t benchmark games like most other sites on the web. Instead of throwing out a simple FPS average (or even average and minimum FPS figures)—numbers that tell us only the broadest strokes of what it’s like to play a game on a particular graphics card. We can go much deeper. We capture the amount of time it takes the graphics card to render each and every frame of animation before slicing and dicing those numbers with our own custom-built tools. We call this method Inside the Second, and we think it’s the industry standard for quantifying graphics performance. Accept no substitutes.

What’s more, we don’t rely on canned in-game benchmarks—routines that may not be representative of performance in actual gameplay—to gather our test data. Instead of clicking a button and getting a potentially misleading result from those pre-baked benches, we go through the laborious work of seeking out test scenarios that are typical of what one might actually encounter in a game. Thanks to our use of manual data-collection tools, we can go pretty much anywhere and test pretty much anything we want in a given title.

Most of the frame-time data you’ll see on the following pages were captured with OCAT, a software utility that uses data from the Event Timers for Windows API to tell us when critical events happen in the graphics pipeline. We perform each test run at least three times and take the median of those runs where applicable to arrive at a final result. Where OCAT didn’t suit our needs, we relied on the PresentMon utility.

Here are our in-game settings for Battlefield V. Although it’s not captured in the screenshots below, we varied the game’s “DXR Raytrace Reflections Quality” setting, as well as the resolution, to achieve our desired test configs.

Battlefield V is a work in progress (recall that it launches to the general public tomorrow, November 20). As early versions of Battlefield V can apparently have issues saving and applying settings for the DXR Raytrace Reflections Quality parameter, our results should not be compared to sites that did not manually verify the DXR Raytrace Reflections Quality settings they were using. We verified the settings we chose were being saved in the configuration file by changing the parameter with a text editor, saving the file before launching the game, and checking that our change was reflected in the game settings menu.

As ever, we did our best to deliver clean benchmark numbers. Each test was run at least three times, and we took the median of each result. Our test system was configured like so:

Processor Intel Core i9-9980XE
Motherboard Asus Prime X299-Deluxe II
Chipset Intel X299
Memory size 32 GB (4x 8 GB)
Memory type G.Skill Trident Z DDR4-3200
Memory timings 14-14-14-34 2T
Storage Intel 750 Series 400 GB NVMe SSD (OS)

Corsair Force LE 960 GB SATA SSD (games)

Power supply Seasonic Prime Platinum 1000 W
OS Windows 10 Pro with October 2018 Update (ver. 1809)

We used the following graphics cards for our testing, as well:

Graphics card Graphics driver Boost clock speed (nominal) Memory data rate (per pin)
Nvidia GeForce RTX 2080 Ti Founders Edition GeForce

Game Ready

416.94

1635 MHz 14 Gbps
Gigabyte GeForce RTX 2080 Gaming OC 8G 1815 MHz
Asus ROG Strix GeForce RTX 2070 O8G 1815 MHz

Thanks to Intel, Corsair, Gigabyte, G.Skill, and Asus for helping to outfit our test rigs with some of the finest hardware available. Nvidia, Gigabyte, and Asus supplied the graphics cards we used for testing, as well. Have a gander at our fine Asus motherboard before it got buried beneath a pile of graphics cards and a CPU cooler:

And a look at our spiffy Gigabyte GeForce RTX 2080, seen in the background here:

And our Asus ROG Strix GeForce RTX 2070, which just landed in the TR labs:

With those formalities out of the way, let’s get to testing.

 

Battlefield V (1920×1080)


Our first and least-demanding round of testing suggests that while real-time ray tracing may be a reality in Nvidia’s new hybrid-rendering vision, it’s quite intensive for Turing graphics cards to accelerate. Moving from RTX off to BFV‘s low RTX preset halves average frame rates and doubles 99th-percentile frame times for all of the cards on the bench. 99th-percentile frame times remain well-controlled, though, and that fact squares well with the fact that BFV feels plenty smooth to play at this resolution on all of the graphics cards we tested, even if frame rates aren’t as fluid-feeling as they are with RTX off.

It’s only when we start to crank the RTX preset higher that the lesser pair of Turing cards on the bench begins to really show signs of strain. Let’s have a look at how long these cards had to spend working on the most demanding frames they had to render over the course of our test run using our advanced metrics.


These “time spent beyond X” graphs are meant to show “badness,” those instances where animation may be less than fluid—or at least less than perfect. The formulas behind these graphs add up the amount of time our graphics card spends beyond certain frame-time thresholds, each with an important implication for gaming smoothness. Recall that our graphics-card tests all consist of one-minute test runs and that 1000 ms equals one second to fully appreciate this data.

The 50-ms threshold is the most notable one, since it corresponds to a 20-FPS average. We figure if you’re not rendering any faster than 20 FPS, even for a moment, then the user is likely to perceive a slowdown. 33 ms correlates to 30 FPS, or a 30-Hz refresh rate. Go lower than that with vsync on, and you’re into the bad voodoo of quantization slowdowns. 16.7 ms correlates to 60 FPS, that golden mark that we’d like to achieve (or surpass) for each and every frame.

To best demonstrate the performance of these powerful graphics cards, it’s useful to look at our three strictest graphs. 8.3 ms corresponds to 120 FPS, the lower end of what we’d consider a high-refresh-rate monitor. We’ve recently begun including an even more demanding 6.94-ms mark that corresponds to the 144-Hz maximum rate typical of today’s high-refresh-rate gaming displays.

Happily, none of our graphics cards accumulate any time spent working longer than 50 ms on any frame, and even the RTX 2070 on the medium RTX preset only  puts up a vanishing few milliseconds at the 33.3-ms mark. If we saw larger accumulations of time spent beyond those thresholds, we’d expect choppy gameplay that would badly tarnish any gains in fidelity from RTX reflections.

At the 16.7-ms mark, however, we can start to see some signs of strain. For the low RTX preset, the RTX 2080 Ti impresses with only a few unnoticeable milliseconds put up here, and the RTX 2080 puts up a similarly impressive performance. The RTX 2070 spends just four seconds of our test run working on tough frames that need longer than 16.7 ms to render, too—a plenty credible result for the most attainable Turing card so far. Even as we crank the RTX preset to high or ultra on the RTX 2080 Ti, the card has little trouble delivering a pleasant 60-FPS experience. The same can’t be said for the RTX 2080, though, and the RTX 2070 groans mightily even under the load of the medium preset.

For those looking for higher frame rates yet with RTX effects, the RTX 2080 Ti is the only game in town, but only at the low preset. The card spends a reasonable eight seconds of our one-minute test run working on tough frames that would drop the instantaneous frame rate below 90 FPS. Use any other Turing card or start cranking RTX presets, though, and an experience more in the neighborhood of 60 FPS should be as high as you set your sights.

Our first taste of Turing performance with RTX reflections enabled puts a steep price on the extra visual fidelity that architecture can provide through ray tracing. Let’s see if any of these cards can maintain a playable experience at 2560×1440.

 

Battlefield V (2560×1440)


As we up our resolution to 2560×1440, the same basic formula we just saw holds for performance with the lowest level of RTX reflections enabled: take the average frame rate with RTX off and divide by two, or take the 99th-percentile frame time and double it.

Depending on your personal threshold of playability, the 99th-percentile frame times our Turing trio puts up suggest smooth frame delivery remains the order of the day at 2560×1440 with RTX effects set to low. All three cards stay inside of the 33.3-ms mark that indicates frame rates are holding above 30 FPS, and the RTX 2080 and RTX 2080 Ti stay well within that margin. That figure is also worthy of note to Nvidia graphics-card owners because it’s the minimum frame rate at which G-Sync monitors can continue to do their thing with GeForces.

Crank the RTX effects preset to medium at this resolution, however, and only the RTX 2080 Ti maintains smooth sailing. This preset overwhelms the RTX 2080 and RTX 2070, dropping average frame rates to 30 FPS at best and spiking 99th-percentile frame times well past what we would consider playable. Using the high or ultra RTX presets pushes even the RTX 2080 Ti to the limit if we consider 33.3 ms a playable 99th-percentile frame time, and the RTX 2080 is well and truly out of gas. (We didn’t test the RTX 2070 at the highest RTX presets for that reason).


Our time-spent-beyond-50-ms chart captures a few milliseconds from one massive frame-time spike late in the RTX 2080 Ti’s run, but our other cards’ noses remain clean. Not quite so at the 33.3-ms mark, where the RTX 2070 spends eight seconds working on tough frames that drop instantaneous frame rates below 30 FPS when it’s asked to cope with the medium RTX preset. The RTX 2080 starts to lose its cool at the RTX medium, high, and ultra presets here, as well. Still, the low RTX preset lets all three of our cards at least stay off this important chart.

Flip over to the 16.7-ms mark, however, and all of our cards put at least some time up on the board with RTX on. The RTX 2080 Ti posts a hardly worrisome three seconds or so spent working on tough frames that would drop instantaneous frame rates below 60 FPS under the low preset, but upping the RTX quality to medium more or less doubles that figure. The RTX 2080 spends a quite-noticeable 11 seconds past the mark with RTX low, however, and the RTX 2070 spends another six seconds in trouble yet.

Cranking RTX presets higher imperils all of our cards to varying degrees if a consistent 60 FPS is the goal. The RTX 2080 Ti suffers a massive increase in time spent beyond 16.7 ms with the high or ultra presets in use, and the RTX 2080 at RTX medium spends half the period of our test run under 60 FPS. Cranking that preset higher on the RTX 2080 suggests there’s not much further left for that card to fall, while the RTX 2070 brings up the back of the pack even with the medium preset. To achieve even a consistent 60 FPS with RTX on at all at 2560×1440, get ready to empty your wallet.

 

Battlefield V (3840×2160)


Given the results on the preceding pages, it should come as no shock that 4K gaming with RTX on is an incredibly taxing workload. Going by 99th-percentile frame times, only the RTX 2080 Ti can stay on the right side of playability at all, and only then by a whisker with BFV‘s low RTX preset. The RTX 2080 and RTX 2070 are already on the wrong side of the 33.3-ms line with the low RTX preset, and cranking the RTX 2080 Ti to medium serves only to tank its performance.


Our time-spent-beyond-X graphs help to put only a slightly better face on the RTX 2080 Ti’s performance with the low RTX preset. The uber-Turing puts only a vanishing 30 ms on the board at the 33.3-ms mark, suggesting that gamers with “cinematic” tastes might be able to enjoy ray-traced Battlefield V vistas with a consistent 30 FPS. Even the RTX 2080 spends only two seconds, but really, that’s not good enough for a game that demands precise mouse movement like Battlefield V does. Meanwhile, the RTX 2070 nopes out at the 33.3-ms mark even with the low RTX preset at 4K, while the RTX 2080 Ti at medium puts all the time on the board we need to call its work a chug-fest. It seems 4K gaming with RTX effects is going to be the next hurdle to clear for Nvidia’s ray-tracing products of the future.

 

Conclusions

After staring at way too many puddles for way too many hours over the past couple of days, I can confidently say that Nvidia’s RTX reflections are often more lifelike compared to the base reflection technique DICE has implemented for Battlefield V. With RTX on, I saw far fewer jarring artifacts from reflections that gave away the fact that I was looking at a game. A few cases of distractingly noisy puddles aside, I got plenty of glimpses and glances of lifelike reflections that made me feel more immersed in the simulated French forests I snuck through for our tests.

Those sometimes-superior reflections come at an incredible performance cost, however. Even at their lowest settings tier, RTX reflections cut Battlefield V‘s average frame rates in half on every GeForce RTX card released so far at 1920×1080, although all three cards at least have some wiggle room to increase RTX presets at that resolution—plenty, in fact, in the case of the RTX 2080 and RTX 2080 Ti.

At least in our test area and at 2560×1440, the RTX 2070 hangs on to a tolerable 99th-percentile frame time at the low RTX preset, and the RTX 2080 stays well within the 33.3-ms mark critical to keeping G-Sync monitors happy. Turning up the RTX effects from there results in unplayable average frame rates and 99th-percentile frame times for our lesser RTX cards, leaving only the RTX 2080 Ti standing. 4K gaming with RTX on isn’t practical without an RTX 2080 Ti, and even then, that card isn’t offering anything more than bare-minimum performance for interactive gaming at the low RTX preset.

While the RTX 2070 will certainly let gamers dip their toes into what RTX can do, I felt the best balance of visual richness and reflections came from playing at 2560×1440 with the low RTX preset on the RTX 2080 or the medium preset on the RTX 2080 Ti. I honestly didn’t see a ton of on-screen differences from turning Battlefield V‘s RTX slider to high or ultra in our test area, but at 2560×1440, those settings murdered performance on the RTX 2080 and left the RTX 2080 Ti at the ragged edge of what even G-Sync can save. If you do want to turn those knobs to 11 for its own sake, the RTX 2080 Ti is unsurprisingly your best bet.

Despite the enormous hit to average frame rates, it’s commendable that RTX effects don’t result in horribly uneven or jerky frame delivery from Battlefield V. So long as gamers with GeForce RTX cards can tolerate 1920×1080 or 2560×1440 gaming and less-fluid frame rates overall, our results suggest they’ll still be able to enjoy consistently-paced frames that help maintain a sense of immersion.

On the whole, I don’t think gamers who already have high-end graphics cards with performance comparable to an RTX card should ditch their current pixel-pusher and go buy one of Nvidia’s latest on account of Battlefield V‘s reflections alone. We’re in the early days of this technology, and the effects I witnessed probably have room for improvement as developers get their coding fingers around real-time ray tracing and the hybrid rendering approach that Nvdia’s Turing cards require.

Still, if you were planning to upgrade to an RTX 2070, RTX 2080, or RTX 2080 Ti to begin with, you’ll be happy to have RTX reflections in your load-out when you fire up Battlefield V, and after this first taste of the tech, I’m eager to see what other studios can do with Nvidia’s hybrid-rendering tool kit for shadows, global illumination, ambient occlusion, and more. I’m hopeful it won’t be a long wait to find out.

Comments closed
    • morete
    • 11 months ago

    Um…ray-tracing was specifically meant for fixed objects, not moving objects or moving shadows and reflections. I don’t know if this was just marketing hype from Nvidia as a “me first” thing, or what, but it seems silly to me. The original concept of ray tracing was to take a real life analog still object or even an entire structure and project imaginary light rays (“laser beam”) from a camera (called: raycasting) on that still object to trace either its entirety or specific parts of the object. As the still object is traced, the analog capture is converted to a digital form. Ray tracing is so precise that it captures even the most minute details of an object. Because of this, their are so many pixels crammed into the entire capture, today’s graphics cards wouldn’t be able to process these images in a game without bringing a 60 frames per second gaming session to 2 frames per minute. For moving objects to be ray-traced (if that is even possible), the constant changing of shades, shadows, hues, colors and light reflections of that moving object would be so complex and involve so many pixels within that moving capture that it would be like dividing by zero for your graphics card. That said, ray-tracing can be used for any object and then extremely dumbed down for the sake of ease and shortcuts when it comes to game developers creating graphics for a game, but then we haven’t advanced towards photorealistic games. We only have rasterized games with ray-tracing monikers on them.

      • GrimDanfango
      • 11 months ago

      I think you’ve got your wires crossed just a bit there.

      “Raytracing” has absolutely nothing to do with real-world object capture. It’s an algorithm for turning polygonal data into a flat image, just the same as triangle rasterization is. Raycasting just means finding intersections with polygonal objects along the path from the scene camera in the direction of a particular pixel. Raytracing is the technique of emitting secondary raycasts from that intersection to also calculate reflections/refractions/shadows.

      Raytracing is only unsuitable for animation in so far as it’s slow, and raytracing 60 frames per second at a high quality level is very heavy. In the VFX industry, raytracing is used almost exclusively these days. In games, it’s just beginning to become viable to use it for certain effects that benefit from it the most, in a hybrid approach that raytraces certain layers and rasterizes others, combined with a heavy dose of GPU-accelerated de-noising to make up for the low quality level they need to run it at in order to keep it running at a reasonable framerate.

      Even considering 3D capture technologies, rather than rendering technologies, you seem to be conflating a bunch of techniques into one imagined holy-grail.
      Laser scanning has been around for decades, and will typically convert an object on a turntable into a dense point-cloud, without any shading or lighting information… that point cloud isn’t good for much on its own – it can either have a rough automatic algorithm applied to get a hacky 3D model out of it, or be used as reference for someone to manually create a well modelled approximation.

      Photogrammetry has fairly recently gained popularity – has nothing to do with lasers or rays, it’s an entirely software-based approach to constructing 3D point cloud data from photo-sets captured using ordinary digital cameras (requiring *very* careful, meticulous photography to put those photo-sets together, to get anything useful out of the software).
      That point-cloud data contains the baked-in colour information from the photographs, so can be used to bake texture-maps as well as to recontruct 3D polygon meshes… again, usually by hand to get anything particularly clean and useful.

      There’s some recent cutting-edge R&D into light-field capture, which is probably the closest to what you seem to be talking about, but it’s so experimental, there’s almost no real-world application of it yet. When that does reach some kind of maturity, it’ll likely be something that we’ll see crop up in VR mostly, as it’ll be sort of analogous to taking a 3-dimensional photograph that you’ll be able to move your head around a small region inside, and see parallax depth in the photograph. That probably will be prohibitively difficult to extend to real-time animated sequences any time soon, as the data sets would indeed be insanely large.

      – but anyways, that’s light-field capture… not raytracing.

    • GrimDanfango
    • 11 months ago

    Why are there standing puddles of water on, and wind-blown ripples running *up* steep hillsides?

    It all looks horribly over-cranked too. Didn’t DICE used to be good at this photorealism lark? Feels like they’ve stopped using reference photos, and just started going with their imagination of how real lighting works.

      • DoomGuy64
      • 11 months ago

      That’s why there’s robot armed women in combat beating WWF wrestlers with cricket bats, and katana wielding british soldiers. Realism went fully out the window with this game. They then told people not to buy it if they didn’t like it, had to fire people, delay the game, and watch their stock prices fall drastically.

      I don’t think BFV is going to be nearly the success BF4 was, but those empty servers will give you plenty of time to enjoy the RTX effects.

        • GrimDanfango
        • 11 months ago

        Mechanical realism has never been a thing for the BF series, even in its less absurd incarnations. There’s ARMA for that.

        They have always been about advancing the state-of-the-art of pure eye-candy though. At this point, I feel like they’re ending up getting swallowed up by the Hollywood VFX imagined-reality vortex – where everything aims to look like the movies, including the movies, resulting in this bizarrely over-processed mess, which others then use as reference, and so the ugly cycle continues.

    • fellix
    • 11 months ago

    In my opinion, Nvidia should have kept Turing for the professional SKUs, until the developers got used to it and instead release Volta to the consumer market. I mean, Turing is just Volta + RTX and Nvidia already supports DXR for Volta through compute emulation. Volta has already enough performance and feature upgrades to be a worthy next gen release on its own. And then, the 7nm would be the perfect match for consumer Turing premiere.

    • Gastec
    • 11 months ago

    The RTX effects for now are more useful for taking screeshots than for gameplay, and I’m not even kidding. Beautiful screenshots sell the product better.

    • Spunjji
    • 11 months ago

    Finally got to watching the videos in 4K. I felt the difference in frame-rate was more noticeable than most of the effects of RTX, though at times that definitely added… something, especially near the fire where the light effects on the gun just bring you into things a little.

    I didn’t really feel anything strongly about the puddles in either implementation. RTX-off tried to do something nice with reflecting leaves and nearly succeeded, but the puddles just look too bright and the artifacts are weird once you see them. RTX-on and.. well, no leaves now, and still too bright, and still artifcating around the edge of the leaves. Better maybe, but neither are great aesthetic choices. Muddy puddles just don’t look much like that in either case. Hard to feel anything conclusively for or against the technology in aesthetic terms.

    I definitely don’t think I’d be up for dropping to 1440p (or 1080p on a weaker card) just to get the frame rate back up where it should be. But then I’m running a GTX 980M and won’t be upgrading soon because all these cards are out of my price range, so this is all academic to me and who cares what I think. 😀

    • Tirk
    • 11 months ago

    Get 90% of the visuals and 2-3X the performance. Or get 10% more visuals and cut your cards performance by over half. Is that really a choice?

    • evilpaul
    • 11 months ago

    What are the clock speeds on the core and power consumption with the RTX cores on versus off?

    • rutra80
    • 11 months ago

    I wonder when miners will figure that mining is no longer profitable and will release all those neat graphics cards to aftermarket, and what will happen to RTX prices then…

    • Rakhmaninov3
    • 11 months ago

    The leaves are a cool effect for sure but…….none of the tree branches or overhead netting move enough to really convince me that it’s windy. I’d trade the reflections in either example for more convincing treatment of the rest of the scenes.

    • DPete27
    • 11 months ago

    So Nvidia has officially raised the entry for 1080p gaming to a $500 GPU…..clever

      • Gastec
      • 11 months ago

      The corporations are winning. They have figured it out that all we do is write text online 🙂

    • rutra80
    • 11 months ago

    Puddles of mercury.
    Now launch Half Life 2 Lost Coast from 2005 and tell me that the improvement looks like 13 years of progress worth a grand of expense.
    Don’t get me wrong I’m huge fan of ray-tracing and I’m glad that things go that way, but yet I’m more impressed by this:
    [url<]http://www.pouet.net/prod.php?which=78488[/url<]

      • Star Brood
      • 11 months ago

      Disclaimer: that link is NSFW.

      I had a look at HL2 Lost Coast and you are absolutely correct. Couple that with bumping up the texture resolution and you have a modern game where very few could tell the difference.

      Additionally, I felt like the video with RTX off looked better. Ironic. Disabling a niche feature to improve performance and visual quality at that.

    • blastdoor
    • 11 months ago

    At some point you have to ask yourself — if visually realistic puddles is your thing, why not just take a walk outside after a rainstorm?

      • ptsant
      • 11 months ago

      If you do that while carrying an assault rifle and start shooting stuff your life expectancy will be considerably reduced. Safer in a virtual environment.

      • Anonymous Coward
      • 11 months ago

      Its hard to overstate how astonishingly large the effective computing power of the universe is.

        • ptsant
        • 11 months ago

        How many gigarays/sec does the sun have?

          • Kjella
          • 11 months ago

          The sun emits 2.8 * 10^26 J/s, a 700nm photon is 2.7 * 10^-19 J so about 1 * 10^45 photons/seconds, you’d have to integrate across the spectrum for an exact answer. But that’s roughly 1000000000000000000000000000000000000 gigarays/sec.

            • DoomGuy64
            • 11 months ago

            Kinda puts a damper on the whole universe is a sim theory. If it was a sim, it is a physically created one….

            Maybe matter was created by beings in another dimension, and everything in this dimension is “fake”, but IMO it’s not being processed by some server. Manipulated, perhaps. We’re probably more like Gerbils in a cage being watched for science and amusement. Space Odyssey stuff.

    • Laykun
    • 11 months ago

    I’m glad you have the option of turning it on or off, but as a graphics programmer, a setting that doubles your frame time for an effect that makes an admittedly minimal difference in visuals is considered unacceptable for a shipping title. This really is just a tech demo feature that nvidia paid for to hype RTX.

    EDIT: Minimal might be a bit of an understatement, it has an appreciable affect but it’s not 50% of the frame time appreciable.

      • Laykun
      • 11 months ago

      I also like that the effect uses the fallback of SSR when the scope glass passes over the environment. Once you see that you can’t unsee it.

    • djayjp
    • 11 months ago

    I personally think the reflections (upon reflections) technique used in Hitman 2 is far superior to these given the results and the cost (negligible). DICE should’ve used that technique instead.

    • hubick
    • 11 months ago

    Let me know when we can have destructible environments, where you can arbitrarily crush a building with a tank because the results no longer have to be pre-rendered for lighting, etc.

    The whole benefit of tracing is that it works with the dynamic stuff, and we can’t even do leaves w this? 🙁

    • Chrispy_
    • 11 months ago

    How many games that take huge playability and performance nosedives with RTX features will it take before developers just ignore RTX?

    If the big boys can’t do it, why should the smaller devs bother?

    Sure, the tools nvidia give the developers may improve over time, but at the moment the glitches and performance hit are so bad that it’s a terrible first impression, and its not competing in isolation It’s competing against 3-4x higher framerates on the same card without RTX features.

      • albundy
      • 11 months ago

      the game looked playable to me. there’s no point in being sedentary. if 3dFX could take a chance decades ago, why cant nvidia?

        • Voldenuit
        • 11 months ago

        If you’re in multiplayer, the player with the 60 fps average frame rate will be at a disadvantage to the one with the 120 fps frame rate, even with a 60 Hz server tick.

        Also, I don’t think anyone who buys a $1200 GPU is happy with 60 fps at 1080p, even with all the pretties.

          • enixenigma
          • 11 months ago

          Another thing to consider: does it state anywhere in the article exactly what scenario was tested? Multiplayer tends to be subject to more variability than single player. A smooth 60 fps in a campaign mission may not translate to 60 fps in a 64-player multiplayer map.

            • Voldenuit
            • 11 months ago

            Very few publications test multiplayer, because it’s very hard to get reproducible results.

            I do believe a couple of youtube sites have tested (or at least done spot checks on) multiplayer, and I remember the numbers being lower than singleplayer, although I don’t think it was by much (the RT cores are really the bottleneck in BF V, leaving the CUDA cores and CPU idle for a lot of the time).

          • Chrispy_
          • 11 months ago

          Good luck getting 60fps when it matters though. That’s a 60fps [i<]average[/i<] with muddy wet scenes hovering in the 30's. On a $1200 2080Ti. At 1080p. Yes. I said nosedived, and by that I meant "jumped face-first into the floor with devastating side-effects". But hey, at least your beaten, losing player character will look 2% prettier with RTX features!

        • Laykun
        • 11 months ago

        Being the target audience, spending that much on a graphics card I would NOT be happy gaming at 30fps. Playable is not good enough when you spend that kind of dosh, I expect 90FPS+ for the 144hz gsync monitor they sold me yester-year.

          • Spunjji
          • 11 months ago

          I think this is the real crunch. They’ve straight-up switched narrative on their most profitable (read: highly invested) customers, and in the process have given them an either/or choice of features (high refresh rate, OR shiny new effects) while still charging top dollar “no compromise” prices. It stinks.

    • synthtel2
    • 11 months ago

    Jeff, how was input latency?

    • synthtel2
    • 11 months ago

    The line between reflective and dull surfaces is way too harsh in both cases, but it stands out a lot more in the non-RTX case because it looks like specular occlusion is lacking. Top-notch (long-range) specular occlusion could be implemented for a tiny fraction of the performance cost of this raytracing, and I think would get very close to the visual quality of the ray-traced reflections in those scenes.

    Urban scenes will probably show more benefit from having a 3D representation of the reflection environment, but I still think voxels are a better bet and will be for quite a while. VXGI would be fairly cheap if used for specular only, for example (the most expensive part of VXGI is sampling across a whole hemisphere to get diffuse results, and even that is much cheaper than raytracing).

    • Soossi
    • 11 months ago

    I won’t be staring at those small puddles if I’m playing this kind of fast paced action game. I suppose those nvidia owners are happy since their puddles render a bit more nicely if you really stare at them.

    • setaG_lliB
    • 11 months ago

    It’s like GeForce FX and shader model 2.0 all over again!

      • DoomGuy64
      • 11 months ago

      Maybe, but IMO it’s closer to GF6 and SM3/HDR. 30 FPS and no AA. The only difference between the two is that most people called the GF6 a “success” when SM3 was clearly a failure on that card. Nvidia also played hard with the shenanigans before and after AMD supported SM3, and that was also annoying.

      If you want to see how much a failure Nvidia’s SM3 implementation really was, all you have to do is compare XB360 games to PS3 games, because both consoles were using the latest SM3 hardware. The 360 did it better, which was using a hybrid x1900.

      At least this time around, nobody is calling subpar quality and performance a “resounding success”. That was surreal.

      Not saying raytracing is bad, but this implementation certainly is, and I hope we don’t get a repeat of the SM3 war.

        • Laykun
        • 11 months ago

        How does comparing those two consoles show that nvidia’s SM3.0 implementation was poor? I’m quite keen to know because the substantial difference to me between the two is the 360 Xenos GPU had unified shaders while the PS3 did not, that seems to me like it’d have a much more substantial affect on the outcome of the any graphics comparison seeing as you don’t have to gingerly balance vertex and pixel shader operations to make optimal use of the resources you have.

        Also if it were based on the X1900 GPU it’d have substantially more pixel shading power than it actually does. It’s based on the x1800, which is much closer to the 7800 the PS3 is based on.

          • DoomGuy64
          • 11 months ago

          Xenos is based on the x1800 the same as the x1900 was. It however is NOT in any way an x1800, as the x1800 had “equal ratio of texturing to pixel shading capability”. Both Xenos and the x1900 have ratios of 16:48

          The x1900 also had similar functions of Xenos, as it’s pixel shaders could do vertex? Something along those lines. Render to vertex buffer maybe. Much like the 4870 supported tessellation, but was never updated to dx11. The hardware kinda supported it, the software did not. Only if you directly programmed for it.

          Xenos was an updated x1900 with legacy features like vertex removed, and an added cache. ATI did this double feature support for a long time, as far back as the 8500 which supported both T&L and shaders that could emulate T&L. (T&L removed in the 9000 budget chips, but it still worked via shaders.)

          And yes, the Xenos has more shading power than the PS3, but it’s not an order of magnitude greater. It just performed better and was more flexible, because Nvidia’s SM3 was that bad. That said, there were platform limitations, but the 360 was still better. The RSX had 24-24, and it could do more texturing, but it’s shading was under-powered compared to Xenos. The PS3 had to obviously cut back on it’s shader effects. Nvidia never did have “good” SM3 shaders. They had marketing, and unlike consumers, console manufactures stopped dealing with Nvidia after the PS3 from their price gouging.

            • tipoo
            • 11 months ago

            Every console maker that partnered with Nvidia historically has switched off them in the following generation. It’s an interesting trend, and what comes after the Switch (not the 2019 refresh) will be interesting to see.

            Part of it is also AMDs desperate position and willingness to make semicustom parts.

    • exilon
    • 11 months ago

    Not only do you lose >50% FPS with DXR, you also have to use BFV’s DX12 engine which is slower and more stuttery compared to the DX11 imlementation.

      • ptsant
      • 11 months ago

      They still haven’t fixed it? I’m playing BF1 and indeed DX11 is much smoother, even if the average fps with DX12 is higher. And it’s not one of these things that you need to measure, you can actually feel it.

        • exilon
        • 11 months ago

        Nope, still haven’t fixed it as of last week.

        [url<]https://www.youtube.com/watch?v=1x4PKh2LF3A[/url<] 15-20% loss in DX12 before DXR versus DX11 on a 2080Ti Maybe the latest patch shored it up. v0v

      • Jeff Kampman
      • 11 months ago

      I don’t see any evidence of issues with speed or stutter from the most recent update of the game under DX12 mode, at least with DXR off. As I exhaustively detailed in the article, turning DXR on halves the average frame rate but generally doesn’t disproportionately raise the 99th-percentile frame time unless you simply push the setting too high.

        • ptsant
        • 11 months ago

        Indeed, the cumulative graphs are really impressively flat. Very little variance in there.

    • WhatMeWorry
    • 11 months ago

    PPP: [url<]http://www.puddlespityparty.com/[/url<]

    • chuckula
    • 11 months ago

    I don’t see what the issue is.

    It’s easy to spot RTX.

    Here’s a simple example:

    [url=https://goo.gl/images/X6gNRH<]RTX off[/url<] [url=https://goo.gl/images/J5vwwv<]RTX on[/url<]

      • Krogoth
      • 11 months ago

      While, AMD RTG just eats glue out of the bottle.

      • Star Brood
      • 11 months ago

      Nvidiayyy

    • brahman05
    • 11 months ago

    @Jeff
    Question sir, any plans on doing a follow up test with sli? I am quite curious if this can scale across a distributive workload. As power draw decreases with rtx on, i truly feel that an add in card without cuda and just rtx, maybe tensor cores as well might be in Nvidias best interests, once games start coming out to utilize it that is. But i allways tru to build multi gpu setups, and when i can add an older nvidia for physx. If they took a 750ti, the only maxwell 100 series chip, a compute powerhouse, and added some rtx special sauce…… With my luck i would put it all together, hit play and my computer would say, ‘I’m sorry i can’t do that dave’.

    • swaaye
    • 11 months ago

    Those are some creepy jumping leaves. Run!

    It looks fine but diminishing returns all around.

    • WaltC
    • 11 months ago

    I thought the RTX off videoclip looked much better than the RTX on clip, myself. In terms of the “reflections,” I saw nothing there that could not be done as well or better via rasterization. Yawn. I still find it laughable that nVidia was unable to ship real-time demos of RTX even with its $1k+ RTX products–right, nVidia, the card manufacturer, couldn’t do it–but a game developer could?…;) Color me skeptical–color me RTX off. Ugh. More marketing aimed at the gullible.

      • Voldenuit
      • 11 months ago

      [quote<]I thought the RTX off videoclip looked much better than the RTX on clip, myself. In terms of the "reflections," I saw nothing there that could not be done as well or better via rasterization.[/quote<] The TR clip was probably not the best representation of reflections, with the forest map only having reflective areas in muddy patches. The town maps have a lot more reflection on a variety of surfaces (windows, cars, water). Also, I've seen videos of RTX Off which had very obvious artefacting when leaves skittered across the water. Hardware Unboxed has comparison videos in different areas at different settings: [url<]https://youtu.be/SpZmH0_1gWQ?t=169[/url<]

      • stefem
      • 11 months ago

      You may prefer screen space and cubemap reflections over raytraced ones but it doesn’t mean rasterized reflection aren’t wrong or that what RTX reflection does it’s practically impossible with rasterization

    • ermo
    • 11 months ago

    @Jeff:

    Which sort of expectations did you have prior to doing this? And is the RTX tech “worth it” to you personally?

    (To me, it seems like this is *very* early adopter tech)

    P.S. Thanks for adding a link to the article in the comment section blurb. The lack of back-links has been a pet peeve of mine for I don’t know how long now…

      • JustAnEngineer
      • 11 months ago

      The links have always been there for the folks that know the secret handshake. :-p

      The image next to the blurb is a hyperlink back to the article.

        • ermo
        • 11 months ago

        Cheers — thanks for the tip!

    • Voldenuit
    • 11 months ago

    Thanks for adding the tested resolution to the chart titles, Jeff!

    I know it’s a small thing to harp on, but it makes it a lot easier to keep track when you’re flipping back and forth across pages comparing data. (And for engaging in Reddit arguments, let’s be honest here.)

    • leor
    • 11 months ago

    It’s as if thousands of voices all cried out “Meh.” and were suddenly… vindicated?

    This gen of products makes 0 sense at their current price points. They are literally double what was the general standard a few years ago – 600, 400, and 250, and they functionally aren’t ready to run this tech. What am I slicing my frame rate in half for, pretty puddles?

    [Edit] I didn’t realize I had to explicitly spell this out, but when I said “THEY” I was referring to all 3 cards, the 2080ti at 1200, the 2080, at 800, and the 2070 at 500, half of which would be 600, 400, and 250. If you want to get down to the penny in an effort to make some kind of obscure point, by all means have fun, but I’ve been buying these cards for years and those ranges were the stable norm for a while.

    I’m sure at some point in the future this will all be very cool, but gouging early adopters before any of these numbers come out is just a dick move. I’m guessing there will be a lot of buyer’s remorse (or selective perception) floating around this week, especially for those who upgraded from last gen.

    But hey if you got the scratch, and you had a card a few gens old, screw it, enjoy what you got, I’m just very irritated at nvidia right now.

      • thedosbox
      • 11 months ago

      [quote<] They are literally double what was the general standard a few years ago - 600, 400, and 250 [/quote<] No idea what you've been smoking, but the MSRP for a GTX 1080 was $600 and the GTX 1070 was $380. The RTX range starts at $500 for a 2070 with performance that falls around the 1080/1080 Ti range.

        • leor
        • 11 months ago

        I feel like I said a FEW generations ago, so I guess I’ve been smoking reading comprehension?

        Here ya go, puff, puff, pass.

        [url<]https://www.techspot.com/article/1191-nvidia-geforce-six-generations-tested/[/url<] Question: how do you like your frogs legs? Answer: Slow boiled

          • thedosbox
          • 11 months ago

          Per your own link:

          a $600 GTX 1080 outperformed a $550 GTX 980
          which outperformed a $700 GTX 780 Ti
          which outperformed a $500 GTX 680 etc.

          None of those cards started at $400 or $250. And while it’s a fair point that no RTX cards have been released at those price points, I’m not sure that you’d *want* one given the performance results – better off sticking with Pascal.

            • leor
            • 11 months ago

            Edited my post for you so you can better understand the context I hoped we all shared at this point.

            Your comment about us not *wanting* one at that price point means you missed the boiling frog reference, and have bought into nvidia’s hype. New generations of cards SHOULD outperform their previous gen counterpoints for a similar price to the previous gen’s launch price.

            I don’t think anyone would like it if Intel launched a consumer CPU at 500, then launched the next gen at 850 a couple of years later using the previous gen’s performance as the baseline price point.

            Well maybe chuckula would 😛

            • Voldenuit
            • 11 months ago

            [quote<]I don't think anyone would like it if Intel launched a consumer CPU at 500, then launched the next gen at 850 a couple of years later using the previous gen's performance as the baseline price point.[/quote<] Ironically*, the i9 9900K has been hitting $900 street prices lately. * Or is it aptly? I can't tell anymore.

            • thedosbox
            • 11 months ago

            [quote<] Your comment about us not *wanting* one at that price point means you missed the boiling frog reference, and have bought into nvidia's hype. New generations of cards SHOULD outperform their previous gen counterpoints for a similar price to the previous gen's launch price. [/quote<] They might not outperform them by a big enough margin for many people, but the new cards DO outperform their previous generation counterparts (based on price rather than product name) on the same workloads. DXR performance is a different matter, but I happen to think the RTX cards are effectively technology demonstrators at this point. As for the lower tier cards, it's not possible to judge how they do on existing workloads when they haven't been released yet. My comment was with respect to (presumably shitty) DXR performance of the $250 cards based on what we've seen thus far.

            • leor
            • 11 months ago

            Of course they out perform them, it would be nonsensical if they didn’t, my issue is, and only ever was, with the pricing. This is the nastiest gouging of early adopters with no hard performance numbers I think I have ever seen.

            From a link posted elsewhere in this thread forwarded to the appropriate time frame.

            [url<]https://youtu.be/SpZmH0_1gWQ?t=864[/url<]

            • ptsant
            • 11 months ago

            I believe when he says 600, 400 and 250 he refers to the card tiers within a given generation, not the evolution of prices for the top part. I almost always bought the second-best card in a given generation and $400 is about right, even a bit generous. The 970, for example, was launched at $329 and soon sold below that.

      • drfish
      • 11 months ago

      [quote<]But hey if you got the scratch, and you had a card a few gens old, screw it, enjoy what you got, I'm just very irritated at nvidia right now.[/quote<] That's where I'm at, not happy about it, and I'm still waiting for a specific card to be released, but I don't feel like I have a better option. Three years + two days ago, I bought my 980 Ti hybrid for $175 after cashing in my Amazon credit card points. Funny enough, I could buy a 2080 Ti right now for $165 out of pocket. Alternatively, I could get a 2080 for "free" - but I'd just feel bad purchasing early 2017 performance in late 2018. There is just no feel good option out there right now, even for idiots like me who are willing to play Nvidia's game.

        • drfish
        • 11 months ago

        “Fish called himself an idiot again!” *everyone upvotes furiously*

          • drfish
          • 11 months ago

          Oh, come on! 😛

            • Arbiter Odie
            • 11 months ago

            You’re on the front page, brace for further upvotes hahaha

            • K-L-Waster
            • 11 months ago

            And with one more post we can have you make a clean sweep!

            • drfish
            • 11 months ago

            It wasn’t meant to be this time, but someday… Someday…

          • gecko575
          • 11 months ago

          I upvoted, and I don’t even know what’s going on.

          *insert “I’m doing my part meme here*

            • drfish
            • 11 months ago

            [url<]https://techreport.com/discussion/31410/a-bridge-too-far-migrating-from-sandy-to-kaby-lake?post=1021402[/url<]

        • leor
        • 11 months ago

        I have G-sync monitor and an nvidia titan, I’ve been playing their game for a while now, but they went too far with this one.

      • Krogoth
      • 11 months ago

      2080Ti and 2080 only make sense if you want 4K/HDR gaming or want the best while budget is not a concern.

        • End User
        • 11 months ago

        You obviously don’t play VR games 😛 (yes, I know, it’s a niche). I’ll take all the GPU power I can get for VR thank you very much.

        I also game at 2560×1440 using a 144Hz G-Sync display and I feel the 2080 Ti is perfect for my needs.

        Maybe I’ll jump to 4K when the 3080 Ti comes out

          • ptsant
          • 11 months ago

          It’s a bit overkill for 1440p, isn’t it? VR is another story, of course.

            • End User
            • 11 months ago

            Not in Battlefield V 🙂

        • ptsant
        • 11 months ago

        HDR should not increase GPU requirements. Maybe you meant VR?

      • End User
      • 11 months ago

      Reality bites.

        • drfish
        • 11 months ago

        Virtual reality bytes.

          • Neutronbeam
          • 11 months ago

          You are back to form–huzzah for you sir!

          • End User
          • 11 months ago

          iRacing in VR is magnificant. Put that in your pipe and smoke it.

          • Peter.Parker
          • 11 months ago

          Virtual reality *virtually* bites

      • Chrispy_
      • 11 months ago

      Not only are the cards woefully expensive but RTX features are really underwhelming. Even if you ignore the price and performance drop, you’re just exchanging one type of “wrong” rendering with another type of “wrong” rendering.

      I see glaring issues with the RTX renders, such as diffuse reflections in glossy surfaces, and simulations of smooth water and surfaces when they should be rippling in the wind or distorted by bumps.

      Traditional ‘fake’ screen-space reflections and specular highlights can include bump mapping and distorted reflections on uneven surfaces, as well as simulating the effect of ripples/wind on the surface of water.

      RTX features poorly emulate a perfectly-smooth brushed metal look and it looks just as out of place when used in a game as the uncanny-valley artifacts of traditional ‘fake’ rendering (SSAO, SS reflections, specular highlights, blended static and dynamic shadowmaps).

        • BurntMyBacon
        • 11 months ago

        I will agree that this implementation leaves a bit to be desired. However, I’m not sure how much is due to RTX and how much is artistic interpretation.

        Reflection subtlety and contrast, for instance, are likely artistic interpretation. Where Jeff finds the subtlety of the RTX reflections appealing, having just walked past a few good puddles on the way in, I feel like real puddles produce reflections a bit harder than those produced by this implementation. It’s almost like they slightly smudged the lens on the camera and then reduced the saturation and/or contrast ever so slightly. That said, I still think the effect is more pleasing than the non-RTX reflections as that effect is overly contrasted leaving hard breaks between lights and darks that don’t exist in the image being reflected.

        Speckles and other noise artifacts are likely an issue inherent with RTX that may or may not be worked around. These do more to break the realism than hard breaks between lights and darks in the reflections as the nature of the artifact draws attention to the artifact.

        I’m not sure if the perfectly flat surface of the puddles is a side effect of RTX, the BFV engine, or just a design decision, but it is extremely obvious to me, particularly with leaves blowing everywhere. Perhaps it isn’t as obvious in the normal rendering mode due to the fact that the moving leaves reflected, though poorly, in the puddle does give the illusion of motion. However, with the decision not to reflect the leaves in the RTX rendering mode, the unwavering flat surface comes off as unnatural given the obvious presence of the wind blowing the leaves around.

        What this demo does demonstrate effectively is that even the RTX2080Ti struggles to keep the frame rates up in this scene. The current hybrid rendering mode in BFV is interesting, but for many, unconvincing. The extra work necessary to provide a more convincing render may not be doable until the next generation and certainly won’t be doable on mid-range hardware for a generation or two after that. I may decide to look into an RTX or a competing technology at that point, but until then, hybrid rendering will probably be relegated to extras and eye candy, rather than baseline (somewhat like PhysX).

    • thedosbox
    • 11 months ago

    [quote<]Despite the photorealism promised by ray tracing, RTX reflections aren't free of artifacts of their own. [/quote<] You should put up both videos without labelling which one is RTX and see if people can identify which is which :p

      • nanoflower
      • 11 months ago

      They can, but it’s going to take paying attention to really notice the difference. I’m not sure it would matter in the middle of a heated battle.

        • drfish
        • 11 months ago

        I made this comment on the last story, but once again I think chasing shadows would have more gameplay impact than reflections. Of course, good shadows, done traditionally, can already have a major impact. Check out [url=https://www.youtube.com/watch?v=TvJhszTU1G0&t=950<]this clip[/url<] from my favorite [i<]DayZ[/i<] streamer/tuber for an example. I can't believe people turn them off for performance boosts.

        • enixenigma
        • 11 months ago

        That’s pretty much where I’m at. I don’t think that Battlefield V (or any twitch shooter, really) is the best early showcase for this tech. I’m a BF junkie, and I know that I’d much rather have lower frametimes than prettier reflections when I’m playing.

          • ptsant
          • 11 months ago

          The BF series is quite reasonable in technical requirements. I am a fan of their engine. They manage to have both great visuals and good frametimes on relatively modest hardware.

          By the way, the best combination of smoothness/graphics was Doom 2016.

            • enixenigma
            • 11 months ago

            I agree. I’m a fan of Frostbite as well. I just wonder if the majority of players would feel that the gameplay-compromising hit to frametimes is justified by the better, more realistic reflections. I guess having the option is better than not having it?

            You didn’t like Wolfenstein 2?

            • ptsant
            • 11 months ago

            Haven’t yet bought WF 2. Waiting for a sale.

      • stefem
      • 11 months ago

      The difference in the reflections are plainly evident, I didn’t seen the videos but it’s night and day live, there’s no way one (with a minimum of observation’s ability) could not note they are different.
      I didn’t had too much time to play but the mentioned artifacts seems related to leaves.

      Edit: Watched the videos, even with youtube compression it’s pretty obvious which have RTX on and which don’t but I may be the best example as I know what to look for.

    • NTMBK
    • 11 months ago

    Given all the recent stories about reliability problems on the RTX cards, I don’t think it’s worth buying into this generation for these performance killing effects. Going to stick with my Pascal card for a good long while.

      • thedosbox
      • 11 months ago

      You may want to read this before buying into the RTX outrage machine:

      [url<]https://www.gamersnexus.net/guides/3394-rtx-2080-ti-artifacting-failure-analysis-crashing-black-screens[/url<] TL;DR - many of the commonly cited theories around *why* the cards failed don't have much substance to them.

        • Action.de.Parsnip
        • 11 months ago

        TL:DR the whys are disputed but that it’s actually happening isn’t.

        [url<]https://hardforum.com/threads/rtx-2080-ti-fe-escapes-testing-by-dying-after-8-hours-h.1972149/[/url<] One poor sap on there has had 3 of them die

      • Krogoth
      • 11 months ago

      It is QC issues from card partners. They went too skimpy on VRMs on some 2080Ti units. It is first-generation R9 290/GTX 1080 FE debacle all over again.

        • NTMBK
        • 11 months ago

        Except even NVidia’s own founder’s edition cards are having artefacting problems too. It’s not just the partner cards.

          • stefem
          • 11 months ago

          AIB said that fault rate of 20xx series cards are normal or even better, most of the report are from cards with reference PCB whose analysis shown high quality components are employed.
          There are multiple cause of the issues of the RTX 2080 Ti, BSOD where caused by drivers and not by bad hardware, there was also problems with corrupted firmware solvable with a reflash and NVIDIA said on its own forum that they had a supply of bad component (rom chip? ram? whatever other component?). I think its the proverbial mountain that gave birth to a mouse, there are a bit of hysteria around if tech website start to report even single events (like the EVGA that burned some components).

          • Krogoth
          • 11 months ago

          You know that they are rebranded PNY units?

            • psuedonymous
            • 11 months ago

            Only in the same way that iPhones are ‘rebranded Foxconn units’.

      • swaaye
      • 11 months ago

      I have had a 1070 and 1080 Ti need replacement. Both started to cause driver crashes. But that’s what warranties are for. They replaced the 1070 with a 1080 too!

    • ptsant
    • 11 months ago

    The game looks fantastic. The RTX doesn’t add enormously to it. I was much more impressed by the leaves and the texture details than by the improvement in reflections.

    • jihadjoe
    • 11 months ago

    So the 2080Ti seems to be a lock for RTX medium at 1080P. Even the 16.3ms 99th percentile frametime is bang on for 60fps.

    • Krogoth
    • 11 months ago

    Again this reminds me of pixel and vertex shading back when it was introduced with the Geforce 3.

    RTX 2xxx series is just trailblazing the tools while developers are getting a hold on them. RTX mode on 2xxx series is only good for tech demos and die-hard videophiles. I honestly don’t expect any real use of it until 2nd and 3rd generation hardware.

      • ptsant
      • 11 months ago

      However, if the lack of change in price/$ persists expect to pay $1500+ for hardware that can actually run RTX titles in the future. I hope there will be more competition soon.

      • BurntMyBacon
      • 11 months ago

      [quote=”Krogoth”<] I honestly don't expect any real use of it until 2nd and 3rd generation hardware.[/quote<] I would not expect to see much traction until a usable option becomes available at mainstream price points. Until then, it'll probably be a lot like PhysX in the sense that games will not rely on or require it.

      • NovusBogus
      • 11 months ago

      Funny you should mention that, I was just thinking about the transition from texturing to shaders back in the day.

      As someone who both wants to buy a new GPU in the near future and has experience in graphics programming and using various pro level 3D modeling/rendering applications, I want to like what they’re doing here but the price tag is way too high for something this experimental. Throw in the ongoing questions about performance and hardware build quality and RTX v1.0 is really not very compelling. NV probably should have done this as a Titan style halo product rather than sacrifice the x70/x80 customer base on the altar of technological hubris.

        • Spunjji
        • 11 months ago

        Now THAT is an idea that makes more sense.

    • derFunkenstein
    • 11 months ago

    Well that’s pretty cool. Nice work Jeff.

    Does the RTX hardware draw a lot of extra power while doing these raytracing effects?

      • BlackStar
      • 11 months ago

      It actually draws less, because the raytracing hardware bottlenecks the CUDA/stream processors.

        • derFunkenstein
        • 11 months ago

        That would be wild, because it means even running full-out the RTX hardware uses less power than whatever it’s blocking. Would love to see the data.

      • Jeff Kampman
      • 11 months ago

      If the question is whether power draw increases with RTX on, the answer is no. Turing cards appear to dynamically re-allocate their power budget among functional units as needed depending on the workload, and in this case, it seems like 50% or so of it goes to the RT cores given the drop in traditional rasterization performance.

        • derFunkenstein
        • 11 months ago

        That was the question, thanks.

        • chuckula
        • 11 months ago

        And I think you just explained a major reason for the performance drop. It’s not that RTX means automatically slow, just that RTX requires downtuning the rest of the chip.

          • derFunkenstein
          • 11 months ago

          It might be semantics, but what I want to know now is: is it down-clocking the rest of the chip to the point that everything else is too slow to keep up with the RTX unit, or is it that the RTX is holding the rest back? If the rest of the GPU is only making frames as fast as the raytracing hardware can process them, then is it really down-tuning the rest of the chip?

            • Voldenuit
            • 11 months ago

            Since power consumption is down, my bet is on the RTX being the bottleneck.

            The rest of the chip is literally twiddling its thumbs on idle cycles while waiting on RTX.

            • derFunkenstein
            • 11 months ago

            That’s what I figure, too, and that’s why I asked that at the end.

    • End User
    • 11 months ago

    Thanks for testing @ 2560×1440.

    I was finally able to order an EVGA 11G-P4-2487-KR this past Friday direct from EVGA. I look forward to duplicating your experience whenever the card arrives.

      • Star Brood
      • 11 months ago

      That’s some heavy alphabet soup, yet none of those letters indicate anything of the current generation.

      Did a quick Google, and it’s a 2080 Ti. Whoever comes up with these product names is a lunatic.

        • End User
        • 11 months ago

        I was too lazy to type out EVGA GeForce RTX 2080 Ti FTW3 ULTRA GAMING.

Pin It on Pinterest

Share This