Nvidia’s RTX Super and AMD’s Radeon RX 5700 series graphics cards reviewed

Another look at Nvidia’s DLSS

The Aorus GeForce GTX 1080 Ti card used throughout this review is my own personal graphics card that I’ve had for some while now. This review actually marks the first time I’ve had a GeForce RTX card in my own hands to play with. As a result, this is the first time I’ve gotten to spend much time fooling around with RTX—more on that in a bit—and it’s also the first time I’ve seen DLSS in action.

Nvidia likes to make dubious comparisons between TAA and DLSS. Source: NVIDIA

If you’re not familiar, then first, go read Jeff’s write-up on DLSS from last year. The AI-powered upscaling technology was created by Nvidia for … well, I’m not completely sure why it was created. If someone were to ask me for my personal opinion, I’d say it was probably an attempt to give some purpose to the tensor cores aboard Nvidia’s Turing-based GeForce GPUs. Let me elaborate on (or perhaps belabor) that point.

As Nvidia CEO Jensen Huang has stated, Nvidia is a “one-architecture company.” That means that the company develops and supports one architecture for all of its products at any given time. Nvidia now services a fairly wide range of markets with its graphics and compute products, but one market is by far both the most lucrative and the most demanding: high-performance computing (HPC).

So saying, Nvidia’s processors are largely designed with an eye toward what will best serve the HPC market and then adapted for other markets, like gaming graphics. Much of HPC these days is concerned with artificial intelligence and neural networks. As a result, the latest GeForce GPUs dedicate a significant amount of silicon (and thus, compute capability) to their AI-oriented tensor cores. These are not utilized in or even useful for typical gaming workloads.

It’s very likely, then, that someone at Nvidia was tasked with finding a way to make tensors useful for games. This is pure conjecture on my part, but it’s not difficult to visualize the short mental hop from “AI-powered image upscaling” to what DLSS ultimately is: AI-powered image upscaling, done in real-time.

When Nvidia first introduced DLSS, the company described two versions of the technology: DLSS 2X and the “standard” DLSS. DLSS 2X is a lot closer to what its creators seem to have envisioned when they came up with the name: the game is rendered in native resolution, and then the pre-trained neural network upscales the game to a higher resolution.

Source: NVIDIA

That’s still upscaling, not super-sampling, but it’s a heck of a lot closer to super-sampling than what all extant DLSS implementations currently are. When you enable DLSS in Final Fantasy XV or Monster Hunter World, the game actually drops the rendering resolution from what you have selected—without telling you it is doing this—before applying the DLSS filter. To be clear, when we say “4K DLSS,” we’re not talking about a 3840×2160 image that has been upscaled—we’re talking about a lower-resolution image that has been upscaled to 3840×2160.

Jeff was frustrated last year by the fact that the only software he had with which to test DLSS were a couple of canned demos that didn’t play nice with our performance profiling tools. So, to see the performance impact of DLSS in a real game, I tested Monster Hunter World on the RTX 2070 Super card. I ran the game through the same benchmark as before in 2560×1440, 3840×2160, and then 3840×2160 with DLSS enabled.

Certainly, setting DLSS to “On” improves the game’s performance. Make no bones about it: 4K with DLSS runs much better than without. The problem I have is that, as discussed above, 4K with DLSS isn’t really 4K. In fact, I feel the proper comparison is really to the lower resolution. Considered that way, DLSS actually has a seriously deleterious effect on performance.

I don’t know what the base resolution of the DLSS’d image is—because the game doesn’t tell me—but “4K with DLSS” certainly runs a lot worse than 2560×1440 without. The problem is, at least to my eyes, it doesn’t look that much better, either. Produced below are three images taken in Monster Hunter World‘s Research Base, a richly-detailed area with lots of complicated geometry.

(2560 x 1440 resolution, click for full-size)

(3840 x 2160 resolution, click for full-size)

(3840 x 2160 with DLSS, click for full-size)

I enthusiastically encourage you to download these images and display each one full-screen in a photo viewer like Irfanview or XnView, blowing up the smaller image as necessary. You don’t have to use a 4K monitor for this, but obviously that’s the best option. Looking carefully at the full-resolution shots, you can clearly see how the DLSS image has less aliasing than the 2560×1440, yet it also muddies detail in distant areas even more than the lower-resolution image. There’s no comparison to be made to the native 4K image.

Furthermore, DLSS looks bizarre in motion. I didn’t produce a video because the encoding on YouTube or another service would surely crush the critical details, but others have described the effect as being “like an oil painting,” and I find myself intuitively agreeing with this assessment. In motion, DLSS adds strange “shimmering” that makes details shift and swirl on static surfaces. It’s subtle, and I may not even have noticed if I wasn’t looking for them, but either way, the final product looks nothing like “real” native 4K rendering.

I have a lot of reservations about DLSS. I don’t approve of the way it is marketed or the way it is implemented. The drop in resolution is evident to the user, yet not at all communicated by the software. Worse, I simply don’t think the effect is convincing, at least in Monster Hunter World. I do think that the technology is fascinating, and I applaud Nvidia’s ingenuity, but I don’t think it achieves what Nvidia wanted. I’ll probably just stick to the lower resolution without DLSS and enjoy more consistent performance.

Pages: 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19

avatar
19 Comment threads
36 Thread replies
0 Followers
 
Most reacted comment
Hottest comment thread
30 Comment authors
KretschmerMr BillwillmoreLoneWolf15K-L-Waster Recent comment authors
  Subscribe  
newest oldest most voted
Notify of
Mr Bill
Guest
Mr Bill

Zak, excellent review. You always have interesting observations for each game and how the cards deal with the game. That in-the-second commentary is what brings the nerds to the yard.

willmore
Guest
willmore

Did you crush that 5700? Or is that really how the shroud looks?

willmore
Guest
willmore

Sorry, it was the XT model that looks crushed.

K-L-Waster
Guest
K-L-Waster

That is in fact what the reference shroud looks like.

(As if peeps didn’t have enough reasons to hold out for 3rd party coolers…)

willmore
Guest
willmore

I wouldn’t call it ugly, but I really don’t like the look.

Kretschmer
Guest
Kretschmer

What this review tells me is that my 1080Ti held up really, really well. I got 2070 Super performance for two extra years at a $150 premium.

Sure no ray tracing, but I’d rather buy into that tech when it becomes better developed.

plonk420
Guest
plonk420

thank you SOOOO much for this review! Time Spent Beyond x ms / average frametime now is the first thing i look at with reviews

LoneWolf15
Guest
LoneWolf15

One note I don’t think I recall being brought up (the lack of index drop-down in the pages listed makes me loath to go back and check): Everything I’ve seen says that if Navi is your card, do not buy a reference design. Some reference blowers are quite good (nVidia Pascal was; I was very happy with the noise level on two 1070 Founders Edition cards at full load, and a single one was whisper quiet). AMD’s blower -is not. It is both loud, and a poor cooler; Sapphire’s own dual-fan Pulse 5700XT runs twenty degrees cooler. Twenty degrees [i]Celsius[/i]… Read more »

LoneWolf15
Guest
LoneWolf15

I should add, this is not meant to denigrate the AMD’s Navi. Just AMD’s reference cooler design.

I think the Sapphire cards (specifically, the Pulse 5700XT or Pulse 5700) for ten bucks more than the reference design, is competitive, and worth looking into. But they may be hard to find at the moment.

https://www.newegg.com/sapphire-radeon-rx-5700-xt-100416p8gl/p/N82E16814202349?Description=sapphire%20pulse%205700xt&cm_re=sapphire_pulse_5700xt-_-14-202-349-_-Product

https://www.newegg.com/sapphire-radeon-rx-5700-100417p8gl/p/N82E16814202350?Description=sapphire%20pulse%205700&cm_re=sapphire_pulse_5700-_-14-202-350-_-Product

anotherengineer
Guest
anotherengineer

“For gamers like myself who use myriad monitors—I’m currently using five—”

got a matrox card? 😉

when are the $235 card coming out?!?!

Mr Bill
Guest
Mr Bill

I have a Matrox G650. Its a fabulous office card for multi-monitor but far too slow for gaming.

StuG
Guest
StuG

I feel like a lot more cards should have been included in the conclusions graph (given that was already charted on previous reviews) so we could see where a larger range of cards would fall (even if the dots were marked as previous reviews or something).

Captain Ned
Guest
Captain Ned

If those cards were not benchmarked on the same rigs as used for this review, the comparison would not be univariate. Multivariate is what proper reviews try to avoid.

Krogoth
Guest
Krogoth

Vega will continue on as general compute solutions while RDNA will focus more on graphical prowess.

Yomom
Guest
Yomom

So sad that great content like this has to get fakked by this horrible horrible generic template.

DPete27
Guest
DPete27

Nvidia’s new Ultra Low Latency setting description may shed some light on AMD’s implementation:
https://www.nvidia.com/en-us/geforce/news/gamescom-2019-game-ready-driver/

By reducing maximum pre-rendered frames to 1, you go to a just-in-time frame scheduling….Sure, that would improve latency, but it would also leave you susceptible to frame-time spikes if a frame takes a little longer than expected. I suspect that using VRR can reduce this effect, but still would be interesting to test. How many pre-rendered frames is optimal?

DPete27
Guest
DPete27

Also of note:
“in DX12 and Vulkan titles, the game decides when to queue the frame”
would be nice to include in your review write-up.

Jesse
Guest
Jesse

I usually set mine to 2 globally in the Nvidia control panel – enough for double-buffered vsync if I want it, and much less latency in game engines that pre-render like 5 frames by default.

Jason Deford
Guest
Jason Deford

So… Are the Radeon Vega cards an evolutionary dead end?

I was disappointed to see the Nvidia 1080 ti card in testing, but not the Radeon RX Vega 64. If you’re making a generational comparison including the Nvidia 10-series and its follow-ons, I’d think you should include the Radeon Vega-series in comparison to the RX 5700-series. The RX 580 card shown in the comparison isn’t in the same price/performance range as the newer cards being benched.

Krogoth
Guest
Krogoth

Vega will continue on as general compute solutions while RDNA will focus more on graphical prowess.

StuG
Guest
StuG

This was exactly what I thought as well. No VII/64/56?

tfp
Guest
tfp

I was wondering the same, that said a check on newegg shows that VII is very limited. Is AMD running into production issues with the VII?

Krogoth
Guest
Krogoth

Radeon VII was a stopgap solution until Navi was ready. It was a way to clear out excessive Vega 20 stock that ate too much power for ISV customers.

Navi already bests Vega 20 at gaming performance when memory bandwidth isn’t a factor.

LoneWolf15
Guest
LoneWolf15

Yup. Radeon VII owners are sadly being left high and dry.

It was a lousy buy even for the most die-hard AMD fans, and its short market time is pretty disappointing for anyone who bought one.

jihadjoe
Guest
jihadjoe

My guess is AMD doesn’t really want to make any more of the VII than is necessary. It’s relatively cheap for something that uses such expensive components, and built on an expensive process.

90% of those chips they rather go into the Mi50 accelerators.

Krogoth
Guest
Krogoth

Yep, the Radeon VII is a much better general compute/content creation card then a gaming card. There’s nothing close to it in its price point. You have to spend a lot more if you want to get performance in either market.

It was a steal for general compute hobbyist like Kepler-based Titans back in the day.

Krogoth
Guest
Krogoth

It is likely that Zak simply doesn’t have any Vega hardware on hand and his test system is different from previous Vega benches making an apples to apples comparison difficult at best.

If you want a ballpark figure just take 5700XT results and reduce them by like ~2-10% to get Vega 64 stock/tuned Vega 56 results.

Colton Westrate
Editor

^ This. Zak was working with what he had, or in some cases, what he could borrow for a couple days.

Jason DeFord
Guest
Jason DeFord

“If you want a ballpark figure just take 5700XT results and reduce them by like ~2-10% to get Vega 64 stock/tuned Vega 56 results.” I think you’re over-simplifying the comparison. I still think seeing the Vega GPUs on the ‘scatter charts’ would be valuable data points. In addition, there is the price dimension that needs to accounted for. Right now, an ASRock Phantom Gaming X Radeon RX Vega 56 can be found for US$270, while a ASRock Radeon RX 5700 goes for US$330 @ TheEgg. Saving ~20% of GPU cost for a difference of “~2-10%” in performance is worth considering.… Read more »

Krogoth
Guest
Krogoth

Vega 56 is a bit of a wildcard because it is highly depend on how well you can undervolt the unit. Unlike the 5700 and 5700XT which can operate at their performance levels without too much hassle. Vega 64 only pulls ahead if you are brave enough to overclock/undervolt to its limits and are willing to tolerate the power consumption.

Vega units are decisively better if you care more about general compute performance.

Oliv
Guest
Oliv

Completely agree, especially since the model used was the 4GB version. One if these things is not like the other.

juzz86
Guest
juzz86

Oh Zak, you’ve done it again mate. I know any of our staff who were tasked with carrying the site’s major drawcard articles would give it every bit of justice you had – as you all do with your own posts. But to see what we all crave seeing on the site hold the same format, same prose, same detail as it always has – means an awful lot to a sentimental fella like me. [Site] Formatting and [staff] introductory niggles around the ownership change aside, I’m heartened to see the stalwart content keep coming (Fish, Bruno, Ben, Josh, Nath)… Read more »

unknown-error
Guest
unknown-error

The “all-white” background is going to take a lot more getting used to. On my desktop, sorry say, but it looks really amateurish. Since there is no drop-down menu with the relevant page titles, it would help us a lot if you put the “Page listing:” at the bottom of each page. So we can skip to pages that interest us. What I do now is, I open the page with the “Page listing:” in one tab and open the pages I like in another tab.

The reviews itself is great as usual and thanks a lot for that.

John
Guest
John

On some pages you mention about “Jeff’s write-up” and you link to another article, but that article has a different author name, without “Jeff” anywhere. You should correct it to not confuse the readers (yes, I know Jeff wanted his name removed from TR). Also, it is disappointing you did not use some other DX12 games like Metro Exodus. It is a perfect game to test performance with RTX enabled and also useful for comparing GPUs without RTX effects. It is also disappointing that you did not select some popular MMORPGs for benchmark, something like Black Desert Online or FFXIV.… Read more »

Fonbu
Guest
Fonbu

Thank You ! Tech Report Staff for making this possible.
Many of us I am sure have waited for this review. And it was smart of the Tech Report to wait for all the Geforce Super cards and the new Radeons to arrive, and showcase them against each other. Being the most productive choice.
I like how all the new driver features of the products were showcased.
Was this Zaks first major video card review? It was well done.

Ben Funk
Guest
Ben Funk
Sam
Guest
Sam

Adding a page title listing to the content of the first page would help a lot, since we lost the convenient dropdown box after the refresh. Some people just want to see benchmarks for a certain game and don’t want to click through 14 pages to find that.

chuckula
Guest
chuckula

Thanks for the review!

As for the product well, let’s say that 7nm has allowed AMD to avoid some of the worst issues with previous cards essentially being overclocked out of the box. But given how much guff Nvidia has gotten for dedicating silicon to RTX, it’s also pretty telling that their 16nm parts (2060 super in particular) are still competitive even when you never turn RTX on and even when you look at power consumption.

I think this calls for some market disruption by a third party (and of course I mean S3!)

Krogoth
Guest
Krogoth

No, a dark shadow from the distance past will emerge from its somber…..

[b]Bitboys[/b]

chuckula
Guest
chuckula

Bitboys??!!

Oy!!

LoneWolf15
Guest
LoneWolf15

I upvoted a Chuckula post. Demons must be shivering in hell as we speak.

Waco
Guest
Waco

It makes me wonder just how much of Nvidia is currently propped up by datacenter sales – their die sizes compared to AMD are monstrous.

Also, Nvidia is at 12 nm, not 16.

chuckula
Guest
chuckula

Die sizes are irrelevant since AMD is clearly paying a fortune for 7nm silicon or else they would have launched these chips for a small fraction of the price of the RTX parts to grab market share. Furthermore, TSMC’s “12nm” is the 16nm process with a couple of tweaks and a marketing name change.

As for the data center you should have paid attention to Nvidia’s most recent earnings beat where the data center was actually down a good bit but overall results beat the street and –unlike AMD — Nvidia just raised it’s outlooks for the rest of 2019.

Neutronbeam
Guest

That is one hell of a review Zak! Excellent work; well done!

Krogoth
Guest
Krogoth

It was worth the wait.

Waco
Guest
Waco

It’s good to see essentially price-parity on the XT / 2060 Super. That’s good news for anyone in the $400 and below market. The standard 5700 looks to stand on its own pretty handily between the 1660 Ti and 2060 Super, it’s slightly better value than either if you look at 99th% FPS / dollar.

Frenchy2k1
Guest
Frenchy2k1

TR left out the RTX2060 which will continue at $350 and is in direct competition with AMD 5700.
The other “SUPER” cards supersede the previous models (RTX2070 SUPER completely replaces the original 2070), but the original 2060 will continue.
It seems TR was not sourced one and hence did not include it in the graph.

Pin It on Pinterest

Share This