Nvidia’s RTX Super and AMD’s Radeon RX 5700 series graphics cards reviewed

Doom (Vulkan)

I can’t wait, because Id Software is about to drop Doom Eternal, the sequel to 2016’s surprisingly-faithful Doom reboot. Unfortunately, it’s not here yet, so we have to make do with the original for benchmarking. This game is unbelievably optimized for performance, and it shows in our results. We tested by running a set path through the beginning of the “Foundry” stage in Arcade Mode on Ultra-Nightmare difficulty using the following settings. As this test involves actual combat there is a bit of variability to the results, but it’s also completely representative of real-world game performance.

(click for full settings)

Doom is the only game in our testing set using the Vulkan API. As we’ve seen before, Nvidia’s Turing architecture absolutely loves the Vulkan API. The RTX 2080 Super turns in a handy win over the big Pascal chip on the GTX 1080 Ti, and the RTX 2070 Super is nipping at its heels despite being smaller and weaker on paper. If ever you needed evidence to prove to someone that Turing is a lot more than “Pascal-plus-RTX,” here it is.

For their part, the Radeon RX 5700 cards turn in a very solid performance. I suspect this game is tightly-tuned for the older GCN architecture, and while compatible, running in that mode takes little to no advantage of RDNA’s enhancements. As a result, we suspect Navi could do even better in this engine with a little tuning. Do take note of those 99th-percentile frame times. Likewise, check out those extremely skinny frametime plots for both Navi cards. Not only is their performance competitive, they’re extremely consistent, too.

All of the faster cards in this test put up astonishing performances. Here’s hoping that Doom Eternal is much more demanding than its predecessor, because Doom ’16 is quickly losing its utility as a benchmark. Even the humble GTX 1660 Ti put up an average framerate over 60 in our 4K test. The old Polaris-based RX 580 does particularly poorly here, but I suspect it’s being limited by its 4GB of on-board memory, as I mentioned before. Dropping down to 2560×1440 makes the game eminently playable even on that card.

These “time spent beyond X” graphs are meant to show “badness,” defined as those instances where animation becomes less fluid. The formulas behind these graphs simply total up the amount of time each graphics card spends beyond certain frame-time thresholds, each with an important implication for gaming smoothness. To fully appreciate this data, recall that our graphics card tests generally consist of one-minute test runs, and that 1000 ms is one second.

The 50-ms threshold is the most notable one, since it corresponds to a 20-FPS average. We figure that if you’re not rendering any faster than 20 FPS, even for a moment, then you’re likely to perceive a slowdown. A frame interval of 33 ms corresponds to 30 FPS, or a 30-Hz refresh rate. Go below that with vsync on and you’re into the bad voodoo of quantization slowdowns. 16.7 ms correlates to 60 FPS, the golden mark that we’d like to achieve or surpass for each and every frame.

In less-demanding titles, or when testing powerful hardware as we are today, it’s useful to look at our strictest graphs. 11.11 ms matches up to 90 FPS, a common refresh rate for ultra-wide gaming monitors. Approximately 8.3 ms is the frame interval at 120 FPS, the lower-end of what we’d consider a high-refresh-rate monitor, while most high-refresh rate displays now run at 144-Hz giving a frame interval of 6.94 ms.

On this title, we didn’t even bother including the 33- or 50-ms charts because there was nothing to show. In fact, only a few cards had any time beyond 16.7 ms (under 60 FPS) and really only one card stayed there for long: the aged RX 580 4GB. You have to drop down to 11.11 ms to start seeing any of the newer cards struggle. Incredibly, the RTX 2080 Super spends less than two seconds under 120 FPS in our test run. That’s sort of absurd.

Pages: 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19

avatar
19 Comment threads
36 Thread replies
0 Followers
 
Most reacted comment
Hottest comment thread
30 Comment authors
KretschmerMr BillwillmoreLoneWolf15K-L-Waster Recent comment authors
  Subscribe  
newest oldest most voted
Notify of
Mr Bill
Guest
Mr Bill

Zak, excellent review. You always have interesting observations for each game and how the cards deal with the game. That in-the-second commentary is what brings the nerds to the yard.

willmore
Guest
willmore

Did you crush that 5700? Or is that really how the shroud looks?

willmore
Guest
willmore

Sorry, it was the XT model that looks crushed.

K-L-Waster
Guest
K-L-Waster

That is in fact what the reference shroud looks like.

(As if peeps didn’t have enough reasons to hold out for 3rd party coolers…)

willmore
Guest
willmore

I wouldn’t call it ugly, but I really don’t like the look.

Kretschmer
Guest
Kretschmer

What this review tells me is that my 1080Ti held up really, really well. I got 2070 Super performance for two extra years at a $150 premium.

Sure no ray tracing, but I’d rather buy into that tech when it becomes better developed.

plonk420
Guest
plonk420

thank you SOOOO much for this review! Time Spent Beyond x ms / average frametime now is the first thing i look at with reviews

LoneWolf15
Guest
LoneWolf15

One note I don’t think I recall being brought up (the lack of index drop-down in the pages listed makes me loath to go back and check): Everything I’ve seen says that if Navi is your card, do not buy a reference design. Some reference blowers are quite good (nVidia Pascal was; I was very happy with the noise level on two 1070 Founders Edition cards at full load, and a single one was whisper quiet). AMD’s blower -is not. It is both loud, and a poor cooler; Sapphire’s own dual-fan Pulse 5700XT runs twenty degrees cooler. Twenty degrees [i]Celsius[/i]… Read more »

LoneWolf15
Guest
LoneWolf15

I should add, this is not meant to denigrate the AMD’s Navi. Just AMD’s reference cooler design.

I think the Sapphire cards (specifically, the Pulse 5700XT or Pulse 5700) for ten bucks more than the reference design, is competitive, and worth looking into. But they may be hard to find at the moment.

https://www.newegg.com/sapphire-radeon-rx-5700-xt-100416p8gl/p/N82E16814202349?Description=sapphire%20pulse%205700xt&cm_re=sapphire_pulse_5700xt-_-14-202-349-_-Product

https://www.newegg.com/sapphire-radeon-rx-5700-100417p8gl/p/N82E16814202350?Description=sapphire%20pulse%205700&cm_re=sapphire_pulse_5700-_-14-202-350-_-Product

anotherengineer
Guest
anotherengineer

“For gamers like myself who use myriad monitors—I’m currently using five—”

got a matrox card? 😉

when are the $235 card coming out?!?!

Mr Bill
Guest
Mr Bill

I have a Matrox G650. Its a fabulous office card for multi-monitor but far too slow for gaming.

StuG
Guest
StuG

I feel like a lot more cards should have been included in the conclusions graph (given that was already charted on previous reviews) so we could see where a larger range of cards would fall (even if the dots were marked as previous reviews or something).

Captain Ned
Guest
Captain Ned

If those cards were not benchmarked on the same rigs as used for this review, the comparison would not be univariate. Multivariate is what proper reviews try to avoid.

Krogoth
Guest
Krogoth

Vega will continue on as general compute solutions while RDNA will focus more on graphical prowess.

Yomom
Guest
Yomom

So sad that great content like this has to get fakked by this horrible horrible generic template.

DPete27
Guest
DPete27

Nvidia’s new Ultra Low Latency setting description may shed some light on AMD’s implementation:
https://www.nvidia.com/en-us/geforce/news/gamescom-2019-game-ready-driver/

By reducing maximum pre-rendered frames to 1, you go to a just-in-time frame scheduling….Sure, that would improve latency, but it would also leave you susceptible to frame-time spikes if a frame takes a little longer than expected. I suspect that using VRR can reduce this effect, but still would be interesting to test. How many pre-rendered frames is optimal?

DPete27
Guest
DPete27

Also of note:
“in DX12 and Vulkan titles, the game decides when to queue the frame”
would be nice to include in your review write-up.

Jesse
Guest
Jesse

I usually set mine to 2 globally in the Nvidia control panel – enough for double-buffered vsync if I want it, and much less latency in game engines that pre-render like 5 frames by default.

Jason Deford
Guest
Jason Deford

So… Are the Radeon Vega cards an evolutionary dead end?

I was disappointed to see the Nvidia 1080 ti card in testing, but not the Radeon RX Vega 64. If you’re making a generational comparison including the Nvidia 10-series and its follow-ons, I’d think you should include the Radeon Vega-series in comparison to the RX 5700-series. The RX 580 card shown in the comparison isn’t in the same price/performance range as the newer cards being benched.

Krogoth
Guest
Krogoth

Vega will continue on as general compute solutions while RDNA will focus more on graphical prowess.

StuG
Guest
StuG

This was exactly what I thought as well. No VII/64/56?

tfp
Guest
tfp

I was wondering the same, that said a check on newegg shows that VII is very limited. Is AMD running into production issues with the VII?

Krogoth
Guest
Krogoth

Radeon VII was a stopgap solution until Navi was ready. It was a way to clear out excessive Vega 20 stock that ate too much power for ISV customers.

Navi already bests Vega 20 at gaming performance when memory bandwidth isn’t a factor.

LoneWolf15
Guest
LoneWolf15

Yup. Radeon VII owners are sadly being left high and dry.

It was a lousy buy even for the most die-hard AMD fans, and its short market time is pretty disappointing for anyone who bought one.

jihadjoe
Guest
jihadjoe

My guess is AMD doesn’t really want to make any more of the VII than is necessary. It’s relatively cheap for something that uses such expensive components, and built on an expensive process.

90% of those chips they rather go into the Mi50 accelerators.

Krogoth
Guest
Krogoth

Yep, the Radeon VII is a much better general compute/content creation card then a gaming card. There’s nothing close to it in its price point. You have to spend a lot more if you want to get performance in either market.

It was a steal for general compute hobbyist like Kepler-based Titans back in the day.

Krogoth
Guest
Krogoth

It is likely that Zak simply doesn’t have any Vega hardware on hand and his test system is different from previous Vega benches making an apples to apples comparison difficult at best.

If you want a ballpark figure just take 5700XT results and reduce them by like ~2-10% to get Vega 64 stock/tuned Vega 56 results.

Colton Westrate
Editor

^ This. Zak was working with what he had, or in some cases, what he could borrow for a couple days.

Jason DeFord
Guest
Jason DeFord

“If you want a ballpark figure just take 5700XT results and reduce them by like ~2-10% to get Vega 64 stock/tuned Vega 56 results.” I think you’re over-simplifying the comparison. I still think seeing the Vega GPUs on the ‘scatter charts’ would be valuable data points. In addition, there is the price dimension that needs to accounted for. Right now, an ASRock Phantom Gaming X Radeon RX Vega 56 can be found for US$270, while a ASRock Radeon RX 5700 goes for US$330 @ TheEgg. Saving ~20% of GPU cost for a difference of “~2-10%” in performance is worth considering.… Read more »

Krogoth
Guest
Krogoth

Vega 56 is a bit of a wildcard because it is highly depend on how well you can undervolt the unit. Unlike the 5700 and 5700XT which can operate at their performance levels without too much hassle. Vega 64 only pulls ahead if you are brave enough to overclock/undervolt to its limits and are willing to tolerate the power consumption.

Vega units are decisively better if you care more about general compute performance.

Oliv
Guest
Oliv

Completely agree, especially since the model used was the 4GB version. One if these things is not like the other.

juzz86
Guest
juzz86

Oh Zak, you’ve done it again mate. I know any of our staff who were tasked with carrying the site’s major drawcard articles would give it every bit of justice you had – as you all do with your own posts. But to see what we all crave seeing on the site hold the same format, same prose, same detail as it always has – means an awful lot to a sentimental fella like me. [Site] Formatting and [staff] introductory niggles around the ownership change aside, I’m heartened to see the stalwart content keep coming (Fish, Bruno, Ben, Josh, Nath)… Read more »

unknown-error
Guest
unknown-error

The “all-white” background is going to take a lot more getting used to. On my desktop, sorry say, but it looks really amateurish. Since there is no drop-down menu with the relevant page titles, it would help us a lot if you put the “Page listing:” at the bottom of each page. So we can skip to pages that interest us. What I do now is, I open the page with the “Page listing:” in one tab and open the pages I like in another tab.

The reviews itself is great as usual and thanks a lot for that.

John
Guest
John

On some pages you mention about “Jeff’s write-up” and you link to another article, but that article has a different author name, without “Jeff” anywhere. You should correct it to not confuse the readers (yes, I know Jeff wanted his name removed from TR). Also, it is disappointing you did not use some other DX12 games like Metro Exodus. It is a perfect game to test performance with RTX enabled and also useful for comparing GPUs without RTX effects. It is also disappointing that you did not select some popular MMORPGs for benchmark, something like Black Desert Online or FFXIV.… Read more »

Fonbu
Guest
Fonbu

Thank You ! Tech Report Staff for making this possible.
Many of us I am sure have waited for this review. And it was smart of the Tech Report to wait for all the Geforce Super cards and the new Radeons to arrive, and showcase them against each other. Being the most productive choice.
I like how all the new driver features of the products were showcased.
Was this Zaks first major video card review? It was well done.

Ben Funk
Guest
Ben Funk
Sam
Guest
Sam

Adding a page title listing to the content of the first page would help a lot, since we lost the convenient dropdown box after the refresh. Some people just want to see benchmarks for a certain game and don’t want to click through 14 pages to find that.

chuckula
Guest
chuckula

Thanks for the review!

As for the product well, let’s say that 7nm has allowed AMD to avoid some of the worst issues with previous cards essentially being overclocked out of the box. But given how much guff Nvidia has gotten for dedicating silicon to RTX, it’s also pretty telling that their 16nm parts (2060 super in particular) are still competitive even when you never turn RTX on and even when you look at power consumption.

I think this calls for some market disruption by a third party (and of course I mean S3!)

Krogoth
Guest
Krogoth

No, a dark shadow from the distance past will emerge from its somber…..

[b]Bitboys[/b]

chuckula
Guest
chuckula

Bitboys??!!

Oy!!

LoneWolf15
Guest
LoneWolf15

I upvoted a Chuckula post. Demons must be shivering in hell as we speak.

Waco
Guest
Waco

It makes me wonder just how much of Nvidia is currently propped up by datacenter sales – their die sizes compared to AMD are monstrous.

Also, Nvidia is at 12 nm, not 16.

chuckula
Guest
chuckula

Die sizes are irrelevant since AMD is clearly paying a fortune for 7nm silicon or else they would have launched these chips for a small fraction of the price of the RTX parts to grab market share. Furthermore, TSMC’s “12nm” is the 16nm process with a couple of tweaks and a marketing name change.

As for the data center you should have paid attention to Nvidia’s most recent earnings beat where the data center was actually down a good bit but overall results beat the street and –unlike AMD — Nvidia just raised it’s outlooks for the rest of 2019.

Neutronbeam
Guest

That is one hell of a review Zak! Excellent work; well done!

Krogoth
Guest
Krogoth

It was worth the wait.

Waco
Guest
Waco

It’s good to see essentially price-parity on the XT / 2060 Super. That’s good news for anyone in the $400 and below market. The standard 5700 looks to stand on its own pretty handily between the 1660 Ti and 2060 Super, it’s slightly better value than either if you look at 99th% FPS / dollar.

Frenchy2k1
Guest
Frenchy2k1

TR left out the RTX2060 which will continue at $350 and is in direct competition with AMD 5700.
The other “SUPER” cards supersede the previous models (RTX2070 SUPER completely replaces the original 2070), but the original 2060 will continue.
It seems TR was not sourced one and hence did not include it in the graph.

Pin It on Pinterest

Share This