Revisiting the Radeon VII and RTX 2080 at 2560×1440

Our initial review of AMD’s brand-new Radeon VII graphics card relied on a punishing combo of a 4K resolution and high-dynamic-range output to bring our field of graphics cards to its knees, and folks, let me tell you: It is a glorious thing to experience smooth 4K gameplay with HDR on an OLED TV over a few days’ worth of gaming with the latest titles.

Our 4K results offer a robust idea of the relative performance offered by today’s graphics cards for the dollar, but I understand that some gamers really want cold, hard numbers for how many FPS they can expect from a given card and title at their preferred resolution. I’ll also concede that I do most of my gaming on 2560×1440, 144-Hz screens with variable-refresh-rate guts to begin with. While those panels might not provide the in-your-face color and brightness of an OLED TV, few things can, and I’ll just as happily take my gaming in high-refresh-rate, tear-free flavor as I will at a higher resolution and lower frame rates.

Certainly the most-requested missing piece from our Radeon VII review was 2560×1440 results, and it just so happened that we had a slightly older data set full of 2560×1440 captures from our recent review of Asus’ Strix RTX 2070 (which is well worth reading in its own right if you’re a 2560×1440 gamer). I don’t usually like commingling data sets produced with older and newer drivers, but given the distinct leap in performance from RTX 2070-class graphics cards to the RTX 2080 and Radeon VII, the single-digit performance differences afforded by most graphics driver updates probably won’t mess up our relative standings too much. The competition between the Radeon VII and the RTX 2080 is a lot closer on paper, though, so I retested the RTX 2080 with Nvidia’s latest drivers to keep things fair. Let’s keep this short and sweet.

Our testing methods

If you’re new to The Tech Report, we don’t benchmark games like most other sites on the web. Instead of throwing out a simple FPS average—a number that tells us only the broadest strokes of what it’s like to play a game on a particular graphics card—we go much deeper. We capture the amount of time it takes the graphics card to render each and every frame of animation before slicing and dicing those numbers with our own custom-built tools. We call this method Inside the Second, and we think it’s the industry standard for quantifying graphics performance. Accept no substitutes.

What’s more, we don’t typically rely on canned in-game benchmarks—routines that may not be representative of performance in actual gameplay—to gather our test data. Instead of clicking a button and getting a potentially misleading result from those pre-baked benches, we go through the laborious work of seeking out test scenarios that are typical of what one might actually encounter in a game. Thanks to our use of manual data-collection tools, we can go pretty much anywhere and test pretty much anything we want in a given title.

Most of the frame-time data you’ll see on the following pages were captured with OCAT, a software utility that uses data from the Event Timers for Windows API to tell us when critical events happen in the graphics pipeline. We perform each test run at least three times and take the median of those runs where applicable to arrive at a final result. Where OCAT didn’t suit our needs, we relied on the PresentMon utility.

As ever, we did our best to deliver clean benchmark numbers. Our test system was configured like so:

Processor Intel Core i9-9900K
Motherboard MSI Z370 Gaming Pro Carbon
Chipset Intel Z370
Memory size 16 GB (2x 8 GB)
Memory type G.Skill Flare X DDR4-3200
Memory timings 14-14-14-34 2T
Storage Samsung 960 Pro 512 GB NVMe SSD (OS)

Corsair Force LE 960 GB SATA SSD (games)

Power supply Seasonic Prime Platinum 1000 W
Operating system Windows 10 Pro version 1809
Graphics card Boost clock

(specified)

Graphics driver version
EVGA GeForce GTX 1070 SC2 Gaming 1784 MHz GeForce Game Ready 417.35
Nvidia GeForce GTX 1070 Ti Founders Edition 1683 MHz
Nvidia GeForce GTX 1080 Founders Edition 1733 MHz
Nvidia GeForce GTX 1080 Ti Founders Edition 1582 MHz
Gigabyte GeForce RTX 2070 Gaming OC 8G 1725 MHz
Asus ROG Strix GeForce RTX 2070 O8G Gaming 1815 MHz
Nvidia GeForce RTX 2080 Founders Edition 1800 MHz GeForce Game Ready 418.81
AMD Radeon RX Vega 56 1471 MHz Radeon Software Adrenalin 2019 Edition 19.1.1
AMD Radeon RX Vega 64 1546 MHz
AMD Radeon VII 1750 MHz Radeon Software Adrenalin 2019 Edition 19.2.1 for AMD Radeon VII

Unless otherwise specified, image quality settings for the graphics cards were left at the control panel defaults. Vertical refresh sync (vsync) was disabled for all tests. We tested each graphics card at a resolution of 2560×1440 and 144 Hz, unless otherwise noted.

The tests and methods we employ are generally publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.

 

Monster Hunter: World


Monster Hunter: World preyed upon the Radeon VII at 4K, but dropping the resolution back to 2560×1440 allows the Radeon to go from borderline unplayable frame rates to a perfectly enjoyable experience. The Radeon VII’s frame rates come in just short of 60 FPS on average, and frame times remain consistently low for AMD’s 7 nm baby across the board. The RTX 2080 comes a lot closer to delivering a near-perfect 60-FPS experience at 2560×1440, though, as evidenced not only by its average frame rate but also its 99th-percentile frame time.


These “time spent beyond X” graphs are meant to show “badness,” those instances where animation may be less than fluid—or at least less than perfect. The formulas behind these graphs add up the amount of time our graphics card spends beyond certain frame-time thresholds, each with an important implication for gaming smoothness. To fully appreciate this data, recall that our graphics card tests all consist of one-minute test runs and that 1000 ms equals one second.

The 50-ms threshold is the most notable one, since it corresponds to a 20-FPS average. We figure if you’re not rendering any faster than 20 FPS, even for a moment, then you’re likely to perceive a slowdown. 33 ms correlates to 30 FPS, or a 30-Hz refresh rate. Go lower than that with vsync on, and you’re into the bad voodoo of quantization slowdowns. Also, 16.7 ms correlates to 60 FPS, that golden mark that we’d like to achieve (or surpass) for each and every frame.

In less demanding or better-optimized titles, it’s useful to look at our strictest graphs: 8.3 ms corresponds to 120 FPS, the lower end of what we’d consider a high-refresh-rate monitor. We’ve recently begun including an even more demanding 6.94-ms mark that corresponds to the 144-Hz maximum rate typical of today’s high-refresh-rate gaming displays.

Our time-spent-beyond graphs let us put a point on just how close the RTX 2080 comes to delivering a perfectly smooth ride in this title. The RTX 2080 spends under two-tenths of a second on frames that take longer than 16.7 ms to render, while the Radeon VII spends nearly six seconds on such frames. Dropping back to 2560×1440 certainly renders Monster Hunter: World quite playable on the Radeon VII, and indeed, it’s the best AMD card around for playing this title at that resolution. Still, the Radeon VII isn’t beating out the GTX 1080 Ti for delivered smoothness in this title, much less the RTX 2080.

 

Hitman 2

Hitman 2 may not have the DirectX 12 rendering path of its predecessor, but it can still put the hurt on any modern graphics card. We cranked image quality settings to the max to fully flesh out the game’s world of assassination.


Despite the lack of a DirectX 12 rendering path in this title, the Radeon VII tracks the RTX 2080 well once we take HDR and 4K out of the picture. The AMD card’s average frame rate only slightly tails that of the RTX 2080, and the two cards’ 99th-percentile frame times are nearly identical.


Neither the Radeon VII nor the RTX 2080 put any time up on the time-spent-beyond-16.7 ms board, so we have to flip over to our 11.1 ms and 8.3 ms thresholds to put any light between them. Even then, the Radeon VII spends just a hair longer on frames that take longer than 11.1 ms to finish than the RTX 2080 does, and just about a second longer than its Turing rival to wrap up frames that need more than 8.3 ms to come out of the oven. Either of these cards will make high-refresh-rate 2560×1440 gamers happy.

 

Far Cry 5


Despite flying Ryzen and Radeon colors in its splash screen, Far Cry 5 doesn’t hand a total victory to the red team. The differences in average frame rates between the GTX 1080 Ti, Radeon VII, and RTX 2080 are going to be near-invisible in this title, though, and the Radeon VII impressively turns in the best 99th-percentile frame time of the bunch.


At 16.7 ms and 11.1 ms, the Radeon VII and RTX 2080 put only the barest slivers of time on the board thanks to a few intermittent frame-time spikes. To really tease out differences between these cards, we have to check out the 8.3 ms threshold. There, the Radeon VII turns in a slight lead over the GTX 1080 Ti and spends only two-tenths of a second longer past 8.3 ms than the RTX 2080 does. We’ll call it a wash.

 

Forza Horizon 4

Forza Horizon 4 drops drivers into a lovingly rendered rendition of the English countryside. Consequently, it can be rather demanding on graphics hardware if you put the quality-settings pedal to the metal. That’s just what we did to see how our graphics cards perform in this title.


We know from our 4K testing that using MSAA on the Radeon VII can put a greater strain on its resources than the pixel-shader-powered FXAA does. Any hope that dialing back the resolution might have held in this title for the Radeon VII gets dashed by our average-FPS and 99th-percentile frame time results, though. It seems like we’re still running into some kind of ROP bottleneck.


The Radeon VII’s struggles with MSAA in Forza Horizon 4 become painfully evident in our time-spent-beyond-11.1 ms graph. The Radeon VII spends nearly six and a half seconds longer than the RTX 2080 does on frames that take longer than 11.1 ms to finish, and that results in noticeably rougher gameplay.

Since we know MSAA is a punishing technique for removing jaggies on the Radeon VII, I also did a quick test with FXAA enabled to see whether it turns any tables.

Choosing FXAA as our anti-aliasing method doesn’t change the standings between the RTX 2080 and the Radeon VII, but it does make life easier for both cards. Neither pixel-pusher puts up more than a second on tough frames that need longer than 11.1 ms to finish. Still, the Radeon VII can’t unseat the RTX 2080 even with its massive shader array doing the AA work, and at the 8.3 ms mark, the GeForce card spends about four fewer seconds of the 84-second built-in benchmark in this title chewing on tough frames that need longer than 8.3 ms to finish. That’s an improvement one can feel.

 

Assassin’s Creed Odyssey


The Radeon VII can pull alongside the GTX 1080 Ti in Assassin’s Creed Odyssey, another title that flies Ryzen and Radeon colors on its title screen. Our plot of 99th-percentile frame times puts the Vega 20 card closer to midpack for delivered gaming smoothness, though. The RTX 2080 remains our overall winner.


The Radeon VII’s spikiness in our frame-time graphs doesn’t cause it to put any time on the board at our most concerning thresholds for delivered smoothness, but it ultimately loses out to the RTX 2080 by spending just short of two seconds longer putting the finishing touches on frames that need longer than 16.7 ms to render. Even the two-year-old GTX 1080 Ti provides a smoother gaming experience by this measure.

 

Gears of War 4


I had high hopes for the Radeon VII in Gears of War 4 at 2560×1440 thanks to its DirectX 12 API and console roots. Even so, for some reason this Unreal Engine game really, really seems to like the Turing architecture at its most punishing settings. The RTX 2080 opens a wide lead on its Radeon competitor here, and it cuts 2 ms off the Radeon VII’s 99th-percentile frame time. That’s impressive delivered smoothness from the green team in this title.


The Radeon VII puts only half a second on the board for frames that take longer than 16.7 ms to finish, but a flip over to the 11.1 ms threshold puts a big exclamation point on Turing’s superiority in this title. The RTX 2080 cuts 6.4 seconds off the time that the Radeon VII puts on the board here, and again, that’s an improvement in gaming smoothness you can feel.

 

Battlefield V

On top of being the marquee title for Nvidia’s RTX effects so far, Battlefield V boasts a cutting-edge DirectX 12 renderer as part of EA’s Frostbite engine.


At 2560×1440, Battlefield V remains the sole win for the Radeon VII, and only then by a hair. Still, the Radeon outpaces the GeForce in both average FPS and delivered smoothness. Let’s see just how meaningful that result is with our time-spent-beyond graphs.


We have to click all the way over to the 8.3 ms threshold to see any time on the board from the Radeon VII or RTX 2080, and here, the Radeon VII shaves two seconds off of the RTX 2080’s time spent finishing tough frames. That’s a much-needed win for the Vega 20 card.

 

Conclusions

As always, we’ve created plots of price versus performance by taking the geometric mean of each card’s average FPS and 99th-percentile frame times in all of the games we tested and plotting that value against each card’s suggested price (if stock is no longer available) or against its retail price on Newegg (if stock is available). To make our higher-is-better approach work, we’ve converted the geometric mean of 99th-percentile frame times into frames per second.


I hate to say I told you so, but it’s true: Our relative standings in 4K testing do generally hold when we lower resolution and remove HDR from the picture. The GeForce RTX 2080 Founders Edition still outpaces the Radeon VII at 2560×1440 and offers a smoother, quieter, less power-hungry gaming experience while doing it. Our results with Nvidia’s Founders Edition card might not make the RTX 2080 look like the best value, but custom-cooled 2080s are widely available for just a few bucks over Nvidia’s $699.99 suggested price for partner cards. The Gigabyte RTX 2080 Gaming OC 8G we have in the TR labs is both faster and quieter than the already fast and quiet Founders Edition. Given that the Gigabyte card is just $10 more, that’s bad news for the Radeon VII.

In turn, nothing I wrote in my original conclusion regarding the Radeon VII changes with a different resolution in play. AMD’s 7-nm baby may be a superior sub-$1000 card for those who need a broad range of computing power or large pools of VRAM that you don’t get on consumer Nvidia cards, but gamers will be happier with Turing options at the moment. That’s especially true at 2560×1440, where the Radeon VII’s monster pool of RAM likely won’t get exercised as hard as it is at 4K by games alone. Unless your demands for graphics cards extend beyond pure pixel-pushing, the RTX 2080 remains today’s best dollar-for-dollar choice for fluid and smooth gameplay at 2560×1440.

Comments closed
    • BorgOvermind
    • 8 months ago

    I do not agree with the conclusion of this article.

    The price difference is higher than the performance difference in average (on what what tested in the article). The difference is not justified only for one obscure car game.

    • danny e.
    • 8 months ago

    edit. too political. forgot about the end of the video.

    • Kretschmer
    • 8 months ago

    Two thoughts:
    (1) My 1080Ti has aged quite well, almost two years out.
    (2) The Vega GPUs aged more like “prison wine” than “fine wine.” Especially for those who paid the crypto-era premium.

      • K-L-Waster
      • 8 months ago

      I concur about point 1 — I’ve had the same experience with my 1080 TI.

      Not as convinced about point 2 — most people who report having Vegas haven’t had bad things to say about them that I recall.

        • designerfx
        • 8 months ago

        Not only this, but every card was crazy inflated. Not just vega GPU’s. So, I wouldn’t agree that something magically aged better or worse. If anything, you can still get a vega 64 today below the price of a 1080ti.

          • jihadjoe
          • 8 months ago

          lol It’s supposed to be below the price of a 1080ti!

      • BurntMyBacon
      • 8 months ago

      (1) Yes. My experience agrees with yours. My 1080Ti has aged surprisingly well.

      (2) Not with you here. Granted I didn’t get my Vega64 until later last year when the prices were dropping like mad, but relative to my 1080(not Ti) it seems to perform better than most of the original benchmarks showed. In a relative sense, I find the Vega64 beating the 1080 more often than not where early on IIRC it was the other way around. Granted the systems aren’t exactly equal, but I’m not sure how much the CPU plays into it in most of the games I’m comparing. The 1080 is paired with a Haswell era i7-4790K. The Vega64 is paired with a 1st gen Ryzen R7-1700X. The 1080Ti is sitting on a Skylake era i7-6700K.

      It seems to me that one of a few scenarios has occurred:
      (1) The Vega64 has in fact aged better than the 1080Ti, but it is hard to see given they are simply in different performance categories.

      (2) The 1080Ti has aged better than Vega64 which in turn has aged better than the 1080. This would mean that two cards from the same company, with the same drivers, both at the high end, and of the same base architecture aged significantly differently. Given that they are still two different pieces of silicon, it is possible if not extremely likely.

      (3) A little of both. I’m imagining that the 1080Ti has aged pretty similarly to a Vega64 and notably, but not vastly better than the 1080.

      That all said, my 1080 still performs admirably. Performance hasn’t suddenly regressed or anything “suspicious”.

      • Concupiscence
      • 8 months ago

      On #1, yeah, Nvidia’s recent parts have been solid. I sold my GTX 1070 Ti when a friend gave me a pair of GTX Titan X’s, but there wasn’t anything that card failed to do in my experience. Barring RTX unexpectedly becoming a must-have, I don’t foresee a 1080 Ti becoming an unattractive investment for a long time.

      I can’t really agree on #2. If you’re using Linux, recent kernel versions and improvements to Mesa and Vulkan have made them *very* capable, and their compute performance for OpenCL apps via ROCm is very high. And even in Windows they offer solid DirectX 12 and Vulkan support, and the DX11 speed isn’t the kind of thing you’d hide your head in the sand about. I wish they ran cooler, but that’s not something that can be patched out.

    • maroon1
    • 8 months ago

    RTX 2080 still winning in performance, power consumption, noise level

    People are criticizing ray tracing and DLSS. But even without these features, RTX 2080 is better. And it came out 5 month before Radeon 7 came out

      • Chrispy_
      • 8 months ago

      Now that Nvidia supports freesync there is very little point in buying an AMD GPU.

      As you said; Slower, hungrier, noisier, and lacking features that Nvidia has (regardless of how useful those features are). Radeon VII needs to come down in price to be even vaguely desirable against the already rather overpriced RTX models. It’s a lot to pay for GCN architecture that is stuck in 2012.

      Possibly Radeon VII is better as a workstation compute card, but nobody is testing it for that purpose since AMD have shouted that it’s a gaming card and bundle it with games.

    • wingless
    • 8 months ago

    I have a feeling that AMD’s performance issues are 100% driver related. The hardware is probably as fast or faster than the 2080, but the drivers make it lose. AMD’s driver team needs to have some investment put in them to recruit talent.

      • Spunjji
      • 8 months ago

      I don’t think that’s true. All indications are that GCN has some fundamental architectural limitations that prevent the hardware resources from being as well-utilised as anything Nvidia have put out since Maxwell.

        • Krogoth
        • 8 months ago

        It is geometric limited at the pipeline’s front-end. It can only do four triangles per pass since GCN 1.0. Nvidia moved way passed this with Maxwell and newer.

        GCN’s main strength has been shading operations. AMD RTG attempted to harness its excessive shading prowess via software (primitive shaders, NGG+). It only made traction in some professional-tier 3D applications. AMD RTG eventually gave up (not enough fiscal resources) and left it to third-party developers to exploit which is highly unlikely due to Nvidia’s massive clout and mindshare.

          • Spunjji
          • 8 months ago

          It’s nice to get an occasional view on what the hell happened to Primitive Shaders. I find it funny that AMD have continued ATi’s track record of building in theoretically useful geometry-spewing widgets that never get used by developers.

    • rUmX
    • 8 months ago

    Don’t know if this was posted yet or not, but TR did do such a resolution and refresh rate poll about 2 years ago. We could use a new one for 2019.

    [url<]https://techreport.com/news/31542/poll-what-the-resolution-and-refresh-rate-of-your-gaming-monitor[/url<]

    • euricog
    • 8 months ago

    [b<]That scatter plot is really missing a 2060[/b<] as it's selling pretty close to Vega 56's pricing (checked on EU zone). It suddenly became an option for me as it now supports my Freesync monitor and belongs to a much more realistic/sensible price range (even though the tier it belongs to is still more expensive than in previous generations). I understand that TR can't review every single GPU class for our gratuitous pleasure, but was wondering anyway if there are plans to review it?

      • euricog
      • 8 months ago

      bump

    • ermo
    • 8 months ago

    I just got my second Vega 64 to put in my “new” 32GB i7-4790k box. The first was bought new and this one was bought used at a decent price.

    The whole shebang is going to be hooked up to a 32″ 4K monitor (sadly no FreeSync) and the goal is to undervolt the GPU cores to 1050-1100 mV and underclock until the boost clocks are stable (currently I’m at -1%) while undervolting the RAM to 1000 mV and clocking it to 1080 MHz.

    We will see where I need to set the power limit, but the goal is to not exceed +15%, which is the stock Turbo setting limit. From experience, this will give me +10% performance in aggregate for the same power envelope.

    I’ve got my eye on a Pimax 5K+ headset (2x 2560×1440 @ 90Hz) for use with racing and flight sims. The reason I’m going with AMD is that I want to support their OSS driver strategy on Linux.

    In the end, I’ll have 16 GB of HBM2 RAM and 8192 shader cores and it’ll have cost about the same as a VII. It’ll also use substantially more power, but there you go — can’t have it all.

      • enixenigma
      • 8 months ago

      Are they reference cards? What are your observed GPU clocks during gaming using those settings?

      My reference V64 can normally hold around 1620MHz at 1020mV P6 voltage, but it also has a huge aftermarket cooler. I was just curious what you’ve seen with your cards.

        • ermo
        • 8 months ago

        One is reference (the oldest one, which I bought new at then current 1080 prices 18 months ago) and the other is a 4 months old PowerColor Red Devil with a triple slot, triple fan cooler that I got at a nice markdown due to the recent Vega 64 price cuts.

        I haven’t got around to actually gaming with the system yet, but the reference cooler Sapphire card looked like it was running at ~1450 MHz during the Unigine Superposition 4K-optimized benchmark as the secondary card (it used to run at 1530-50 as the primary card).

        The primary card with the bigger cooler ran at 1530-1550 MHz. I used the Unigine Valley CrossFireX profile. My score was 6550-ish from a single card and 11600-ish with two cards as I recall (I’m not on my Windows box right now, so can’t verify).

        The system is a 4790k@4.6 GHz turbo with DDR3-2133.

      • TheMonkeyKing
      • 8 months ago

      Geez, a second Vega? I’d like to have your paycheck…

        • ermo
        • 8 months ago

        Nah, I saved up for it.

    • chuckula
    • 8 months ago

    For all the whinging about resolutions… I just setup muh RX-560 running my 2560×1440 display at 144 Hz.

    Easiest plug-n-play setup ever thanks to Linux working great with both Intel & AMD open source drivers.

    Oh, and hardware acceleration of HEVC-encoded 4K video @ 60 Hz is working great too thanks to VAAPI.

    • maxxcool
    • 8 months ago

    “I hate to say I told you so” .. 🙂 did you *really* hate saying it /wink/

    • LostCat
    • 8 months ago

    Thanks Jeff! (I’m trying to stay out of the….whatever it is in the comments.)

      • Mr Bill
      • 8 months ago

      Discussing a “Resolution” you might say.

        • LostCat
        • 8 months ago

        I know we have some weird people here and I’m one of em but sometimes it’s just like….what?

    • rnalsation
    • 8 months ago

    Thank you for the 1440 follow-up. But when are we going to see new year’s resolution testing? Tens if not hundreds of millions of people worldwide use new year’s resolution, I just don’t see how 4k or 1440 even compare.

      • Wirko
      • 8 months ago

      The short half-life of new year’s resolutions makes testing very difficult, and it becomes next to impossible by February.

        • rnalsation
        • 8 months ago

        It seems like the true test of the new year’s resolution would be some time after the new year with incremental data points gathered throughout the year. Almost like the SSD endurance test.

          • K-L-Waster
          • 8 months ago

          Meh, a straight vertical line dropping to 0 by Jan 3rd isn’t very interesting…

    • ronch
    • 8 months ago

    I bet it’ll do better at 640×480.

    • tacitust
    • 8 months ago

    I’m not sure AMD cares that much about the shortfall in performance at this point. No doubt they knew before they announced the card it was going to be tough to match the RTX 2080 in terms of performance, noise, and power.

    I suspect, having left Nvidia owning the flagship tier for so long, they felt increasing pressure to get something out there to prove to their partners (console makers, card makers, etc.) they are still competing and capable of putting out a solution in the high performance space.

    With Navi still months away, and only targeting mid-tier solutions in the first round of products (according to a leaked roadmap), it could be another year before they have another bite at that cherry, and there’s no guarantee it will match what Nvidia has up their sleeve to counter it.

    So we should not undervalue the presence of the Radeon VII to AMD, even if they don’t make much money out of it. As a stop-gap, it’s worth more to them than the actual dollars they make from the sales.

      • Freon
      • 8 months ago

      > mid-tier

      > $699

      Oof.

        • RAGEPRO
        • 8 months ago

        Tacitust said Navi is targeting mid-tier solutions at first, not that the Radeon VII is mid-tier.

        • Anonymous Coward
        • 8 months ago

        Like a phone!

      • Kretschmer
      • 8 months ago

      I mean, AMD is unlikely to sell enough defective workstation GPUs to move the needle. This is just about keeping their mindshare alive.

    • bjm
    • 8 months ago

    These cards were not using the same drivers. You have to use the AMD drivers with the RTX 2080 to put these on equal footing. If you do this, you will see that the Radeon VII runs better on the AMD drivers than the RTX 2080 does. Nice trick TR, you don’t fool me.

      • chuckula
      • 8 months ago

      You shill!
      We need to be fair to everyone and only use the Intel drivers for both cards!

      I guarantee we’ll get the most consistent performance possible!
      As an added bonus, the review will be finished in record time since the benchmarks will complete in record time!

    • cegras
    • 8 months ago

    I think 99th percentile 8.33 ms is more relevant than 4K gaming, especially with the advent of Gsync, Freesync, and Free-G-Sync.

      • Waco
      • 8 months ago

      All three of those technologies reduce the impact of lower framerates as well.

        • cegras
        • 8 months ago

        Kinda. My old RX480 on BF1 all low settings achieved about 100 FPS average, but regularly dipped in stressful scenes. Even within the Freesync range, the low frame rate severely held me back. I would say that *-sync technologies help to promote high refresh gaming, but it’s really sustaining high fps which is the game changer.

          • Waco
          • 8 months ago

          Absolutely – but for tear-free low framerate compensation, all three work wonders in perceived smoothness. A new monitor is on my very short list after seeing how fluid even 30-40 FPS can look on a *sync monitor.

    • thedosbox
    • 8 months ago

    Still no measurement of Feline Purrs Second?

    • moose17145
    • 8 months ago

    I was mainly curious if dropping the resolution down to 2560×1440 would smooth out some of the frame spikes we saw at 4K. And in Far Cry 5, when comparing 4K to 2K, it does indeed seem to go a long ways to smoothing out the frame time plot.

    Obviously that is not the case in every game (and obviously not every game had spikiness), but it was something that I was curious about none the less, and now my questions have been answered.

    Thank You Very Much Jeff! I very much appreciate it!

    And I cannot help but think that if the Vega VII were priced about $100 or $200 lower that it would actually be putting quite a bit of hurt onto NVidia. But that $700 MSRP makes it quite a bit more of a harder sell. That being said, GamersNexus’ teardown and analysis of the card did reveal a rather robust and nice Power Delivery system that accounts for a decent chunk of the cards price (typical of reference AMD cards), and I am sure that Vega VII GPU and 4 HBM stacks cannot be cheap either. So there may very well just not be enough overhead for it to be feasible for them to sell it $100 dollars cheaper (let alone 200 cheaper). But if there is some headroom in there, it might actually do them some good to sell it a bit cheaper and get it’s gaming performance notched into a friendlier price bracket.

      • K-L-Waster
      • 8 months ago

      [quote<]That being said, GamersNexus' teardown and analysis of the card did reveal a rather robust and nice Power Delivery system that accounts for a decent chunk of the cards price (typical of reference AMD cards), and I am sure that Vega VII GPU and 4 HBM stacks cannot be cheap either. So there may very well just not be enough overhead for it to be feasible for them to sell it $100 dollars cheaper (let alone 200 cheaper).[/quote<] That's the consequence of repurposing a professional workstation card -- some of the components have an inherent price point that's higher than consumer cards normally have. AMD is likely pricing it where it is because to price it lower would end costing them more than they can justify. There are already some stories that they're selling the card at a loss--not sure if those are true or just Interwebz rumorz, but even if they aren't true there probably isn't a lot of margin at this price point given the quantity of HBM and the power delivery system.

        • enixenigma
        • 8 months ago

        To be fair, AMD’s reference cards have been well-built for some time, barring fan/cooling choices. Power delivery, in particular, has been very good. If anything, the additional complexity from the wider memory interface and the increased memory in general, as well as the fact that it is a 7-nm chip, are the main cost drivers.

          • moose17145
          • 8 months ago

          From what it was sounding like, each Vega VII has over 100 dollars just in power regulation components alone. That’s a pretty significant chunk of the 700 dollar MSRP.

          I agree that the 7nm process, as well as the HBM Memory (that entire die+HBM package as a whole) are a seriously significant chunk of the expense of the card (likely the biggest expense by a far and wide margin), but even still… 100+ bucks just in VRM components is a pretty hefty and non-insignificant chunk of change.

          I was mainly alluding to the fact that (performance aside) the reference board seems to have a nice build quality and should last quite a while. So for those purchasing this card should at least take comfort in knowing they (likely) have a well build product that will last quite a while. But that also means there likely isn’t enough of a margin in the reference design to make it much cheaper. Which I am not sure how to feel about that. On the one hand I feel like they need to get the price of this card down 100 or 200 dollars to place it into friendlier territory. On the other hand I dislike the idea of a down-specced board just to get the price down because I do prefer quality products (who doesn’t?).

          In the GamersNexus video they show how AMD already removed 4 phases just to get the costs down as is (alone with removing the BIOS switch). But man oh man if they had left all 14 power phases fully enabled on this card… how sweet would that have been? Sigh… the give and take of the real world…

          Maybe I am just trying to find the positives of this card because the comments section is so riddled with negativity over it. Is it’s gaming performance on par with the competition? Ehhh yea I will concede that could be better and that even despite NVidia’s pricing of the RTX cards they are sitting prettier than the Vega VII… but at least for those buying it, they seem to be getting something that was built with some seriously quality power delivery components.

      • psuedonymous
      • 8 months ago

      [quote<] That being said, GamersNexus' teardown and analysis of the card did reveal a rather robust and nice Power Delivery system that accounts for a decent chunk of the cards price[/quote<] The price of some extra beefy FETs and nicer power controller ICs is absolutely miniscule in comparison to the GPU die and packaging. Even with design factored in, it's a single-digit dollars premium over 'bare basic' VRM. The RTX series also have similarly overspecced VRM, also due to the workstation heritage (same as the Titan V). [quote<]typical of reference AMD cards[/quote<] *coughRX480cough*

      • ColeLT1
      • 8 months ago

      “And I cannot help but think that if the Vega VII were priced about $100 or $200 lower that it would actually be putting quite a bit of hurt onto NVidia.”

      I would agree with you there, except since they have so few to sell, may as well maximize profit as long as they are selling out.

        • moose17145
        • 8 months ago

        I agree. But that is typical of any new graphics card launch for the first few weeks / month.

        I was thinking long term once the initial “Oh a new shiney!” sales wear off.

          • ColeLT1
          • 8 months ago

          I’m under the impression that these are all a limited run of failed Radeon instinct MI50/60 and are not quite mass produced like typical graphics cards.

      • DPete27
      • 8 months ago

      $350 in HBM alone from what I’ve heard.

        • Spunjji
        • 8 months ago

        Great Scott!

          • K-L-Waster
          • 8 months ago

          Not sure what Damage has to do with HBM, but ok…

    • Ikepuska
    • 8 months ago

    Being able to see both sides, I chose not to comment on the 4k vs 1440 argument earlier.

    However, having said that I wish to point out something, that applies to me and how I read reviews, which may be idiosyncratic or may be more common than I know. I read reviews for two reasons. The first is to identify, within a given budget, which card is best from a $/perf perspective. The other is that I like to use reviews as part of my whole system analysis. I often use TRs frame time analysis to decide if [i<][b<]I[/b<][/i<] could live with a given level of performance in a given scenario. It's terrific for that, because I often find that a given card may be 'marginal' up or down, and I can choose to live with that for savings put towards other nice to haves in a build depending on the type of games I'm into lately. I also build computers for friends and family, and they all have very different sensitivities to issues and the frametime graphs help me decide the best trade off for them. So for those scenarios I'm not looking strictly at the scatter plot, or relative rankings, I'm looking at the absolute performance of a given card for certain indicative games so that I can make a decision. Whether that makes me an oddball, or if others think the way I do I don't know. ETA: And with those 2cents put in, I'd like to thank Jeff for another terrific writeup!

    • Sputnik7
    • 8 months ago

    Perhaps we could have a new poll on the TR main page asking at what resolution and refresh rate do people game?

    I’m part of the 2K (1440p) / 144hZ/ G-sync gang, and have no interest in upgrading to 4k until I can get the same experience for only *marginally* more money (distance-to-screen vs. screen size vs. resolution arguments aside)

      • K-L-Waster
      • 8 months ago

      It does sound like a great front page poll topic.

        • Sputnik7
        • 8 months ago

        I don’t think it has to be extensive either:

        # of vertical pixels only

        1080p / 60hz
        1080p / >60hz
        1080p / g-sync
        1080p/ free-sync

        1440p (2k) / 60hz
        1440p (2k)/ >60hz
        1440p (2k)/ g-sync
        1440p (2k)/ free-sync

        2160p (4k) / 60hz
        2160p (4k) / > 60hz
        2160p (4k) / g-sync
        2160p (4k) / free-sync

        I would avoid HDR/ultra-wide, or put them as an “other” category

      • f0d
      • 8 months ago

      something like this?
      [url<]https://techreport.com/news/31542/poll-what-the-resolution-and-refresh-rate-of-your-gaming-monitor[/url<] im part of the 1080p 144hz hoarde, i dont notice as much difference going up in resolution as much as i do refresh rate and i like to keep my pc budget low while being able to keep my fps high

        • Sputnik7
        • 8 months ago

        Yeah! I’d love to see a repeat of it to see how it’s changed over time, or if it has changed at all.

        We’re looking at a new generation of GPUs, as well as a proliferation of high-refresh-rate monitors and syncing tech.

          • f0d
          • 8 months ago

          i have a sneaking suspicion that it wouldnt have changed much
          most people keep their monitors for much longer than any other part of their computer and one year isnt a long time as far as monitors are concerned

          imo eye candy types usually keep pushing their resolution and people that prefer fps will keep pushing more hz

          id personally like to see how many people have or would be thinking about joining the ultrawide side of the table to be honest

          edit: oh shii thats almost 2 years…. time flies.!

            • Sputnik7
            • 8 months ago

            And it’s been a pretty busy 2 years, I’m curious as to how many people have changed their monitors thanks to various technology proliferations.

            • JustAnEngineer
            • 8 months ago

            At 1440p144, you can have both image quality and high refresh rate.

            • f0d
            • 8 months ago

            I actually prefer ultrawide screens now and would still rather 1080p ultrawide than going up to 1440p
            My budget can’t handle a 1440 ultrawide graphics card that can keep up the minimum frame times that I want

            That said even if I could afford a 2080/r7 I would still just stay at 1080p – it’s enough image quality for me

            • JustAnEngineer
            • 8 months ago

            31½” flat 2560×1440 VA panel with 48-144Hz FreeSync range and very low input lag: [url=https://www.newegg.com/Product/Product.aspx?Item=0JC-0081-00019<]$350[/url<] today.

            • f0d
            • 8 months ago

            yet the graphics card to drive it would be much more expensive than the monitor every time i buy a graphics card

            like most of us here i build many computers for other people and have seen a fair few 1440 and 4k monitors and i really dont see the justification of the added expense (of the graphics card every time) for something that (for me) makes very little difference when playing games

            if you can justify it then good for you 🙂

            different people prefer different things

            • K-L-Waster
            • 8 months ago

            How powerful a GPU do you need to drive 3840×1080 at high refresh rates? How much more powerful does it need to be to do 2560×1440 at high refresh rates?

            (I’m thinking they’d be pretty similar.)

            • Voldenuit
            • 8 months ago

            f0d says he’s gaming at 2560×1080. 2560×1440 is 33% more pixels and 3840×1080 is 50% more pixels (the more common 3440×1440 is +79%). Jumping to 4K would be 3x the number of pixels.
            I think any of those jumps are significant when performance deltas between GPU tiers is typically in the 20-40% per tier. Any of those jumps would require upgrading the GPU by one or more tiers to maintain current performance.

            EDIT: Granted, performance doesn’t scale linearly with resolution unless a game is fillrate bound, which doesn’t happen anymore. But the resolution isn’t a bad baseline for worst-case performance penalty.

            • f0d
            • 8 months ago

            also 2560×1440 isnt ultrawide is it?
            for that id need 3440×1440 or 3840×1600

            theres no way im going back to 16:9

            • JustAnEngineer
            • 8 months ago

            There’s no way I’m going back to only 1080 vertical pixels.

            • Mr Bill
            • 8 months ago

            Agree, I paid $350 for my 24″ BenQ 241VW rocking 1920×1200 and its still going strong since 2007.

            • Spunjji
            • 8 months ago

            Tree-fiddy well spent

      • Ifalna
      • 8 months ago

      I’m firmly entrenched in the 60Hz camp. Playing mostly slow games (MMOs) I didn’t notice anything when forcing 120Hz @ 1080p on my new 55″ TV.

      So I usually game @ 1440p or 4K if my GPU can handle it.

      @ 55″ you can definitely see an effect due to resolution but the upscaler does a nice job with 1440p material. Having 55″ if desktop space is pretty insane though. 😀

      • Sunburn74
      • 8 months ago

      Honestly still stuck at 1920×1080. Not really sure the extra pixels really make a difference

        • Waco
        • 8 months ago

        My advice – don’t try it. Once you do, the additional clarity in everything is pretty hard to jump back from.

          • f0d
          • 8 months ago

          for some people
          i have tried 4k tv’s and monitors and i just dont see the point of it personally

          im not going to be like doomguy and tell everyone its useless for everyone but just like most things it will have different impacts for different people
          just like how i could never go back to a 16:9 monitor and others dont see the point of 21:9 its different for different people

    • tipoo
    • 8 months ago

    [i<]baby MI50, doo doo doo doo[/i<],

      • Usacomp2k3
      • 8 months ago

      My daughter is obsessed with that. Amazon prime has a show pinkfang (sp?) that is very overplayed in our household.

    • DPete27
    • 8 months ago

    Jeff,
    I hate to be “that guy” in asking for more, but could you test even a couple titles at “High” instead of “Ultra” and report on the frame rate differences? Could just be a single GPU for all I care.

    Obviously, it’s in reviewers’ best interest to test all games on ULTRA, but it seems finding High settings data from reliable sources is a bit harder to come by.

      • synthtel2
      • 8 months ago

      I’ll second that. It’s easy to extrapolate to different resolutions, but when I play at medium and every review on the internet tends towards ultra, it’s tough to glean anything other than the relative stack-up from them (and even that could potentially shift quite a bit more than from resolution changes).

        • Spunjji
        • 8 months ago

        As someone who very rarely pushes a game’s settings above “high”, this appeals to me. Bearing in mind that I agree with you, though, I’ll offer a small counter-argument:

        The differences between each publisher’s definition of “medium”, “high” and “ultra” settings can make it utterly impossible to look at the results from one game and draw broader conclusions about how other games will perform with “similar” settings. Increases in resolution, on the other hand, tend to affect most games in a similar way (with some exceptions).

          • synthtel2
          • 8 months ago

          There will be a lot of variance from game to game, but the performance constraints vary in a pretty consistent way. At some particular level, it’ll all be about cramming as much eye candy as possible into a strict resolution/framerate target on consoles. At the level reviews tend to test:

          * A lot of the budget is likely to be eaten up by effects that don’t have to be as efficient. Image quality per development time is probably at least as important, but image quality per render time is a lot less important. If a game is slow on ultra and not slow on high, ultra has to be very slow indeed to provoke serious complaints, and it’s good for fancy screenshots regardless.

          * Games don’t have to be as consistent in the performance they do deliver. PC gamers can just turn settings down if they don’t like it, at least in theory, and many of us don’t use vsync, so the consequences for going a bit over budget are a lot less severe.

          * The hardware targets are different. A console would look pretty different from what the typical dude who habitually maxes everything owns even if you scaled everything to the same general power level.

          * Some parts of rendering just scale a lot better than others. If you’re doing old-school deferred shading with a fat G-buffer, odds are good that you’re generating that G-buffer the same way at all settings; nobody really wants to wrangle having two paths through that, and if it puts some harsh limits on performance at low settings, players will just have to deal with it. OTOH, something has gone terribly wrong if shadows in a modern game fail to scale. Review GPUs spend a lot more time on shadows and a lot less time on G-buffer generation than mine do.

          I expect some patterns might emerge pretty quickly, even if a lot of games don’t follow them.

    • Usacomp2k3
    • 8 months ago

    This puts the Radeon in a much more competitive light than the other review did.

      • chuckula
      • 8 months ago

      It definitely didn’t hurt the Vega 64!

    • enixenigma
    • 8 months ago

    Reading any of a multitude of other (objective) reviews on the web would’ve shown that changing the resolution (EDIT: to 1440; 1080 would be a CPU bottleneck wasteland) wasn’t exactly going to turn the competitive landscape on its head.

    That being said, I know that I mentally consider TR to be the final word for performance analysis. Thanks for going the extra mile to appease a small but very loud minority of the readership.

      • euricog
      • 8 months ago

      I agree it’s cool that TR puts the extra effort in, but don’t share the notion that sub-4K users are a minority. Has this metric been recently polled?

      Gaming in 4K makes zero sense to me (note: personal opinion). Need to spend so much more on GPU power to have less framerate with a barely noticeable increase in image quality. Seriously, how close do people shove their faces into the screen (while gaming) to even notice the pixel grid from 1440p upwards?

      Having used 4K screens at work (30″) and owning a 34″ ultra-wide (1440p), there’s no way in hell that I would choose a 4K over my ultra-wide. Less resource hungry, more immersive and excellent for productivity (emulates a side-by-side dual screen setup – without the annoying frame in-between).

      Again, this is my personal take, but it would be interesting to see where’s the majority of users at.

        • enixenigma
        • 8 months ago

        I did not mean to suggest that the majority of gamers play at 4k. That is obviously not true. (EDIT: I still haven’t made it to 2K myself) When I said ‘small but very vocal minority’, I more meant the people that were having a fit over the fact that 4k was the only resolution tested. Jeff explained his reasoning for doing so (to best highlight the performance differences between the cards tested), and I feel that it is an understandable position. Given limited testing time, I think he made the right choice. CPU bottlenecks can play a role in lower resolutions when dealing with ‘high-end’ cards. Is everyone going to agree? Of course not.

        More data is always better, and it isn’t wrong to desire or even request more data. I just think that there are better ways of going about it than some did.

          • Srsly_Bro
          • 8 months ago

          This guy gets it. So much of this site is outage and tribalism. Best way to get down votes is to disagree with an author of a post. It’s like an antifa party on here, ready to be outraged at anyone who disagrees.

          Disagree with something

          Oh, you didn’t like it??

          No, I just wanted more resolutions yesterday.

          So you don’t like it. Wtf is wrong with it. Everyone likes it.

          I just went to see the resolution I and 80% of gamers use.

          $&#$& you for not liking this review. It’s perfect in every way.

          That’s the discourse on this site. Tribalism and absence of rational thought in many instances.

          Snowflake and outage culture on a tech site.

            • YellaChicken
            • 8 months ago

            [quote<]This guy gets it. So much of this site is outage and tribalism[/quote<] Pretty sure that's not what enigma said. [quote<]Snowflake and outage culture on a tech site.[/quote<] Gotta love the irony of someone being so upset by downvotes that he goes round calling everyone else a snowflake.

            • K-L-Waster
            • 8 months ago

            Funny how those who protest the loudest about their right to free speech are also the first to resort to name calling as soon as anyone disagrees with them.

            • Srsly_Bro
            • 8 months ago

            Lol I’m not upset bro. Intrigued, actually. Disappointed, mostly.

            • Anonymous Coward
            • 8 months ago

            I’M NOT ANGRY, YOU’RE ANGRY.

            • K-L-Waster
            • 8 months ago

            STOP BEING ANGRY WHEN YOU’RE TELLING ME YOU’RE NOT ANGRY!!!!!!1!1!

            • Redocbew
            • 8 months ago

            I DON’T UNDERSTAND WHY IT NEEDS TO BE LIKE THIS.

            • auxy
            • 8 months ago

            I don’t know why I’m wasting my time with an obvious troll who should have been banned ages ago, but-

            You realize it’s possible to disagree with criticism without being a fanboy, right? (。-`ω-)

            You can say: “I want more resolutions” and someone else can say “no, I don’t think that’s necessary because [it won’t change the outcome /there’s no need for the reviewer to waste their time on that]” and that doesn’t make them a fanboy or “snowflake.”

            You get downvoted a lot because you say stupid and inflammatory things. I have never seen any comment chain on this entire site in all the years I have been here that proceeded in the way you’re describing, or even remotely close to it.

            For the record, I am even more disgusted at outrage culture than you are, and a vocal proponent of free speech. You’re barking up the wrong tree, though. There is no such “outrage and tribalism” here aside from a very few outliers such as yourself. If anything you’re the one engaging in “outrage culture” here!

          • euricog
          • 8 months ago

          Gotcha, my bad! I definitely misunderstood you.

          (wow, I took a while to return here and the thread quickly escalated!)

        • chuckula
        • 8 months ago

        The vast majority of gamers don’t use 4K and while 2K has good usage, it’s not a majority (or plurality) either. I’d put 1080p as the closest thing to a majority stake in resolutions.

        HOWEVER: Just because that’s true doesn’t mean TR was wrong to focus on 4K [b<]in a review of AMD's top-line $700 GPU[/b<]. In this niche, 4K performance makes sense. I didn't see anybody criticizing AMD's own marketing department when it only produced hard numbers at 4K resolution settings either.

          • K-L-Waster
          • 8 months ago

          Analogy time: the vast majority of people don’t use or need DSLR cameras.

          A ridiculous number of photos these days are selfies.

          This does not mean that the most appropriate way to comparison test DSLRs is to see which one makes the best selfies…

            • Voldenuit
            • 8 months ago

            However, it would be just as ridiculous to review a phone and not test its selfie capabilities.

            Even though I don’t take selfies, a large proportion of people do, and their interests are relevant to the review.

            • Redocbew
            • 8 months ago

            I can’t decide if an attempt at creating an objective methodology for taking selfies would be amusing or just really depressing.

            • K-L-Waster
            • 8 months ago

            True enough. So, to continue the analogy would we agree on phone==mid-range GPUs, and DSLR==high-end GPUs?

        • shaq_mobile
        • 8 months ago

        I agree. I switched to 4k and my girlfriend left me. I think the only reasonable outcome is to use 2k as benchmark res.

        According to the almighty steam survey, 60% of gamers play at 1080p, 4% play on the correct resolution, and 1.5% are on 4k.

        There are more people on 1280×1024 than 4k. I think we all know what that says about 4k.

          • enixenigma
          • 8 months ago

          Who is buying a $700 GPU to game at 1280×1024??

            • shaq_mobile
            • 8 months ago

            A man with nothing left to lose.

            Liam Neeson is…
            Man on 5:4

            • Srsly_Bro
            • 8 months ago

            Has anyone seen the movie Ponyo? There is a trend of Liam Neeson losing his daughters. Started with Ponyo and then taken series, unless there is prior movies with the same.

            • K-L-Waster
            • 8 months ago

            Losing one daughter is a tragedy. Losing two sounds more like carelessness.

            Losing lots, well… this is starting to sound like an insurance scam.

            • enixenigma
            • 8 months ago

            I love this community. Never change, guys.

            • Redocbew
            • 8 months ago

            He did say that he’s got a very specific skill set.

        • JustAnEngineer
        • 8 months ago

        [quote<] Gaming in 4K makes zero sense to me. Seriously, how close do people shove their faces into the screen (while gaming) to even notice the pixel grid from 1440p upwards? [/quote<] You just need a larger display. 🙂

          • Krogoth
          • 8 months ago

          You need a 4K HDTV unit (40-50+) though or projector though but you make a trade-off for refresh rate.

          Massive, high framerate 4K panels don’t exist yet.

            • Waco
            • 8 months ago

            Many 120 Hz 4K panels were shown off at CES this year. 🙂

            • Krogoth
            • 8 months ago

            Prototypes/demonstration units =! something you can buy off the shelves today.

            • Waco
            • 8 months ago

            I’m just saying they exist – they’ll be for sale soon. I know I’m eyeing a large 4K 120Hz Freesync 2 display.

            • JustAnEngineer
            • 8 months ago

            The most extreme example arrives February 24, for folks into Big F Gaming:
            [url<]https://www.amazon.com/dp/B07M888GH8/[/url<]

            • derFunkenstein
            • 8 months ago

            That thing is super cool

          • Spunjji
          • 8 months ago

          Or keener eyes..!

          I don’t get the whole “my eyes can’t tell the difference” line of argumentation. You never see people with protanomaly showing up exclaiming that rendering in full colour is a waste of GPU power because “you can’t see the difference between red and green”.

      • blitzy
      • 8 months ago

      And yet there’s a fairly noticeable difference between the 1440P and 4K performance of this card, as compared to the rest of the field. i.e. the performance delta is greater at 4K, the card is more competitive at 1440P. Goes to show that the 1440P data is worthwhile covering, as most people probably game nearer to 1440P. And if choosing a card to buy, it makes sense to look at how it will realistically perform, rather than how it performs theoretically.

      So while there are bottlenecks as would be expected, it was still useful to see, as it does impact the overall competitiveness of the cards.

    • not@home
    • 8 months ago

    When I was reading the original review, I thought 1440 would be a more relevant resolution to most TR readers, but it would probably scale to have the same results. So, I did not comment on it. Well, here is proof to that.

    • Krogoth
    • 8 months ago

    It changes nothing. The scaling is almost the same as 4K gaming. 2070 and 2080 are still better buys for 2560×1440/2560×1600 gaming. 1080Ti holds its ground despite being almost two years old.

    Vega 64/56 are better buys if you want to stick with red team and are at least obtainable. Radeon VII is just a stop-gap until Navi finally hits public channels.

      • Voldenuit
      • 8 months ago

      Recent news reports that the [url=https://www.techpowerup.com/252476/amd-radeon-vii-has-no-uefi-support<]Radeon VI is not UEFI compatible[/url<], causing some Windows Secureboot machines to lose their activation. This really sounds like it was rushed to market, and not only the chip, but the card itself was originally intended for MI50 and MI60 cards targeting Linux servers and workstations.

        • chuckula
        • 8 months ago

        Ouch! That issue — not some game benchmark win/loss — needs to be addressed with the highest priority.

          • enixenigma
          • 8 months ago

          [url=https://www.fudzilla.com/news/graphics/48126-amd-radeon-vii-to-get-one-click-uefi-ready-bios-update<]It is.[/url<]

          • Voldenuit
          • 8 months ago

          I hear AMD is working on a BIOS update to add UEFI.

          EDIT: Ninja’ed.

        • Krogoth
        • 8 months ago

        Doesn’t surprise me at all. It is probably some unmarked Instinct MI50s that didn’t sell that got rebranded with new stickers and packaging. AMD and card partners didn’t even bother to flash them at the factory with proper UEFI.

      • dragontamer5788
      • 8 months ago

      [quote<]Vega 64/56 are better buys if you want to stick with red team and are at least obtainable.[/quote<] Its true of NVidia too though. The 2060 is simply a better buy on the Green Team than the 70 / 80 / 80 Ti models. Vega 56 seems mostly sold out right now... EDIT: or at least, not at good prices. I'm seeing decent prices on Vega64 this week. Recommendations are very much on a "market price" basis, and changes very quickly.

    • chuckula
    • 8 months ago

    Thanks [again!] Jeff.

    A few points:

    1. This doesn’t change any major conclusions.
    2. I guessa the VII is a little better at 2K relative to the 1080Ti compared to the original 4k results.
    3. While the price-performance scatter plot is accurate, it’s certainly worth noting that there are less expensive (and potentially better performing) RTX-2080 variants on sale now, so it’s not a requirement to spend $100 more on the RTX 2080.

      • derFunkenstein
      • 8 months ago

      #3 gets its own paragraph in the conclusions right below the graph. 😉

      • gerryg
      • 8 months ago

      The price will change for R-VII’s too. RTX has a little more time on market, but R-VII will undoubtedly change in price as well. The Price/Performance plot will literally change every day (at least week) from here into eternity, if it were kept up-to-date. It is only useful to talk about performance and what you would be willing to pay compared to the next closest products as they are currently priced. And even performance will change periodically as both companies improve their drivers and developers learn to tune things for new hardware.

      At the time of the review, the price/performance plot clearly shows a linear relationship, and if market forces play out I suspect R-VII will, in general, remain in between 2070 and 2080 in both performance and price throughout the year.

      This doesn’t even take into account the value-adds like game bundles, helper apps, ray-tracing, extra memory (for future use… ray-tracing? VR?), total cost of VRR/LFC with a monitor, etc. Jeff & co. try to explain that at times and people should take it into account, but sometimes they fall prey to that thinking as well. It’s easy to try to boil everything down to a single number or relationship between two numbers and assume nothing else factors in and nothing will ever change.

    • derFunkenstein
    • 8 months ago

    Great, now Doom64Guy thinks if he complains loudly enough, TR will write an article just for him. 😆

      • chuckula
      • 8 months ago

      People think I’m annoying but I have nowhere near that level of power!

        • Srsly_Bro
        • 8 months ago

        I brought gtaV back. You just gotta take a different approach.

      • drfish
      • 8 months ago

      Remember today, and be ready to share this tale in the future. We act now so we don’t have to act again later.

        • chuckula
        • 8 months ago

        SO SAY WE ALL!

      • Srsly_Bro
      • 8 months ago

      Tbf, I made a few comments when gtaV was taken out of the tested games while being the best selling game. The game was eventually added back. Jeff is smart enough to see the logical connection.

      TR is the only site that I know of that only tests one resolution, and one that few game at. Don’t pick on the guy; he has a point in all of his ranting.

      Wouldn’t it make sense to give a review on a resolution the target audience uses and can relate to in tests? Even he (Jeff) admitted to gaming more at 1440P. This site is becoming like UC Berkeley, only choosing speech that everyone agrees with.

        • derFunkenstein
        • 8 months ago

        I think the existence of this article disproves your ridiculous accusation at the end. In fact, the article only bolster’s Jeff’s stance. Work the hard as hard as you can to tease out the differences. All this one did was prove that lower resolutions don’t change relative standings.

          • Voldenuit
          • 8 months ago

          The standings may not change (although I’m sure some of the deltas did). However, a prospective buyer is more interested in whether a card is fast enough to play a given set of games than they are in relative standings, and the 1440p results provide that for an additional set of users in addition to the original 4K numbers.

          I’ll be frank that all the cards tested were slower than I expected, I suppose that’s the penalty of Ultra graphics. And those framerates are fine for the single player games tested.

            • derFunkenstein
            • 8 months ago

            Sure, more data is always better, but as you pointed out there are already more nits to pick because everything was run at Ultra settings. As long as TR focuses on the much harder to parse frame time data using manual benchmarks (which that’s what the site should absolutely do) we’re going to have to live with the fact that one resolution and one setting gets done per game. I can read Anandtech or Tom’s or whatever if I want a million resolutions in a canned benchmark.

          • Srsly_Bro
          • 8 months ago

          I just said people want to see what they use and can relate to. I didn’t say the difference would be meaningful or give entirely different results.

            • Redocbew
            • 8 months ago

            So you think the needs, wants, hopes, and dreams of all of us should come before the inner details of the hardware? That’s an odd position to take when you hang out on a tech news site, but I guess you’re really just tough on the outside and all warm and fuzzy inside.

            • derFunkenstein
            • 8 months ago

            But you did it a pretty moronic fashion. Let me remind you of your conclusion:

            [quote<]This site is becoming like UC Berkeley, only choosing speech that everyone agrees with.[/quote<]

            • Srsly_Bro
            • 8 months ago

            Lol. It’s true. Speech critical of anything TR is met with down votes. The idea of other resolutions or modifications to testing is met with ridicule. If it’s not here, it’s on the way.

            • derFunkenstein
            • 8 months ago

            Because that’s the same thing as a government agency limiting free speech. The hyperbole is just too much. Srsly.

            • Voldenuit
            • 8 months ago

            I love fake internet points as much as the next person, but there’s always going to be a piling on effect for both up- and down-votes, it’s just the way ppl are on the web. /shrugs.

            EDIT: Some sites will hide posts that get beyond a certain number of downvotes, and they may have to do that to curtail trolls and misinformation. Fortunately TR doesn’t do that, and hasn’t needed to.

            • derFunkenstein
            • 8 months ago

            You took out the “freeze peach” thing and I was going to upvote it.

            • Voldenuit
            • 8 months ago

            I took out the freeze peach thing because I missed him ranting about UC Berkeley, so decided it would be inaccurate of me to mention it.

            • Srsly_Bro
            • 8 months ago

            The act is the discussion, not the legality. You can do better than that. The act was a result of everyone thinking the same and rejecting opposing thought. Kind of like this discussion.

            • derFunkenstein
            • 8 months ago

            We can know you’re wrong and still welcome discussion on other topics. 🙂

            • Srsly_Bro
            • 8 months ago

            I just want my resolutions, man!!

            • Ninjitsu
            • 8 months ago

            Not to mention, it misunderstands criticism as censorship.

          • BurntMyBacon
          • 8 months ago

          [quote=”derFunkenstein”<]All this one did was prove that lower resolutions don't change relative standings.[/quote<] Maybe I'm just crazy, but I seem to recall that the GTX 1080Ti beat the Radeon VII in the original tests: [url<]https://techreport.com/review/34453/amd-radeon-vii-graphics-card-reviewed/11[/url<] The gap between the Vega64 and RTX 2070 also narrowed significantly at the lower resolution suggesting that if the GTX 1080 was tested at 4K in the original article, it may have beat the Vega64 where it lost at 2560x1440. Can't say for sure, as it wasn't tested, but that is kind of the point. Certainly the relative standing of the Radeon VII didn't change with respect to the RTX2080, but other cards did. I'll let others debate whether it is worth the effort, but there does seem to be at least some merit to adding 2560x1440 to the standard test suite. I actually expected the Radeon VII would do a little worse at lower resolutions relative to its nVidia counterparts as I would normally expect a video card with both a RAM capacity and bandwidth advantage to excel at higher resolutions. That said, it shouldn't really come as a surprise that games don't really make use of the extra capacity yet. I suppose the Vega architecture just isn't capable of utilizing the extra bandwidth in gaming scenarios.

          • blitzy
          • 8 months ago

          but @ 1440P this card appears more competitive than @ 4K, so surely that does provide a worthwhile insight, if most gamers aren’t actually going to be running at 4K

          Personally I’m not that interested in this particular card, but it’s good to get a couple of quick 1440P benches thrown into the mix when comparing cards, so you can see how they will actually perform. (rather than theoretically at 4K)

            • Redocbew
            • 8 months ago

            More data = good

            I just hope you’re not thinking that comparing 1440p to 4k is like comparing a test of a real game to 3DMark. That really is comparing “actual” vs “theoretical” where as the same test at different resolutions are just different points on the same curve.

            • blitzy
            • 8 months ago

            Maybe that’s a bastardization of the word theoretical, but in essence what I am saying is that the majority of gamers won’t be using 4K in practice, and from their perspective performance at 4K is largely irrelevant, if not theoretical. And as shown by these benches, it appears that cards don’t necessarily scale linearly. So 4K performance is not necessarily representative of 1440P or 1080P data. In practical terms it’s generally close enough to make loose comparisons, but as we see here it pronounces the performance disparity between the cards moreso than it actually would be at 1080P or 1440P.

            -4K gaming requires a top tier gaming card which a majority of gamers don’t have, and even then by choosing to game at 4K you’re severely reducing your frame rate for a marginal quality improvement
            -4K monitors are prohibitively expensive, and are often aimed at productivity rather than gaming
            -There are very few 4K displays available with a refresh rate greater than 60hz, and the 4K gaming displays with >60hz are are about 4 times as expensive as a 60hz panel.
            -Most gamers would prioritize a refresh rate greater than 60hz over 4K
            -Gamers generally tend to use monitors in the 24-30″ range, which are arguably too small to need 4K resolution. 4K gaming is more relevant on TVs, e.g. gaming in a living room on a 50″< screen, though generally that’s counter intuitive for keyboard and mouse players. Also gaming consoles probably won’t drive a 4K display at a very good refresh rate, and or will upsample to achieve 4K

            One could argue, what relevance is 4K performance for those who won’t be using that resolution, and have no desire to (non optimal pixel density for gaming)? In a sense that information is theoretical to that category of gamers (a general majority)

            Are there good reasons to be interested in 4K? Absolutely.
            – VR
            – Gaming on large displays, if for some reason your preference is a 50″< 4K >60hz TV

            The bottom line, desktop gaming @4K in 2019? It’s becoming more prevalent, however is still a small niche. Probably the majority of people doing it are gaming on productivity displays that double as their gaming display, and there will be a select few who have gaming monitors at 4K, or are using VR. That’s great, and it’s good to have data that supports that use case. It’s also good to have data that supports the more common use cases, which I support, and appreciate being covered.

            • Redocbew
            • 8 months ago

            Great Wall of Text Batman!

            I can’t fault someone who would rather not guess when attempting to figure out the performance of new hardware, but I’m not sure that’s really what’s happening here. It’s never going to be a one-to-one comparison against the review machine and yours regardless of resolution.

            I’ll admit that I stopped reading right after I saw “majority of gamers”. Seeing the old poll linked on resolutions lead me to reading posts I wrote a year so ago which might as well be exactly the same thing I would write now. I mean no disrespect, but I’d rather not do that.

          • cegras
          • 8 months ago

          Relative data, as the common justification for 4k testing, is not as useful as you say it is. It does not answer how well the data scales in absolute numbers for those of us trying to find the best bang/buck card for lower resolutions. Sure, a 2080 will handle any resolution at any detail level, but there’s a real risk of making the wrong choice by buying a Vega, 2060, 1070, based only on 4K resolution data, and buying a tier too low or too high. It’s much better to have the actual, relevant data.

        • auxy
        • 8 months ago

        See, in this post nobody is downvoting you for criticizing TR’s methodology, which is fair. They’re downvoting you for the insulting comparison. But you’re playing the victim in the comments below saying you got downvoted for criticizing the methodology. You’ve no intellectual or moral high ground to stand on. As usual.

        • Spunjji
        • 8 months ago

        “This site is becoming like UC Berkeley”

        As in, repeatedly attacked by people who haven’t really understood the issues at stake but do have an axe to grind about “free speech”?

        • danny e.
        • 8 months ago

        this.

        irony is that all the downvotes proves your point.

      • YellaChicken
      • 8 months ago

      To be fair it wasn’t just for him, others asked for it too. It’s the way he went about it that bugged me.

        • Srsly_Bro
        • 8 months ago

        Your feelings get hurt often, bro?

          • YellaChicken
          • 8 months ago

          Not as often as yours sweetheart but keep trying.

      • DoomGuy64
      • 8 months ago

      I just think 4k is a stupid resolution for gaming in general, not that I wanted to specifically see THIS card benchmarked at 1440p. The benchmarks aren’t going vary that much, but it is a statistically more common and realistic resolution.

      If anything, I would have liked to see some compute benchmarks, because the Radeon VII isn’t a better gaming card than the 2080, and people are most likely buying the Radeon VII for both work and gaming. The extra ram apparently helps a lot in video editing.

      Maybe if people had a honest discussion with me about it, instead of knee jerk dog-piling, you wouldn’t have such a ridiculous opinion, which is completely distorted from what I was actually trying to say.

        • chuckula
        • 8 months ago

        Next time “thank you” would suffice.

          • DoomGuy64
          • 8 months ago

          No. I would have been happy with one or two 1440p results for reference and variety. I never called for a full 1440p run, and was only pointing out that 4k is a ridiculous resolution that most people don’t care to game at. Stop acting like I wanted a full 1440p review. That’s your own distorted take.

          What I wanted to see was a variety of tests that would have showcased results for what people actually want from this card. If you solely wanted to see game results, well I guess the review did it’s job, but otherwise it didn’t cover what else you can do with it.

            • K-L-Waster
            • 8 months ago

            Ummm, sorry, but what you said was:

            [quote<]Not to mention, this whole 4k only shtick is getting old, and I no longer care to read reviews for scenarios that I don't game at. This is catering to the smallest percent of the smallest percent of PC gamers, and makes these reviews worthless.[/quote<] If that's not calling for a full 1440p run, what *is* it calling for, exactly?

            • DoomGuy64
            • 8 months ago

            All I was saying there is that a dedicated 4k benchmark is only relevant to the minority of 4k users, and pointless to anyone who doesn’t game at that resolution. It was simply out of touch with the non 4k gamers.

            What I’m calling for is to not be singularly catering to a such a minority, and add some variety. It doesn’t have to be a LOT, just throw us a frickin bone for pete’s sake. Really, a few things would be fine. The 4k benchmarks have a purpose, but so does 1440p, and compute tests. 4k results by themselves limit relevance. But by all means, turn the site into a dedicated 4k site, and watch the traffic drop off to the steam survey level of 4k users. At this point, I’m fine with either. Maybe TR is in a position to not need 1440p readers, and is trying to push everyone into 4k for some sort of 4k future for the betterment of PC gaming. I just don’t think that’s very realistic, while a varied approach makes more sense.

            • Waco
            • 8 months ago

            I guess we should disregard the fact that the Steam surveys also say that cards like these (this price class) are an extreme rarity?

            I’m happy to have more data (I love it) but the way you went about justifying your case was sideways at best.

            • K-L-Waster
            • 8 months ago

            A more varied approach absolutely makes sense. Having both sets of numbers is great.

            In fact I don’t believe there would have been a problem if you’d started out with:

            [quote<]The 4k benchmarks have a purpose, but so does 1440p, and compute tests. 4k results by themselves limit relevance. [/quote<] Starting out with "4K gaming isn't even a thing" though is a good way to start a fight. (Which it did.)

            • YellaChicken
            • 8 months ago

            This post would have been an absolutely fine opener on the other thread (maybe without the suggestion of losing readers). It’s spot on. But it’s not what you actually said.

            [quote<]Linus Tech Tips on why 4k is dumb, we should be gaming at 1440p, and he's a heavy Nvidia fan. The vast majority of PC gamers agree with him too, IMO. 4k is not a proper gaming resolution yet, and we would rather have 1440p 144hz. Not to mention 4k+AA is clearly ROP limiting the AMD cards, and is a waste of everyone's time.[/quote<] What you did there was made an assertion that 4k gamers are dumb (whether you meant to or not) and you presumed to speak for all gamers when you kept saying 'we'. However, that's not where it started going wrong for you. That happened when Chuckie made a joke out of your last line and you began to lose your s**t at ppl. Chuckula makes jokes, but to be fair, they often have an underlying point whether you like them or not. If you or Bro think you're victims because you were 'piled on', go back and have a look on the other threads on that page where ppl wanted lower res results. They weren't met with the same response for a reason.

            • kuraegomon
            • 8 months ago

            Sigh. You do realize that TR will test mainstream cards that are [i<]aimed[/i<] at 1440p (or 1080p) gaming markets at the appropriate resolution, right? AMD [i<]marketed[/i<] the Radeon VII as a 4K gaming card, intended to compete with a 4K gaming card (the RTX 2080). Putting aside your personal feelings about 4K gaming, the purpose of a review is to determine what a product's capabilities are, and whether those capabilities match the manufacturer's claims. When you remember what the point of a review is, it's pretty clear what the appropriate resolution to test [i<]this[/i<] product at was. Please check any graphics card review from TR from the last calendar ever, and tell me what the tested resolution was. I promise you that in every case, it was the highest resolution that card was being marketed as being suited for. Which is precisely as it should have been. I'm happy to provide constructive criticism of TR whenever it's warranted, but it wasn't warranted here. You, and srsly_bro, and co are just wrong on this one. Pro-tip (with tongue firmly in cheek here): when Chuck, and Funk, and Krogoth can all agree on something remotely technical, and you're in dissent... you're probably wrong 😛

            • Ninjitsu
            • 8 months ago

            To be mildly fair, AMD and Nvidia have been saying “this card is for 4K” since 2013 or thereabouts.

        • Srsly_Bro
        • 8 months ago

        They are doing knee jerk dog piling instead. TR isn’t about discussion, it’s about one view point, or go somewhere else. This discussion really shows that. Nothing you said was controversial except using a resolution far more people use. It’s like Reddit came to TR or it always was.

          • Waco
          • 8 months ago

          People react to the incredibly abrasive way that he posts. The intent doesn’t matter much if you’re being a jerk and complaining about everything.

            • Srsly_Bro
            • 8 months ago

            The word you’re looking for is snowflake. Toughen up. He’s not your mom. He doesn’t have to talk to you or anyone with kid gloves to prevent you or them from seeking their safe spaces. Everyone is so sensitive smh.

            Listen to the message and understand his frustration with lazy half-baked reviews. Find another major tech site that only uses one resolution. He cares about this site and wants to see data that will help more people. This in turn helps the site. Many of the apologists on this site will not accept that TR isn’t perfect. That attitude will hurt the site more.

            Cyril left. Scott left. Jeff is hanging it up. Keep thinking of ways to limit the target audience and people from clicking links to this site and there won’t be many more people to leave.

            Smh

            • Waco
            • 8 months ago

            Nobody said anything remotely close to what you’re insinuating. Nobody is offended, nobody needs safe spaces. All we ask is common courtesy and clear rational discourse.

            If he’s unwilling to offer, then this is the response he can expect. Simple. The way you say something matters almost as much as what you say.

            • Redocbew
            • 8 months ago

            Yeah, there are intelligent reasons not to be a fan of native rendering in 4k. Checker box and tile based rendering have been shown to provide results at much lower computational cost without sacrificing too much in image quality when compared to poking at every single pixel. That would have been a very different discussion than “4k is worthless and stupid”.

            • derFunkenstein
            • 8 months ago

            It’d take us back to the days of the CRT when there weren’t pixels, just lines. Paint to the screen at a certain number of samples per line. They’re “pixels” but they’re not necessarily square. That’s a great way to conserve rendering power, as you noted.

            • auxy
            • 8 months ago

            Reacting to an abrasive attitude with more abrasiveness is not being “fragile” so calling these people “snowflakes” is moronic and inflammatory. It’s not a matter of being “tough”, it’s a matter of being mature adults who can speak about things openly and with confidence. DoomGuy64 doesn’t express himself very well and yah, maybe people could react a little better, but that’s hardly on them. And you look like a fool for defending him.

            Anyone can see that TR is dying, but it’s hardly alone in the world of online journalism. Maybe you heard of the “learn to code” kerfluffle recently? Advertisers aren’t paying and affiliate links are paying single percentage points instead of the double-digit numbers they once did. The point is that even if this site got ten times the traffic it does it wouldn’t matter unless they were subscribing. (Disclaimer: I don’t have any particular special knowledge of the state of TR, I just know how the online review site business has been going the last few years.)

            Your “he cares about the site and wants to see data that will help more people” would be more convincing if either of you were subscribers, which is the most direct way to help the site. It’s a stupid point anyway; the 4K data tells the story. As proved by this article, nothing changed. And nothing will ever change by testing graphics cards at a lower resolution; all you’re doing is shifting the load back over onto the CPU.

            You should appreciate what you’re getting for free instead of complaining that it isn’t enough. You and he both are choosing beggars of the worst kind. Your posts are bad and I hope you feel bad.

            • derFunkenstein
            • 8 months ago

            [quote<]Anyone can see that TR is dying[/quote<] This is my chief concern about TR and has been for the better part of a year. No secret that everyone's blocking ads and the site hasn't successfully transitioned to YouTube or Twitch or whatever. There hasn't even been a podcast for like two years. There are two sources of income: ads and subs, and that's it. So yeah, I'm going to cut Jeff a break for only testing one resolution and one setting. They're literally doing everything they can and they're doing it well and these jagoffs sit behind their AdBlocked browsers and cry foul about not getting breakfast in bed. They can eat shit for all I care.

            • anotherengineer
            • 8 months ago

            I don’t adblock TR. I wish I could block chu&*#la’s trolling though, but only his trolling since he does have good posts also. Is there a troll blocker??

            I do visit the site a lot less though, mainly cause I don’t really game anymore, and don’t play any games newer than 7 years old.

            This site basically caters to gaming and gamers (heavy focus on CPU & GPU with a dash of SSD) which I suppose is good since they are always upgrading.

            Perhaps some diversification reviewing things like CAD, solidworks, F@H, transcoding, etc. etc. might be a way to help traffic, since they are heavy CPU and GPU tasks also?

            I guess that’s something for a different thread.

            edit -3 in one shot…… I wonder who that could be?? 🙂

            • derFunkenstein
            • 8 months ago

            It’d be nice to see some computational stuff with GPUs, I agree. Not sure how to fix that. Are there community-driven benchmarks for those things? At one time TR had CPU F@H tests, but the way that work units shift over time made that sort of thing not terribly useful.

            I’ve been a fan of the various ways that TR has expanded CPU reviews over the years. DAWBench is a relevant and fun inclusion, for example. It’d be great to figure out a way to apply that to graphics cards.

            edit: by “everyone” I meant the figurative “everyone” – it’s a trend, no doubt. I whitelist TR because Adam has done an amazing job throughout the years of blocking ads that do annoying things.

            • Mr Bill
            • 8 months ago

            I took noscript and adaware out of Firefox and had only enabled blocking trackers and third party trackers since then. Apparently that has a broad definition because it blocks ads from TR. I switch back and forth with Chrome so often and I do see adds with Chrome. So, I will browse TR with Chrome if that helps even a bit.

            • anotherengineer
            • 8 months ago

            Indeed. Or perhaps handbrake cpu vs gpu or other encoding/decoding software with a few different formats. CPUs and GPUs can be used for a lot more things than games.

            • derFunkenstein
            • 8 months ago

            You need more than quantitative benchmarks for that. You’d also need an image quality comparison, because the two methods aren’t using the same algorithm. I’m all for that, btw, I just think it’s a little harder than just running it as a benchmark. We used to get IQ comparisons and dotted diff results that showed what was different, and then everyone started to basically have the same IQ and it was not important anymore. In video compression/transcoding, I think it would be again.

            • DoomGuy64
            • 8 months ago

            I think TR can still break into youtube if they put more videos up. Record some of the benchmarking process, talk a bit, then post the article with the video. The video production doesn’t have to be movie quality, since people would watch it for the review itself. They already have most of the resources to do it, and it wouldn’t hurt anything to record work they are already doing for the article. Youtube doesn’t cost anything to host, so that is a plus, ad revenue is more consistent, and it would even pull youtube viewers to the website.

            Podcasting is maybe harder creative wise, since you have to constantly come up with ideas and guests to talk about, while you can always throw in a random youtube video that does all of the above without committing to a single concept or schedule.

            What would really be interesting, and perhaps a lot harder to do, is having a video for each page of the review, which would be even more interactive. Dunno what would work best, but adding some video can’t hurt.

            • Waco
            • 8 months ago

            I don’t know about anyone else, but I won’t watch video reviews. Period.

            • DoomGuy64
            • 8 months ago

            Are you insinuating that I said to drop all written reviews and only do video? Do you understand that video is useful whether or not you view it? TR has done videos in the past, and I found them very informative and entertaining.

            Having a video included with a review doesn’t hurt [i<]anything[/i<], it only increases ad revenue and marketshare. It's an OPTIONAL feature that you don't have to participate in, while plenty of other people will still watch and enjoy.

            • Waco
            • 8 months ago

            No, I simply said I don’t watch video reviews.

            • Yan
            • 8 months ago

            I remember an article [url=https://arstechnica.com/information-technology/2015/10/online-advertisers-admit-they-messed-up-promise-lighter-ads/<]elsewhere[/url<] where the Interactive Advertising Bureau admitted that Internet advertisers went overboard: "ad blocking is a crucial wakeup call to brands and all that serve them about their abuse of consumers' good will". I sometimes wonder whether the widespread use of ad-blockers will cause Internet journalism to decline and traditional paper journalism to come back.

            • cegras
            • 8 months ago

            I don’t know if TR is dying or not, but it certainly is not the gold standard. I personally do not think TR has kept up with the times, nor has it done anything recently to differentiate itself from its competitors. Unique analysis (inside-the-second) is immediately copied and it cannot provide the sheer amount of data others can. It does not have anyone on staff that has specialized knowledge.

            When I built a new PC, I consulted almost every source except for TR:
            – Techpowerup for 1080p + high refresh GPU data
            – Russian website that exhaustively ran every CPU/GPU combination [url<]https://gamegpu.com/action-/-fps-/-tps/battlefield-v-test-gpu-cpu-2018[/url<] - HardwareUnboxed/Techspot that tested CPUs on BFV - Anandbench - Buildzoid for motherboard VRM analysis - JonnyGuru / others for PSU testing - Reddit for PCIE NVME ssds What TR provides nowadays is basically the bare minimum for tech journalism. Perhaps it has a flair for writing, but that doesn't really matter these days when GamersNexus gets away with atrocious writing, formatting, visualization, and rambling youtube videos but is clearly the next review darling. TR has not kept up with the times and still uses the same clunky ways to visualize the data 5 different ways when a simple box-and-whisker plot would suffice. I do credit TR for recommending the i5-9600K. I keep TR off my adblock, but I stopped subscribing years ago.

            • GrimDanfango
            • 8 months ago

            Nobody is under any obligation not to act like a prick, but neither is anyone else under any obligation not to tell that person they’re acting like a prick. That isn’t being a “snowflake” or being overly sensitive. At most, it’s perhaps a tad reactive, but hey, most people aren’t Gandhi.

            Neither is it shutting down discussion… presenting something for discussion in a deliberately obnoxious way undermines the point right off the bat – you’re shutting down your own discussion. People will be happy to discuss that same point presented in a reasonable manner.

            Don’t confuse being told you’re a prick with being censored.

            (Disclaimer, I’m being deliberately contentious to make a point… but I’m not specifically intending to insult anyone. Still, if I should edit this post to be less insulting, please let me know :-P)

            • kuraegomon
            • 8 months ago

            Dude – did you seriously just call a TR [i<]GPU[/i<] review lazy and half-baked?!? Are you bleeping insane? If you're going to utter something as incredibly asinine as that then please go somewhere else. Please.

            • Spunjji
            • 8 months ago

            Every area of human existence has rules of discourse – even 4chan. Trying to broaden the term “safe space” to encompass “anywhere with rules of discourse that I disagree with because I think they’re too soft” serves only to demonstrate how little you understand about both safe spaces and broader society.

            Telling people to “toughen up” when you’re the one very obviously presenting as oversensitive is what’s really killing you here, though. Fix that approach and you might find the downvotes mysteriously disappearing. It would also help if you stopped trying to jam inaccurate alt-right talking points into unrelated topics.

        • danny e.
        • 8 months ago

        so much reaction to some valid points being made. I have a 1080TI and 2x 4K monitors and NEVER game at 4k… so far.
        2560×1440 all the time for me.

        People who are arguing 4K is so great are not paying attention to the benchmarks.
        You’re averaging under 60fps in almost all the games @4k except 1k plus cards.

        Edit last two lines and clarifying “except 1K plus cards”

          • Spunjji
          • 8 months ago

          Valid points, you say?

          “I just think 4k is a stupid resolution for gaming in general” – this is a quote from the comment you’re replying to. It’s not even close to being a valid point, and it’s the one getting the most discussion.

          “People who are arguing 4K is so great are not paying attention to the benchmarks.”
          That’s a strawman. I haven’t seen anyone here arguing that it’s “so great” – just that testing at 4K ensures you’re entirely GPU limited and therefore (as borne out here by the further testing) is suitable for the overall ranking of high-end cards.

          The benchmarks will only ever tell you a little about what’s possible. I’ve gamed very happily at 4K in a few titles and I’ve never owned anything more powerful than a GTX 980.

            • danny e.
            • 8 months ago

            I’m sure you can game happily at 15fps if you like to play solitaire.

            • Spunjji
            • 8 months ago

            *shrug*

            You have all the necessary kit that will prove me correct here – all it requires is a few moments in the graphics settings screen of your favourite game to move a few sliders from “Ultra” to “High”. Your kit, your choice 🙂

            • danny e.
            • 8 months ago

            [url<]https://youtu.be/ehvz3iN8pp4[/url<]

            • danny e.
            • 8 months ago

            haha hilarious that this got downvoted. odds the person doing so clicked. saw title. got panties in a bunch. downvoted.

            watch it. learn something.

            • danny e.
            • 8 months ago

            Waco says: “I don’t know about anyone else, but I won’t watch video reviews. Period.”
            ————-

            ah, that explains it.

            Waco translation:

            “I don’t value others opinions or insight if it takes any effort on my part to learn the facts.
            I know what I feel is right and I relish my own ignorance”

            Edit: yeah, I know. I’m just trying to get more downvotes to prove a point. small things amuse me. I like watching morons behavior. It’s like going to a zoo.

            • Waco
            • 8 months ago

            I gamed at 4K for quite a while on the equivalent of a GTX 980 (it was a Titan Xm). 45+ FPS is perfectly acceptable for many types of games and it’s easy to hit 60+ if you’re not a stickler about “ULTRA EVERYTHING”.

            Please show up with something other than a shitty attitude an an opinion next time please.

          • Waco
          • 8 months ago

          I really hope you aren’t running 2560×1440 on your 4K monitors…

            • danny e.
            • 8 months ago

            I do love the attitudes in here. Ignore the benchmarks. Ignore common sense. Ignore that your 4K monitors are 60Hz. Ignore that your graphics card can’t drive more than 50fps at 4k.

            keep exposing your ignorance and claiming victory.
            yeah. way to show em!

            • Waco
            • 8 months ago

            Nobody is ignoring benchmarks or common sense. Perhaps you’d like to present an actual argument?

            My HTPC has a GTX 780 in it. It drives a 4K 70″ TV. It can easily exceed 60 FPS in many many games.

            Perhaps I’m just old, but I recall the “SVGA is stupid” and “beyond 1024×768 is stupid” and “1920×1080 is stupid”. It’s all a pointless argument.

            You’re not new to TR. Facts matter here, not opinions. You know this.

            • danny e.
            • 8 months ago

            Seriously, you’re like Jussie. It’s obvious you’re lying.

            Look at the benchmarks TR ran in the latest reviews. Your 780 can not do close to 60fps @4k in any of them unless you’re turning down the detail to crap.
            foolishness when the benchmarks you claim you care about are right in front of your face.

            And if you don’t like/play those games, just say “I like solitaire”
            Everyone has different tastes. Some like 3d shooters, some like solitaire.
            Some like running @4k 15fps and some like 1440p @60+ fps.
            Knocking Doom Guy for liking the latter is just petty.

            edit: had to add Jussie joke. relevant.

            • Waco
            • 8 months ago

            Thanks for continuing to prove my point.

            • danny e.
            • 8 months ago

            Wait, what?? You have a point?

            I’d love to hear it.

            From what I gather, you don’t play any of the games in the TR benchmarks, yet you rail against those that do play those or similar games.
            Tell me where I’m wrong. All of your words can and will be used against you.

            • Redocbew
            • 8 months ago

            Thank you for another post which makes me hope I didn’t sound like you 20 years ago.

            • danny e.
            • 8 months ago

            haha 20 years ago? I ‘tinking I’m older than you ‘tink.

            I was originally trying to get you guys to actually make reasonable arguments based on the same reviews we’re all looking at. But since that failed, I’m just messing with you all now. Like tossing peanuts to the zoo monkeys. It’s all fun.


            Edit: if you care at all about history. Look up the communist purge in Indonesia. I was there tossing out Obama during that. Yup, I’m that old.

            • ermo
            • 8 months ago

            Obama?

            • Spunjji
            • 8 months ago

            Not sure if that’s a typo or he’s nuts, but he seems to be implying that he was complicit in partisan mass-killings.

            • ermo
            • 8 months ago

            I’m assuming “Obama” is an ill-informed reference to “communists” in this instance?

            As for the other part, it might be covert US intelligence/military work to lessen communist influence in South-East Asia in general?

            Either way, it sounds pretty ugly.

            • Waco
            • 8 months ago

            Where aren’t you wrong?

            You’ve been on TR long enough to know that BS like what you’re spewing doesn’t fly. I have a Radeon VII in my desktop for most gaming. My HTPC has a 780 for light gaming at 4K in older games. Both can sustain > 60 FPS in the games I’m playing on each.

            Please, go off on another rant. It’s slightly entertaining, if a bit pathetic.

            • danny e.
            • 8 months ago

            Alright, lets do this thing.

            [quote<] Where aren't you wrong[/quote<] anywhere. [quote<] Please, go off on another rant.[/quote<] get off my lawn!

            • DoomGuy64
            • 8 months ago

            I had a 780 before I went AMD so I know you’re leaving out details like resolution, settings, or the age of the games on purpose.
            [quote<]3840x2160 at 30Hz or 4096x2160 at 24Hz supported over HDMI. 4096x2160 (including 3840x2160) at 60Hz supported over Displayport.[/quote<] Also, I know that unless you have a displayport connection to your TV (how many TVs support DP?), you're physically not capable of running 4k 60 FPS over HDMI. That's a fact. Nvidia dropped all decent game optimization for Kepler after Maxwell, which is why so many people were angry with Nvidia when the 960 started beating the 780 in newer games, especially when the 780 had superior hardware specs. You're not running any modern games at 4k 60 FPS over HDMI, and I had trouble with that card getting decent framerate past 1080p on any Maxwell+ games, let alone making that claim today and trying to pass it off as legit. The only way you're getting 4k 60 FPS, is with xbox 360 era games using reduced settings, and over display port (DP on a TV?). Which is still questionable. That card [i<]doesn't even have enough VRAM[/i<] to run 4k in games, outside of titles that have low VRAM requirements. I would have believed you if you said a 980, but 780? Nah, dude. You're clearly lying, or bare minimum lying by omission.

            • Waco
            • 8 months ago

            HDMI, 60 Hz, chroma subsampling. Accusing me of lying is cute, though.

            I never said modern AAA games at high settings, just that it’s 100% doable on older hardware with a huge number of games with good framerates. Jumping to conclusions doesn’t help your stance.

            • danny e.
            • 8 months ago

            Ok, Jussie.

            • DoomGuy64
            • 8 months ago

            What?

            You can bypass the HDMI limit with chroma subsampling? First I heard of this hack, and obviously this is all part of your lying by omission shtick. Your “4k 60 FPS” 780 only works under specific conditions that you deliberately did not mention until after being called out on it. That’s called lying by omission, which still qualifies you as a manipulative liar.

            I did some searching on the claim, which did not turn up a lot, but there was a Tom’s article on it.
            [url<]https://www.tomshardware.com/news/nvidia-kepler-4k-hdmi-1.4,27117.html[/url<] [quote<]Four times less colour information[/quote<] This was not part of the original capabilities, and was later patched in with a driver update. I'd also like to know what games you are running on this card. Because that's part of the whole 4k 60fps 780 fraud you are pushing. So far we know: * Chroma subsampling hack to bypass HDMI spec, at cost of image quality. Games? Unknown. Game Settings? Unknown. Actual framerate of said games? Unknown. Either way, this is a poor way to play games. There is a loss of image quality, game selection is undoubtedly limited, and the hardware is severely outdated. Only you could submit yourself to such hackery and think it was worth it. None of the games you can play on that card have the visual fidelity to justify 4k, and there is a loss of color. This isn't a justification of 4k, it's a poor justification of insanity, and some sort of 4k obsessive disorder. Kookoo for cocoa puffs Kraazy.

            • Waco
            • 8 months ago

            Wow. What’s up your ass?

            I have no reason to lie about anything. Clearly you’re pretty invested in your “4K is stupid” mantra, but please, step back and realize how insane you sound right now. Chroma subsampling is set by default when connecting via HDMI for 4K sets on both Nvidia and AMD GPUs (I just swapped my Titan Xm for a Radeon VII on my desktop with a Samsung 40″ 4K set). It’s unnoticeable unless you’re looking for specific patterns that highlight the difference. The Blu-ray spec is 4:2:0, AMD/Nvidia default to 4:2:2 at 60 Hz at least on the sets I’ve tested with.

            • danny e.
            • 8 months ago

            you have no reason to lie – yet you do it. .. or option B is that you just don’t realize you’re wrong. I’d say the former but considering you don’t like to inform yourself by watching any videos of people who would teach you some things, it actually is likely B in your case.
            Ignorance is not much of a step up from lying, though.

            What games do you play that allow you to play >60fps?
            What is your actual fps in those games?
            What 4K monitor are you using that allows >60fps

            some of the many questions you won’t answer due to being exposed.
            There are very few monitors that do >60Hz.
            And there are basically no modern 3D games that will do >60fps on your GPU

            • Waco
            • 8 months ago

            To be honest I’m only entertaining this thread because I keep getting pinged about it via email.

            There’s no lying nor any willful ignorance here from my side of the conversation. Your lack of knowledge isn’t my problem.

            You’ll probably have an aneurysm but I also ran a GTX 750 on that same HTPC, running 4K 60 Hz, and gamed on it as well. I gave that card to a coworker with an old 2600K and popped in the 780 as a replacement.

            • danny e.
            • 8 months ago

            other peoples lies don’t give me aneurysms. I’m only entertaining this thread because I was figuring, much like Jussie, you’ll eventually fess up.

            Like I said earlier, if you’re playing solitaire, sure you can use a 750 or 780. If you’re playing any modern, 3D games, you can not – unless again, as I said, you turn the detail down to crap.

            ALL of TRs reviews have shown this. On one hand, you’re trying to defend TR by saying how accurate their reviews are. On the other hand, you’re denying the very content of those reviews.

            The 780 did manage to do 50fps+ at [b<]1440p[/b<], though. So, you have that going for you... except you keep claiming it can do 60 @4k, which it can not - again, unless you turn the detail down to mud which defeats any good reason to run at 4k. [url<]https://techreport.com/review/25611/nvidia-geforce-gtx-780-ti-graphics-card-reviewed/5[/url<] Just go back to the current review again and look at the cards that did 60fps+ in every game. There was one. So, once again, if you like playing your games at a stuttery 40-50fps, feel free to keep believing in that 4k. Or keep playing those unnamed games that do fine at 4k I used to play Crysis @30fps, so I can certainly appreciate that there are those out there that don't mind the stutter because they don't know any better or can't afford to upgrade. However, pretending that reality isn't reality is just silly.

            • Waco
            • 8 months ago

            I’m amazed at how much effort you’re putting into disproving something I never said.

            • DoomGuy64
            • 8 months ago

            The Kepler 60hz 4k mode is reported as 4:2:0, not 4:2:2, and according to the available articles it makes text look horrible, and introduces color banding. Probably not a problem with brightly colored games, but I could see it being a noticeable issue with games like deadspace.

            IMO, 4k is definitely stupid for PC gaming. LTT aren’t the only guys saying it. You pretty much need a large TV to notice the improvement. DLSS shows that 1440p upsampled to 4k is acceptable for desktop users, proving the pixel density of 4k desktop monitors is overkill. Consoles use upsampling to get 4k, and people aren’t complaining there either. You are then mostly limited to 60hz, which is terrible for FPS. Then you have the issue of TVs having worse response time than gaming panels, and TVs are more VA based than desktop monitor’s IPS or TN, which VA suffers from more blur. Especially @ 60hz. It’s also incredibly GPU and VRAM taxing, making $700 cards a requirement to play modern titles.

            If I had a 4k panel, which I have no incentive to buy, but if I did, I would be using a RTX card with DLSS upsampling. That said, I still wouldn’t touch the spec until high refresh panels are available without a major premium, and I had a card capable of upsampling. None of my games need that level of pixel fidelity, and I benefit more from higher refresh rates.

            So while I can see the purpose of 4k benchmarks, I don’t accept the resolution as a realistic setting that gamers need, because there are too many trade-offs, and the noticeable improvement is marginal.

            • Waco
            • 8 months ago

            You’ve clearly never seen it in action. I thought 4K was dumb till I tried it, then it was impossible to go back to my 85 Hz 2560×1440 monitor.

            I’m anxiously awaiting the 120 Hz Freesync 2 panels displayed at CES this year.

            I do find the irony delicious that you think DLSS is great but chroma subsampling is terrible.

            • DoomGuy64
            • 8 months ago

            I don’t need to. Why? It’s like this: 3dfx/quake1. 320 software was terrible. 800×600 was pretty good. Now Imagine it’s the 90’s, and we have people saying we need geforce 256 and 1600×1200 to run Quake1, and the costs are ridiculous, while visual improvements are limited. This is what you are doing.

            I can see the difference between 320 software and 800×600 glide. I can’t see geforce 256 being relevant for quake or glide games, because the textures and game fidelity don’t require it. Glide games were limited to 256 pixel textures.

            Only once games reached a higher visual fidelity was newer hardware required, and it took games like UT2003+ and free anisotropic filtering to push that newer hardware. You could see that 800×600 wasn’t enough with those newer games.

            There is no “seen it in action” when there is no detail level that needs it, especially when “action” also refers to MOVEMENT, which is terrible on 60hz VA panels with reduced color settings. We can also use DSR/VSR to supersample games on lower resolutions to reduce aliasing, so mere aliasing is not an argument either. It’s only when games start using 4k textures that 4k resolutions will become necessary. There is no “seen it in action”, when there is no actual extra fidelity to see.

            • Waco
            • 8 months ago

            Keep convincing yourself.

            EDIT: Wow. I just realized you are now saying that 4K DSR on 2560×1440 is better than 4K native. Not only is it a non-integer multiple, it’s literally the same amount of GPU horsepower.

            • DoomGuy64
            • 8 months ago

            I’m not saying it is better, YOU ARE. I’m saying that there is no extra detail that can be gained from 4k, until texture size matches the resolution. The only benefit 4k gives right now is reduced aliasing, which can be done with MSAA, FXAA, MLAA, DSR, ETC.

            Here’s another point. You are 4k gaming on a 780 OF ALL THINGS, that has limited VRAM, and can’t run modern games at that resolution. THERE IS NO EXTRA FIDELITY YOU ARE GETTING WITH THAT CARD @ 4k, other than reduced aliasing. You are essentially running quake 1 in 1600×1200 and claiming [i<]it's so much better[/i<], when these older games source material were not designed for 4k. That is an outright lie. You don't need 1600x1200 in quake1 for quality, and a lower resolution would be perfectly acceptable. Not to mention vastly superior in competitive play, where you need the framerate. If you really wanted extra fidelity, you'd be gaming on a 1080/2080 TI, and only playing the most modern titles that supports 8GB of textures. Any games that run on a 3-4GB card maxed out will not see any benefit of 4k, OTHER THAN ALIASING.

            • Waco
            • 8 months ago

            I have a Radeon VII for primary gaming.

            Regardless, you don’t need 4K textures for a game to benefit from higher resolution. I’m not even sure what your argument is any more given your professed ignorance to 4K anything.

            RE:2 at my current settings uses 14.5 GB of VRAM. It’s beautiful.

            • Redocbew
            • 8 months ago

            I wonder which is more common, bitmapped or procedural textures? I would guess that items(weapons, NPCs, and so on) are probably bitmapped, but environmental stuff(sky, grass, walls) could likely use procedural textures, but that’s just a wild assed guess.

            • Waco
            • 8 months ago

            I would assume procedural for environments makes a lot of sense.

            • DoomGuy64
            • 8 months ago

            Exactly, you have proven my point. Now run RE:2 at those same settings on the 780. Not possible. The trade offs you would have to do, would be worse than running a lower resolution with higher settings, let alone if that card can even run RE:2 all that well.

            Everything you said about RE:2 on a Radeon VII proves my point about 4k. You have to be playing a game that uses 8GB or more, on $700+ hardware, to actually take advantage of 4k. Everything below that is pointless.

            • Waco
            • 8 months ago

            It’s like you’re having an argument with yourself. I never claimed anything you seem to be railing against.

            • DoomGuy64
            • 8 months ago

            [quote<] I never claimed anything you seem to be railing against.[/quote<] [quote<]My HTPC has a GTX 780 in it. It drives a 4K 70" TV. It can easily exceed 60 FPS in many many games.[/quote<] Yes, you did. Seems to be a part of your shtick to attempt a ridiculous claim, backpedal with a gotcha, then deny when you get called out. Step one: Ridiculous statement. (780 can do 4k, it's so awesome.) Step two: Backpedal with a gotcha. (Chroma subsampling / never mention games / deny that it is pointless.) Step three: Whoops, my argument fell through. I never said that. (Oh well, RE:2 is awesome on my R VII @ 4k.) What is this? Troll Jujitsu? Just don't make ridiculous gotcha claims in the first place. I would have agreed with your RE:2 statement if that's what you started with, because there are a few instances where 4k makes sense. I just don't think there is a point to running old games or old hardware in 4k, because the pixel density isn't justified for the texture quality. That, and 4k isn't mainstream enough (hardware cost / refresh rate / lack of titles) to be worth upgrading over 1440p. It's going to take at least another year for 4k to fix it's downsides and become mainstream.

            • Waco
            • 8 months ago

            I didn’t say I played modern games on my HTPC with settings cranked up. Believe it or not but games that are older/indie are still awesome, fun, and run well on older hardware.

            So yeah. Keep attacking that imaginary position you put me in. 🙂

            • DoomGuy64
            • 8 months ago

            [quote<]I didn't say[/quote<] Exactly. Your whole pattern is lying by omission. You deliberately didn't mention it for the sole purpose of making a gotcha retort. Rinse, repeat. What are you, a broken record? I don't care that you run old games in 4k. I think it's pointless. The quality gain is non-existent, since the source material doesn't match the resolution. You know what would work just as well? A VIDEO UPSAMPLER. You could use TV upsampling, or something like mCable, to upsample old games to 4k, without any performance loss. The source material isn't good enough to need true 4k rendering. Nobody needs native 4k rendering until games have native 4k graphics. You are essentially rendering NES Mario 3 @ 4k, when that game would be better suited to being rendered with any number of upsampling methods. Old games like Quake don't need high resolutions, and each resolution bump gives increasingly diminishing returns. 360 era games are the same. You can get a bit more detail out of 1440p, but the games were designed for 1080p, and 4k is outright pointless. The only real reason to run quake 1-2-3 @ 4k, would be that windows doesn't support independent resolution changes from the desktop, or independently upsample/scale low resolution games, meaning it breaks the desktop GUI for not rendering native. That's not proof running old games in 4k is good, that's proof windows is broken, and you are working around a broken OS that lacks insight for backwards compatibility. At least programs like DOSBOX know what you really need, and offer scalers like, "hq3x, 2xsai, super2xsai, supereagle", etc. Older windows games need the same treatment, but it isn't a native feature. If I was you, I wouldn't bother running 4k on that HTPC, and would stick a mCable on it. The upsampling would look better than native 4k rendering. [url<]https://www.marseilleinc.com/gaming-edition/[/url<]

            • Waco
            • 8 months ago

            I’m still surprised how much vehemence you’re channeling with this anti-4K tirade.

            4K resolution is a clear quality bump. It’s cheaper on older games and still looks amazing. You can’t accept that, fine, but accusing people of lying out of your own ignorance is not exactly a strong position.

            Upsampling is ugly by comparison. DLSS is too, even compared to reduced chroma 4K.

            Your opinion that it’s stupid has been noted. Clearly many don’t agree with you. Done?

            • DoomGuy64
            • 8 months ago

            Upscaling is superior, depending on what algorithm you are using. DLSS looks fine for the most part, but Nvidia clearly hasn’t worked out how they render fine lines, and should maybe combine it with TAA. It also doesn’t help that they are making it a per-game feature on new games that look better @ 4k. DLSS is better suited for older games with less detail.

            Mario 3 is the ultimate proof of upscalers being superior. Running that untouched on a 4k monitor would look like pixelated garbage, while a scaler would vastly improve the image.

            If anyone here is “ignorant”, it’s you, for denying the fact that you gain no additional image quality when running old games in 4k. Not to mention, it breaks the UI and text in most of them.

            What we really need is a native windows upscaler for old games, and not hacking 4k into games that don’t need it.

            • Waco
            • 8 months ago

            I gain nothing, that’s why I (and others) do it. Brilliant. Have fun creating stupid strawmen with someone else.

            • Spunjji
            • 8 months ago

            “I’m saying that there is no extra detail that can be gained from 4k, until texture size matches the resolution.”

            This part of your point is really not true. I’ve done the testing myself, in innumerable different titles and with a lot of different hardware.

            You’re correct that the benefit is variable. The original Dawn of War games, for example, look like a smeared-textured-mess at any resolution – the extra pixels at 4K resolution give you crisper lines, slower frame-rates aaand that’s it.

            Then there are newer titles. I’ve enjoyed a few evenings messing around in an F2P game called Crossout. It runs very nicely indeed (50-60fps) at 4K on my 980M with basically everything turned up, and the clarity difference between 2.5K and 4K native is very readily apparent. It’s most obvious in things like textures being rendered in the middle-distance, but smaller geometric details on objects closer to the viewport also see a benefit. Also telegraph wires and cables, wow, they have been dying for more resolution for decades now.

            Fallout 4 is another example I can call to mind that sees similar benefits to clarity of a number of different in-game items. It’s not universal, and sure, if you run up face-first to some textures then you’ll rapidly see them as the limitation rather than your rendering resolution.

            You’re not wrong about many of the things you’re saying – where you’re wrong is in thinking that these caveats apply more universally than they actually do. Again, the balance is there to be discussed, but that doesn’t make one side “right”. If I had a 2.5K 144Hz monitor I’d happily take the hit in detail for the extra frame-rate; I don’t, so I only drop resolution in games where 4K is too much to ask for from my ageing hardware.

            • Redocbew
            • 8 months ago

            That’s generally how compression works, but I’m sure the lost bits appreciate someone noticing their passing. You can join the most ardent of the audiophiles by insisting that a lossless algorithm doesn’t just sound better, but puts you on a path to righteousness that shall not be denied.

            More seriously… really?

            • derFunkenstein
            • 8 months ago

            [quote<]the lost bits[/quote<] One of my all-time favorite movies. That show had it all: Kiefer Sutherland. Both Coreys. The 80s were a great time.

            • Spunjji
            • 8 months ago

            The “attitudes in here” are ones you’d expect to find on a hardware enthusiast site. We try things out for ourselves, we tweak, we experiment. We change the settings in games to meet our own preferred balance between eye-candy and frame-rate. We inform our purchases with benchmarks but we don’t define our entire understanding of the world by them. We push our hardware to its limits because, if you don’t, you might as well buy a console.

            We don’t just watch a Linus video and then decide that “4K gaming is dumb”. We certainly don’t listen to someone pushing that view when their opening statement declares that they have no relevant experience. By your own admission you have chosen to pick somebody else’s “correct” idea about how to play games without actually testing it for yourself, and have then inexplicably set about harassing anyone who disagrees. No wonder you don’t like it here.

            I’ve noticed that your posts on this topic have some obvious patterns. You create false binaries, build a straw man and then stridently insist that your opponent defend it, and engage in blindingly obvious projection (“keep exposing your ignorance and claiming victory” indeed). I’m going to assume that all this nonsense, along with your lack of argumentative consistency and apparent inability to perceive your own hypocrisy, is the result of a calculated internet hard-man persona. If that’s the case then I’m afraid you’re nothing special – Chuckula does that sort of trolling far better than you ever could.

            If it’s not the case, well, oh dear – it must be difficult getting by in life with little more than a spam sandwich wedged between your ears.

Pin It on Pinterest

Share This