AMD’s Radeon RX 590 graphics card reviewed

Morning, folks. I’m fighting intermittent power outages thanks to an ice storm in the locale of the TR labs last night, but that hasn’t stopped me from collecting and digesting data on AMD’s latest graphics card: the Radeon RX 590. The company didn’t share a ton of details about this card with us, so I’ll keep this short. The RX 590 uses the same basic Polaris 10 GPU that’s powered the RX 480 and got a boost in the RX 580. This time, the performance improvements come courtesy of a move to GlobalFoundries’ 12LP process, an improved version of the basic 14-nm FinFET technology that has underpinned AMD’s CPUs and GPUs for some time now.

The XFX RX 590 Fatboy card (and yes, that is its name) that we’ve had the privilege of playing with over the past few days carries a 1600-MHz boost clock range, up a fair bit compared to the roughly 1400-MHz range that RX 580 partner cards could boast of. We’ll be adding more to this article as we can, but all of our test data is present and accounted for. If you’d rather not page through reams of frame-time data, you can skip ahead to the conclusion at leisure.

Our testing methods

If you’re new to The Tech Report, we don’t benchmark games like most other sites on the web. Instead of throwing out a simple FPS average—a number that tells us only the broadest strokes of what it’s like to play a game on a particular graphics card—we go much deeper. We capture the amount of time it takes the graphics card to render each and every frame of animation before slicing and dicing those numbers with our own custom-built tools. We call this method Inside the Second, and we think it’s the industry standard for quantifying graphics performance. Accept no substitutes.

What’s more, we don’t rely on canned in-game benchmarks—routines that may not be representative of performance in actual gameplay—to gather our test data. Instead of clicking a button and getting a potentially misleading result from those pre-baked benches, we go through the laborious work of seeking out test scenarios that are typical of what one might actually encounter in a game. Thanks to our use of manual data-collection tools, we can go pretty much anywhere and test pretty much anything we want in a given title.

Most of the frame-time data you’ll see on the following pages were captured with OCAT, a software utility that uses data from the Event Timers for Windows API to tell us when critical events happen in the graphics pipeline. We perform each test run at least three times and take the median of those runs where applicable to arrive at a final result. Where OCAT didn’t suit our needs, we relied on the PresentMon utility.

As ever, we did our best to deliver clean benchmark numbers. Our test system was configured like so:

Processor Intel Core i9-9900K
Motherboard Gigabyte Z390 Aorus Master
Chipset Intel Z390
Memory size 16 GB (2x 8 GB)
Memory type G.Skill Flare X DDR4-3200
Memory timings 14-14-14-34 2T
Storage Samsung 960 Pro 512 GB NVMe SSD (OS)

Corsair Force LE 960 GB SATA SSD (games)

Power supply Seasonic Prime Platinum 1000 W
OS Windows 10 Pro with April 2018 Update

Thanks to Intel, Corsair, G.Skill, and Gigabyte for helping to outfit our test rigs with some of the finest hardware available. AMD, Nvidia, and EVGA supplied the graphics cards for testing, as well. Have a gander at our fine Gigabyte Z390 Aorus Master motherboard before it got buried beneath a pile of graphics cards and a CPU cooler:

Unless otherwise specified, image quality settings for the graphics cards were left at the control panel defaults. Vertical refresh sync (vsync) was disabled for all tests. We tested each graphics card at a resolution of 1920×1080 and 60 Hz, unless otherwise noted.

The tests and methods we employ are generally publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.

 

Shadow of the Tomb Raider



 

Far Cry 5



 

Wolfenstein II: The New Colossus



 

The Witcher 3



 

Gears of War 4



 

Dota 2



 

A quick look at power consumption and noise levels

From the preceding pages, we know that the RX 590 can generally deliver higher average frame rates and lower 99th-percentile frame times than its arch-rival GeForce GTX 1060 6 GB. The question behind those numbers is just how much power the RX 590 has to burn to get there. To find out, we fired up Doom‘s Vulkan renderer, a notoriously hard-hitting load for any graphics card, and had a look at the opening vista of the game’s Foundry level. 

Perhaps the best thing that can be said about the RX 590’s contribution to a system’s power draw is that it’s not that much higher than that of the RX 580 8 GB, itself a card that was pushed hard on the voltage-and-frequency-scaling curve to deliver competitive performance against the GTX 1060 6 GB. Even so, the 12LP Polaris chip is consuming GTX 1070 Ti power to deliver its only-somewhat-superior-to-a-GTX-1060-6-GB performance levels. Performance-per-watt continues to be a challenge for Radeon graphics cards, and any gamer who wants to add one of these cards to a system may find they need a more powerful PSU than they might for use with any GeForce.

For its part, however, XFX has done a fine job of quietly dissipating the extra 100 W of power the RX 590 pulls down and converts into heat over our single-fan GTX 1060 6 GB. One thing our noise results don’t capture is that the Fatboy card has a piercing coil whine that’s easily audible above the soft sound of its fans. Coil whine is a common sound to hear from a graphics card, but even with that in mind, the XFX card’s song is lower-pitched than usual and sounds almost like the Radeon R9 Fury X’s liquid-cooling apparatus in operation. Those with sensitive ears may not find that sonic signature to their liking.

 

Conclusions


Our value scatters boil down the story of the RX 590 quite well. On the back of its clock-speed boost alone, the RX 590 capably steps into the wide gap left by the RX 580 8 GB and the GTX 1060 6 GB in today’s midrange graphics-card market. Perhaps more critically, the RX 590 is the first midrange card we’ve reviewed that clears the 60-FPS mark for 99th-percentile FPS per dollar. For 1920×1080 gaming at high or ultra settings, the RX 590 proves both swift and smooth. Can’t ask for much more than that.

For the first time in a long time for a Radeon launch, the move to GlobalFoundries’ 12LP process has apparently yielded enough performance headroom that AMD’s engineers didn’t feel the need to hurl this card over the shoulder of the voltage-and-frequency-scaling curve (or at least not any more so than it already was). With our XFX RX 590 on our test bench, I observed less than 20 watts’ extra draw on our power meter versus a hopped-up RX 580 8 GB. AMD is still nowhere close to matching Nvidia for performance per watt, but it’s nice to know that the RX 590 shouldn’t be much more demanding of power supplies or custom coolers to deliver its extra performance.

For all of the RX 590’s virtues, I wish AMD had found another $30 or so to trim from its suggested price tag. If the RX 590 had landed at $250 or so, it would be a total, 100%, no-brainer pick versus Nvidia’s GeForce GTX 1060 6 GB and a long-demanded shot in the arm for midrange graphics performance. Pricing for the RX 580 might have had to shift downward accordingly, but if AMD really wanted to get serious about upending the midrange graphics-card market, selling RX 580 8 GB cards for around $200 wouldn’t be a bad move. In fact, $200 is close to the going rate for entry-level RX 580 8 GBs right now, making the RX 590’s asking price pretty hard to swallow.

If the $280 asking price for the RX 590 persists, I fear buyers will be tempted to step all the way up to high-end graphics cards with superior performance potential. Our scatter plots above use suggested prices as a fixed point of reference, but real-world prices for high-end cards are quite a bit different right now.

Savvy shoppers can keep an eye out for a discounted RX Vega 56, GTX 1070, or GTX 1070 Ti for less than $100 extra than an RX 590 seems set to command. The most affordable GTX 1070s on Newegg right now fall in the $340 to $350 range, and the most deeply discounted GTX 1070 Ti is just $360. ASRock is even selling a reference RX Vega 56 for $350 as I write this.

Those discounts will likely shift around as we move through the holiday season, but if they’re typical of graphics-card prices for the near future, patient folks who have some flexibility in their budgets may find that stepping up to a high-end graphics card just isn’t that hard on the wallet, and that’s bad news for the RX 590 at the moment.

All that said, discounts don’t last forever, and old stock of high-end graphics cards may not, either. If you’re shopping for a graphics card in the under-$300 range and were already considering spending $250 or $260 on a GTX 1060 6 GB or Radeon RX 580, I would spend the extra $20 or $30 on an RX 590 without hesitation. It shouldn’t come as a shock that this card comes TR Recommended.

Comments closed
    • DavidC1
    • 11 months ago

    Does it have to be explained every time why the frequency and voltage targets are conservatively set by the manufacturers?

    AMD, even with just 20% or so marketshare are selling millions of these cards. 1% of the cards not working properly, or stable seems not much of a deal unless that 1% means 30,000 units and potentially 30,000 customers being affected which means those 30K people are going to potentially RMA or return their purchase.

    Also, testing might show over a longer period of time, say 5-7 years, certain undervolting might result in instabilities under limited and niche testing conditions. Again, “limited” and “niche” when you are selling millions of them are not really small numbers.

    Worse than being somewhat worse than your competitor is being in the spotlight of the press talking about high return rates and complaints of games crashing.

    Nvidia, with their much better finances are also advantaged here. They can afford to spare more on testing equipment and simulations to better predict how their card will behave in reality so they might be able to clock it little bit higher or lower its voltage a little bit more.

    • Rza79
    • 11 months ago

    [url<]https://nl.hardware.info/reviews/8851/19/amd-radeon-rx-590-review-geen-pensioen-voor-polaris-overklokken-en-undervolten[/url<] Undervolting results in this Dutch review. In an effort to see how much 12LP improves upon 14nm, they downclock a RX590 to RX580 clocks and decrease the voltage too. They go from 197 W to 146 W. In the same review, a GTX 1060 consumes 115 W. Sadly they didn't decrease their RX580 voltages too to make it a fair comparison.

    • Ninjitsu
    • 11 months ago

    Price looks okay vs the 1060 6GB, but it’s the price gap w.r.t the 580 that needs to be closed. Of course, matching the 1060’s price would have been a stronger move. And yeah, power consumption is high…

    • christos_thski
    • 11 months ago

    So if AMD is releasing a new mid-range part today, are we to assume that we’re still some ways off the midrange navi ?

    The RX590 doesn’t seem like a bad part, at all, if it wasn’t for the power consumption and the price.

    As is, however, I’d rather go for either an RX580-8GB or a 1060-6GBs , if I was looking for a midrange card. The price is simply not right.

    nVidia has royally screwed the pooch this gen around, would be a damn shame if AMD don’t utilize the chance to finally compete. We’re 20 or so months away from intel’s discrete gpu too. AMD doesn’t have all the time in the world.

      • Goty
      • 11 months ago

      I think the expected release date for Navi is still 2H 2019.

    • ronch
    • 11 months ago

    I’ll still wait until AMD comes up with a much more energy efficient design. Some folks may not care. Well, I care about my power bill and I feel better getting energy efficient products.

      • synthtel2
      • 11 months ago

      My 480 at 945mV is pretty much on par with the green team in perf/W. It’s tougher to recommend to someone who can’t do those tweaks if efficiency is a high priority, but I’m sure you can do them.

        • Srsly_Bro
        • 11 months ago

        Nice comment, bro. Under-volting is of interest to me with my computer on 24/7 except to update. My 2700k has had >99% uptime since 2012 and still going strong.

          • synthtel2
          • 11 months ago

          Unless you’ve got it crunching numbers 24/7, undervolting isn’t the place to look for that. The idle states don’t care how you’ve got it tuned under load.

        • ptsant
        • 11 months ago

        You have a very good sample. Best I have seen with my 480 is 975mV but need 1025-1050mV to be really stable. Can hit 580 clocks (1330+) at 1050mV, which is not bad.

          • Ruiner
          • 11 months ago

          Nice. 6 pin or 8 pin card?

            • ptsant
            • 11 months ago

            8-pin (Sapphire Nitro 480)

          • synthtel2
          • 11 months ago

          945 doesn’t give stock clocks, and probably doesn’t on most cards. I’m running 1160; 1190-1200 would probably be good if I wanted to push it, and it crashes once in a while at 1210. I think that makes it a pretty average sample (edit: that’s for 4**, I don’t know about 5**).

          It definitely isn’t quite going to keep up with a 1060 6G, but it also uses less power than one.

            • DPete27
            • 11 months ago

            I do 1235MHz/940mV….

            • Spunjji
            • 11 months ago

            Now that is a nice chip.

        • ronch
        • 11 months ago

        I actually have my FX-8350 undervolted to about 94% of its stock voltage. Thing is though, the chips from the other team can probably be undervolted as well.

          • Voldenuit
          • 11 months ago

          Yep, I’m running the 1080 Max Q in my laptop at ~780* mV and the 1080 Ti in my desktop at about 915 mV. The laptop GPU especially is noticeably cooler and quieter under load after the undervolt.

          *EDIT: I’m going to have to check the voltage on the MaxQ when I get home, but I remember it seemed an unreasonably low number at the time.

          • synthtel2
          • 11 months ago

          They can, but they don’t make it as easy and doing so hurts performance more (they’re closer to optimal to start with and on a flatter V/F curve).

    • Unknown-Error
    • 11 months ago

    You know there was a time when the AMD CPU charts were like the AMD GPU charts today. What a turn-around.

      • synthtel2
      • 11 months ago

      Like OG Zen era if OG Zen were trying to run 4.0 / 1.4V all-core?

      The tuning and launch price here are questionable, but the tech is fundamentally sound, much unlike Bulldozer.

      • DPete27
      • 11 months ago

      They still are for gaming

        • ptsant
        • 11 months ago

        Benchmarking 1080p with high-end GPUs is nice for illuminating the differences but not that relevant. Most users will be GPU-limited in real life and will not be playing 1080p at 200fps. (also, differences between 120 fps and 150 fps are not usually visually meaningful)

        The benchmark differences in GPUs translate much closer to real-life differences in gaming performance, in my opinion.

    • jihadjoe
    • 11 months ago

    The price/performance scatterplot is interesting. AMD clearly owns the midrange, but Nvidia has the high-end.

    • NovusBogus
    • 11 months ago

    It’s certainly not an embarrassment, but the power draw kills it for me. Twice as much as the competition translates to a fair amount of money over a 2-3 year period.

      • Demetri
      • 11 months ago

      The most surprising thing to me on the power draw graph is the 570. Within 20W of the 1060 3GB and comparable performance. So Polaris can be pretty efficient when they’re not trying to wring every last clock cycle out of it. I imagine a 12nm version would be pretty much equivalent in perf per watt if they didn’t touch the clocks.

        • NovusBogus
        • 11 months ago

        Yeah, the lower orders of Polaris are actually pretty reasonable. AMD’s decision to focus on the midrange cards certainly makes sense given what they have to work with.

      • Freon
      • 11 months ago

      The 97W difference between a faster RX590 vs. a 1060 6GB, @3hr/day (21hr/week) for the entire year, at prevailing rate of $0.12/kwh = $12.75 per year. But the 590 is faster than the 1060 6GB so that’s not entirely fair. Compared to the 1070 it is $7.10/yr, which won’t pay for the 1070’s higher price after three years by any stretch.

      I really don’t think it’s that big of a deal. Variable sync cost is a MUCH bigger deal.

        • Chrispy_
        • 11 months ago

        Closer to €0.20 per kWh in Europe, which means that at 100W extra at 1200h of annual usage is about €25 a year, so €75 for a typical three year useful lifetime. That more-than-covers the difference between an RX590 and a 1070, which [i<]still[/i<] draws less power than an RX590 despite being significantly faster.

          • Rza79
          • 11 months ago

          1200h means 3,2 hours of gaming every day of the year. Isn’t that a bit excessive? I know some people do that but that’s surely not the norm. Non-gaming power usage is similar to nVidia.
          I would say 500h a year of gaming is a better average for a normal (not in high school) user and in Europe that would be 10 Euro.

        • f0d
        • 11 months ago

        i cry when i see the price of power in (im guessing) america

      • DPete27
      • 11 months ago

      Depends on how much you game. At $0.14/kWhr it’s 14 cents extra per 10 hours of gaming.
      And that’s before the trivially easy task of undervolting at stock clocks which would likely save 30W or more by my estimation.

        • Freon
        • 11 months ago

        There’s a reason AMD doesn’t set the voltage lower from the factory…

          • jihadjoe
          • 11 months ago

          And clearly it’s because Radeon Technologies Group is full of idiots who do not care to have their GPUs run quietly or efficiently!

            • Star Brood
            • 11 months ago

            When I bought my 7850 it destroyed the Fermi generation before it by every metric. That was a great time. Now nVidia are the perf/w leaders. Hopefully whatever Navi is or whatever they choose the next actual GPU generation to be is a massive leap in the right direction.

    • albundy
    • 11 months ago

    Did they at least improve the VCE HEVC hardware encoder?

      • ptsant
      • 11 months ago

      They haven’t changed a single transistor, so I don’t see how it could be any different. I thought it worked OK.

        • Action.de.Parsnip
        • 11 months ago

        Strictly speaking they changed every single one.

          • ptsant
          • 11 months ago

          In size, not function.

    • Jeff Kampman
    • 11 months ago

    I’ve added a brief section on noise and power consumption to the review. Enjoy!

    • moose17145
    • 11 months ago

    I also see that right now Newegg has an ASRock Vega 64 for $410 USD, and Sapphire for $440.

    They are reference designs / coolers… but still that is not a bad price…

    • Jeff Kampman
    • 11 months ago

    Apologies for the state of this article this far on, but after doing all-nighters for both a CPU review and a graphics-card review in the same week, my body decided it was time to pull a Windows 10 and do a forced shutdown for a while. I’ll be resuming work on it shortly.

      • synthtel2
      • 11 months ago

      Good things are worth waiting for. Take care of yourself!

      • derFunkenstein
      • 11 months ago

      Don’t kill yourself over it, man. It’s just hardware. It’s not going anywhere. 🙂

      • Chrispy_
      • 11 months ago

      It’s the weekend, and it’s cold. That sounds like the perfect opportunity for a duvet day to me 🙂

        • derFunkenstein
        • 11 months ago

        I always get duvet and bidet confused.

          • K-L-Waster
          • 11 months ago

          Hopefully you’re getting the words mixed up, not the objects… 0_0

            • derFunkenstein
            • 11 months ago

            Well I was trying very hard not to picture somebody sitting on the can.

    • Mat3
    • 11 months ago

    So they didn’t increase memory speeds? Polaris always seemed somewhat bandwidth constrained already.

    • ermo
    • 11 months ago

    Makes you wonder whether AMD will at some point release 12nm LP RX Vega 56/64 SKUs and reserve the 7nm process for the server segment CPUs and GPUs in the near term?

    • Chrispy_
    • 11 months ago

    Thanks for the numbers Jeff, looking forward to the power consumption figures. Did you do any undervolting or testing at reduced power limits?

    If other sites are anything to go by, AMD have shot themselves in the foot [i<]yet again[/i<] by overvolting the 12LP process to the absolute limit just to eke out another few MHz. Apparently, 12LP requires significantly less voltage than 14nm at the same clocks, and a 1400MHz GPU at 150W would genuinely make it more appealing than a 1550MHz card that I assume runs hotter than the claimed 225W TDP (AMD have understated their TDPs for all desktop Polaris designs so far). The extra power requirement means much larger/more expensive coolers and power supplies, effectively handing the SFF market and potential sales to owners of prebuilt PCs to Nvidia. Not only is the large cooler undesirable for compatibility reasons, it's also more expensive to produce, package, store, and distribute. All of these large heavy coolers mandate even further collateral cost, because the card will need reinforcing to support the weight. When I look at the most popular variants of the mainstream cards, it seems to be the ones with 5-star (or egg) reviews that cite cool and quiet running. The cheaper cards offering the best performance/$ are usually lower-scoring [i<]and[/i<] there are fewer reviews, indicating that sales aren't anywhere near as good. It's clear to me that [b<]people are prepared to pay extra for a cool, quiet, and efficient card[/b<] that doesn't cause all the fans in their case to rev up and drown out their game sound. I guess that's because a lot of people game with speakers rather than headphones, but don't want to hear their PC making a racket. That's my preference at least, and I don't think I'm alone. I'm just wondering when AMD will work that out, and whether they realise that poor marketshare on the desktop isn't just about performance/s. The product has to be tolerable to live with, too.

      • chuckula
      • 11 months ago

      If every AMD card could be massively undervolted with no negative effects somebody at AMD would have done it by now. The logical answer is that while some AMD cards can be undervolted, plenty of others (probably at least a third and maybe more) can’t and AMD wants to keep yields reasonable.

        • DPete27
        • 11 months ago

        50mV undervolt across the range of frequencies on the GloFo 14nm process was pretty standard, regardless of chip lottery. It seems AMD found the best freq/voltage curve and applied a +50mV offset. Most likely for a safety factor against stability. That may not seem like much, but I’m usually able to cut ~20% power usage by undervolting polaris cards.

        Of course, it’s yet to be determined how the 12nm cards will behave, but my guess is that they’ll also share the same +50mV offset as their predecessors.

        Also the tipping point on the GloFo 14nm process is around ~945mV. Would be interesting to see where their 12nm process bends.

          • Chrispy_
          • 11 months ago

          Early reports say a little higher, about 25-50mv – but the good news is that this tipping point occurs a good 150-200MHz faster than it does with 14nm.

          I assume that means we’ll start to see 1.4GHz@1.00v as a common underclock target for the 590;

          Time will tell!

            • MOSFET
            • 11 months ago

            I really, really don’t think many people underclock/undervolt video cards / GPUs. (I realize the two guys above me in the subthread do undervolt.)

            The 580 8GB looks decent on the scatter plot, esecially when combined with the new <$200 Newegg (USD) sale prices.

            • synthtel2
            • 11 months ago

            Add me to the list. I don’t even mind heat/noise/power under load that much, I just care about the tiny performance difference even less.

            Undervolting AMD cards is really, really easy. If you can install Windows on a computer you built, you can undervolt.

            • ptsant
            • 11 months ago

            Undervolting with AMD Wattman it’s relatively painless. Not saying that you should, but since it’s less dangerous than overclocking, I don’t see the harm in trying.

        • tay
        • 11 months ago

        Yes. My RX 470 didn’t like being undrevolted one bit. It was a shit chip, which you could tell by running some software that would report back chip quality (forget which one).

        • Chrispy_
        • 11 months ago

        No, that’s not how Polaris works, at all. Every Polaris card has multiple performance states, each of which has voltages to match the clocks. The higher the clock state, the higher the voltage. AMD exposes this freely in their driver and even the worst of the yields that manage to make it into a production card will operate about 150MHz slower for 150mv.

        That’s not yields, and that’s not undervolting, that’s the default behaviour of all Polaris cards using stock clocks and voltages, just at lower performance states. Power consumption is proportional to the [i<]square[/i<] of the voltage, so a relatively small drop in voltage causes a really big drop in power consumption. As has been said many a time by many a reviewer, AMD always push their desktop GPUs well outside the sweet spot on the clock/voltage curve. If Intel sold a coffee-lake i7-8700K+ that was 10% faster than the standard 8700K but needed a new board with a 250W socket, people would call them out for it. The sweet spot for Coffee Lake is seems to be 3.8-4.3GHz depending on yields and the bad yields become low-end i5 models like the 8400. Intel know that TDP and efficiency matter, which is why they don't set the stock voltage/all-core boost/TDP any higher. My issue is that the 12LP process is the silver bullet for AMD's non-trivial performance/Watt problem. Initial reports say that it's 20-25% more efficient than GF14nm at the 1.4GHz range, before we even attempt to undervolt it. Rather than using the new 12LP process to fix their most serious problem, AMD have just turned everything up to eleven and put their fingers in their ears 🙁

        • Concupiscence
        • 11 months ago

        Bingo. This had to affect the AMD FX series chips too; while I had good luck in undervolting, it’s entirely possible that I just had good luck. Nobody at AMD *wanted* to sell CPUs that took 50% more power to run than the Intel chips they unevenly competed against.

      • K-L-Waster
      • 11 months ago

      “Cool and Quiet” doesn’t fit with the marketing message of “Etreeeeem Gamerrr Overclocking Dominashunnnn!!!!!” though.

        • DPete27
        • 11 months ago

        “[s<]Fast[/s<] Mediocre & Loud"

        • Chrispy_
        • 11 months ago

        No but if you look at those independent customer review sites used by many online retailers, you’ll see that the cheap noisy cards don’t get anywhere near as many reviews as the more expensive quiet ones.

        Clearly, the marketing message is distanced from reality – in which a quiet cooler seems to be the one common factor in models that sell well.

          • K-L-Waster
          • 11 months ago

          I was of course being somewhat facetious…

          To put it another way, they sell cards clocked like this for the same reason they style the things like Decepticons and load RGB LEDs on every exposed millimeter. They think people want consumer PC gear to look like something a 12 year old with ADHD would design.

      • Goty
      • 11 months ago

      A lot of the issue of heat and noise is down to cooler design. Evidently the Sapphire variant that TechPowerUp tested is quieter than the best 1060 models they’ve tested, comes with a factory OC, and is priced right at MSRP. The only concern at that point is power consumption, which is nothing new for the 480/580 cards.

      • Ruiner
      • 11 months ago

      Seconded. I’d be curious to see power usage for an undervolted 590 at 480 clocks.
      They are shooting for 1060 6gb performance and prices, it seems. Such a beast would have to be priced much lower but might make for a single slot compact card.

      • synthtel2
      • 11 months ago

      100% agreed. This could have competed solidly with a 1060 6G on both performance and power consumption, if AMD thought it worthwhile.

        • Goty
        • 11 months ago

        The 580 is already faster and cheaper than the 1060 6GB; would competing on power consumption really sell a lot of cards? I don’t think it would, personally.

          • synthtel2
          • 11 months ago

          I would agree, but power consumption is pretty much the only point Nvidia is actually winning in that range. If I’m to assume Nvidia plays fair, then power consumption has to by elimination be the big reason GP106 outsells Polaris *0 10:1.

          (I don’t think Nvidia plays fair or even remembers what playing fair would mean, for the record.)

            • Goty
            • 11 months ago

            I’d be surprised if the vast majority of people using midrange GPUs have even the slightest idea how much power their GPUs consume. It’s all about brand reputation and NVIDIA wins that battle for any number of reasons, deserved or not.

            • synthtel2
            • 11 months ago

            Sure, but that brand reputation doesn’t spring out of thin air. How many even here think that Polaris’ power consumption reflects fundamental problems with GCN? How many even here think that perf/W differences of this magnitude have more to do with architecture than tuning? I see these sentiments regularly in this comment section, and I’m sure they’re only more common elsewhere.

            Nvidia does plenty of dirty and aggressive PR, but they’ve still got to have something they’re winning at for it to work. They’re no strangers to manufacturing reasons (Crysis 2 tessellation, Witcher 3 HairWorks, etc), but I’m sure they’re not going to complain about AMD repeatedly throwing them this particular softball.

            • Goty
            • 11 months ago

            *EDIT* Sorry. Sleepy post. Basically said the exact same thing as above.

            • Spunjji
            • 11 months ago

            You’re right that brand reputation doesn’t spring out of thin air, but it’s actually quite difficult to pin down where it really does come from. Most reasons I see given for a brand preference for Nvidia (individual products aside) come down to the usual “better drivers” and “highest quality products” that doesn’t really have any basis in reality, given how both sides have had on-and-off again problems in both areas. When I do see technical reasons given it tends to look a lot like post-hoc reasoning as justification for an expensive purchase.

            The most telling part in all of it is how the notion of “quality” shifts around all the time. In the 200 and 400 series days, a lot of “enthusiasts” claimed to note care at all about power draw in favour of absolute performance leadership to the tune of 5-10fps. Now I’m suddenly seeing people on other sites banging on about the extra cost of a higher-wattage PSU, about fan noise, about all sorts of stuff that didn’t matter when it was Nvidia’s problem. The same things applied to comparisons between AMd and Intel CPUs in the P4 days.

            I’m not inclined to think it’s a conspiracy, I’m just not inclined to think there’s much of a rigorous, scientific mindset behind it either.

            • synthtel2
            • 11 months ago

            The notion of quality shifting around all the time is actually one of the things that makes me think foul play is more likely – across all of PC hardware, >50% of that seems to be about GPUs and in Nvidia’s favor. Everywhere else, there are the occasional obvious fanboys and most of the rest seem to be able to give fairly consistent reasons for buying what they buy, if not always reasons TR readers would find sufficient. For Nvidia in particular, popular perception always does whatever it takes to say they can do no wrong, even on the occasions it hasn’t taken much knowledge to see that they were doing wrong.

            Of course there are plenty of legitimate and fair factors that contribute to this and Nvidia is on top of every one of them, but I don’t think they could possibly be enough, and Nvidia has already been caught doing plenty of underhanded things including hiring forum shills. If they didn’t have a history, I’d be happy to give them the benefit of the doubt, but they do and I’m not.

            • Voldenuit
            • 11 months ago

            nvidia doesn’t get a free pass on hot chips. Remember the 5700/5800 Ultra debacle, and Fermi being christened “Thermi”?

            Mind you, AMD has had their share of hot chips, including the X1900XTX and X2900XTX, and those weren’t well regarded either.

            • Krogoth
            • 11 months ago

            Big Pascal is a toasty beast when fully loaded. However it is overlooked because it has performance to back it.

            • Voldenuit
            • 11 months ago

            Euh. My 1080Ti runs at 60-65C with an overclock and liquid metal. Even before the liquid metal, it wouldn’t thermal throttle. And its power consumption is less than Vega 64, with more performance.

            • auxy
            • 11 months ago

            [quote<]Most reasons I see given for a brand preference for Nvidia (individual products aside) come down to the usual "better drivers" and "highest quality products" that doesn't really have any basis in reality, given how both sides have had on-and-off again problems in both areas.[/quote<]"Both companies have had driver issues" does not equate to "their driver quality is the same", not even close. (・へ・) AMD'S Radeon drivers are still significantly worse. Weird bad performance in games that run fine on lesser NV hardware, broken features, stability problems, compatibility problems, you name it. And those are just the major issues! I LOVE Radeons, but they are simply and clearly a budget option next to the premium GeForce brand.

            • Krogoth
            • 11 months ago

            They both have suffer from stupid, stupid software and hardware issues over the years. Only die-hard fanboy pretend that their favorite brand doesn’t suffer from it.

            • Krogoth
            • 11 months ago

            No, it is because of brand name.

            Nvidia is simply a far more powerful brand and has far stronger mindshare then AMD RTG in the gaming circles.

            Tahiti simply couldn’t put a dent into Kepler despite being superior at every tier at the time. The following Maxwell gave Nvidia a decisive advantage in every metric.

      • Mr Bill
      • 11 months ago

      I think you’re being a bit hard on the power numbers. Power is just a touch above the 1070 and 1070Ti and as 66-76W below (as it should be) the 1080 and 1080Ti.

    • chuckula
    • 11 months ago

    At first I was like.. this won’t destroy Turing!

    Then I remembered: This will destroy Arctic Sound!

    Thanks AMD!

      • Krogoth
      • 11 months ago

      It needs more glue……

        • chuckula
        • 11 months ago

        Don’t worry grasshopper.
        What you don’t realize is that Navi is the ancient Sumerian name for “Gorilla Glue”!

        16 chipper Navi….. CONFIRMED!

          • chuckula
          • 11 months ago

          Pro tip: when forced against your will to use an IPhone while traveling, remember that the RDF doesn’t recognize chiplet and changes it to chipper automatically.

          #WoodChipper4Life

            • Krogoth
            • 11 months ago

            Apples grow on trees and iPhone auto-corrects chiplet to chipper.

            Coincidence? I think not.

    • derFunkenstein
    • 11 months ago

    [quote<]If you'd rather not page through reams of frame-time data, you can skip ahead to the conclusion at leisure.[/quote<] Who is coming to TR and NOT at least flipping through every freakin' graph? It's, like, what the site is known for. 😆

    • Techonomics
    • 11 months ago

    “For 1920×1080 gaming at high or ultra settings, the RX 590 proves both swift and smooth.”

    I could have missed something, but it wasn’t immediately obvious what resolution you tested the cards at starting on the first page. The above quote is from the conclusions. Of course, I deduced it was 1080p since I’m familiar with your graphs (being a long time TR reader), plus given the cards being testing.

    Just an FYI. 🙂

      • Voldenuit
      • 11 months ago

      This.

      I wish TR would put the tested resolution in the title of every graph.

      Leaving it off is as bad as not labelling axes…

        • DPete27
        • 11 months ago

        They typically include a screenshot of the Video Settings for each game at the top of the respective page before the performance graph. There’s also usually WORDS on the performance metrics pages. This article is a work in progress, be patient.

          • MOSFET
          • 11 months ago

          Going by past reviews, the Settings screenshots don’t always have the resolution. Love TR, but I agree here.

            • Voldenuit
            • 11 months ago

            Also, every game’s settings page looks different, so it takes extra time to decipher, and since it’s usually a screen-grab, it’s also not information you can Ctrl+F to find.

      • DPete27
      • 11 months ago

      Since the GTX1070 produced 63fps in Witcher 3 on Ultra at 1440p and 93 in this article, you may be right.

      ….Either that, or Jeff isn’t testing at Ultra settings anymore (please let this be the case)

        • Voldenuit
        • 11 months ago

        If long time readers are having trouble figuring out exactly what is being tested and how, maybe it’s a good idea to add nine ***-dang characters to the top of each graph? e.g. 2560×1440, 1920×1080.

          • techguy
          • 11 months ago

          I agree. Labeling graphs is a good idea.

    • DPete27
    • 11 months ago

    According to anandtech, the RX590 needs only 1.156V to hit those clocks, whereas the RX580 needs 1.163V at 200MHz less.

    Color me impressed and surprised.

    OTOH:
    [quote<]Polaris 30 is well past the optimal point on the voltage curve with the clocks at hand.[/quote<]

      • ptsant
      • 11 months ago

      Meanwhile, my RX480 needs only 1.05V to run at 1266MHz. And it’s not even a “good” sample. The 590 can probably be undervolted to 0.95V-1.0V at 1400MHz. Jeff, willing to try this?

        • DPete27
        • 11 months ago

        Mine is 1305MHz/1020mV but that’s as low as I can go for a 24hr+ stress test. It gets mad if I take away even 10mV more.

    • Demetri
    • 11 months ago

    They’re offering a pretty nice game bundle starting today. DMC5, Resident Evil 2 Remake, and The Division 2. If you get a 590 or Vega you get all 3, or pick 2 with a 570 or 580.

    I agree, the price is too high, especially with discounted Vegas on the high end starting at $350, and 580s on the lower end for $200 or less.

      • Voldenuit
      • 11 months ago

      I’m expecting a fire sale on RTX cards, with ripple down effects through the supply chain, because after the BF V RTX benchmark fiasco, nobody’s going to be crazy enough to pay $500 for a RTX 2070 that can’t do RTX, or $800 for a 2080 to play at 1920×1080.

        • Krogoth
        • 11 months ago

        Fire sale for existing Turing SKUs wouldn’t happen until Pascal stock dries up and Nvidia introduces lesser Turing SKUs in their place.

        • NovusBogus
        • 11 months ago

        I was rather looking forward to picking up a Turing x70 that outperforms the 1080/1080ti at equal or lower power draw for less money like how next-gen GPUs normally work. Instead they’re offering something that roughly matches the 1080 for more money and more power consumption, bundled with the promise of some highly experimental early-adopter features that might be proven out around the time the card is going obsolete. Mega snore.

        As it stands I’m probably going to get a Zotac 1080 mini during the holiday sale season and fly over this whole mess.

      • stdRaichu
      • 11 months ago

      After seeing the reviews of the 590, and also seeing that Vega’s were now selling for sane prices in the UK, I bought a Vega 56 (which I plan to underclock since I don’t need top-tier 3D performance but I can certainly use the extra VRAM) and was quite surprised to be sent the games doofer a few minutes after completing the purchase (it wasn’t listed on the product page).

      Shame the games are all windows-only though…!

        • Freon
        • 11 months ago

        There have been great deals on the Vega 56/64 in the US as well. I don’t think they’re bad options. You’ll save $200 or more on the Freesync version of your favorite Gsync gaming monitor while you’re at it.

        TR’s aggregate shows a slight edge for the 1070 over Vega 56, but other sites using a different mix of games show it leaning slightly in the other direction. They’re close enough to call it a draw IMO.

    • Krogoth
    • 11 months ago

    I give it 1 Krogoth. It is too little, too late.

    Nvidia is going to be unleashing its “1060Ti”(basically, 3/4s of a 1080) which easily will match or best the 590 for the same price point. While lesser Turing SKUs are coming down the pipe in 2019.

    If AMD RTG had a “590” a year ago to go along with Vega launch. It would have made more of an impact. These days, consumers for the mid-tier have already settled for a 1060 6GiB or 580. They are going to be waiting for lesser Turing SKUs and Navi to come out.

      • ptsant
      • 11 months ago

      True, nobody is going to “upgrade” from a 580 (not even a 480, most likely) or 1060 to a 590.

      This card is an upgrade for 460/560/470/570, 1050[Ti] and slower cards.

        • Krogoth
        • 11 months ago

        Those buyers are probably waiting for lesser Turing or Navi before jumping the gun unless their current card happens to bite the barn.

          • ptsant
          • 11 months ago

          As an owner of a RX480, I am hoping for Black Friday deals on Vega 56. Still, not very likely.

      • Anonymous Coward
      • 11 months ago

      Well, its also a low-effort product by AMD that will (one imagines) be possible to produce at competitive prices. I expect to see them get aggressive with the pricing on this silicon at some point. The most interesting thing to me is that they put this at the top of the 5xx line at this late stage.

      • willmore
      • 11 months ago

      As someone who has had a 590 since last October 2017 (before prices went crazy), I have to agree. I don’t see much market value in adding in this card at this price point.

      What I suspect–and which Jeff mentions in the introduction–is that this is being made to shift silicon useage to the contractually obligated GF process–as more higher end production moves to TSMC 7nm.

      An article on that would be very informative. 🙂

    • ptsant
    • 11 months ago

    Thanks for the review, I was impatient to read it!

    I generally agree with your conclusions. At $280 it makes more sense to step down to the 580 (often at $200 or so) or step up to a Vega 56 on sale. However, the lower power use, compared with the 580, is probably worth a small $$ premium. Also, if the past is any indication, I hope we will quickly see cards selling below MSRP, closer to $240, which would be an ideal price for a midrange card that can cut through any title at 1080p.

    PS OC potential may also tilt the equation a bit
    PS2 Some AIB cards may come with faster DDR5 (apparently 8.5 GBps)

    —-

    Update: I peeked the power numbers at Anand. Unfortunately, it seems that the 590 consumes even more power than the 580 and a full 70W more than the RX480 playing BF1. Very disappointing and certainly not justified by the level of performance (for ex: +70W for +8 fps in BF1). Am waiting for your power consumption estimates.

      • Krogoth
      • 11 months ago

      It is not really that much of a shock. GCN simply cannot match Nvidia GPUs at power efficiency. It is foolhardy to think that 12nm process would have made a difference.

        • ptsant
        • 11 months ago

        Well, here the performance/W didn’t move at all. It should have improved, even a little.

        Some are saying that the actual chip design remained the same and that they could have used some sort of “optimized” 12nm design (for the same logic, ie the same transistors) but they didn’t. If I understand that correctly, they didn’t even adapt GCN to 12nm, but simply reprinted.

      • Mr Bill
      • 11 months ago

      [quote<]With our XFX RX 590 on our test bench, I observed less than 20 watts' extra draw on our power meter versus a hopped-up RX 580 8 GB. [/quote<]

        • ptsant
        • 11 months ago

        Which was already ridiculously power-hungry compared with the RX 480…

      • Freon
      • 11 months ago

      Yup, 40% higher price for 5-8% more performance is definitely a tough sell given current street prices. It’s just not hard to find a 580 for $199 these days by pressing F5 on /r/buildapcsales

      I guess time will tell what happens with pricing.

      • meerkt
      • 11 months ago

      Another reference point: a few weeks ago Amazon was selling a 2 fan 1070 for $290.

        • ptsant
        • 11 months ago

        I agree, the 1070 is definitely the superior card but the cost of G-sync is more significant in the low/mid end than in the high-end (2080 etc).

          • meerkt
          • 11 months ago

          Once G-sync runs out of steam, I secretly expect Nvidia to enable VESA adaptive sync in the last few GeForce generations.

            • Krogoth
            • 11 months ago

            They already do support VESA’s adaptive sync on mobile flavors of post-Kepler GPUs. It is just a matter of enabling on firmware/drivers for desktop chips.

            • Spunjji
            • 11 months ago

            I fully expect them not to; the capability has been there for years now and yet they stoically refuse. I hope you’re right, though!

            • Krogoth
            • 11 months ago

            Nvidia wants to setup a vertical monopoly on the gaming market. They still feel that G-Sync 1.0 in its current state can pull it off.

        • albundy
        • 11 months ago

        yikes! thats way too much for me! wait for an ebay 15% off code. they seem to be coming at the end of every month. i bought a used EVGA dual fan 1070 for $200, and paid $170 after the code.

    • derFunkenstein
    • 11 months ago

    That’s quite a nice performance uptick over the RX 580. It doesn’t catch the 1070 but it clearly slams the door on the 6GB GTX 1060 (itself retailing for [url=https://www.newegg.com/Product/Product.aspx?Item=9SIA6ZP56T2897<]around $280-290 for some variants[/url<], so I don't have an issue with the price). Hopefully it doesn't come at too much of an increase in power draw. The [url=https://www.notebookcheck.net/XFX-Radeon-RX-590-Fatboy-graphics-card-details-revealed.361121.0.html<]picture of the card on Videocardz[/url<] appears to show an 8-pin and a 6-pin PCIe connector. edit: all the dual-fan EVGA GTX 1060 6GB cards on Newegg push up over $290 to $320 or so.

      • derFunkenstein
      • 11 months ago

      Oof.

      [url<]https://www.anandtech.com/show/13570/the-amd-radeon-rx-590-review/15[/url<] The stock 590 draws 30W more than the 580 while playing Battlefield 1. That's a full 110W more (at the wall, so probably 95-100W actual) than the 1060 6GB. Temps and noise are decent, at least.

        • Jeff Kampman
        • 11 months ago

        I tested system power draw using [i<]Doom[/i<]'s Vulkan renderer and observed 290 W at the wall for the RX 590 and 190 W for the GTX 1060 6 GB. I'll get that graph in later, but the move to 12LP does not change the fact that AMD is way behind on perf/watt.

          • derFunkenstein
          • 11 months ago

          Thanks, Jeff!

      • FuturePastNow
      • 11 months ago

      I guess my hope is that this results in a mini price war that reduces the cost of the 1060 6GB and maybe also the 570 and 580.

      • anotherengineer
      • 11 months ago

      more value added possibly?

      [url<]https://www.techpowerup.com/249610/amd-launches-raise-the-game-fully-loaded-bundle-offers-up-to-three-free-games-if-you-buy-a-radeon-rx[/url<]

        • derFunkenstein
        • 11 months ago

        Yeah, definitely. It’s quite a deal.

Pin It on Pinterest

Share This