AMD’s Radeon RX Vega 64 and RX Vega 56 graphics cards reviewed

AMD’s Vega for gamers is finally here. The Radeon RX Vega 64 and RX Vega 56 mark AMD’s return to the high end of the graphics-card world. After a long, long stretch wherein 2015’s Radeon R9 Fury X and R9 Fury were asked to hold the fort against Nvidia’s Pascal onslaught, the company is relieving them with what promises to be competition for Nvidia’s long-dominant GeForce GTX 1080 and GeForce GTX 1070.

The Vega 10 GPU that’s riding in on those cards is a massive and massively complex piece of silicon. It packs 12.5 billion transistors into a 486 mm² die fabricated on GlobalFoundries’ 14nm LPP FinFET process. (For comparison, the Nvidia GP102 chip aboard the GTX 1080 Ti and friends is a similarly massive 12 billion transistors on a 471 mm² die.)  The card’s compute resources are organized into 64 “Next-gen Compute Units.” Each of these hosts 64 stream processors for a total of 4096. The full Vega 10 chip has 256 texture units and 64 ROPs, too.

While the basic organization of Vega 10 may sound similar to Fiji before it, the similarities largely end at those broad outlines. I would love to explore Vega’s many changes and capabilities in more depth, but when you have two days and change to review two brand-new graphics cards in the wake of testing and writing for a CPU review, stuff has to be left on the cutting-room floor, and a deep dive on Vega’s new talents is one of them. We’ve known broadly what the new bits of Vega would be since January, however, so my architecture introduction is as good a place as any to start if you need to catch up. I’ll be trying to add more information to this article as time goes on, but AMD should have a white paper available soon with full architectural details if you want to know much, much more.

The Radeon RX Vega 64 and RX Vega 56

The implementations of the Vega 10 GPU that will be available to consumers were revealed at SIGGRAPH a couple weeks ago, but to recap, AMD will be selling the fully-enabled Vega 10 GPU aboard the Radeon RX Vega 64 in both air- and liquid-cooled varieties. The Radeon Vega 56, on the other hand, loses eight NCUs to the world’s tiniest chainsaw, and it’ll only be available as an air-cooled card in its reference form. Curiously, AMD left all 64 ROPs intact on both the RX Vega 64 and RX Vega 56, meaning the cuts to the 56 may hurt less than they otherwise might.

  GPU

base

clock

(MHz)

GPU

boost

clock

(MHz)

ROP

pixels/

clock

Texels

filtered/

clock

Shader

pro-

cessors

Memory

path

(bits)

Memory

bandwidth

Memory

size

Board

power

GTX 970 1050 1178 56 104 1664 224+32 224 GB/s 3.5+0.5GB 145W
GTX 980 1126 1216 64 128 2048 256 224 GB/s 4 GB 165W
GTX 980 Ti 1002 1075 96 176 2816 384 336 GB/s 6 GB 250W
Titan X (Maxwell) 1002 1075 96 192 3072 384 336 GB/s 12 GB 250W
GTX 1070 1506 1683 64 120 1920 256 259 GB/s 8GB 150W
GTX 1080 1607 1733 64 160 2560 256 320 GB/s 8GB 180W
GTX 1080 Ti 1480 1582 88 224 3584 352 484 GB/s 11GB 250W
Titan Xp 1480? 1582 96 240 3840 384 547 GB/s 12GB 250W
R9 Fury X 1050 64 256 4096 1024 512 GB/s 4GB 275W
Radeon RX Vega 64

(air-cooled)

1247 1546 64 256 4096 2048 484 GB/s 8GB 295W
Radeon RX Vega 64

(liquid-cooled)

1406 1677 64 256 4096 2048 484 GB/s 8GB 345W
Radeon RX Vega 56 1156 1471 64 224 3584 2048 410 GB/s 8GB 210W

Both cards also ship with 8GB of HBM2 memory on board. That memory communicates with the Vega 10 chip across a 2048-bit bus. AMD doesn’t disclose as much, but on the RX Vega 64 cards, that HBM2 runs at an effective rate of 1890 MT/s, and on the RX Vega 56, it runs at about 1600 MT/s.

  Peak pixel

fill rate

(Gpixels/s)

Peak

bilinear

filtering

int8/fp16

(Gtexels/s)

Peak

rasterization

rate

(Gtris/s)

Peak

shader

arithmetic

rate

(tflops)

Memory

bandwidth

(GB/s)

Asus R9 290X 67 185/92 4.2 5.9 346
Radeon R9 295 X2 130 358/179 8.1 11.3 640
Radeon R9 Fury X 67 269/134 4.2 8.6 512
GeForce GTX 780 Ti 37 223/223 4.6 5.3 336
Gigabyte GTX 980 85 170/170 5.3 5.4 224
GeForce GTX 980 Ti 95 189/189 6.5 6.1 336
GeForce Titan X 103 206/206 6.5 6.6 336
GeForce GTX 1070 108 202/202 5.0 7.0 259
GeForce GTX 1080 111 277/277 6.9 8.9 320
GeForce GTX 1080 Ti 139 354/354 9.5 11.3 484
GeForce Titan X (Pascal) 147 343/343 9.2 11.0 480
Radeon RX Vega 64 (air) 99 396/198 6.2 12.7 484
Radeon RX Vega 64 (liquid) 107 429/215 6.7 13.7 484
Radeon RX Vega 56 94 330/165 5.9 10.5 410

We didn’t have time (yet) ahead of publication to run our fancy Beyond3D test suite, so here are some potential peak rates for the RX Vega family. Compared to the GTX 1080, the RX Vega 64s trail slightly in pixel fill and peak rasterization rates, but they bring a prodigious array of shader and texturing power to the table, along with much higher memory bandwidth. The RX Vega 56 similarly trails and trounces the GTX 1070 in these theoretical peak rates. Our tests will tease out whether those theoretical victories translate into real performance shortly.

AMD’s reference design for both the air-cooled RX Vega 64 and RX Vega 56 uses the same black shroud with an axial fan exhausting air directly out of the rear of the card. While this design may bear some similarities to the Radeon RX 480 before it, they’re ony skin-deep. The shroud on this card is mostly metal and features a full metal backplate, as well. The Radeon logo on the side of the card lights up with red LEDs when the card is on.

One neat little touch from the R9 Fury X returns on board the RX Vegas: the “GPU Tach.” This line of red LEDs (or blue, if you flip a DIP switch) will show you your GPU’s occupancy, presuming you can see it from your desk or other site of sitting. The LEDs can also be turned off.

The Radeon RX Vega 56 will list for $399, or $20 more than the GTX 1070’s original $379 suggested price. The RX Vega 64 air-cooled card will sticker for $499, or the same as the GTX 1080’s most recent suggested price.  Both cards are also available as limited editions and as parts of “Radeon Packs,” which we described in more detail in our RX Vega reveal. We’ve been talking about RX Vega for ages, so now it’s time to shut up and share performance numbers.

 

Our testing methods

Most of the numbers you’ll see on the following pages were captured with OCAT, a software utility that uses data from the Event Timers for Windows API to tell us when critical events happen in the graphics pipeline. We run each test run at least three times and take the median of those runs where applicable to arrive at a final result.

As ever, we did our best to deliver clean benchmark numbers. Our test systems were configured like so:

Processor Core i7-7700K
Motherboard Gigabyte Aorus GA-Z270X-Gaming 8
Chipset Intel Z270
Memory size 16GB (2 DIMMs)
Memory type G.Skill Trident Z

DDR4-3600

(run at DDR4-3200)

Memory timings 15-15-15-35 2T
Hard drive Samsung 960 EVO 500GB

Kingston HyperX 480GB

2x Corsair Neutron XT 480GB

Power supply Seasonic Prime Platinum 1000W
OS Windows 10 Pro with Creators Update

 

  Driver revision GPU base

core clock

(MHz)

GPU boost

clock

(MHz)

Memory

clock

(MHz)

Memory

size

(MB)

Radeon RX 580 Radeon Software 17.7.2 1411 2000 8192
Radeon RX Vega 56 Radeon Software beta 1156 1471 1600 8192
Radeon RX Vega 64 1274 1546 1890 8192
EVGA GeForce GTX 1070 SC2 GeForce 378.78 1594 1784 2002 4096
GeForce GTX 1080 Founders Edition 1607 1733 2500 8192

Thanks to Intel, Corsair, Kingston, and Gigabyte for helping to outfit our test rigs with some of the finest hardware available. AMD, Nvidia, and EVGA supplied the graphics cards for testing, as well. Behold our Gigabyte Aorus Z270X-Gaming 8 motherboard in all its glory:

Unless otherwise specified, image quality settings for the graphics cards were left at the control panel defaults. Vertical refresh sync (vsync) was disabled for all tests. We tested each graphics card at a resolution of 2560×1440 and 144 Hz, unless otherwise noted.

The tests and methods we employ are generally publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.

 

Doom (Vulkan)
Doom‘s Vulkan renderer is a staple of our graphics card reviews and a favorite of AMD tech demos. Let’s see if the RX Vegas have made a deal with the devil for the performance lead at maximum settings and a resolution of 2560×1440.


Doom‘s Vulkan renderer is always a bright spot for Radeons, and the Vega cards don’t disappoint out of the gate. The RX Vega 56 outperforms even the GTX 1080, and the RX Vega 64 is 11% faster still in average frame rates. Both Radeons and the GeForce GTX 1080 deliver impeccable 99th-percentile frame times, as well. The GTX 1070’s 16.1-ms 99th-percentile result is still good, but in a fast-paced game like Doom, you’ll notice the frame-rate drops that lead to such a figure.


Our “time-spent-beyond-X” graphs can be a bit tricky to interpret, so bear with us for just a moment before you go rocketing off to the conclusion. We set a number of crucial thresholds (or bins) in our data-processing tools—50 ms, 33.3 ms, 16.7 ms, 8.3 ms, and 6.94 ms—and determine how long the graphics card spent on frames that took longer than those times to render. Any time over the limit ends up aggregated in the graphs above. Those thresholds correspond to instantaneous frame rates of 20 FPS, 30 FPS, 60 FPS, 120 FPS, and 144 FPS, and “time spent beyond X” means time spent beneath those respective frame rates. We usually talk about these results as a proportion of the one-minute test runs we use to collect our data.

If even a handful of milliseconds make it into our 50-ms bucket, we know that the system is struggling to run a game smoothly, and it’s likely that the end user will notice severe roughness in their gameplay experience. Too much time spent on frames that take more than 33.3 ms to render means that a system running with traditional vsync on will start running into equally ugly hitches and stutters. Ideally, we want to see a system spend as little time as possible past 16.7 ms rendering frames, and too much time spent past 8.3 ms or 6.94 ms is starting to become an important consideration for gamers with high-refresh-rate monitors and powerful graphics cards.

With a fast-running game like Doom, it makes the most sense to start our analysis at the 8.3-ms mark. Recall that any time spent past this point means the frame rate will drop below 120 FPS. Here, the RX Vegas prove their mettle. The RX Vega 64 spends just about three-and-a-half seconds of our one-minute test run below 120 FPS, while the RX Vega 56 spends five seconds working on tough frames that similarly dip its instantaneous frame rate. The GTX 1080 delivers a similarly respectable result, but the GTX 1070 spends a whopping one-quarter of our test run working hard.

 

Hitman (DirectX 12)


The RX Vega 64 doesn’t repeat its commanding performance in Doom here, but a tie with the GTX 1080 is still a fine result. The RX Vega 56 is slightly faster and slightly smoother than the GTX 1070, as well.


The Vegas and GeForces deliver practically perfect performances at 50 ms, 33.3 ms, and 16.7 ms, so we can say they all offer fine gaming experiences. Looking at the high time-spent-beyond-8.3-ms bar, though, the RX Vega and GTX 1080 are as virtually tied as it gets in these matters. In a win for the Radeon camp, the GTX 1070 spends more than two seconds longer than the RX Vega 64 does on these tough frames.

 

Rise of the Tomb Raider (DirectX 12)


Rise of the Tomb Raider‘s DX12 mode results in another tight matchup for our contestants. The RX Vega 64 can’t quite catch the GTX 1080, but the Vega 56 delivers a slightly more fluid experience than the GTX 1070 while turning in virtually the same 99th-percentile frame time.


Those 99th-percentile frame times don’t tell the whole story for the RX Vega 56 and the GTX 1070, though. For that, we have to turn to our time spent beyond 16.7-ms threshold. Here, the Vega 56 spends nearly a second less on tough frames compared to the GTX 1070. Both the GTX 1080 and the RX Vega 64 spend only imperceptible amounts of time on tricky frames, as well, although the GTX 1080 maintains its slight edge.

 

The Witcher 3


Chalk up another good showing for the Vegas here. The RX Vega 64 only slightly lags the GTX 1080 in both average frame rate and 99th-percentile frame time, while the RX Vega 56 and the GTX 1070 are as dead-even as it gets.


Even better, all of the Vega cards and the GeForces spend just a handful of milliseconds past the crucial 16.7-ms mark. Save for the RX 580, all of these cards will deliver a smooth, high-fidelity Witcher 3 experience at 2560×1440.

 

Deus Ex: Mankind Divided (DX11)


The RX Vegas post a fine showing in DXMD. The RX Vega 64 trades punches with the GTX 1080, while the RX Vega 56 narrowly beats out the GTX 1070 in both its average frame rate and 99th-percentile frame time.


All of our cards spend plenty of time working on frames past 8.3 ms, so we’ll stick with the standard 16.7-ms mark here. Some minor hiccups at 50 ms and 33 ms aside, the RX Vega 64 shares company with the GTX 1080 in delivering only a handful of frames that took more than 16.7 ms to render. The RX Vega 56 also provides a slightly smoother experience than the GTX 1070 here. Even so, I’m surprised Radeons continue to have those noticeable stutters in DXMD, since they’ve been present for nearly a year now. Perhaps future driver updates will help smooth out the Vegas’ performance for good here.

 

Gears of War 4
Gears of War 4 is one of Microsoft’s first-party DirectX 12-only efforts, and it has plenty of PC-friendly graphics settings to make even the most powerful graphics card sweat. We dialed in the game’s Ultra preset and went to work.


I was hoping Gears of War 4 would provide even footing for our graphics cards to strut their stuff. Things didn’t quite play out that way. The RX Vega 56 trails the GeForce GTX 1070, and the RX Vega 64 just matches the lesser GP104 card. Seems Nvidia’s DX12 driver update earlier this year was no joke.


None of the cards save the RX 580 spend a noticeable amount of time past the 16.7-ms mark on tough frames, so it makes more sense to observe what happens past 8.3 ms. Here, none of the cards are standouts, but the GTX 1080 does spend three seconds less working hard than the RX Vega 64 and GTX 1070 do. The RX Vega 56 spends about six seconds longer than its fully-enabled counterpart on those tough frames, an unusually large gap given the cards’ relatively similar clocks and resource provisions. We may need to investigate further.

 

Grand Theft Auto V

For a change of pace, I cued up our usual GTA V test run at 4K using maximum settings for everything save MSAA.


I had hoped moving to 4K with GTA V would help show the virtues of the Vega cards a bit better versus the competition, but that turned out not to be the case. The GTX 1080 is the only card of this bunch that delivers what I would deem a playable 4K experience on a traditional 4K monitor. What may be interesting to some who are flirting with 4K is that both RX Vegas deliver 99th-percentile frame times that will keep a typical 4K FreeSync monitor happy. FreeSync doesn’t help GTA V feel as fluid as it does on the GTX 1080, but the smooth motion both cards delivered with FreeSync enabled in my informal testing was a far sight better than the alternative.


Although I’ve been ruined by the GTX 1080 Ti for 4K gaming, the GTX 1080 still delivers a surprisingly playable 4K, 60-FPS experience with GTA V. It spends just over a second working on tough frames that would drop instantaneous frame rates below 60 FPS. Nothing else in this company even comes close, but FreeSync does help make the RX Vegas more appealing if you’re mulling the jump to 4K. None of the Vegas (or GeForces) spend even a millisecond past 33.3 ms, which augurs well for a fine 4K FreeSync experience.

 

Watch Dogs 2


For some reason, Watch Dogs 2 simply does not play well on the Vega cards. The GTX 1080 is way out in front of the pack here, and the RX Vega 64 can’t beat the GTX 1070, let alone the GTX 1080. The Radeons exhibit slightly more competitive 99th-percentile frame times here, but the gameplay experiences they deliver still feel less smooth and fluid than that of the GeForces.


Given the average frame rate ballpark above, the most relevant “time spent beyond” metric is our 16.7 ms threshold. Both Vega cards spend over twice as long working on tough frames here as do the GeForces,  and the RX 580 is thoroughly outclassed.

 

Six TDPs are better than one

A long-running pursuit among Radeon fans has been to find just how far they can undervolt their cards with minimal loss of performance. Whether deservedly or not, AMD cards have developed a reputation of being pushed to the ragged edge of the voltage-and-frequency-scaling curve in pursuit of the highest possible performance. We observed as much with the move from the RX 480 to the RX 580, whose slightly higher performance than its forebear required 48W more power for the privilege.

Perhaps because of that reputation, AMD is giving builders the option to easily explore more efficient points on that curve. The company is specifying six separate TDPs for each RX Vega card. Three options for this configurable TDP will be available in Radeon Wattman at any given time: “Power Saver,” “Balanced,” and “Turbo.” “Balanced” is the default mode, and the one we used in our main gaming tests. A two-position BIOS switch on the card will drop the base TDPs of each of these settings even further. (Yes, there is some overlap.) While the most devoted undervolters will still likely want to resort to manual tweaking, AMD is making the practice both officially sanctioned and more accessible to folks who just want to toy around with increasing performance-per-watt without a major time investment.

I decided to test both the power consumption and noise levels of these cards while also exploring the performance implications of the three TDPs available from Wattman in the BIOS switch’s default position. To do so, I ran through our Hitman test run and logged performance at each setting.



In Hitman, the RX Vega 64’s “turbo” mode offers no additional performance. “Power save,” however, has a negligible impact on both average frame rates and 99th-percentile frame times. The RX Vega 56 gets a bit of additional headroom from “turbo,” and its performance slightly decreases in “power saver” mode, as we might expect. Of course, we’re most interested in the changes in power consumption each mode offers. To measure this result, I used my trusty Watts Up power meter while looking at a static but complex scene in our Hitman test run.

So that’s something. The RX Vega 64 maintains most of its performance while shedding a whole 86W from our test system’s power draw. Just goes to show how far AMD is pushing the Vega 64 to extract the last 1% of its performance potential, we suppose. Let’s see how these numbers stack up against the rest of our test cards.

Across the board, the numbers aren’t good for the Radeons. The RX 580 is drawing as much power under our Hitman load as the GeForce GTX 1080, and the numbers only go up from there. The RX Vega 56’s default “Balanced” mode makes our test system ask 91W more from the wall than the GTX 1070, and the RX Vega 64’s default “Balanced” mode makes our system yank an astonishing 136W more than the GTX 1080 does. The Vega architecture seems to offer a major increase in performance per watt compared to current Polaris chips, but in absolute terms, AMD’s latest is still far behind Pascal GPUs in efficiency. At least in the case of the Vega 64, “Power saver” mode might be a prudent thing to try for folks willing to give up a tiny sliver of performance for a rather big slice of system power draw saved.

 

Conclusions

Before we issue a verdict on the RX Vega 56 and RX Vega 64, it’s time once again to sum up our results in our famous value scatter plots. To produce these charts, we take the geometric mean of the average FPS figures for each game we tested. We also convert the geometric mean of the 99th-percentile frame times we collected for each card into a 99th-percentile FPS figure so that our higher-is-better system works. Where practical, we used retail or e-tail pricing for each card tested.


If you’ve been following our Vega coverage, you’ve already known how these graphs would shake out for some time now. The only question was how close the race would be. The answer: pretty close. In our testing so far, the RX Vega 64 trails the GTX 1080 by about 11.5% in our all-important 99th-percentile frame time metric, but its performance potential in our tests (as measured by average FPS) only lags the GeForce by about 6%.

Those gaps are a bit of a letdown after the long wait for Vega, but they don’t seem insurmountable. AMD has proven it can lower 99th-percentile frame times with driver updates in the past. Vega marks the biggest change to AMD’s GCN tech in years, and the company may need some more time to fully tune its software for the best performance from the product. Vega is also is launching into a gaming market that has been dominated by the GTX 1070 and GTX 1080 for over a year. Game developers may find some new tweaks and optimizations are necessary to get the best performance from the Vega GPU, as well.

Another reason for the performance deficit between the RX Vega 64 and the GTX 1080 in our initial standings is, I think, a lot simpler. The RX Vega 64 reference card seems to be running on the ragged edge of its voltage-and-frequency-scaling curve, and its factory fan profile doesn’t allow the GPU to run at its peak clock speeds for extended periods, if at all. Our card was plenty happy to overclock with its blower cranked and a bunch of fans blowing on it, but the eyebrow-raising power draw and physically painful noise levels that ensued showed why AMD isn’t pushing Vega over the shoulder of the voltage-and-frequency-scaling curve and into its ear.

Even at stock speeds, power consumption is the bane of the Vega GPU. Our system power draw with the RX Vega 64 installed exceeded that of even the GeForce GTX 1080 Ti, at 408W for the RX Vega 64 and about 350W for the GTX 1080 Ti, and it peaked at over 500W for the Vega chip once we informally explored overclocking. For a more apples-to-apples comparison, installing the GTX 1080 Founders Edition caused our system to draw about 272W. Those extra watts mean more expensive power supplies, more robust cooling fans, potentially higher noise levels, and a need for better climate control in one’s gaming den. The Power Saver Wattman profile goes a long way toward taming the RX Vega 64 for next to no performance cost, but there is no denying that performance-per-watt remains a challenge for AMD’s architects.

I’m curious, then, how the RX Vega 64 performs in its liquid-cooled guise. AMD itself showed that liquid cooling can be good for power-dense and power-hungry graphics cards with the R9 Fury X. Keeping hot-tempered GPUs way cool can help reduce wasted energy and could potentially open up more performance headroom. This time around, though, a liquid-cooled Vega 64 is a $200 upcharge over the base RX Vega 64 as part of a Radeon Pack. $699 happens to be the same price as the sticker for Nvidia’s GTX 1080 Ti Founders Edition, and given the huge delta in performance between those two products, I can’t imagine any but the most ardent AMD fans will pick the liquid-cooled Vega. Even with the enticement of packed-in games, accessible FreeSync monitors and heavy discounts on other hardware, leaving that much performance on the table is rough going. Perhaps we’ll see affordable aftermarket liquid-cooling solutions that can help bridge the gap.

The RX Vega 56 is a happier story for AMD. As was the case with the R9 Fury versus the R9 Fury X, losing eight of the full Vega 10 chip’s compute units to the world’s tiniest chainsaw just doesn’t hurt the Vega 56 that much. Our indices of 99th-percentile frame times and average frames per second put the Vega 56 dead-on with a hot-clocked GTX 1070, and only 10% behind the Vega 64 in our FPS index. The 56 does draw much more power than a GTX 1070 in our test system, but not in the eye-popping way of the Vega 64.

The RX Vega 56 looks especially nice in light of the FreeSync monitor selection these days. One can get a 144-Hz IPS gaming display with a 30-Hz-to-144-Hz FreeSync range for just $550 now, compared to about $800 for a comparable G-Sync display. That $300 could go a long way toward more powerful components elsewhere in a system, or it could stay in one’s pocket. It’s not a stretch any longer to say that a variable-refresh-rate  display is the way to game, and the RX Vega duo completes a puzzle for FreeSync that’s been unfinished for a long time. I could go on and on about how revelatory VRR tech still is for the gaming experience, and I’m happy to see it become potentially more accessible again for high-end gaming. Assuming you can tolerate the higher heat and noise levels of the Vega 56 compared to the Pascal competition, its FreeSync support makes it a strong contender for the entry-level high-end graphics card to get.

Comments closed
    • Lordhawkwind
    • 2 years ago

    Unfortunately I was able to purchase a Vega 64 AC on release date and what a mistake that is turning out to be! I got it for just £450 which I thought was a bargain as I have a high end freesync monitor. I installed it yesterday and all I’m getting is black screens and restarts whenever I try to load a game. Having read a lot of reviews they were saying that a 550/600w PSU will do the job. Sorry no it won’t as I’ve got a 700w PSU that handled my Fury PRO and big overclock on my 7700K easily that is now brought to it’s knees by what I can only call this abomination of a graphics card.

    I’ve put the 7700K to default and use wattman power saver to absolutely no avail. All I can say is how did this card ever make it to retail. It takes them two whole years since Fury arrived to produce this abomination. Amazing. A 750w PSU to run this card and 1000w to run the LCS is just OMG. Have RTG really lost the plot and think consumers won’t mind the massively insane power draw to just equal a card 15 months old?? Well I’ve ordered a 750w PSU and if that doesn’t do the job I’ve contacted the retailer I bought it from and they have a 14 day no quibble refund if you don’t like it. Nvidia 1080ti here I come!

      • UberGerbil
      • 2 years ago

      You’re sure it’s an inadequate PSU and not some other problem?

        • Rza79
        • 2 years ago

        My thoughts too. A too weak PSU would only give restarts but you have the occasional black screen which would indicate some other issue. Could very well just be a compatibility issue.
        The way you’re freaking out, you’ll never figure out the actual problem.

      • Gastec
      • 2 years ago

      Seems to me you don’t really have a money problem. So just keep ordering computer components 🙂

      • Chrispy_
      • 2 years ago

      You’re barking up completely the wrong tree.

      Your Fury pro likely pulled similar power (275W to 295W) but although it’s more power hungry than Nvidia’s current lineup, it’s not even significant because both AMD and Nvidia have released several cards requiring 300+ Watts across multiple generations, and people were running two or even three of them in some cases.

      No, you have another problem.

      Perhaps your existing PSU is dying and the extra 6% total power draw is enough to tip it over the edge, but if that was the case your PC would have been flaky as hell beforehand. My guess is that a BIOS update and possibly some PCIe settings on your board might be needed, or worse than that, your board has a problem providing adequate juice to the slot. AMD are unlikely to have overstepped the boundary here since a previous mistake with Polaris drawing about 90W of the allowed 75W from the slot cost them their SIG approval certificate and required some post-release hotfixing to work around. It’s unlikely they’ll be making that mistake again and nobody has mentioned it in reviews like they did with Polaris.

      • blahsaysblah
      • 2 years ago

      You’re not going to get any sympathy with a rant like that. Did you put back the Fury to see everything is OK? You never spelled out your system. What if you are literally at your PSUs limit already. You havnt even mentioned any attempts at troubleshooting.

      There isn’t even a point if you haven’t at least used a spare disk(disconnect others disks) to install a fresh install of Win10(no need to activate) and just Vega 64 drivers.

    • stefem
    • 2 years ago

    Hey Jeff, why using MSRP for AMD and street prices (on newegg you can find even for less than you specified) in the perf/$ chart?
    First I thought: “well, he can’t know Vega street price yet” but then why the RX 580 is listed as $250 on that graph?

    Now I command you to sell me immediately 3 (new of course) RX580 at $250 then buy from me 3 GTX 1070 and 3 GTX 1080 at your suggested price! 😉

    Also a GTX 1080 Ti (as reference) and Beyond3D GPU suite tests would have been greatly appreciated.

    P.S.
    I would have happily awaited few days for such addition.

      • erwendigo
      • 2 years ago

      Well, where is Scott now? Yes, he is working in AMD. That’s the answer, because nobody can tell us that taking MSRP for AMD and fake prices for nvidia cards is fair (sorry, but I searched prices in newegg and found that the 1070 and 1080 are cheaper than the Techreport’s prices, in some cases with a BIG MARGIN like the 1080 costing around 500$ in newegg).

      Because Jekk have to know the market, and that the upper prices in retail cards is all about cryptomining, and then with a new AMD card… hell, this upscalated prices is going worse than with nvidia cards.

      And hell, the 1080 isn’t affect by the crypto craze, and is very easy to find for 500$.

    • psuedonymous
    • 2 years ago

    There are rumblings from [url=https://www.overclock3d.net/news/gpu_displays/amd_s_rx_64_launch_pricing_was_only_for_early_sales/1<]UK retailers[/url<] that the 'MSRP' is a limited-time-only launch offer, and the actual MSRP will be increasing in short order.

    • euricog
    • 2 years ago

    If this is true:
    [url<]http://wccftech.com/amds-rx-vega-64s-499-price-tag-was-a-launch-only-introductory-offer/[/url<] ...which seems likely by looking at current prices, then TR will have to update the scatter plot for the review.

      • cynan
      • 2 years ago

      Moot. The cards will sell for what they can sell for. If supply continues to be limitted due to HBM2 and/or it turns out that Vegas are in high demand by miners, these won’t sell at teh current MSRP. Regardless of whether it is AMD or retailers pocketing the extra. Personally, I think I’d prefer AMD to be pocketing this, considering how much fewer of these they must be selling relative to the GTX 1070/80/80 Tis and how much more these cards must cost to make (especially compared to the 1070/80s).

      When the supply stabilizes – hopefully in a couple of weeks (and a bit longer for Vega 56 by the looks of things?) – then, and only then, will it make sense for the price/performance chart to be updated with any sort of meaningful values.

      • stefem
      • 2 years ago

      They should update it anyway, ok that he didn’t know RX Vega street price but the RX580 is listed as $250 while a GTX 1070 is listed as $470 and GTX 1080 as $550 (which should be lower looking at newegg) so it’s basically AMD MSRP against NVIDIA not the lowest “street price” which clearly doesn’t give you a good picture of reality.

    • Convert
    • 2 years ago

    For whatever reason I was actually surprised Vega performs as well as it does.

    As of today it’s not really that bad of a card. The bad news is it appears AMD is about a generation and a half behind Nvidia and the gap is only going to get wider.

    Thanks for the review Jeff!

    • UberGerbil
    • 2 years ago

    From reading the [url=http://radeon.com/_downloads/vega-whitepaper-11.6.17.pdf<]whitepaper [/url<] (thanks to tipoo for linking it in another comment here) it looks like -- as usual for a new architecture -- there are a number of areas where additional performance could be extracted (or so AMD claims) but require explicit coding in the driver, the game, or both. I have no doubt we'll see driver updates that provide (some) additional performance in selected titles; it's less clear if we'll see patches to the games themselves (how good is AMD's developer relations these days -- is there anybody left?) I'm also amused to see "a set of eight instructions to accelerate memory address generation and hashing functions (commonly used in cryptographic processing and cryptocurrency mining)" -- kudos for knowing who butters your bread these days.

      • tipoo
      • 2 years ago

      Yeah, reading it I’m expecting Vega to age pretty gracefully…But this is the classic problem with AMD. We don’t know what things will be like in 3 years, so we base our buying decisions off performance today, in launch reviews.

      Next generation geometry culling isn’t even enabled yet. Also as you mentioned, some features like that require developers to do added steps too, and for that they need marketshare, and for marketshare they need performance…Yet another hard cycle.

    • freebird
    • 2 years ago

    Well, that didn’t take too long… for the “buy & resale on ebay for profit” sellers to start popping up… I thought I checked this morning, but maybe it was last night and didn’t see any
    .
    Now there are a WHOLE boatload of Vegas on ebay only 1 or 2 on amazon…

    a few bids are still below $500, but most are $650-$750 range and Buy Now pricing is ranging from $829 – $1100.

    Everyone wants a slice of the GPU Profit train pie these days…
    AMD just gets the crumbs for all the hard work?
    Maybe AMD should’ve just sold the 1st month release at $1000 and then put in massive price drop just to soak the ebay leeches… 😉 but then the crying on the review sites would’ve been massive.

    • LoneWolf15
    • 2 years ago

    The performance is nice, though not cutting edge. Sadly, I don’t think it’s nice enough to justify that power envelope, especially if you’re looking at the Vega 56 vs a GTX 1070.

    It still seems like a game of catch-up. I’d really love to see AMD change the game to leapfrog instead.

    • DeadOfKnight
    • 2 years ago

    So the Radeon 56 is a good alternative to the GTX 1070…a year later. At least the rumors seem to indicate that Nvidia will be merciful enough to not launch volta anytime soon…

    • rudimentary_lathe
    • 2 years ago

    The performance is actually better than I was expecting. But that power draw. How can they have taken so long to put a product out that doesn’t even address the power draw problem of GCN? If anything, the problem is made worse with these cards!

    I’d be interested to learn what happens when you undervolt and/or underclock. If it’s anything like a Polaris card – one of which I own – then you can undervolt it by 50-100mV from stock and cut down the wattage by 10-20% easy. If you’re willing to lose 10% off the clocks, you can realistically shave ~30% from the wattage.

    This card is good enough for those that want 4K or 1440p using open standards (e.g. FreeSync) – so long as they can live with that power draw.

    Edit: Just saw Jeff already looked at undervolting on the last page. +1 for that. Looks pretty good, but as you say, still much worse than the Nvidia cards.

      • stefem
      • 2 years ago

      Yea, but NVIDIA cards can be undervolted too yielding a 20%-30% (on Pascal) power reduction without degrading performance.

        • Krogoth
        • 2 years ago

        It does downgrade performance where it actually matters (when GPUs are stressed).

        You can’t have you cake and eat it too. It is all about trade-offs. Are you willing to sacrifice raw performance for more sane power consumption (a.k.a reduced noise levels)?

          • stefem
          • 2 years ago

          No, what I see is a small reduction (2-3fps) of avg fps but with better framerate consistency, some workload may be negatively affected but that hold true also for AMD GPUs.
          I practically forcing GPU Boost to do its magic with the lowest voltage possible.

      • Gastec
      • 2 years ago

      “Those” that want 1440p… Or 4K. Don’t we all want that? Or are you trying to say 1080p is “good enough” and do you plan to stay at that resolution for the rest of your life?
      Like those lost souls who said not so long ago that they prefer 19″ CRT monitors because Counter-Strike or whatever.

    • Nictron
    • 2 years ago

    Cautiously disappointed but only because I am a Nvidia user with GSync.

    The benefit to the market is that you can now get FreeSync and comparable performance for quite a bit less than what it would cost you to go with a similar performing Nvidia GSync setup.

    So not all bad but not all good either!

    • Kougar
    • 2 years ago

    Holy smokes… I thought Anandtech’s power difference of 150w between Vega and a 1080 was excessive, but in your graph it’s an even higher 176 watts at full power.

    What setting does AMD default to for these cards, balanced or turbo?

    • HERETIC
    • 2 years ago

    WHO IS THIS CARD FOR???

    I think we were all hoping for a little more from vega,and a little less power usage.
    After reading this-
    [url<]https://techgage.com/article/a-look-at-amds-radeon-rx-vega-64-workstation-compute-performance/[/url<] Part of Rob's conclusion sums it up perfectly- "All in all, this is extremely impressive. If you’re a workstation user wanting a GPU that will give you good performance in both workstation and gaming workloads, the RX Vega 64 is, surprisingly enough, a pretty attractive choice…" Vega combined with Threadripper-hmmmmmmmm.

      • Gastec
      • 2 years ago

      It’s for people who don’t pay much for electricity or too rich to care.
      Also for people who don’t stay in the same room with their computers.

        • Krogoth
        • 2 years ago

        Electricity has never been an issue with hardcores gamers that typically buy high-end GPUs. It is noise-level that extra power consumption (more heat) that it brings is the potential issue.

      • anubis44
      • 2 years ago

      Well, since I despise nVidia for their cheap-assed tactics, trying to kill performance with excessive tessellation, the GTX970 memory fiasco, the bump-gate problem that killed the GeForce graphics chip on my Toshiba Tecra M3, price gauging with G-Sync, etc. etc. etc., I’m going to be grabbing at least one Vega 56 once the custom cooled cards come out to replace my trusty R9 290, which is going into my HTPC for big-screen gaming.

      On top of everything, Vega has Rapid Packed Math capability, which will almost certainly boost games even further, which Pascal doesn’t have. Yes Pascal runs cooler, but it’s also missing features that Vega has. Pascal is already obsolete, in my view, so why would I buy it over the newer arch with more features?

      So that makes for one example of ‘who this card is for’. As for power consumption, I live in Quebec, Canada, where the electricity is .046kWh hour, so I could care less about 40w-50w.

        • MOSFET
        • 2 years ago

        I don’t disagree with paragraph 1, but I disagree that Pascal is outdated. If it weren’t for Nvidia’s blistering pace of innovation, Maxwell and even Kepler would still be very relevant, and are for many people.

        • Krogoth
        • 2 years ago

        Pascal is hardly obsolete. It is killing Vega at performance and power efficiency as far as gaming performance is concerned. Vega only matches Pascal at professional graphical stuff (only Frontier SKUs or better)

          • Mr Bill
          • 2 years ago

          In a sense, Vega is a ‘server’ chip doing double duty as a gaming chip. NVIDIA can afford to strip out all those power hungry FP compute lines to make a gaming chip. AMD has to make do with one design for both.

      • Krogoth
      • 2 years ago

      It is for gamers who want to game at 1440P with Freesync where Fury and RX 480/580 don’t cut it.

      It is a small niche at best.

      • Mr Bill
      • 2 years ago

      That was an interesting read. The compute performance looks pretty compelling.

    • CuttinHobo
    • 2 years ago

    I bet Krogoth is so Vega’d right now.

    • 223 Fan
    • 2 years ago

    Vega? Sounds like they made a ’73 Pinto.

    • NovusBogus
    • 2 years ago

    Decent performance for the price, but dat power draw. It’s like the R9 290 all over again: big, hot, and likely to be badly outclassed in a few months by something with half the upkeep cost. Too bad Polaris didn’t scale better.

    • Delta9
    • 2 years ago

    Curious to see if there are any extra baked in hardware that is not fully utilized yes. The primitive shaders and some of the extra DX12/Vulcan optimizations may bring forth some better performance. That and a respin on a refined version of the same process. I’m not an AMD fan boy, I am a huge fan of competition making hardware cheaper. Ryzen got Intel to stop dragging ass and we have quad core i3s and 6 core i5s incoming. If Vega had more teeth out of the box it it could have driven Nvidia to cut prices or jack up performance. As of now, the countdown to Navi begins.

      • K-L-Waster
      • 2 years ago

      [quote<]As of now, the countdown to Navi begins.[/quote<] This is the story with AMD far too frequently, isn't it. As soon as their new hotly anticipated product line comes out and it isn't quite what was hoped for, the "wait until the next one comes out" refrain starts -- usually on launch day. RyZen was a relatively successful launch as AMD products go, but even it had a "just wait until RyZen2" refrain in regards to the gaming performance. Poor Vega seems to be obsolete before launch day is over.

      • dyrdak
      • 2 years ago

      Judging by the power usage there’s got to be whole lot baked in. Sooner or later (and this comes from red team member). AMD seems to be using brute force solutions where Nvidia plays more elaborate game, and wins. Simple comparison between Vega56 and 64 proves that the design does not scale.

    • Voldenuit
    • 2 years ago

    Hi Jeff, I saw you mentioned noise testing but I don’t see any objective or subjective noise level results in the text or graphs?

    [quote<]I decided to test both the power consumption and noise levels of these cards while also exploring the performance implications of the three TDPs available from Wattman in the BIOS switch's default position. [/quote<]

      • Jeff Kampman
      • 2 years ago

      It got left on the notepad this morning, I’ll try and graph it and discuss it as soon as I’m able.

    • Shobai
    • 2 years ago

    Am thoroughly enjoying the review, Jeff. Thanks!

    On page “Six TDPs are better than one”, in the “Hitman – Time spent beyond 16.7ms” graph, one of the “RX Vega 64 (Balanced)” should probably be “(Turbo)”.

    • Jigar
    • 2 years ago

    How did the AMD engineers explain this disaster to their bosses ? No seriously, either they are just to smart that we should all learn from them or atleast i need to learn from them.

      • tipoo
      • 2 years ago

      “Nvidia spends 3 billion on R&D on one architecture and we…don’t” might be a fair one, alas.

        • Jigar
        • 2 years ago

        Yes that should be one point.

      • PrincipalSkinner
      • 2 years ago

      Both of them said “Whoops!”.

      • Krogoth
      • 2 years ago

      It isn’t a disaster at all.

      Vega performs as expected. The problem is that gaming performance isn’t that important anymore for big chip designs. Current Vega is clearly designed for prosumers and general compute first. It is basically AMD RTG’s GF100. Vega Frontier/Instinct are quite competitive to their Pascal Quadro/Telsa counterparts.

      It is akin to Nvidia releasing a GP100 SKU for gaming and it ends up be underwhelming for it can do on paper, but Nvidia learn its lesson from the GF100 debacle. They just distill their big chip designs (axing some of the general compute stuff and using GDDR5/GDDR5X instead of HBM2) to make gaming-tier products a.k.a GP102 and GP104.

      AMD simply doesn’t have the capital to allocate two separate high-end lines at once. They just went after the more lucrative markets with their big chip design. The Polaris is sufficient enough for the mid-tier stuff (where most of the $$$ is with the gaming market).

      The only people who are disappointed are those who are expecting a 1080Ti killer.

        • Freon
        • 2 years ago

        Yeah AMD seems to be homologating their pro and consumer dies to save on R&D efforts. They only make a small handful now. Zen’s 8 core die recycled into more than a dozen products, RX line (only one or maybe two dies?), and Vega (one) all appear to be one-and-done spins. NV and Intel have further refinement of core designs into many different products, but given their production numbers saving a few slivers of silicon is a lot in production savings down the road.

        It seems the GPU side is suffering a bit more due to the compromises, but I also think Nvidia has been far less complacent than Intel.

        • ColeLT1
        • 2 years ago

        This would be true, if it was not 12 months late. Disappointing at the least.

          • K-L-Waster
          • 2 years ago

          Basically matching the 1080 a year later and with a bloated power consumption budget makes this at best a me-too product.

          Only people who are going to be happy are those who refuse to buy NVidia ‘cus it’s against their religion.

      • jts888
      • 2 years ago

      I think Vega silicon is essentially what was expected by everyone in AMD for over a year.

      What’s happened is that (again) lackluster launch drivers will take a year or more to mature for them and recover the 20% or whatever that’s missing now, and executives/sales/marketing knowing this just cranked the voltage and frequency to the absolute edge in order to get final launch performance somewhere near parity with the 1080. Vega narrowly missed its HBM2 bandwidth targets and definitely has more raw FLOPs than the competition. What it lack is geometry throughput (nobody using new pipeline yet) and likely any software taking advantage of the newly coherent RBEs or the HBCC.

      As far as I can tell, Radeon hardware has actually caught up with Pascal finally, but AMD just can’t keep up with Nvidia’s multi-billion driver efforts.

      • southrncomfortjm
      • 2 years ago

      The cards work and deliver top tier performance. That doesn’t seem like a disaster. Does it fall short of delivering all we could have wanted? Yeah, but that doesn’t make it a bad product.

      Looking at the whole product lineup from GPU to monitors, etc, Vega may make a lot of sense to certain people, especially if they already have a Freesync monitor, or are patiently waiting for a Freesync ready TV.

    • ermo
    • 2 years ago

    The Vega whitepaper is interesting and I understand why AMD chose the “designed for the future” tagline for Vega 10.

    Sadly, if you are getting a Vega today, you’ll likely be playing mature DX11 games on a platform that was specifically designed to NOT blindly adhere to the DX11 way of doing things. And as the numbers show, this is reflected in the performance you are getting RIGHT NOW.

    For future Vulkanized-out-of-the-gate titles, I expect the Doom scenario to be par for the course. Whether that’ll be enough to compete with Volta is anyone’s guess.

    The fact that Phoronix’s testing shows that Vega w/OSS drivers (RadeonSI Mesa GL implementation) will beat NVidia’s proprietary drivers on a 1080Ti in select games is quite frankly astounding. Whether this is a true sign of what Vega has on tap or just an anomaly is not known this point.

    Verdict: Wait and see. Power draw needs to come down, which may require new steppings. Drivers will need to improve and ISVs may need to make an effort to re-tune their graphics engines to make better use of Vega’s new features.

    EDIT: Oh and good job on the review, Jeff!
    EDIT2: … aaand re-subscribed. You earned it Jeff & Co. =)

    • Mat3
    • 2 years ago

    The power saver profile should be the default! That’s how the card should simply be. Who are idiots at AMD who are think a ~5% more performance is worth 100+ watts? Reviews are slamming the power consumption, heat and noise of these cards. Same story with the 580. Complete morons!

      • chuckula
      • 2 years ago

      I tend to agree that while Vega is never going to be an efficiency champion it would look a whole lot better running in powersaver mode by default.

      On top of that, mining tends to be power efficiency sensitive and I’d be very curious to see if powersaver mode on Vega makes it much more attractive to miners compared to some crazy out-of-the-box overclock power consumption mode for the same card.

      • RdVi
      • 2 years ago

      Not only this but overclocking looks terrible. If AMD just let the AIB’s sell the overclocked models and/or let users do it themselves, the day one reviews simply wont show the increased power consumption that comes with it. Later reviews of factory OC AIB cards will show it, but few people base their opinions of an architecture and AMD/Nvidia themselves on those AIB specific reviews.

    • the
    • 2 years ago

    Typo on the first page in the table. Memory bus width of the Fury X is 4096 bit wide, not 1024.

    • Mikael33
    • 2 years ago

    I don’t get it…
    I didn’t expect Vega to slaughter the 1080 Ti, the whole blind test gaming thing [L]impocp was telling because it was obviously done because AMD knew they didn’t have the raw performance to beat nvidia but I expected the Vega 64 to at least beat the 1080FE across the board. I only hope drivers can improve performance significantly on Vega as it’s seeming like another R600/2900XT situation, where after a delay AMD can only manage to match nVidia’s 2nd best card while drawing a ton more power, the power draw difference is almost academic as presumably most people buying high end GPUs have beefy PSUs but the fact they need to use significantly more power to match Nvidia’s performance while delivering their high end solution over a year later is a bit depressing. AMD’s cpu division did really well with Zen, it’s a damn shame RTG can’t deliver the same level of performance.

      • Action.de.Parsnip
      • 2 years ago

      Resources

    • flip-mode
    • 2 years ago

    This is not one of AMD’s finer moments. I don’t even game and I’m disappointed. Damn.

      • tipoo
      • 2 years ago

      Yeah, for all the not gaming I do I still root for a hyper competitive market, and have followed it since I figured out what GPU powered my bondi blue iMac G3.

    • odizzido
    • 2 years ago

    Vega 56 looks interesting. The fact the it supports open standards would probably be enough for me to get it over a 1070.

    edit——-
    Would have been interested in seeing some testing with a ryzen powered system. Maybe a follow up review?

    • Action.de.Parsnip
    • 2 years ago

    And thus with Vega the realisation that AMD dont have the money to do Zen and a brand new GPU arch at the same time. It’s GCN 1.X yet yet yet again. Eventually we were going to see that something had to give to get zen and the 4k consoles done.

    I think what’s not often appreciated is that after Kepler Nvidia made 2 lines: a stripped down gaming line and a fully featured compute line. Because they have the money and they can. AMD didn’t, don’t and couldn’t. One line is dense, power efficient and clocks very highly, one line clocks like AMD products and consumes power like AMD products. What AMD has is a fully featured compute line primarily bought by gamers, that looks wheezy and sweaty next to the 1800mhz boost clock 250w tdp 1080ti. In reality it is made to face the 1300mhz 250w tdp P100, and play games…

      • LocalCitizen
      • 2 years ago

      I don’t get it, which is cheaper to do?

      AMD way: a new arch Polaris + redo an old arch Fiji (vast majority of the 3.8 Billion new transistors are spend on designing the chip to clock much higher than Fiji)

      NV way: one new arch Pascal and disable bits and piece for different classes.

        • ludi
        • 2 years ago

        That’s the point: AMD had to divide their limited resources between the new Zen CPU architecture and the Vega GPU architecture, so on the GPU side they took the ideas they already had and enhance them. Vega is a success but TR’s power analysis shows it to be running at redline in fifth gear in order to compete with Nvidia’s high-end, which seems to still have some margin left in the design.

          • LocalCitizen
          • 2 years ago

          so polaris development was completely free so they just threw it in right?
          having to develop 2 architectures to compete against 1 doesnt seem to be a money saver. is there a plan to merge the two eventually?

      • thx1138r
      • 2 years ago

      I do not believe this is about money. It’s about the technology bet that AMD made. Years ago, they saw that GDDR5 style memory was going to run out of road for high-end GPUs and they decided to jump ship to HBM before that happened.
      They got the timing wrong.

    • Unknown-Error
    • 2 years ago

    What the hell happened? Late, power-hungry and struggles to out-perform an old 1080?! And notice jeff deliberately leftout the 1080 ti. AMD CPU division pleasantly surprised us with Threadripper’s power consumption but the GPU division? Seriously WTF? If this trend Continues then I am seriously worried about Raven-Ridge in the Mobile APU segment since power consumption plays a huge role and RR GPU is vega based. 😮 😮

      • derFunkenstein
      • 2 years ago

      The 1080Ti is 50% more expensive than Vega’s MSRP. It’d be an embarrassment for AMD to consume more power than GP102, but they’re really not competing.

        • Pancake
        • 2 years ago

        Problem for AMD is that Vega costs MORE to make. Equivalent die size and more expensive HBM2. And to rub salt into wounds more expensive cooling and power supply. As you say, AMD are really not competing.

          • derFunkenstein
          • 2 years ago

          All those things are true beyond just the price point comparison I was making

          • ptsant
          • 2 years ago

          I have thought a lot about this and there are only two possible explanations:
          a. AMD wanted a card for AI/DL and compute/CAD, secondarily they are trying to sell some to maintain some face in the “high end” (good scenario).
          b. The choice of HBM2 was a huge mistake and should never have been made. They should/could have made a big Polaris with GDDR5(X) instead, preserving the margins (bad scenario).

          If (b) is true, some people should be fired, although the very successful support of Vega for linux/AI does seem to indicate (a) is possibly true.

        • mph_Ragnarok
        • 2 years ago

        The price different is a RESULT of the mismatch.

      • swaaye
      • 2 years ago

      It’s like all AMD high-end GPU launches after 5870 or so. Pushed outside of its happy power consumption zone because it wouldn’t be adequately performance competitive otherwise.

      Hopefully their APUs will decimate Intel’s IGP as usual and won’t need to be pushed hard. I keep wondering if Intel will bring in EDRAM in more models…

        • cynan
        • 2 years ago

        My HD 7970 would like to disagree. When it first launched, stock load clock was 925 MHz. There was tones of extra thermal room – so much so in fact, that they respun the same product into the GHz Edition – which then became the standard. And even then, hitting 1100 MHz was a cakewalk, without too much additional power if you were not extremely unlucky with the silicon lottery.

        If you weren’t afraid to push the thermals, and power consumption didn’t bother you, a 30% overclock was not unheard of for an original edition HD 7970 with a good 3rd party cooler (ie, Gigabyte Windforce).

          • DPete27
          • 2 years ago

          That’s bc the 7970 came out (Jan 2012) before the GTX680 (March 2012). Hence why AMD later (June 2012) turned the screws to stay on top of the GTX680. At least back then, they were still on the same perf/watt curve as Nvidia, even with the 7970 GHz edition.

      • brucethemoose
      • 2 years ago

      From Anandtech:

      [quote<]Talking to AMD’s engineers, what especially surprised me is where the bulk of those transistors went; the single largest consumer of the additional 3.9B transistors was spent on designing the chip to clock much higher than Fiji. Vega 10 can reach 1.7GHz, whereas Fiji couldn’t do much more than 1.05GHz. Additional transistors are needed to add pipeline stages at various points or build in latency hiding mechanisms, as electrons can only move so far on a single clock cycle; this is something we’ve seen in NVIDIA’s Pascal, not to mention countless CPU designs. Still, what it means is that those 3.9B transistors are serving a very important performance purpose: allowing AMD to clock the card high enough to see significant performance gains over Fiji. [/quote<] So it's literally the Pentium 4 Extreme of GPUs.

        • the
        • 2 years ago

        Additional pipeline stages in a GPU though aren’t necessarily that bad compared to a more general purpose CPU. While the hardware is fully capable of branching, this is still seen as something to avoid due to poor branching performance in general of GPUs. Even then, GPUs are moving enough data to process something while a branch is being worked out so overall throughput isn’t significantly impacted. It is easy to switch to a different thread if one is stalled. However, GPUs need large caches to be able to keep things flowing in this scenario. AMD double the amount of caches in the chip which is no small feat.

        Considering where this chip lands on the performance/watt curve, in hindsight increasing cache sizes further may have been wiser than increasing pipeline depth even if it cost AMD some clock speed. This is a new foundation for a new generation of cards (Navi and beyond) so this may play out differently under a new process where there is more room to increase clock speeds.

      • NovusBogus
      • 2 years ago

      It’s pretty clear to me that Vega is just an overclocked Polaris, with all the tradeoffs that OCing entails. Sad times.

        • tsk
        • 2 years ago

        Actually it’s an overclocked Fiji.

        • ptsant
        • 2 years ago

        Not true at all.

        There are very difference compute capabilities, including the transparent adressing of memory space over PCIe, secure boot and virtualization, primitive shaders, draw stream binning rasterizer, packed fp16/int8 (especially useful for deep learning). Most of these mean very little for gaming, but Vega is much better for pro/compute/AI than Polaris.

    • tipoo
    • 2 years ago

    Scott tweeted out an architecture whitepaper btw, not sure if it’s hyperlinked already

    [url<]http://radeon.com/_downloads/vega-whitepaper-11.6.17.pdf[/url<]

      • Mr Bill
      • 2 years ago

      An AMD APU combining RyZen or ThreadRipper with Vega via infinity fabric could be very potent. A very interesting read, thanks for the link.

        • derFunkenstein
        • 2 years ago

        An APU with Vega and some on-package memory would be so sweet (although I doubt that extra memory is coming along). The undervolting results give me hope that they could get something like a “Vega 16” in an APU with a reasonable thermal envelope.

        • tipoo
        • 2 years ago

        It’s surprising that there still isn’t an APU as powerful as the PS4s GPU. Wouldn’t have minded a Skull Canyon NUC sized AMD APU with at least that performance.

        I guess with their revenue, between APUs, CPUs, and GPUs, one tentpole is always going to suffer.

        • UberGerbil
        • 2 years ago

        Yeah, that’s kind of the dream — basically the HBM “cache” is also system memory, with one Zen die and a cut-down Vega riding along. Some have questioned whether you’d end up bandwidth-limited in such a design, but even if you were the results would almost certainly be the most potent “IGP” available. The real question in my mind, especially given the Vega results here, is whether you’d be thermally-limited, and if it could be shoe-horned into a laptop that could be in any way described as “thin” and/or “light.” (Then again, two Zen cores + a sawn-off Vega on HBM would make for the centerpiece of a pretty interesting next-gen console).

    • bjm
    • 2 years ago

    Glad to see that both Phoronix and TechReport got cards to review in time for the launch. The biggest takeaway for me with Vega is mainline Linux support, which is awesome for a high end card. Nvidia still holds the top-tier performance crown, but Vega is really tempting, especially if you don’t want to deal with any proprietary drivers.

      • DragonDaddyBear
      • 2 years ago

      Totally agree up until the end. From a pure “I’m lazy” perspective, I’d much rather have clean proprietary drivers than go through what Michael did to get the open drivers working. Eventually, though, it will be pug’n’play and that’s pretty awesome.

        • bjm
        • 2 years ago

        Ah, very true, it does require a little pain at the moment. Once a stable software stack starts hitting the main distro repos, it’ll be a breeze to get going. I suppose it’s akin to any new software release in the Linux world, you’ll have to wait until your distro gets it all packaged nice and easy for you.

    • raddude9
    • 2 years ago
    • DrDominodog51
    • 2 years ago

    Meh.

    According to other reviews, Vega 64, at stock speeds, only gets 2000 more graphics score in Firestrike than my OCed Fury.

    I might consider getting Vega 54 when the price goes down to ~250 dollars in a year or so.

    • K-L-Waster
    • 2 years ago

    It’s nice that Vega is out, but really I can’t see replacing either the 980 TI in my main rig or the 970 in my living room rig with either of these.

    Sticking it out until Volta I guess…

    • Takeshi7
    • 2 years ago

    No mining benchmarks? Everyone knows that’s what these cards will be bought for.

      • DPete27
      • 2 years ago

      [url=http://www.tomshardware.com/reviews/amd-radeon-rx-vega-64,5173-16.html<]30MH/s ETH[/url<] Probably not worth the extra $ over polaris for a 25% improvement.

        • freebird
        • 2 years ago

        Guess I’m gonna keep my R290s @ 29.5MH chugging along for a while more…

        We’ll have to wait and see if Rapid Math or the new instructions in Vega can be taken advantage of by mining software in the coming weeks though.

          • Srsly_Bro
          • 2 years ago

          A 1050 to does 11mh at 65w.

            • freebird
            • 2 years ago

            Yeah, and I have 2 1060s doing 23.5MH and probably running at 75-80w, so what is your point? The 290s were bought 2 years ago and earn at least $1.50-1.75 a day after electric usage…

            Also have 2 1070s doing 31.5-32.5MH probably sucking down about 150w, but those don’t do freesync and I’m also not throwing my BenQ XL2730 in the trash, just because the R290s pull more “juice”, which is why I’m waiting on RX Vega[super<]56[/super<]; probably a ROG addition, maybe even two of them.

        • tipoo
        • 2 years ago

        I assume that’s not yet using the new ISA instructions specific to mining?

        [url<]http://images.anandtech.com/doci/11717/siggraph_vega_architecture_17.png[/url<] That might amp it up a fair bit more.

      • Krogoth
      • 2 years ago

      Not worth it for mining. ETH is close to hitting the difficulty wall. Vega 56 is the only SKU that be worthwhile for mining other crypto-curriences.

        • freebird
        • 2 years ago

        The Ethereum Dev Team is probably going to change the ETH difficulty ramp and put off the change over to POS until the end of 2018 or thereabouts from what I just read recently…

        [url<]https://www.reddit.com/r/EtherMining/comments/6t56a0/important_information_from_todays_ethereum_dev/[/url<]

      • freebird
      • 2 years ago

      Here you go…don’t say I never did anything for ya… 😉

      [url<]http://www.legitreviews.com/amd-radeon-rx-vega-64-vega-56-ethereum-mining-performance_197049[/url<]

      • southrncomfortjm
      • 2 years ago

      20-25% faster hashing at double the power draw of a 480… no go.

        • freebird
        • 2 years ago

        RX Vega[super<]56[/super<] isn't bad but still not on par (power-wise) with the GTX 1070 for Ethereum mining... haven't seen any results at Nicehash on how good Vegas are with other crypto currencies, but as others have mentioned there are 40 new instructions some for hashing/cryptocurrencies listed in their slide, so there may yet be more to the mining story for Vega.

        • ptsant
        • 2 years ago

        There is no optimized VEGA-specific code yet. Plus it would only make sense to run Vega at the powersave profile and probably undervolt.

        I believe Vega can match and maybe slightly surpass the hashrate/W of Polaris with appropriate tuning. But this will not happen overnight.

          • freebird
          • 2 years ago

          Apparently, Vega 56 with overclocked HBM2 @1900Mhz is pretty good with the new AMD mining driver.

          [url<]https://hothardware.com/ContentImages/NewsItem/41855/content/Radeon-RX-Vega-Blockchain-Driver-Ethereum-Hashrate.png[/url<] But GTX 1060s with 6GB are a better deal; they are capable of 23.5 MH at less than 100w.

    • Duct Tape Dude
    • 2 years ago

    Jeff, this reads like a 12 page advertorial–for a gold subscription.

    Well done on obtaining review samples and knocking out a TR-quality review in time for launch.

    • Chrispy_
    • 2 years ago

    Nice quick review Jeff – great to see you managed an article at launch despite the Threadripper fiasco and only two days to play with.

    Based on power usage and performance, it’s inferior to Polaris in every visible way even if you completely ignore how GP104 annihilates it in several games and at all times in the performance/dollar metrics.

    Compared to Polaris it’s offering 35-70% more performance, despite having about 80% more resources running 15% faster. As if that wasn’t disappointing enough already, the power consumption at those clocks is ridiculous. HBM is supposed to be lower-power than GDDR5, and yet the average of 50% performance gains of Vega64 seem to be costing nearly double the power draw of even Polaris 20 (which itself was a huge step down in power-efficiency over Polaris 10).

    So, I’m curious to see what Vega [b<]*is*[/b<] good at. I'm hoping further benchmarks can investigate why it sucks so hard! We know it can number crunch like a boss on GPGPU work, but it does seem like completely the wrong product for gaming in every conceivable way - a significant step back from Polaris and unable to make much of an argument against the much smaller GP104. Today in the UK, in-stock Vega64 cards are being sold for 1080Ti prices and the GP102 in those 1080Ti cards [b<]utterly outclasses[/b<] VegaRX for gaming.

      • Anonymous Coward
      • 2 years ago

      I would have liked more analysis along these lines in the article.

      Its good to see AMD show solid throughput numbers, but where is AMD standing architecture-wise? Where does this point for Raven Ridge and mobile?

        • Voldenuit
        • 2 years ago

        [quote<]Its good to see AMD show solid throughput numbers, but where is AMD standing architecture-wise? Where does this point for Raven Ridge and mobile?[/quote<] Vega 56 gets about 30% more performance than a RX 580 for ~14% more power using the Power Saver profile. I think Raven Ridge is looking promising. Vega 64 is just being run past the ragged edge of voltage/clocks in order to match the 1080.

          • Chrispy_
          • 2 years ago

          True, but comparing Vega 56’s power-saving profile to the RX 580’s horrific factory-overclock isn’t something you can do. The power saver profile is likely a clockspeed reduction and undervolt, whilst Polaris 20 is a voltage/clock/power-limit increase.

          Vega 56 gets about 40% more performance than an RX 480 running at 1200MHz/1000mV, but whilst Vega 56 is still pulling down ~225W from the wall in power-saving mode (extrapolating from the known power draw of the other three cards tested), the RX 480 in homebrew “power-saving” mode pulls down about half of that.

          Vega’s architectural advantages just aren’t any use in modern games. I’ve read some other reviews from sites that had more time with the cards, and they note that Vega is a very forward-thinking architecture. I suspect in three years from now, we’ll look back on Vega cards and see that they’re still holding up well despite their age. The problem is that they’re pretty lousy right now in terms of performance/die-area, performance/Watt, and performance/$ compared to Polaris and GP10x.

            • Voldenuit
            • 2 years ago

            Don’t we have 3 Bn transistors on Vega 56/64 just to get the clock rate bumped up?

            Raven Ridge won’t be needing those, so the IGP might be more space-efficient than Vega 56/64, depending on how many resources AMD is able to spend on optimizing the die. If they get rid of those, it’s going to be more power efficient too, as it won’t need to power ‘dark silicon’ that’s doing nothing directly useful for the graphics pipeline.

            If, as you say, the Vega architecture is more forward-looking, then it sounds like a good match for an APU line that can be produced for the next 5-7 years, with minor tweaks to the CPU and GPU portions as needed.

            • Chrispy_
            • 2 years ago

            It’s kind of sad that this dark silicon related solely to clock speed optimizations seems to have [b<]completely failed[/b<]. Vega64's base clock is actually lower than the base clock of an RX580 and it would appear that unlike Polaris 20, Vega can't stay at boost clock under load. 3Bn transistors of power-guzzling, cost-increasing, fan-noise-producing failure.

            • Mr Bill
            • 2 years ago

            [quote<]Don't we have 3 Bn transistors on Vega 56/64 just to get the clock rate bumped up?[/quote<] Yeah and all those pipelines when slightly biased ON makes for more power draw at every clock speed. But I wonder, what about the floating point architecture, which is why bitcoin miners love these cards. Is doing the same computations with FP intensive architecture more power hungry than NVIDIA's design that de-emphasizes FP?

      • NovusBogus
      • 2 years ago

      I happened to be at Microcenter today, and they had posted a page from someone else’s review showing that a Vega based ETH mining rig is comparable to Fury X but at much lower cost and power use. So that’s one thing it does well, or at least more efficiently than that which came before. Interestingly, they also mentioned that Nvidia has requested rationing of 1070 sales so the miners don’t take the whole stock.

    • DPete27
    • 2 years ago

    A 1440p Ultra card. Priced too high. Not doing any favors for the burgeoning FreeSync monitor market. I guess Nvidia can sit comfortably on GSync for another couple years at least.

      • cynan
      • 2 years ago

      Instead, the sad irony is that AMD is using FreeSync as an excuse for an under-performing product, when I think the overall market reception will be more in line with your assessment.

      Then again, you can hardly fault AMD given how much less money they are poised to make off of Vega compared to Pascal (at least for the gaming SKUs), and the added insult that Nvidia gets royalties for every G-Sync monitor sold.

    • Fonbu
    • 2 years ago

    Manic Monday!

    The frame time’s seem decent on Vega, for the most part. I realize they are not really 4k oriented gpu’s, but would be interesting to see 4k benchmarks. Another review noted that Vega loses a fair amount of steam when traditional MSAA is activated.

    Anyway at this point in time with AMD Vega, differentiating the brand loyalists and they will buy this option for gaming. And factoring in the talk about miners buying them up, once mining benchmarks have been done.

    Something else to think about is the monolithic dies are almost reaching their apex. The cost associated with producing such a chip is huge and the cannibalizing of them. Naturally with many smaller dies, in theory they have lower defection rates.

    • MaxTheLimit
    • 2 years ago

    The RX Vega 56 seems interesting. I’m back and forth between the 56 and the 1070. I don’t see any sign of the 56 being listed yet. If the price of the 1070 comes back down, I’ll probably go that way. If -and that’s a big if – the 56 is listed soon for a lower price I’d go with that one.

    The power draw doesn’t bother me much. If I do decide on the 56, I might wait short while for some better, and quieter, aftermarket coolers.

    I’ve been patient, and I’m not loyal to either side. Whichever one give me the best performance per dollar will be getting my money. I don’t have much interest in the 1080 or the 64. They are a big jump in price for small performance boost.

      • DoomGuy64
      • 2 years ago

      Yeah, I’m probably going to wait for aftermarket Vega 56, as that will likely be the best option to match my freesync monitor. Powerdraw on the 56 doesn’t seem too bad, performance is a little disappointing, but as Doom shows it can really ramp up with proper optimization. Hopefully driver improvements and game optimization will catch up to the hardware.

      • Ryhadar
      • 2 years ago

      Vega 56 is rumored to be released on 8/28, last I heard.

        • MaxTheLimit
        • 2 years ago

        I can wait. What’s 2 weeks? I’m interested to see how the two cards stack up in terms of ACTUAL price when they are released. Will the 1070 see a drop down closer to the ‘price drop’ price?

    • ronch
    • 2 years ago

    So, a bit slower than a 1080 and draws 50% more power. That’s… not unexpected but nonetheless a bit disappointing.

    I wonder what AMD is missing here. What is Jensen’s secret sauce?

      • freebird
      • 2 years ago

      A Polaris with a few more Compute units and 64 ROPs vs the current 32 could’ve been interesting… I always wondered WHY did they limit Polaris to just 32ROPs when GTX 1070/1080 have 64…

    • kvndoom
    • 2 years ago

    TR should do coin mining benchmarks too, since most of these cards will get bought out for that reason anyway. 😛

      • chuckula
      • 2 years ago

      You’re actually not wrong to say that.

        • Firestarter
        • 2 years ago

        it’d be useful to predict the actual price that these cards will go for, instead of the MRSP. That would make the scatter plot more useful as well, given that the mining performance of the other cards is already factored in their retail price

        • raddude9
        • 2 years ago
      • jihadjoe
      • 2 years ago

      I’m thinking of getting one if it can mine BCC.

      • morphine
      • 2 years ago

      AMD really should start pricing its cards in Bitcoin / Ether instead of USD. =)

      • ptsant
      • 2 years ago

      In other sites, RX 64 has been shown to get 31MH/s, which is quite respectable and better than both the 1070 and the 1080 (the 1070 is actually faster than the 1080).

      However, it appears that Vega has mining-specific (or at least, helpful) instructions that are not found in previous GCN cores and are currently not used. I suspect that with some tuning Vega should improve significantly.

      Using the current GCN assembly kernels, the RX 580 remains more efficient and cost-effective (at MSRP or close to it).

      • PrincipalSkinner
      • 2 years ago

      I suggested this ages ago. Someone replied saying those benchmarks are done on some other websites.

      • Krogoth
      • 2 years ago

      I really don’t think so. Mining in itself is a fool’s errand for the economically illiterate a.k.a the majority of those doing it.

      Vega 64 has too much upfront hardware cost ($699+ USD and $1,000+ USD for bundle deals) and eats too much power. You are better off getting RX 480/470 and 1070 for mining despite their massive mark-ups.

        • southrncomfortjm
        • 2 years ago

        Right. Mining normally only makes sense if you already have the hardware ready to go and start early in the craze. I don’t see how anyone who bought 4-5 Polaris cards at $300-$500 made anywhere near enough money to cover their costs in the last 6 months.

        I’ve been mining on my RX480 for about 2 months and have made about $85, which is about 40% of what I paid for the card back in December 2016. Assuming 4 more months of steady mining and I may about pay for the card that I got for below MSRP at the time, before the craze.

          • jihadjoe
          • 2 years ago

          IMO it’d have been better just to have bought BTC as a speculator. No hassles of setting up mining rigs, figuring out electricity costs etc., and value jumped from $2000 to $4000+ in the last few weeks.

      • southrncomfortjm
      • 2 years ago

      Its too easy to get a very good idea of what these cards can do mining-wise just from simple searches. The Vegas do about 30-35 mhs/s compared to the 25-26 I can get on my RX 480. Nothing mind blowing, and means it would take about a year to recoup the cost of the GPU assuming 24/7 mining and power costs.

    • Voldenuit
    • 2 years ago

    Is the FPS/$ chart going by ASP instead of MSRP? The review states that the MSRPs of the 1070 and 1080 are $379 and $499 respectively, but the 99th percentile chart has them at $475 and $550.

    Granted prices have been inflated lately due to mining, but comparing MSRP for one series to ASP for another isn’t exactly apples to apples.

      • homerdog
      • 2 years ago

      [quote<]comparing MSRP for one series to ASP for another isn't exactly apples to apples.[/quote<] More like it's complete BS. Hope Jeff corrects this.

    • Duct Tape Dude
    • 2 years ago

    Great Scott,
    who art at AMD,
    Damage be thy name.

    Thy Vega be shown,
    thy work be done,
    in GPUs as they are in APUs.

    Give us this day our latest benchmarks,
    and forgive us our fanbois,
    as we forgive those who only measure FPS.

    And lead us inside the second,
    but deliver us from closed standards,
    Amen.

      • Neutronbeam
      • 2 years ago

      All hail the Wassoning!

        • Growler
        • 2 years ago

        Here we come a-Wassoning
        With no cards of green,
        Here we come benchmarking
        Results to be seen.
        Love and joy come to you,
        And to you your Vega, too,
        And Su bless you, and send you
        Happy new gear,
        And Su send you happy new gear.

    • MileageMayVary
    • 2 years ago

    Guessing they can close most of the performance gap but doubt there will be much which can be done about that power usage.

    Does AMD not have a contract with TSMC any more? I thought they re-renegotiated their GF contract to be less GF-bound?

    • deruberhanyok
    • 2 years ago

    Jeff thanks for working your fingers to the bone to get this out so quickly!

    Performance falls about where I expected it might after all the various rumours and preview talks – top-end SKU matches a 1080 and the cut-down one matches a 1070. That’s a little disappointing given it’s over a year since the Pascal launch. At the same time, if the prices are competitive (which, I guess we’ll have to wait for the mining craze to give out before any video cards are reasonably priced), it means there’s a viable alternate option, and that’s pretty cool.

    The one thing that’s just kind of mind-boggling is the power draw / performance. Having built-in selectable power profiles is AWESOME and I love that they’re doing that. Neat trick with the BIOS switch too. But seeing how little of a performance drop there is using “power saver” vs. the actual decrease in power used really makes me question why they thought the extra few percent was worth it.

    I know in end-use the power draw of a GPU doesn’t really matter for most of us – it’s one video card in your house, being used only when you’re actually gaming (not surfing TR), so yes, practically speaking it doesn’t make a difference. If you’re running a mining farm it might, but at that point, well, you deserve whatever problems you get because it’s your fault GPUs are all overpriced right now.

    But the visuals of that power draw is a killer. It gives this (not incorrect) perception that AMD cards are hot and loud and pushed to their absolute limit, and having the ability to knock it down by over 80W with barely a difference in performance really makes me wonder why they didn’t just DEFAULT to lower value and have some “built in one-click overclocking” instead.

    Eh, regardless, the tech looks like it will serve as a good base for future product generations and I have a feeling it will look more attractive with future refinements (and again, if price stays competitive). I’d really hoped that the RX 580 would be half of Vega 64 instead of rebadged Polaris 10, but maybe next gen.

      • DPete27
      • 2 years ago

      See how nicely that Vega 56 lines up against the GTX1070 in the performance summary…..yeah, that’s why 3fps matters to AMD.

    • Ryhadar
    • 2 years ago

    Been refreshing newegg since 8:55am this morning. Results for “RX Vega” and using their power search didn’t even turn up any results except for the frontier editions until about 9:08am. With prices at $699 for nearly all of the cards (bundled with the “free” games) I was holding out for the limited edition silver as I watched the black shrouds disappear. Never saw the silvers come in stock.

    I can justify spending a little more on a limited edition run, but no way am I spending $100 more on a reference card and two games I may not even like.

      • southrncomfortjm
      • 2 years ago

      Not really sure why AMD even releases reference cards. They are invariably far inferior to 3rd party options and make their initial benchmark numbers look bad due to high noise, high temps, and resulting throttling. Just seems like a waste.

        • Voldenuit
        • 2 years ago

        The reference blowers from AMD and nvidia tend to be better than add-in card makers’ custom blowers. The AIB board makers devote more effort to open coolers, because these have better absolute performance, but some users prefer the blowers, especially ppl with small or cramped cases who don’t want to dump waste heat back in the chassis.

        Also, having reference boards is attractive for users who want to install an aftermarket water cooling loop, as these are the parts that get the first custom waterblocks.

        • LoneWolf15
        • 2 years ago

        AMD’s reference cards used to have better voltage regulation. The quality of MOSFETs, VRMs, etc. is guaranteed and consistent. And when you started getting non-reference cards, you might find that to get the price down, someone cut out one power phase, or did other things that would make the card less expensive, at the cost of being able to overclock at all.

        It used to always (read: Radeon 4xxx/5xxx/6xxx times) be preferable to buy a reference card, because then you knew for sure what you are getting. That mentality has changed as vendors have tried to make improved cards rather than cut them down, but it could always go the other way again. Also, AMD’s reference coolers exhaust air out the case, and that’s not always true with others even if they have more fans.

    • End User
    • 2 years ago

    I’m going to wait for Gigabyte to release their dual fan cooler edition before I buy a Vega 64.

    • geniekid
    • 2 years ago

    Hopefully the 64 Liquid will follow the same path as the Fury X with respect to price. There were a couple of months when the Fury X was sub-$350 which was an incredible deal at the time.

    • tipoo
    • 2 years ago

    There’s a cryptocurrency specific ISA addition re:the press slides, eh. People wanting it for graphics may hate it, but this may see huge demand from miners.

    Also interesting to have numbers on FP16 gains, looks like in the 20 percents. I.e not a magic doubling as some would say.

    Vega 56 seems well positioned for now. 1070 performance at a decently lower price. Question is if Nvidia can/will drop that price on a whim with enough margin (with a smaller die in theory they could, but AMD is probably getting low margins on these). Vega 64 is a far less clear value prospect, in one way it’s similar to the 1070 vs 1080, but with Nvidia you’re actually getting the best, which 64 can’t claim.

    • NTMBK
    • 2 years ago

    Good value performance, but holy cow, that power consumption. I thought stacked memory was meant to bring power consumption under control, not make things worse.

      • chuckula
      • 2 years ago

      The stacked memory isn’t making things worse, it just can’t fix the other issues in the architecture.

      It’s kind of like how having a squirt gun at a warehouse fire technically doesn’t make the fire worse, it just can’t really help put it out.

        • NTMBK
        • 2 years ago

        Hey, don’t just blame the architecture. GlobalFloundering deserves some of the blame too.

      • tipoo
      • 2 years ago

      It’s despite HBM, not because of it. AMD is pushing things well past their optimal frequency/voltage spot to get performance in the same ballpark.

      [redacted, page 11], Polaris doesn’t do too bad at perf/watt at its peak efficiency point, problem is that’s not up to Nvidia performance, so out of desperation AMD has to push it past where I assume it was designed for.

      See 35W Radeon Pro 560 in the rMBP. It’s slower than the 1050, but not so much so that 35W vs 50W isn’t impressive.

        • TwistedKestrel
        • 2 years ago

        Did you see page 11? AMD more or less put the low hanging fruit in the box

        • Jeff Kampman
        • 2 years ago

        You don’t have to wonder about the underclock, I did it right here: [url<]https://techreport.com/review/32391/amd-radeon-rx-vega-64-and-rx-vega-56-graphics-cards-reviewed/11[/url<]

          • tipoo
          • 2 years ago

          Somehow, I saw that, and still posted my comment.
          *mutters something about coffee*

    • deathBOB
    • 2 years ago

    So do overclocked 1070s and 1080s just kill these things? Maybe a 56 can overclock but there is clearly no headroom on the 64.

      • cynan
      • 2 years ago

      While Vega is an abysmal overclocker, 1070s and 1080s don’t tend to commit all that much additional carnage when overclocked. With GPU Boost 3.0, FE edition cards approach around 1900 Mhz stock. With some tweaking, you get up to around 2050 Mhz. Springing for a card with a better after market cooler gets you around 2100 Mhz, maybe a handful more if you’re lucky. Voltage wise, Pascals are pretty locked down, and this is reflected in the limited overclocking.

    • maroon1
    • 2 years ago

    People who waited for Vega should hit their head on the wall for making one of the worst decisions

    Vega will cost much more than official price no doubt because of mining. Look at current price of RX580 in newegg. It cost around 350 to 400 dollars depending on the brand.

    It bad time to buy a new GPU because of the increase in the prices.

    People who bought GTX 1080 and GTX 1070 few months ago before increasing the price made the best decision. Waiting for more power hungry vega was just a waste of time.

      • shank15217
      • 2 years ago

      Hyperbole much? If Vega becomes a mining card kudos to AMD for selling every VEGA they make.

        • Krogoth
        • 2 years ago

        Unless another crypto-currency becomes the next ETH/BC in the near-future. Vega 64 is a sub-optimal choice for mining. The low-hanging fruit for ETH is all gone and it is rapidly approaching the difficulty wall.

          • freebird
          • 2 years ago

          Apparently not so…
          [url<]https://www.reddit.com/r/EtherMining/comments/6t56a0/important_information_from_todays_ethereum_dev/[/url<] Profitability will drop, but not completely,but maybe some less efficient miners may drop when Sept 24, 2017 rolls around.

      • The Egg
      • 2 years ago

      So the people who saved their money should hit their heads on a wall, while those who immediately paid full MSRP made the “best decision”, even though the mining boom was completely unforeseen and couldn’t possibly have factored into their decision.

      Got it.

      • brucek2
      • 2 years ago

      It’s not just a “few months.” The 1080 was available 14 months ago. Or, if you just discovered gaming this year, the 1080ti was available 5 months ago. The 1070s came just a couple months later IIRC.

      Take away the time element and I think AMD has the easily superior value, factoring in the significantly less expensive FreeSync monitors. I’d have gone AMD if they were both available in the same window.

      But you just can’t ignore being 14 months late to the party. That’s an eternity in tech. If you’re going to be getting stuff that much later than everyone else, the discount actually needs to be a lot steeper than it is – think of what a 14 month old game on Steam sells for compared to a new one, or a movie, or a computer hardware in general.

      • tipoo
      • 2 years ago

      Sometimes wait and see pans out, sometimes it does not, waiting is hardly the worst decision someone could make in tech.

      • Gastec
      • 2 years ago

      What about me? I have been “waiting” to upgrade my system for years now, I’m on a GTX 670 here. What am I to do? Please enlighten me 🙂

    • Anovoca
    • 2 years ago

    Thanks for the review Jeff. We will see you back here in 12 hours after you catch some sleep for the first time since last weekend, go outside for some fresh air, then maybe get an IV to refuel after a week’s diet of beer and redbull .

    As a side note, you mentioned wanting to revisit several topics more in-depth at a later time. I think I speak for many of us when I say, we would be more than happy to consume this information in podcast form and let you give your fingers some rest.

    • Bauxite
    • 2 years ago

    $200 overpriced, a very shiny and late turd. Hype and fanboyism will sell that for a little bit, but reality needs to come down on the price hard.

    The same newegg email linking to a bunch of sold out $600 air cards has a $750 1080 Ti. The non-limited editions were the same.

    There was one token $500 model, the cynical side of me wonders if AMD paid the difference just so their marketing people wouldn’t have a complete and total lie.

      • derFunkenstein
      • 2 years ago

      I have a feeling that the launch price isn’t AMD’s doing, but rather resellers cranking it up. You’re seeing token $500 models most likely to fulfill some sort of contractual obligation to AMD, but the rest is just money lining Newegg’s and Amazon’s and eBay resellers’ pockets.

      • tipoo
      • 2 years ago

      The 64 is. 56 is priced aggressively for 1070 performance.

        • Bauxite
        • 2 years ago

        What price, the fantasy price or what it will actually be gouged for with low supply? Meanwhile 1070s will be available and cheaper.

          • derFunkenstein
          • 2 years ago

          Uh, GTX 1070s are being gouged, too. $449+ on Newegg right now. That’s $100+ over Nvidia’s suggested retail. We never did see many hit that $350 mark after the price drop was announced.

            • freebird
            • 2 years ago

            I said this before and I’ll say it again; sometimes it is even cheaper to buy direct from evga.com

            Still over-priced, but in the current market; it is what it is…price is relative to the desire of the buyer…

            [url<]https://www.evga.com/products/productlist.aspx?type=0&family=GeForce+10+Series+Family&chipset=GTX+1070[/url<] $469 [url<]https://www.newegg.com/Product/Product.aspx?Item=N82E16814487320[/url<] $499

            • derFunkenstein
            • 2 years ago

            I said they’re all $449+. $469 qualifies as $449+, amirite?

            • freebird
            • 2 years ago

            Correct me if I’m WRONG, but I DID NOT say anything about the $449+ prices that you stated, did I?

            I’m just recommending to ANYONE looking for cards that the BEST PRICE can be found OTHER than NEWEGG… AMIRITE???? Hell, I even bought an EVGA 1070 for $449 on ebay FROM newegg while newegg was had it listed on their own website for $479 last month. My gripe is with newegg, so I don’t what your problem is…

            Everyone can get gored or gouged less by shopping at the OEM site… which was MY point and it pretty much goes for anything being bought up my miners. I saved $30 or $40 on a EVGA 1300w G2 PSU direct from EVGA (with 10% discount EVGA gave me for using their Power supply estimator) instead of going to my old reliable of newegg.com…found Provantage also had better prices on some identical PSUs than newegg.

            • derFunkenstein
            • 2 years ago

            That’s great but even the manufacturers are taking advantage of this situation (meaning, they’re also happy to gouge buyers). I mean, we’re apparently willing to pay those prices so they might as well. Only fault is our own.

            • freebird
            • 2 years ago

            Once upon a time at University,
            I heard some crazy idea in economics class about supply and demand.

            I demanded a non-crazed professor, so I could buy a Video card at the price I wanted, not what they were asking/demanding.

            The University told me I could have a “safe space”, but they didn’t sell video cards there. 🙁

            😀

            • derFunkenstein
            • 2 years ago

            Exactly. since we’re willing to pay it, they’re willing to charge it.

          • tipoo
          • 2 years ago

          Well, I can only go off MRSPs at this point, can’t I. Which price was “$200 overpriced,” referencing then?

    • Neutronbeam
    • 2 years ago

    Hey gerbs, why am I am not seeing some love thrown at the Kampman for working his little test bench to the bone to get us an awesome review ASAP after being dissed by AMD for a sample after everyone else (and their cats) got their samples?

    Jeff, great, thorough work–you set the bar high for detailed, thorough, clearly articulated technical reviews–and there’s nobody out there better.

    Thanks for this and for all the hard work you do for the site and for us readers.

    EDIT: Added “and” and took out comma and added “do” and added “o” to “thorough”–posting way too fast.

    • ptsant
    • 2 years ago

    Do the TDP profiles simply enforce a power limit (with default freq/volt) or do they actually modify the freq/volt curves?

    The power limit slider already gives an option to tune the available power envelope, so it would be trivial to just use this functionality for TDP profiles.

    The answer should be obvious from the Radeon Wattman trace.

    • brucethemoose
    • 2 years ago

    One thing to remember is that these have almost no overclocking headroom. A 1070 or 1080 TI will have a good 8% perf increase from a factory OC, and another 8% from manual tweaking on top of that.

    Another is that the 1070 is over $100 over MSRP in the price/perf graph. You can get a used 980 TI for $320, and it’ll match Vega 56 using less power (which in itself is rather interesting).

      • DPete27
      • 2 years ago

      Your OC numbers seem a little off. Even Pascal FE cards OC themselves considerably higher than their written specs. And reviews are reporting on those numbers, not the advertised frequency.

      That said, your comments on Vega having effectively zero OC headroom is spot on. PCPer couldn’t even get an extra 1% out of their Vega 64 liquid.

        • brucethemoose
        • 2 years ago

        I meant % over stock FPS, not necessarily clocks.

        Yeah, 64 is clearly at some kind of voltage wall already.

      • Voldenuit
      • 2 years ago

      [quote<]One thing to remember is that these have almost no overclocking headroom.[/quote<] PCPer [url=https://www.pcper.com/reviews/Graphics-Cards/AMD-Radeon-RX-Vega-Review-Vega-64-Vega-64-Liquid-Vega-56-Tested/Clocks-Power-<]tried overclocking the 64 Liquid[/url<], and were only able to eke 15 MHz (!) out of it. In their words, [quote<]Starting at the flagship level with the RX Vega 64 Liquid, I was only able to squeeze another 15 MHz out of the card. In fact, even increasing the clock speed by 1% in the Wattman slider would result in a crash or a black screen, even with the temperature maxed out at 70C and the power target slider moved to +50%.[/quote<] It did manage to gulp down 440 W of card power for that privilege, though -_-. Maybe they got a bum copy, but you'd think AMD would be putting their best chips in their most expensive SKU.

        • chuckula
        • 2 years ago

        Given the results of TR’s review and the other results from around the web, I would suggest that AMD really emphasize the reduced-power modes since they at least cut the excess power & heat while having a minimal impact on performance.

        An overclocker’s dream Vega is not.

        • cynan
        • 2 years ago

        Probably hit a current supply wall. How is 440W with only 2 8-pin PCIe connectors even possible without modding anyway?

          • Voldenuit
          • 2 years ago

          You are correct that a card with 2 8-pin PEG slots would be rated for 375W power draw (2x150W + 1x75W). But most good PSUs have no problem exceeding the PCIE power spec on the PEG lines; what you don’t want to do is to exceed the motherboard lines, as that is putting unnecessary load on the motherboard circuitry (although the same argument could be made that high quality mobos should be over-designed and able to cope with said load).

          At 345W TDP spec on the 64 Liquid, there isn’t a whole lot of headroom already.

      • freebird
      • 2 years ago

      Most of the 980Ti I see on ebay are $350-$500… a few are listed under $300, but the bidding isn’t over either…

      besides that going forward DX12 games will be the norm and that is where Vega performs best and (hopefully) driver updates will improve its current performance.

      So personally, I’d go for a GTX 1070/1080 or Vega 56/64 depending on whether you prefer G-sync/Freesync.

      I’ll probably look for a Vega 56 with custom cooler or liquid, since it probably clock as high as the 64, but that’s another 2 WHOLE WEEKS at least; probably early Sept. as Asus stated for their ROG versions.

    • chuckula
    • 2 years ago

    Good news: Vega 64 is listed on Newegg: [url<]https://www.newegg.com/Product/ProductList.aspx?Submit=ENE&DEPA=0&Order=BESTMATCH&Description=vega&N=-1&isNodeId=1[/url<] Bad News: All of them are out of stock* Worse News: The air-cooled versions of the Rx 64 (not the high-end versions) are listed at $600 so forget about some big discount over the GTX-1080. * Note: By way of contrast ThreadRippers were fully in-stock on launch day -- [b<]and[/b<] were listed at the regular MSRP -- and were actually available a little before the review NDA lifted.

      • DPete27
      • 2 years ago

      With the recent upswing in BTC/ETH price, the crypto craze is still in effect.

        • erwendigo
        • 2 years ago

        But the difficulty of ETH ups so much, so in reality the cost of mining for a X economic performance is greater now than before.

        It’s a nonsense. The cryptomining craze is going to a new crash like the many ones that It happened before. Hell, the ETH mining had a crash a month ago.

        When the speculation find a new altcoin for using it as a “easy way” to get BTC, the ETH will going to crash definitively, and that’s only relevant IF the difficulty doesn’t grow up to levels that does the gpu cryptomining a nonsense (cost of gpus and power consumption greater than the cryptocoins mined), a point that I haven’t clear if it’s reached now or not.

      • raddude9
      • 2 years ago
        • RAGEPRO
        • 2 years ago

        You guys should really take this to a private message.

          • derFunkenstein
          • 2 years ago

          preach it brotha

          • NovusBogus
          • 2 years ago

          But sir, that’s not a very Internet friendly approach. The tube gods demand public drama!

        • chuckula
        • 2 years ago

        [quote<]Just to clarify, GCC does not "experiences random crashes when running on RyZen". A few users have reported crashes (seg-faults) on large gcc compiles, it's not something that I or, it seems, most Ryzen gcc users have encountered. [/quote<] That's the typical combination of arrogance & ignorance I expect from you and that was a pretty insulting statement towards the people who [b<]actually[/b<] do real work with GCC who obviously knew a hell of a lot more about what was going on than you did. Incidentally this statement is even worse: [quote<]The workaround of disabling ASLR is mentioned a number of times in the AMD forums link you posted:[/quote<] So first, you tell everybody to turn off a vital security feature, ASLR, because now all of the sudden it's Linux's fault that AMD had an errata in RyZen? DAYUM that is not only insulting to people who actually understand software but it literally was wrong advice that also was encouraging people to open up new attack paths to their RyZen systems. No apologies from me whatsoever and I'm getting pretty tired of your fact-free empty insults.

          • Jeff Kampman
          • 2 years ago

          stahp

            • chuckula
            • 2 years ago

            I have no problem whatsoever not making any out of the blue attacks on Raddude9 if he agrees to stop attacking perfectly legitimate posts in unrelated articles.

            If you factual evidence that I’m operating on good faith here, just look at my posts in the official story that revealed these bugs: [url<]https://techreport.com/news/32362/amd-confirms-linux-performance-marginality-problem-on-ryzen[/url<] Not one personal attack on Raddude9 even though he was proven to be wrong.

            • bjm
            • 2 years ago

            stahp already.

            • southrncomfortjm
            • 2 years ago

            No no, keep going. This is *really* important.

            When’s the slap fight start?

          • raddude9
          • 2 years ago
          • Action.de.Parsnip
          • 2 years ago

          While its nice to swagger around like you know what youre talking about, which some likewise get pretty tired of, may I point you to the realworldtech.com forum where Linus *himself* comments. Make a sentence out of these words yourself educate go.

            • chuckula
            • 2 years ago

            What, you mean this one where Torvalds says one of the excuses floating around for the bug isn’t correct:

            [url<]http://www.realworldtech.com/forum/?threadid=170454&curpostid=170628[/url<] Or how about this one where he mentions nothing about ASLR whatsoever but does suggest that the bug -- which AMD hasn't actually tracked down yet -- could be in the TLB logic: [url<]http://www.realworldtech.com/forum/?threadid=170454&curpostid=170518[/url<] Nope. Not seeing anything where the founder of Linux went around saying that there's nothing wrong whatsoever in RyZen and that you should just turn off ASLR since it's broken.

            • raddude9
            • 2 years ago
            • cygnus1
            • 2 years ago

            Getting out of hand with the language guys…

            • derFunkenstein
            • 2 years ago

            It’s not guys getting out of hand with language, it’s guy. Say what you want about chuckula, he doesn’t have to get his point across with profanities.

            • cygnus1
            • 2 years ago

            Either one of them can stop though….

            • derFunkenstein
            • 2 years ago

            Totally agreed.

            • cegras
            • 2 years ago

            I don’t see the difference between douche and fancy ways of calling someone stupid, and profanities.

            • derFunkenstein
            • 2 years ago

            Fair. He was at least supporting his argument, but everyone could de-escalate.

            • raddude9
            • 2 years ago
            • derFunkenstein
            • 2 years ago

            I would prefer one of you guys be the bigger person and just stop attacking. But if you both just absolutely refuse, then I’d prefer you at least get creative with your insults. /popcorn

            • Chrispy_
            • 2 years ago

            Oooh, all his comments are gone. Is that a banhammer?
            I shouldn’t have spent so long getting my popcorn :'(

            • derFunkenstein
            • 2 years ago

            One of them had an “edited by Jeff” indicator but now they all show edited by raddude9 himself.

            Nobody actually deleted anything, because deleted posts have huge text that say Post Deleted.

            • DPete27
            • 2 years ago

            Ugh, grow up. It’s not like Chuckula killed your cat or anything.

      • shank15217
      • 2 years ago

      They were just listed, they show “out of stock” because they were never in stock. Vega 56, the one that’s actually going to be worth buying is only available 2 weeks later.

      • freebird
      • 2 years ago

      Yeah and what cryptos can you mine on TR? NiceHash doesn’t have a rating for it, but they support CPU mining.

        • chuckula
        • 2 years ago

        There was a reason I wanted TR to review cryptomining on the i9 parts to emphasize how much they suck at coin mining.

        It was a defensive policy to protect the i9 supply.

          • freebird
          • 2 years ago

          Well in that case all the surplus FX-9590 are “safe” from “exploitation” also… whew, I was worried for a second…

          • freebird
          • 2 years ago

          And the lower end i9s with only 24 pcie lanes are protected from using more than 2 Vegas thereby protecting it from catching fire… what do you call an overclocked i9 18 core with 3 OCed Vegas?

          Fire Starter!!!

          ;D

    • ptsant
    • 2 years ago

    There are no surprises when it comes to performance, practically equivalent to the 1070 and 1080. Power consumption is also not a surprise and the “Power Saver” profile is the one where the card was supposed to run. Drivers also appear to have been decently tuned and the 99% fps profile is good. No hiccups there.

    The only real surprise for me is the price, especially of the RX 56. If it drops below MSRP (depending on the cryptomining situation), it will be a tempting card for 1440p.

    Overall, this is a less spectacular failure than what I expected.

      • chuckula
      • 2 years ago

      [quote<]Overall, this is a less spectacular failure than what I expected.[/quote<] THANKS FOR YOUR SUPPORT! -- AMD marketing

        • derFunkenstein
        • 2 years ago

        [quote<]Overall, this is...spectacular...[/quote<] --back of the box quote

          • Arbiter Odie
          • 2 years ago

          You’re hired!

          Sincerely,
          AMD’s Marketing Department

        • raddude9
        • 2 years ago
      • cynan
      • 2 years ago

      I agree that this launch is no big win for gamers. However, though the two are related, it’s more the lackluster performance that accounts for this than the price.

      The only thing more tragic than the underwhelming competitiveness of the price compared to prior AMD gaming GPU launches would be to see lower MSRPs and have AMD sell the little inventory they seem to have at a loss (given how expensive Vega must be with the die size and HBM2) to miners.

      If and when AMD has inventory that doesn’t sell, they’ll be able to cut the price.

      • the
      • 2 years ago

      I would say performance was a bit of a let down over all. I was expecting the Vega 64 to slide in-between the GTX 1080 and GTX 1080 Ti but right now it is pretty much on par with a vanilla GTX 1080. I suspect driver updates can get a bit more performance out of the card but it shouldn’t change the over all ranking.

      Power saver should have been the default with a more aggressive power saver profile that was willing to give up a bit of performance. Seriously, with the current state of power saver profile with respect to performance and power consumption would have saved AMD some embarrassment here. The raw power consumption by default is just embarrassingly bad given its performance against a card that launched 14 months earlier and consumes less energy.

      I strongly suspect that AMD originally wanted to charge more for these cards but realized where they’d fall and priced accordingly. These chips are not cheap to make so AMD needed premium performance to justify the premium manufacturing techniques (interposer, HBM etc.). Knocking it down a notch in the pricing hierarchy eats into margins AMD needed. For end consumers though, this isn’t a good thing. I just wonder how long their will be supply issues as AMD shouldn’t be in such a rush.

        • ptsant
        • 2 years ago

        Depending on which site you read, the RX 64 sits slightly above or slightly below the 1080. For marketing purposes you can count this as a “token” victory, which is exactly what I predicted/expected. I never expected the RX 64 to be significantly faster (the way the 1080 Ti is faster) than 1080.

        I agree about the supply issues you mention. If AMD can keep the supply of Instinct ($$$$) and Vega FE ($$) cards based on the same chip, then that won’t be much of a problem, financially speaking. Anyway, at least for the beginning I don’t expect miners to snatch the cards up so there should be some left over for gamers.

        To be clear, Vega does have some embarassing amount of compute power (raw TFLOPS), 2:1 fp16:fp32 throughput (which Polaris doesn’t have), additional instructions (some, for mining) and really, really good open source support (see the Phoronix article for that). At best, the platform should age better than the 1080, given the physical presence of more powerful hardware.

    • Anovoca
    • 2 years ago

    [quote<] I decided to test both the power consumption and noise levels of these cards while also exploring the performance implications of the three TDPs available from Wattman in the BIOS switch's default position.[/quote<] Are these noise level charts still forthcoming? I don't remember seeing them anywhere in the review? I am curious how the stock blowers compare to FE.

    • USAFTW
    • 2 years ago

    [url=https://www.youtube.com/watch?v=_O1hM-k3aUY/<]In a word...[/url<]

    • willyolioleo
    • 2 years ago

    and nothing but the packs are available… no standalone regular edition vegas to be seen

      • derFunkenstein
      • 2 years ago

      Likely a way to combat miners, but also some tie-in benefit for AMD.

    • USAFTW
    • 2 years ago

    Sooo, with the exception of Doom, it’s significantly slower than (or occasionally tying) a 1080 but needs more juice, and for the same price.
    More than a year later, why even bother?
    On the other hand, seeing the frametime graphs from both AMD and Nvidia as smooth as they are is quite pleasing.
    Edit: I’m glad Jeff decided to skip the Beyond3D suite for now. Thanks for the timely article. Hopefully AMD gets their schedule right next time.

    • DPete27
    • 2 years ago

    Jeff. What was actually affected by changing from Turbo – Balanced – Power Saver modes?

    I assume clock speeds drop from Balanced to Power Saver (but how much?), however the extra 40W for effectively zero improvement has me wondering what changed between Balanced and Turbo.

      • DPete27
      • 2 years ago

      Also, from my perusal of review articles thus far, it seems TR is the only one that did any exploration into the power modes. Kudos for that. It could be beneficial to TR to expand on this section since they’re currently the sole reference to this topic. Could result in more site traffic.

      [Add] Still would like so see how far you could manually push voltages down on Vega 56 at stock clocks and/or clockspeed of “power saving mode”. (wink)

      Looks like PCPer did some OCing and got less than 1% on their 64 Liquid, and about 7% on their Vega 56 albeit for MUCH higher power draw. AMD has pushed Vega even closer to it’s breaking point than the 500-series Polaris cards. Goes to show then, knowing that when you’re in this echelon of the frequency/voltage curve, just a small sacrifice in frequency[performance] can produce HUGE power savings. This could shine a brighter light on Vega than most sites are working with. I mean heck, when you can save 85W for a 3fps hit (and that’s without manual tweaking)…no brainer.

      • ludi
      • 2 years ago

      Backing down from the hairy limit of frequency probably allows for a fair bit of voltage reduction. That’s where the big savings would come from.

      • Rza79
      • 2 years ago

      TPU tested the power modes and the results are very promising.

      Vega64 PwrSave <2% slower than GTX1080 while only consuming 48W more.
      This is huge progress from AMD. Too bad they went way beyond the sweet spot for that 2% extra performance.

      [url<]https://tpucdn.com/reviews/AMD/Radeon_RX_Vega_64/images/perfrel_3840_2160.png[/url<] [url<]https://tpucdn.com/reviews/AMD/Radeon_RX_Vega_64/images/power_average.png[/url<]

        • DPete27
        • 2 years ago

        Ah, nice. Thanks for pointing that out. I like that TPU included the power profiles in [url=https://www.techpowerup.com/reviews/AMD/Radeon_RX_Vega_64/32.html<]their perf/watt graphs[/url<] even though it doesn't show them well against Polaris. I thought Vega was supposed to be sooooo much more efficient.

      • Mat3
      • 2 years ago

      Vega would look so much better if they had just gone with the power saver as the default. Almost the same performance as a 1080 for about the same difference in power as the 480 vs the 1060. All AMD has done is fuel their reputation for falling further behind Nvidia when it’s actually not nearly as bad as it looks. It’s not like they’re fighting for top spot, it’s for distant second! Just truly an idiotic decision.

        • Rza79
        • 2 years ago

        Some guy on TPU forum made this table:
        [url<]https://www.techpowerup.com/forums/threads/amd-radeon-rx-vega-64-8-gb.235978/page-6#post-3709973[/url<] Looking at the second table, it makes no sense to have gone with the setting they choose for default. It's just ludicrous.

          • Waco
          • 2 years ago

          WOW. That’s…not sane.

        • Mr Bill
        • 2 years ago

        And Bonus! Overclockers could get 2-3% and feel there was headroom?

    • derFunkenstein
    • 2 years ago

    [s<]On the undervolting page: [quote<] AMD is giving builders the option to . The company is specifying six separate TDPs for each RX Vega card[/quote<][/s<] looks like it's fixed edit: now that I've finished it, great job on the review. They're underwhelming, but when a product is as late as Vega I think it's almost to be expected.

    • chuckula
    • 2 years ago

    Linux benchmarks are available [url=http://www.phoronix.com/scan.php?page=article&item=rx-vega-linux1&num=1<]too[/url<]. Props to AMD for actually getting review samples out.

      • ptsant
      • 2 years ago

      Not just available, but with stellar [b<]open source[/b<] support. I never expected to see this, but should facilitate debugging and alleviate the nasty bugs arising from separte code trees if the code is mainlined in 4.15 or 4.16 kernel.

    • Waco
    • 2 years ago

    While I’m glad they almost caught up to Nvidia…that power draw reminds me of every AMD release when they were behind the green giant in performance.

    I don’t particularly care assuming it stays quiet, but dropping almost 1/3 of the TDP for a very slight performance deficit really highlights the problem.

      • HisDivineOrder
      • 2 years ago

      Yeah, the thing is nVidia’s going to release a new product in the not so distant future and sail away from AMD yet again. It’s almost like asking one small company to compete in two different categories against two companies each specializing in one category is just too much for the small company to ever do.

      If they’re rocking CPU’s, they’re late with their GPU’s. If they’re rocking their GPU’s, they’re late with their CPU’s. Both cost money, both lose money, and they wind up not staying ahead.

      The best thing AMD could do is spin what’s left of ATI off and focus on CPU’s.

        • shank15217
        • 2 years ago

        Volta is a compute chip, Nvidia has no reason to cannibalize their Pascal GPUs for a incrementally faster architecture. Expect a tweak or a process refresh but expecting Volta to roll down to consumers before 2019 is wishful thinking.

        • cynan
        • 2 years ago

        Nvidia already “sailed away” from Vega in March with the 1080 Ti. Vega has got to cost as much as the 1080 Ti to make with its die size and HBM2, but obviously they can’t charge as much for it.

        I’m hoping that Vega is some sort of compromise-by-necessity (in light of HBM2 delays, focus on RyZen, etc) stepping stone to Navi. If their infinity fabric magic doesn’t pay off in spades for them with Navi – perhaps most importantly drastically reducing power consumption due to using multiple smaller dies and smaller process – then I don’t know what AMD is going to do.

        Navi could very well turn the tables on Volta similarly to how RyZen gave the tried and tested Core architecture a run for it’s money given that, as far as I know, high-end Volta will be using a traditional single large die, while Navi will be using multiple smaller ones. So maybe by the time gaming Navi comes out 1.5 years from now – a full year after gaming Volta – AMD will be back in the competition.

        So don’t worry AMD gaming fans, you’ve waited a year already for Vega, what’s one and a half more?

    • chuckula
    • 2 years ago

    So after all the Capsaicin you can handle here we are.

    The overall news for AMD is actually good because even though these cards aren’t exactly what most people on TR would call “impressive” AMD will still likely sell as many as they can make for two reasons:

    1. It’s blatantly obvious that even with some product delays AMD is still having supply issues so they aren’t flooding the market with these cards. The Rx 580 & friends clearly remain the mass-market card in AMD’s inventory.

    2. For all their faults in gaming the early rumors are that these cards mine well, so miners will be more than happy to buy AMD’s products and money from a miner is just as valid as money from a gamer.

      • USAFTW
      • 2 years ago

      Either their marketing department works in an entirely different company and is completely detached from engineers’ input, or they’re utterly incompetent.
      Endless dog and pony shows, “Poor Volta”, and this is the end result. Sucks to be that guy who waited so long for these.

        • chuckula
        • 2 years ago

        Being condescending and snarky towards the competition certainly plays well in fanboy circles but it doesn’t work so great when you can’t back it up with real performance.

          • raddude9
          • 2 years ago
            • Jeff Kampman
            • 2 years ago

            Since you don’t know when to quit, enjoy a day on the sidelines.

            • Tirk
            • 2 years ago

            I hope your comment means that you put both of them on the sidelines.

    • derFunkenstein
    • 2 years ago

    Disappointed that [url=https://twitter.com/jkampman_tr/status/896937044685066240<]this benchmark[/url<] didn't make it into the review.

      • bhtooefr
      • 2 years ago

      But does it have LMNOPRAM?

        • derFunkenstein
        • 2 years ago

        Sorry, Apple wins that one. All of their PCs have ‘PRAM.

      • freebird
      • 2 years ago

      That graph is [b<]INCORRECT![/b<] The AMD RX series is actually listed as RX Vega[super<]64[/super<] [url<]https://gaming.radeon.com/en-us/rxvega/[/url<] which if you count Vega as 4 letters = 4[super<]64[/super<] which is quite a bit larger than 1070 or 1080... 😀 and V is actually 5 in Roman numerals, but that doesn't make much difference at this point any how...unless it can run 5 EGA displays but those are quite dated...

        • derFunkenstein
        • 2 years ago

        I think that’s a typo. Those numbers on the Radeon.com page are so elevated as to be on a whole other line.

          • freebird
          • 2 years ago

          Srsly_Bro? You don’t think they can type the number on the same line in their own advertisement? For all 3 cards? Don’t be a “doubting” derFunkenstein… it’s not your style.

          BTW, it’s also printed superscript in the Products dropdown listing for RX Vega and this AMD doc.
          [url<]https://www.hardocp.com/image/MTUwMjcwNDgwMWp5bnJvMGFsdGtfMl8xNF9sLmpwZw==[/url<] I think it comes down to typing RX Vega 64 is much easier than typing RX Vega[super<]64[/super<] all the time without any cut & paste.

            • derFunkenstein
            • 2 years ago

            TR’s CMS will screw up the copy-pasta. It pastes in the text with a white background, so if you’re using the blue theme it’ll highlight pasted words.

            • freebird
            • 2 years ago

            I was referring to AMD’s PR dept. & their web site/document management with typing Vega[super<]64[/super<], but that's also useful information and applies here as well.

            • derFunkenstein
            • 2 years ago

            Ah, right. My bad.

        • chuckula
        • 2 years ago

        I like page and especially the video.
        At 2:01 you see longtime Friend-of-TR, Roy Taylor.

      • southrncomfortjm
      • 2 years ago

      There’s no legend. Is higher better or is lower better on this graph?

        • derFunkenstein
        • 2 years ago

        This is why it’s important that Jeff add this back to the benchmarks suite and provide analysis.

        • Wirko
        • 2 years ago

        Next review comes with 299 vs 399 points graph, I’m sure there will be a legend.

      • CScottG
      • 2 years ago

      ..because some knob would take it at face-value.

    • chuckula
    • 2 years ago

    A Tale of Two Product Launches by Chuckula Dickens:

    [quote<]It was the ThreadRipper of times, it was the Vega of times. It was the age of ThreadRipper wisdom, it was the age of Vega foolishness. It was the epoch of fanboy belief, it was the epoch of fanboy incredulity. It was the season of Cinebench Light, it was the season of Inside-the-Second Darkness. It was the spring of Infinity Fabric hope, it was the winter of HBM2 despair. We had every R9 1950X before us, we had no Rx 64s before us. We were all going direct to ThreadRipper Heaven, we were all going direct the other way to Vega. In short, the period was so far like the present period, that some of its noisiest fanboys insisted on its being received, for good or for evil, in the superlative degree of comparison only.[/quote<]

      • drfish
      • 2 years ago

      Man, and all I’ve been noodling with was replacing ‘[url=http://www.plyrics.com/lyrics/aquabats/thesharkfighter.html<]shark fighter[/url<]' with Threadripper...

      • Arbiter Odie
      • 2 years ago

      I regret that I have but three upvotes to give

        • chuckula
        • 2 years ago

        It is a far better post that I upthumb now that I have ever upthumbed before.

          • freebird
          • 2 years ago

          The Tale of 2 Chuckulas…a pretty good read I hear…

          at least we know Chuckula didn’t sleep through English Literature class… 😉

          ( I see someone couldn’t quite “accept” my upvote of your previous comment putting you at +1 …what a “downer”)

            • raddude9
            • 2 years ago

Pin It on Pinterest

Share This