AMD’s Radeon RX Vega 64 and RX Vega 56 graphics cards revealed

Gamers have endured a long wait for the Radeon RX Vega, but the wait is over, or at least nearly. Over the past couple of days, I’ve been learning about how AMD plans to re-enter the high-end graphics card market with its next-generation graphics architecture. The company revealed most of the details of its Ryzen Threadripper CPUs to us ahead of SIGGRAPH, as well, if you’d rather catch up with that news first.

  GPU

base

clock

(MHz)

GPU

boost

clock

(MHz)

ROP

pixels/

clock

Texels

filtered/

clock

Shader

pro-

cessors

Memory

path

(bits)

Memory

bandwidth

Memory

size

Board

power

GTX 970 1050 1178 56 104 1664 224+32 224 GB/s 3.5+0.5GB 145W
GTX 980 1126 1216 64 128 2048 256 224 GB/s 4 GB 165W
GTX 980 Ti 1002 1075 96 176 2816 384 336 GB/s 6 GB 250W
Titan X (Maxwell) 1002 1075 96 192 3072 384 336 GB/s 12 GB 250W
GTX 1080 1607 1733 64 160 2560 256 320 GB/s 8GB 180W
GTX 1080 Ti 1480 1582 88 224 3584 352 484 GB/s 11GB 250W
Titan Xp 1480? 1582 96 240 3840 384 547 GB/s 12GB 250W
R9 Fury X 1050 64 256 4096 1024 512 GB/s 4GB 275W
Radeon RX Vega 64

(air-cooled)

1247 1546 64 256 4096 2048 484 GB/s 8GB 295W
Radeon RX Vega 64

(liquid-cooled)

1406 1677 64 256 4096 2048 484 GB/s 8GB 345W
Radeon RX Vega 56 1156 1471 64 224 3584 2048 410 GB/s 8GB 210W

The high-level details of the Vega architecture have been known to us for some time, but the implementation of that architecture on Radeon RX gaming cards has remained a mystery until now.

AMD will be releasing the Radeon RX Vega with two different GPU configurations across three products. The fully-enabled Vega 10 GPU will find a home in the Radeon RX Vega 64 Liquid-Cooled Edition and the Radeon RX Vega 64. Both cards will get a GPU with 4096 stream processors, 256 texturing units, 8GB of HBM2 RAM running at a transfer rate of 484 GB/s, and 64 ROPs.

The RX Vega 64 Liquid-Cooled Edition will be the highest-performance Vega card at launch. This card will offer a typical boost range of 1677 MHz, a base clock of 1406 MHz, and a board power of 345W. It’ll offer peak single-precision performance of 13.7 TFLOPS and peak half-precision performance of 27.5 TFLOPS.

The air-cooled RX Vega 64 will offer a typical boost range of 1546 MHz, a base clock of 1247 MHz, and a board power of 295W. Those figures are good for 12.66 TFLOPS of peak single-precision performance and 25.3 TFLOPS of half-precision throughput. Both of these RX Vega 64 cards are positioned to compete with Nvidia’s GeForce GTX 1080.

The swanky aluminum-bedecked cards you see above are both limited editions, and AMD claims that label is genuine. At least for the air-cooled card, once the stock is sold through, the only way to get a reference air-cooled Vega will be with the black shroud you’ll see below.

The most interesting RX Vega graphics card may be the previously-unknown RX Vega 56. As its name implies, the Vega 10 GPU on this card has 56 of its 64 compute units enabled, for 3584 stream processors in total. Interestingly, it’ll still have all 64 of its ROPs, but it’ll ship with only 224 texturing units enabled. This card will have a typical boost range of 1471 MHz and base clocks of 1156 MHz, and somewhat lower memory clocks resulting in a peak transfer rate of 410 GB/s. AMD claims it’ll be good for 10.5 TFLOPS of peak single-precision throughput and 21 TFLOPS of half-precision throughput. This card will have a board power of 210W, and it’s positioned to compete with Nvidia’s GeForce GTX 1070.

We learned a wealth of new architecture details regarding the Vega GPU at this event, although the short window between the presentation of that information and the NDA lift for this article means that I’ll be holding off on a deep-dive until our full Vega review. The Cliff’s Notes is that some of the performance potential of the Vega architecture, like double-rate packed math, the draw-stream binning rasterizer, support for primitive shaders, and the High Bandwidth Cache Controller, are going to require driver optimizations or developer targeting (or both) to eventually run at their best. Early performance numbers for Vega from Frontier Edition cards didn’t include any gains from the DSBR, for example, and that feature will be enabled for the first time with the Radeon RX Vega release drivers.

 

Previewing performance

AMD took an unusual tack in discussing the potential performance of Radeon RX Vega cards. Instead of focusing on average frame rates, the company argued for the 99th-percentile frame rates (derived from 99th-percentile frame times) that the RX Vega 64 can produce and how those frame rates match up with FreeSync monitors.

The first case that AMD presents is with a premium 3440×1440 FreeSync display with a 48 Hz-to-100-Hz refresh rate range. In this scenario, the company claims the RX Vega 64’s 99th-percentile frame rates will generally fall within the low side of the FreeSync range for the display, meaning gamers shouldn’t experience the tearing and general unpleasantness of un-synced operation beneath the low side of the FreeSync range.

AMD tested the six games it used for this scenario using a mixture of high and ultra presets, so the numbers it generated should be reasonably representative of real-world performance.

AMD also presented 99th-precentile FPS numbers for a Radeon RX Vega 64 paired with a 4K FreeSync display with a refresh rate range of 40 to 60 Hz. In all cases, the Vega card delivered 99th-percentile frame rates that would keep its performance within FreeSync range when paired with such a monitor. In two of the games presented—Battlefield 1 and Call of Duty: Infinite Warfare—the RX Vega 64 delivered markedly better 99th-percentile frame times.

We still need to see average frame rates and frame-time data of our own to gauge the fluidity of the gaming experience from the RX Vega 64, but combined with the recently-introduced Radeon Enhanced Sync and a relatively affordable FreeSync monitor, it seems as though Vega’s smoothness could be competitive with or better than Nvidia’s GeForce GTX 1080. I have a pair of FreeSync monitors at home waiting for just such a high-performance card to do them justice, so I’m eager to give the RX Vega 64 a shot.

Packing it up

The biggest remaining question for many about RX Vega cards is pricing, and the answer to that question is both simple and complex. The simple answer is that the Radeon RX Vega 56 reference card will start at $399, and that the reference air-cooled Radeon RX Vega 64 will start at $499. The Radeon RX Vega 64 Liquid-Cooled Edition will only be available as part of a Radeon Pack for $699.

Radeon Pack is a marketing move that makes buying more AMD or AMD-friendly hardware with one’s Vega card more attractive. AMD really wants Vega buyers to take full advantage of its ecosystem, and the idea is that when one buys an RX Vega Radeon Pack, they’ll get a one-time opportunity to take advantage of discounts on FreeSync monitors and Ryzen CPU-motherboard combos at the point of purchase. AMD will direct buyers to dedicated pages at partner retailers so that they’re fully aware of the discounts available to them before they press “Place Order” and miss out on those opportunities forever.

The Radeon RX Vega 64 will come in two packs: a Black Pack and an Aqua Pack. The Black Pack starts with an air-cooled Radeon RX Vega 64 graphics card in either limited-edition or standard black livery for $599. USA buyers will then receive free copies of Prey and Wolfenstein II no matter what. If they so choose, they can add a Samsung CF791 34″ ultrawide monitor with a FreeSync range of 48-100 Hz and get $200 off its list price. They can also add an eligible Ryzen CPU and motherboard combo to their order and get $100 off that combo. The Aqua Pack will be the only way to get the RX Vega 64 Liquid-Cooled Edition at launch. This $699 package offers the same benefits as the Black Pack for monitors and CPU-motherboard combos.

AMD will also offer the Radeon RX Vega 56 as part of a Radeon Pack. The Radeon Red Pack will run $499, and for that money, buyers will still get Prey and Wolfenstein II and all the other potential trimmings of Vega 64 packs. The company says that it plans to widen the Radeon Pack program to include more FreeSync monitors in the future, so buyers could potentially get discounts on more displays with time—although it seems unlikely that those who want a Radeon RX Vega 64 limited-edition air-cooled card will be able to wait around for that widening of eligibility to occur.

I find it curious that the company isn’t being more aggressive about Vega pricing given that the claimed performance figures for the RX Vega family are no better than those of Nvidia’s GP104-powered GTX 1070 and GTX 1080, cards that launched over a year ago. AMD readily admits that it’s still playing catch-up in the high-end market, but it points out that of the estimated million cards that move per quarter for over $350, it now has two options for buyers to choose from where previously it had zero. That’s an important step, even if it does seem like Nvidia’s next-generation graphics cards could be arriving at any time.

In my conversations with employees, AMD further defended its pricing decisions by arguing along the lines that Vega is an architecture for the future and that Pascal isn’t. Whatever one might think of that angle, staking any large part of Vega’s value on the idea that there are future performance gains coming from future software seems like putting an IOU in the box with every Vega card, and I think it’s probably safer to appreciate any such gains when they come.

Right now, the Radeon RX Vega family seems ready to do the job that AMD wanted it to do: re-establish a foothold in the high-end graphics market as it exists today. Gamers will have a chance to see for themselves whether Vega is a bright new star for the Radeon Technologies Group come August 14. Add-in board partners will also have a chance to show what they can do with Vega, and those cards will arrive later on. Stay tuned for our full review.

Comments closed
    • Mr Bill
    • 2 years ago

    I wonder if the NDA for the AMD video cards bears any relation to the total eclipse of the moon on August 21st?

      • Voldenuit
      • 2 years ago

      They hoping that the Day of Dark Sun will disable the powers of the Flame Benders.

      • JustAnEngineer
      • 2 years ago

      Launch is at 9 AM EDT on 8/14.

    • Gastec
    • 2 years ago

    295W, 345W !? These are obviously video cards only meant for North America with their 10-12cents/kWh. In Europe we pay 2-3 times more than that. And in China where they make all these devices they pay 8cents/kWh. So guess what finger I’m showing them now!

      • DoomGuy64
      • 2 years ago

      [url<]https://youtu.be/fBeeGHozSY0?t=331[/url<] Even if you pay 2-3 times more than US, that's max power draw spent gaming, not at desktop. You'd have to add up how much you spend gaming to get the actual cost difference, and chances are it isn't that relevant. Especially in the US. It's still way less than running household appliances like a clothes dryer, and if you can afford to wash your clothes, you can afford the video card. AMD WILL COST YOU A BAJILLION DOLLARS MORE AND GIVE YOU TUMOR. AUO AUO AUO GET TO DA CHOPPER. Yeah, no.

        • Gastec
        • 2 years ago

        What is it with you Unitedstatians that you can’t stand a good rant when you see one? Don’t you see that we are trying to improve the room heaters…I mean advanced micro devices with our constructive critiques?

          • JustAnEngineer
          • 2 years ago

          Check this out:
          [url<]https://www.eia.gov/electricity/data/browser/#/topic/7?agg=0,1&geo=vvvvvvvvvvvvo&endsec=vg&freq=M&start=200101&end=201412&ctype=linechart&ltype=pin&rtype=s&pin=&rse=0&maptype=0[/url<]

    • freebird
    • 2 years ago

    Nobody has posted about the rumors that it may be scare at launch due to being a good mining platform…

    Pick your favorite rumor website from this list…
    [url<]https://www.google.com/search?q=rx+vega+ethereum+mining&ie=utf-8&oe=utf-8[/url<]

    • ronch
    • 2 years ago

    AMD desperately needs a new architecture that’s more efficient and which they can deploy from top to bottom. You know, like what they’re doing with Ryzen. The fundamental GCN architecture is just really tired at this point. It’s been, what, more than 5 years since it came out.

      • renz496
      • 2 years ago

      but it seems AMD is very reluctant to move away from it. because if they did it the “grand master plan” that they lay out since few years ago will go to waste.

        • Kretschmer
        • 2 years ago

        I don’t think it’s inflexibility. They’re too resource-starved to design something new.

      • Krogoth
      • 2 years ago

      Nvidia hasn’t done anything radical since Fermi/Kepler as well. They have been riding on fab tech and architectural refinements so far like AMD. Nvidia just dealt a better hand this round.

        • ronch
        • 2 years ago

        Like Intel, Nvidia has a superior architecture and so they can sit on it for a while, much to our dismay. AMD simply can’t sit on Bulldozer so they created Zen. They need to do the same thing with their GPU division but yeah, I guess they had to focus on Zen first.

          • shank15217
          • 2 years ago

          I don’t think Intel has a superior architecture to Ryzen, you are comparing their 7th gen core to AMD’s 1st gen Zen. AMD nearly caught up with Intel with just one release, wait till they fine tune zen 2 and zen 3then we can talk.

            • DPete27
            • 2 years ago

            1) Zen isn’t AMDs first CPU architecture.
            2) It’s always easier to follow someone else’s footsteps.

            • K-L-Waster
            • 2 years ago

            I hope you’re not expected Zen2 to get the same kind of IPC improvement that Zen1 had over the FX series.

            The next generations will have improvments, but they’re more likely to be in the range of 5-10% rather than 40+%.

      • cynan
      • 2 years ago

      Honestly, it’s still too early to tell why high-end Vega isn’t up to snuff at this point. (Ie, whether it has something to do with the architecture itself, drivers, implementation of HBM2, all of the above, etc).

      I think AMD, given its resources, would be crazy to throw GCN out the window at this point with Navi around the corner, where the use of smaller dies could hopefully remedy the power efficiency situation at least somewhat. And the approach seems to be more or less working for Ryzen…

        • ronch
        • 2 years ago

        I’m not saying they should throw out GCN right now because… they simply can’t. Yes their resources are stretched. What I’m saying is they need a new architecture. They just really need it.

          • ermo
          • 2 years ago

          For its intended target (GPU Compute), GCN is good. Consequently, GCN is good in terms of the CPU+GPU convergence and integration strategy AMD keeps pursuing.

          And since Vulkan (nee Mantle) is moving towards merging compute (OpenCL) and graphics, GCN is *still* the right basic architecture.

          We don’t (yet) know how much of the current power draw situation is process related and how much of the perf/watt deficit is lack of hardware optimization and/or driver optimization.

          IF the performance increases by +15-20% (compared to PC Perspective’s numbers) once AMD sorts out its drivers to use the new NCU features properly (IIRC this is what happened with GCN 1.0 over time), THEN the value proposition will look very different.

          But that’s a pretty big IF.

    • ET3D
    • 2 years ago

    I’m so glad we didn’t end up with an RX Vega XTX, as has been rumoured.

    • jts888
    • 2 years ago

    The only thing that really interests me at this point about Vega is the SR-IOV capacity, but it [i<]really[/i<] interests me. If it can allow me to relegate Windows to a sandboxed, minimal overhead game launcher, I'll buy one rather quickly. I use one big monitor, so current IOMMU pass-through techniques with separate GPUs and displays are still too clunky for my tastes, but a full-performance virtual GPU opens a lot of potential opportunities.

      • ptsant
      • 2 years ago

      You will probably need a MB and CPU with the appropriate support. I’m afraid it won’t just plug and work on any setup, but I might be mistaken.

    • USAFTW
    • 2 years ago

    I know, I like to rail on against G-SYNC and its premium, but aren’t AMD users who have invested in Freesync (which isn’t all that free either) effectively locked into their platform and have to make do with whatever GPU AMD throws their way? If there is a 4K 144Hz Freesync diplay, how could one be able to take good advantage of that on something like a RX Vega with the possible exception of source games and Overwatch?
    Ugh. I wish HDMI 2.1 or a new DP revision would make this mess go away already.

      • BobbinThreadbare
      • 2 years ago

      Freesync is just a brand name on a VESA standard. Anyone can implement it.

        • cygnus1
        • 2 years ago

        Anyone can, but only AMD has. You’d think Intel would do it but they haven’t, and obviously nVidia won’t until G-Sync isn’t making them any money. So until others implement it in a compatible way, does it really matter if it’s a standard?

          • the
          • 2 years ago

          Intel said they would and the expectation is that they’d implement it alongside a DP 1.3/1.4 based display controller. That would be Coffee Lake/Cannon Lake per their roadmap right now.

          • travbrad
          • 2 years ago

          Yep right now you are effectively locking yourself into either AMD or Nvidia whichever one you choose if you care a lot about variable refresh rates. Intel says they will support it on future CPU/iGPUs, but their performance is so bad in the first place that I can’t see that being a good gaming experience regardless of the display tech they support.

          Freesync is the cheaper option, but yeah anyone with a 1440p monitor was sort of stuck with less than ideal framerates since Polaris wasn’t great for 1440p (neither is the Nvidia competition in that price bracket, but Nvidia had 3-4 cards above that)

          If other people feel the same way as me (unlikely :P) there shouldn’t be much vendor lock-in anyway though. I’m very Krogoth’d about VRR after trying it myself. It only seems to help at framerates so low that I wouldn’t want to play anyway. If Vega is a better card at the price when I get my next card, I’ll just plug it into my G-sync monitor and forego the G-sync functionality entirely.

            • f0d
            • 2 years ago

            [quote<]I'm very Krogoth'd about VRR after trying it myself. It only seems to help at framerates so low that I wouldn't want to play anyway[/quote<] same here, i cant tell if its on or off at around 100hz and being at 100hz with roughly 100fps to match it is a way better experience than anything around 60hz with or without VRR after getting a freesync monitor my overall impression was "meh"

          • WiseInvestor
          • 2 years ago

          XBox OneX has FreeSync. It has been confirmed.

            • K-L-Waster
            • 2 years ago

            Which is also effectively AMD…

            • cygnus1
            • 2 years ago

            Great, and so has every other PC OEM selling systems with AMD GPU’s…

          • urvile
          • 2 years ago

          yeah it does because it means screen manufacturers don’t have to pay a licensing fee. That’s the difference between industry standards and proprietary. Apple do it to with some of their networking protocols. It generates another revenue stream.

      • cynan
      • 2 years ago

      This is a short term perspective. A sentiment that would have never arisen had Rx Vega been neck and neck with a 1080 Ti. If/when AMD has a GPU that’s competitive with Nvidia’s high end offering (Navi?) then obviously Freezync becomes way more appealing than G-sync in a very objective sense. Because, all else the same, it will literally be the best performance for less money. The onus is, and has always been, on Nvidia, because they’re the ones who charge extra for their variable refresh monitor tech.

      I think that sentiments of “AMD has an underwhelming high end gaming product in Vega” along with “monitors generally survive multiple GPU upgrade cycles” are being unfairly conflated.

    • Kretschmer
    • 2 years ago

    I look forward to the TR review of these cards. Based on AMD’s past benchmarks, I fully expect the Vega 64 to perform like a 1070+.

    • ermo
    • 2 years ago

    If AMD’s internal psychometric research suggests that the optimal gaming experience hinges on pairing a FreeSync display with drivers which ensure that frame-time variability is as low as possible (at the expense of maximum rendered frames per second) then fair play to them.

    I wanted what is now known as the RX Vega 64. However, at this point, I’m firmly on the fence until proper drivers land and are reviewed with a FreeSync monitor.

    • cygnus1
    • 2 years ago

    So, if the Vega 56 pack with the Samsung monitor come out to a decent total price (taking the monitor discount off regular retail pricing, not necessarily MSRP) I’m going to strongly consider ebaying my 1070 expecially since I got it for under $350 and they’re selling for that and more no problem.

    Been waiting on decently priced G-Sync monitors or for nVidia to cave and support Freesync, but I don’t think either of those are going to happen before the end of this decade at this point.

      • Kretschmer
      • 2 years ago

      Watch out; the CF791 is defective.

        • cygnus1
        • 2 years ago

        I’m kind of hoping for more choice for the monitor discount. I have seen some complaints about the C34F791, but it seems pretty solid for the most part. Problem is actual retail prices are already under $200 below MSRP. If they only let it be discounted from MSRP, then it could end up being MORE expensive than buying the card and monitor separately.

        EDIT: Also, ideally the monitor discount selection would include monitors that support FreeSync 2.

    • DPete27
    • 2 years ago

    Jeff: [b<]PLEASE DO SOME UNDERVOLTING TESTS[/b<] in your review.

      • Chrispy_
      • 2 years ago

      bump.

      • USAFTW
      • 2 years ago

      Agreed. And since undervolting on AMD GPUs is a known commodity at this point, I would like to know if undervolting is also possible on Nvidia GPUs and what kind of reduction in power consumption can be expected. Although Pascal GPUs are already very efficient, it would be interesting to see if they can be pushed to even lower levels.

        • Voldenuit
        • 2 years ago

        [quote<] Agreed. And since undervolting on AMD GPUs is a known commodity at this point, I would like to know if undervolting is also possible on Nvidia GPUs and what kind of reduction in power consumption can be expected.[/quote<] At the moment, msi afterburner only has overvolting options on nvidia GPUs, and I'm not aware of any OEM tweaking tools that have undervolting on nv GPUs, either (they could be out there, but the Gigi Aorus software is so bad I never use it, so haven't looked too deeply). Gamersnexus did some undervolting tests on their Vega FE, and found that some voltage settings were stable in some games but not others, so I don't think undervolting is going to be a panacea for RX Vega users, either.

        • Air
        • 2 years ago

        On Pascal you can do it using GPU Boost 3.0. Overclock it to max as you normally do, then limit the clocks to the original max clock (set all voltage clocks after this point to the same value, so it will always boost to the lowest avaliable voltage). If afterburner’s “GPU power” mesurements are to be trusted, on my card it leads to a 20-25% reduction in power. Its not possible to lower idle and base clock voltages.

        (I use it without limiting, so i get an imperceptible performance improvement at the cost of a lot more power, just so i can see 2050 MHz instead of 1950 in the on-screen display. Gotta pass that 2k barrier…)

        • DPete27
        • 2 years ago

        I have an RX480 so [url=https://techreport.com/forums/viewtopic.php?f=4&t=119219<]I know how undervolting works there.[/url<] I do like how AMD WattMan exposes frequency states that are fully customizable along with a voltage setting for each state. I don't have experience with Pascal, but I believe you're basically stuck with adjustment of two sliders; a core clock offset slider, and a power limit slider. Which to me sounds like you can only limit frequency, and rely on GPU Boost3.0 to auto-assign the most efficient voltage for a given frequency. (That's obviously something that AMD cards are extremely bad at, hence undervolting) So I hesitate to call that undervolting, since it's more like under-clocking....unless there's something more to it than that.

          • USAFTW
          • 2 years ago

          That’s one of my favorite things about AMD’s new driver interface. It something which I would definitely use on any GPU.
          Right now I get by my 780 Ti at +125MHz core and 70% power limit but better control is always appreciated 😉

      • Jeff Kampman
      • 2 years ago

      You undervolting guys are going to [i<]love[/i<] Vega.

        • Mikael33
        • 2 years ago

        Undervolters dream? 😉

        • chµck
        • 2 years ago

        ok mr freddy foreshadowing

        • Kretschmer
        • 2 years ago

        Why is AMD gear typically at a higher voltage than needed? Is it because they need to use every chip for economic reasons, or is voltage testing too inefficient for mass-produced cards?

          • DPete27
          • 2 years ago

          Clearly their [url=https://techreport.com/review/30328/amd-radeon-rx-480-graphics-card-reviewed/2<]adaptive voltage and frequency scaling (AVFS)[/url<] is rubbish.

          • Air
          • 2 years ago

          I think pretty much all GPUs and CPUs are to a degree. I think its just more frequently done on AMD GPUs because the power and temperatures are higher in comparison to nVidia, while overclocking is not so worth it. Plus to do it on nVidia is a little more complicated.

          Notebook owners seem to always to that though, on both on their Intel CPUs and nVidia GPUs. It really makes a big difference in this case.

        • DPete27
        • 2 years ago

        Not sure what this means. I’d LOVE not having to make manual voltage adjustments.

        • tipoo
        • 2 years ago

        Another one pushed well past its peak efficiency zone, eh?

        • Mr Bill
        • 2 years ago

        3.9 billion additional transistors to pipeline some sections and allow higher clocking… But, under the control of adaptive voltage and frequency scaling (AVFS). Maybe the range of scaling allows for scaling down sections the same way clocks are scaled up and down in the 7890 APU’s.

    • brucethemoose
    • 2 years ago

    I don’t really care how Vega will perform at stock… I care how it performs with an average overclock. That’s why I got my 7950 over a 770: it can go from 850 to 1150 without a sweat.

    Vega, on the other hand, already seems to be clocked within an inch of its life.

      • DPete27
      • 2 years ago

      That’s why it’s got such a high TDP. I’d bet it will fall perfectly in line with Polaris on a Frequency/Voltage curve. They’re pushing Vega even higher than Polaris 20 just to be competitive on performance.

    • Voldenuit
    • 2 years ago

    Ouch, selling Vega 56 @ $399 has got to be hurting AMD. It’s about the same size as GP102 (484 vs 471 mm^2) *and* has an interposer, which GP102 doesn’t, *and* has expensive HBM2, although the smaller amount of RAM (8 GB vs 11) might mitigate the cost.

    It’s likely that Vega 56 is at the voltage/MHz inflection point for the chip, as clocking an extra ~75 MHz ramps up power from 210W on the ’56 to 295W on the ’64.

    Also, while the monitor bundle is nice, it’d probably lock you into buying the monitor at MSRP before discount. If you can find the same or similar monitor cheaper than MSRP, the actual discount can become vanishingly small.

      • brucethemoose
      • 2 years ago

      Which begs the question:

      WHY is Vega so much slower than a similarly-sized GP102? What the heck is all that die area being used for?

        • renz496
        • 2 years ago

        for massive FP16 performance? HBCC?

        • Voldenuit
        • 2 years ago

        Yeah, Vega is substantially larger than GP104 for the same (graphics) performance compared to Polaris vs GP106.

        It’s also only about 20% smaller than the Fury X despite having the same number of shaders and being built on a smaller process (14 nm vs 28).

        It’s gotta be the compute features, internal caches, and possibly the HBCC (not necessarily transistors, wide internal ring buses can take up a lot of space on the die).

        But if it’s hiding any secret sauce compute features that would justify the die size, AMD failed to talk them up for their prosumer launch (Vega FE).

        • ImSpartacus
        • 2 years ago

        Clocks.

        From Anandtech:

        [quote<]Talking to AMD’s engineers, what especially surprised me is where the bulk of those transistors went; the single largest consumer of the additional 3.9B transistors was spent on designing the chip to clock much higher than Fiji. Vega 10 can reach 1.7GHz, whereas Fiji couldn’t do much more than 1.05GHz. Additional transistors are needed to add pipeline stages at various points or build in latency hiding mechanisms, as electrons can only move so far on a single clock cycle; this is something we’ve seen in NVIDIA’s Pascal, not to mention countless CPU designs. Still, what it means is that those 3.9B transistors are serving a very important performance purpose: allowing AMD to clock the card high enough to see significant performance gains over Fiji.[/quote<] [url<]http://www.anandtech.com/show/11680/radeon-rx-vega-unveiled-amd-announecs-499-rx-vega-64-399-rx-vega-56-launching-in-august/3[/url<]

          • Voldenuit
          • 2 years ago

          Wow, so adding transistors to clock up instead of execution units?

          It’s like no one at AMD had heard of Netburst…

            • derFunkenstein
            • 2 years ago

            Or bulldozer…

            • USAFTW
            • 2 years ago

            It seems like they were jealous of the high clock speeds that Maxwell and Pascal GPUs could hit and wanted some o’ that…

          • 0x800300AF
          • 2 years ago

          A good deal also goes to approximately 45MB for on die mem..

          • freebird
          • 2 years ago

          And they probably are planning ahead with 7nm production being available with Vega 20 or Navi by 2H2018 (according to GF) which should drastically reduce die size & combination of less power or more frequency…

        • 0x800300AF
        • 2 years ago

        Vega more than doubled Fiji on die mem/cache to over 45Mb. , more than 4x Hawaii.

      • _ppi
      • 2 years ago

      Because compared to nVidia cards, they have overemphasis on compute at the relative expense of geometry/texture capabilities. See among others synthetics comparison between Polaris and Pascal on this site (not 3D Mark, but texels, geometry, etc.)

      See also Radeon 480/580 vs GeForce 1060 gaming performance (which is in the same ballpark, 1060 may be considered faster) vs. TFLOPs difference. Vega vs. 1080(Ti) is the same.

      • fuicharles
      • 2 years ago

      At least they don’t need to pay any penalty charges to GlobalFoundries

      • DoomGuy64
      • 2 years ago

      That’s like saying Nvidia selling 780’s from 780 TI’s and 470’s from 480’s were hurting them. The chuckula hivemind bias is strong in these forums.

      Reality is, Vega 56 doesn’t hurt AMD. Not every Vega can be a Vega 64, and selling Vega 56 makes more money than throwing away a mostly good chip. When you get down to actual costs of mass producing hardware, the cost is much lower than you are insinuating. Vega 56 still makes money, it just doesn’t make what Vega 64 does.

      If you really want to get into profit, the watercooled Vega is massively overpriced, and one of the youtube overclockers said people were better off buying Vega 56 with a custom watercooler and overclocking it. Provided the HBM isn’t underclocked and voltage locked.

      As for your monitor point, freesync is already on average $200 cheaper than gysnc, and an extra $100 is even more incentive to go that route for cost savings. MSRP or not. The discounts are really good perks for people building a new rig, and will most likely cause value builders to choose all AMD for a gaming rig.

    • Billstevens
    • 2 years ago

    Wonder if they will come out with a custom cooled version with clocks close to the liquid. It’s hard to get excited about a card that draws 100 more watts than the fury x….

    • maxxcool
    • 2 years ago

    hmmm .. starting to get a very ‘meh’ feeling about all this.

      • Goty
      • 2 years ago

      What do you mean “starting to?”

    • Kretschmer
    • 2 years ago

    Paying an extra $100 for $420 in discounts doesn’t add up… What am I missing?

      • mnemonick
      • 2 years ago

      …that you also have to buy the ($950) monitor, CPU and mobo [i<]at the same time[/i<]?

      • lordcheeto
      • 2 years ago

      Makes it less attractive for miners and more attractive for new gaming builds.

    • Kougar
    • 2 years ago

    AMD said the HD 2900 XT’s issues and lack of performance could be fixed in driver and game optimizations too, of course that was ignoring reality of the situation.

    [quote<]The Radeon RX Vega 64 Liquid-Cooled Edition will only be available as part of a Radeon Pack for $699.[/quote<] Thank you AMD for making those hot-clocked, watercooled 1080 Ti cards at $800 such a good deal. $100 to upgrade from Vega;s "1080" to 1080 Ti performance with considerably less power consumption is a win regardless of how one looks at it. Vega certainly isn't in as bad a shape as the 2900 generation, but AMD's decision making apparently hasn't changed much after all.

      • Lord.Blue
      • 2 years ago

      It was noted that this package setup might be to mitigate the bitcoin miners from gobbling up all the cards on the market rather than gamers.

        • Kougar
        • 2 years ago

        I buy that excuse about as much as I’d buy Vega. Prey is already selling for 40% off from multiple sites anyway.

    • USAFTW
    • 2 years ago

    That is a rather confusing way to gauge performance. And Judging by the fact that AMD has gone with 980 To and 1080 as yardsticks seems to suggest the early frontier edition tests weren’t far off. So much for a brand new architecture and 2X performance/watt AMD talked about before.
    Unless you already own a Free sync monitor and this is the best AMD has to offer, these things need a haircut to justify their existence. Really sad, looks like the prospects of AMDs competitiveness are thrown out the window. This makes Fiji look like an amazingly efficient GPU.

    • renz496
    • 2 years ago

    [quote<]In my conversations with employees, AMD further defended its pricing decisions by arguing along the lines that Vega is an architecture for the future and that Pascal isn't.[/quote<] to me that might be a sound strategy if the launch Vega along side nvidia GTX1080/1070 last year. and most consumer only look at the end performance. not what advantage X card will have over Y card in the future. also consumer volta probably will be here in less than a year and might as well contain architecture improvement that is similar to Vega.

      • squngy
      • 2 years ago

      I agree.

      And you have to assume that the people who pay this kind of money for GPUs aren’t just going to sit on an old GPU if there is something significantly better available.

      • chuckula
      • 2 years ago

      I tend to agree especially with Vega launching a solid 14 months (generously) after Pascal.

      It’s not “future proof” when you are claiming that your brand new late-2017 architecture will maybe beat your competitor’s smaller and more energy efficient mid-2016 parts in 2018 (maybe).

        • raddude9
        • 2 years ago
      • stefem
      • 2 years ago

      [quote<]also consumer volta probably will be here in less than a year and might as well contain architecture improvement that is similar to Vega.[/quote<] <joke>lol, looking at the results I hope not!</joke> The, all hypothetical, performance improvement that Vega cards may get will happen because release drivers wasn't well optimized (either for less competence, resources or unnecessary complex hardware) and not because "Vega is an architecture for the future and that Pascal isn't." As a matter of fact, tile based rasterizer and packet FP16 math, almost the whole new feature introduced by Vega, has been previously introduced by NVIDIA with Kepler and Maxell years ago...

        • tipoo
        • 2 years ago

        TBR yes, FP16 /works/ on Maxwell/pascal but doesn’t benefit through the full pipeline, it’s there for compatibility

        [url<]http://www.anandtech.com/show/10325/the-nvidia-geforce-gtx-1080-and-1070-founders-edition-review/5[/url<]

      • tipoo
      • 2 years ago

      It worries me that that’s still an internalized strategy for AMD. Aiming for the future while Nvidia and Intel target today is a plan that has burned them time and time again. Performance in today’s titles is what matters when people make buying decisions. That it may age more gracefully in 4 years, I believe it, but I’d weigh things more heavily on what I do today than what I may do in 4 years.

    • Zizy
    • 2 years ago

    Essentially the selling point of this cars is “hey, we have tons of Freesync monitors, don’t bother with expensive limited selection of Gsync”. Cause based on these numbers I just don’t see much value in the GPU itself, except for those few data points where 1080 struggles.

      • JustAnEngineer
      • 2 years ago

      I believe that AMD should emphasize the value of FreeSync even [b<]more[/b<] than they do. If a FreeSync monitor is already $175 cheaper than an otherwise-identical monitor with NVidia's expensive proprietary G-Sync and then AMD offers an additional discount to get the monitor in a pack together with a Radeon Vega graphics card... that's a significant value. I'm hoping that they expand the monitor deals to include the [url=http://www.samsung.com/us/computing/monitors/gaming/32--chg70-gaming-monitor-with-quantum-dot-lc32hg70qqnxza/?redir=*C32HG70*<]Samsung C32HG70[/url<].

        • Kougar
        • 2 years ago

        I don’t think AMD is thinking that logically right now.

        • BobbinThreadbare
        • 2 years ago

        This would make more sense in mid range with say a $300 GPU and a $500 monitor instead of the prices of the Vegas and the Samsung monitor.

        • Gasaraki
        • 2 years ago

        But you have to pay $100 more for the pack…

          • lordcheeto
          • 2 years ago

          Makes it less attractive for miners. Might get some of their cards in the hands of gamers.

      • JosiahBradley
      • 2 years ago

      Sadly I have a freesync monitor paired with a 1080ti because Vega never got released. I don’t even miss freesync cause I just play attention 144fps now. They make it sound like you have to have adaptive sync to use a monitor. Sure it’s nice if you can’t hit your target refresh but v sync is awesome too.

        • urvile
        • 2 years ago

        Hows that working for you? Because if vega disappoints in benchmarks. I think I will do the same thing. Although given what threadripper has done to the CPU market. Who knows?

      • Freon
      • 2 years ago

      The Gsync vs. Freesync cost issue is definitely valid.

      I think most people tend to keep monitors for much longer than video cards which amortizes that cost a bit.

      When I bought my XB270HU two years ago Freesync tech and options were still behind. The MG279Q was not feature parity with the XB270HU and PG279Q. The $250 premium for something I plan(ned) on keeping for 5-8 years was worth it.

        • K-L-Waster
        • 2 years ago

        Yes — if you keep the monitor for 5 years the G-Sync premium is ~$50 a year. And if you keep it longer that of course drops. So yes, it costs more, but not enough to be life changing in a product you are likely to continue using for a much longer time than any graphics card you connect to it.

        • DoomGuy64
        • 2 years ago

        See, I think that’s a cop out since VRR is not compatible with ULMB. You use one or the other, not both. If you solely want ULMB, there are monitors that support that feature that will work on AMD.

        It is nice to have one monitor that does it all, but if you are buying a VRR monitor specifically for VRR, ULMB is not something you will be using a whole lot of anyway. Maybe for CS:GO or something, but every time you switch modes that requires manually reconfiguring your monitor and refresh settings. Not something I would care to do, since it would get tedious without automatic profiles.

        The nice thing about the MG279 freesync monitor is that the range is reconfigurable via CRU, and you can eliminate the lower ranges to force LFC to double the refresh rate of anything below, which reduces blur of those lower ranges to that of the forced higher range. Gsync doesn’t do that, afaik. Freesync monitors can be a trade off if they don’t support ULMB, but if you are more interested in VRR they are worth it.

          • Voldenuit
          • 2 years ago

          [quote<]It is nice to have one monitor that does it all, but if you are buying a VRR monitor specifically for VRR, ULMB is not something you will be using a whole lot of anyway. Maybe for CS:GO or something, but every time you switch modes that requires manually reconfiguring your monitor and refresh settings. Not something I would care to do, since it would get tedious without automatic profiles.[/quote<] Are you talking about ULMB and nvidia? Because I have my 1080Ti + PG278Q set up so that it automatically switches to 120 Hz+ULMB for Overwatch and G-Sync for everything else. Perhaps some monitors are less user-friendly about it?

            • DoomGuy64
            • 2 years ago

            There isn’t a lot of information about it floating around, so that’s been my guess more than anything. The only thing I know for sure is that both features are currently incompatible with each other at the same time. If your monitor allows profiles, that’s the first I’ve heard about it, as it’s not been greatly advertised that it can do that.

            My MG279’s freesync mode is an option set by the monitor and control panel. You turn freesync on in the monitor, then turn it on in the control panel. If Nvidia’s gsync allows automatic mode switching in software, then I would give it points for ease of use for sure.

            Still, I ultimately bought my monitor for VRR and lower cost. Being able to switch automatically is a nice perk, but Nvidia needs to better advertise that their monitors can do that. I might have considered it a bit more if that was well known.

            Ultimately the best scenario is a monitor that does both at the same time, but I don’t know if that’ll ever happen.

            • Voldenuit
            • 2 years ago

            ULMB and VRR won’t work at the same time, this is because if frame rates are variable, you either have to:
            a. guess how strongly to strobe the backlight while waiting for the next frame, resulting in possibly varying brightness
            or
            b. delay displaying frames until you have a buffer of pending frames to display, resulting in lag.

            Neither of which is desirable, so nvidia and the monitor vendors made ther modes exclusive.

            You *can*, however, switch between VRR and/or ULMB on a per-application basis. Simply use nvidia control panel to configure a specific behaviour for any given program, and the game (and monitor) will switch to said profile when launched.

            [s<]You can also manually switch between the two on the fly using on-monitor controls*, or by changing the default mode in nv control panel.[/s<] EDIT: My bad, it looks like the physical toggles on the monitor will only toggle ULMB on or off, not switch between ULMB or VRR. You *can*, however, toggle between fixed refresh or ULMB, which can be useful if you want to turn ULMB on for watching movies from the desktop.

      • Kretschmer
      • 2 years ago

      A lot of that FreeSync selection is questionable, though. Either implementation issues, no LFC, or some other pitfall. Read reviews carefully.

      Like that bundled CF791 has a FreeSync range of 80-100. If you fall below 80FPS, it can flicker. Yay.

      • urvile
      • 2 years ago
    • christos_thski
    • 2 years ago

    Judging from those performance previews, the stock of R9 Furies at 260-300 dollars that were available some months ago were good purchases even with Vega releasing…. Vega simply doesn’t seem all that important an improvement (unless the full blown model came down to 400 bucks?).

    • Shobai
    • 2 years ago

    The paragraph beginning with
    [quote<]The RX Vega 64 Liquid-Cooled Edition [/quote<] has consecutive sentences starting with "This card". The paragraph before the "Packing it up" sub heading: [quote<]ownto [/quote<]

    • Unknown-Error
    • 2 years ago

    So RX Vega 64, 1-year late roughly GTX 1080 performance but uses 100W more? WTF?

    I read in another forum that RTG actually stands for “[b<]R[/b<]eally [b<]T[/b<]errible [b<]G[/b<]raphics"

      • Krogoth
      • 2 years ago

      Vega silicon is far better at general compute then GP102, GP104 and GP104 silicon. AMD doesn’t have the fiscal resources to develop a dedicated high-end GPU design nor do they really one. This isn’t the 1990s-2000s anymore.

      Nvidia is also moving away from having “big” dedicated GPU designs. They are focusing more on general compute since that’s where the big money is. They just need to develop a “mid-size” and “value-tier” GPU designs to yield enough performance to satisfy the majority of the PC gaming audience. They can use their massive mindshare to take care of the rest.

        • stefem
        • 2 years ago

        If previous architecture are an indication Vega wont be superior in “general compute” than GP102 (excluding packed math which require precise condition and apply only at a fraction of the usage) the only thing that GP102 lack is DP performance which Vega lack too

    • DancinJack
    • 2 years ago

    I’d obviously take those numbers with a big bowl of salt until someone has cards in hand for reviews. According to TR’s own testing in Doom (Vulkan), the 1070 (which for our purposes equates to a 980 Ti) shows better average FPS and frametimes than the Fury X. Not quite sure how AMD shows the Fury X beating both the 980 Ti AND the 1080.

    [url<]https://techreport.com/review/31562/nvidia-geforce-gtx-1080-ti-graphics-card-reviewed/5[/url<] Same goes for Deus Ex. [url<]https://techreport.com/review/31562/nvidia-geforce-gtx-1080-ti-graphics-card-reviewed/7[/url<] And Hitman (DX12) [url<]https://techreport.com/review/31562/nvidia-geforce-gtx-1080-ti-graphics-card-reviewed/10[/url<]

      • BobbinThreadbare
      • 2 years ago

      I think this may be due to driver updates since March. Unless a Fury X is slower than an RX 470

      [url<]https://techreport.com/review/31754/amd-radeon-rx-580-and-radeon-rx-570-graphics-cards-reviewed/5[/url<]

        • DancinJack
        • 2 years ago

        [quote<]We tested each graphics card at a resolution of 3840x2160 and 60 Hz, unless otherwise noted.[/quote<] ^This is from the review I posted. Looks like the one you just posted was at 2560x1440. I'm sure there have been driver updates that have made the Fury X faster, I just very much doubt AMD's numbers on these slides until independently verified.

          • BobbinThreadbare
          • 2 years ago

          Good catch on the resolution.

          • urvile
          • 2 years ago

          I doubt numbers from any manufacturer until independently verified.

      • Froz
      • 2 years ago

      ” Not quite sure how AMD shows the Fury X beating both the 980 Ti AND the 1080.”

      Except they don’t. There are numbers in the little bubbles and the bubbles within a group are not arranged according to the performance. Also I haven’t ever seen a benchmark made by a producer that would not use some tricks to show the product as if it would be better. The slide mentions endnotes, we can only imagine they have more details about the testing.

      • Freon
      • 2 years ago

      Yup got quite the laugh at them showing the Fury X beating the 980 Ti. Nice try, but I think most of us know better.

      Another round of “anisotropic 0x” shenanigans probably.

        • DoomGuy64
        • 2 years ago

        Maybe, but historically it sounds accurate as Hawaii started handily beating Kepler after maxwell and driver updates.

        Fury X beating the 980 Ti might well be possible, especially in newer titles that are better optimized in the driver. Maxwell optimization is lower on Nvidia’s priority list than Pascal. Very possible, but I would also question what settings were used, as the 980 Ti has a lot more ROPs than Fury. AMD probably picked the best case scenario of settings and games that run better.

      • stefem
      • 2 years ago

      They also chose the lower refresh rate limit of the displays ,48Hz and 40Hz, as fps metrics to discern “poor experience” forgetting to mention that’s a limitation only FreeSync have as every G-Sync goes down to 0 fps without tearing

        • BobbinThreadbare
        • 2 years ago

        I thought Freesync 2 was supposed to solve this.

        • _ppi
        • 2 years ago

        G-Sync does it via the very same method as FreeSync*, it is only transparent to user and minimum framerate is not disclosed.

        And frankly speaking, as owner of 4K 60fps G-Sync monitor, I can testify G-Sync does not work flawlessly either (I am looking at you, Witcher3).

        *basically when frame is not coming for a long time, it forces a refresh. See PCPer’s analysis using osciloscope

          • stefem
          • 2 years ago

          No, they handle frames lower than minimum refresh of the panel in a different way, the fact that there are games that have problem with it is not pertinent for what I was saying.

          Look what PCPer writes about those benchmarks
          [quote<] If you are using a display with a 48-100 Hz variable refresh rate, going below 48 FPS means you are outside the VRR range (which to be clear, is only true on FreeSync monitors, not G-Sync)[/quote<]

            • _ppi
            • 2 years ago

            No, they handle it in the same way.

            Difference in G-Sync vs. Freensync is mostly in scaler chip. The monitor MUST be refreshed, there is no way around that. See PCPer testing with oscilloscope with early Freesync vs. G-Sync monitors – there it is apparent that when G-Sync monitor is waiting long for refresh, it will force it.

            Whether the forced refresh is initiated by scaler chip (probably G-Sync) or by GPU/driver (Freesync) is from end-user standpoint pretty much irrelevant. Result is the same.

            The only reason why you never hear about minimum frame rates and LFC on G-Sync monitors is because nVidia was apparently smart enough from the outset and did not allow “LFC incapable” G-Sync monitors, i.e. as a consumer you do not have to worry about it.

            Tbh I do not get the advertised 48-100 fps window, because with Freesync 2, Vega is capable of LFC even on this panel. It would be different if it was 45-100? Or what if someone released monitor with 10-10000 fps, I hope AMD would not advertise that as long as you have 10 fps, you are fine 😉 .

          • urvile
          • 2 years ago
          • urvile
          • 2 years ago

          Things like G-sync and freesync never work as advertised. Same as SLI and Crossfire. To many variables. I would put money on G-Sync being based on the same standard as Freesync but with proprietary tweaks.

          There is a point where it becomes impractical to reinvent the wheel. Given that the core functionality would by necessity have to work in the same way. Ah well. Hopefully the threadripper I pre-ordered lives up to expectations. Which it will. Otherwise intel wouldn’t have crapped themselves.

            • Voldenuit
            • 2 years ago

            Having had the buttery smooth goodness of G-Sync for over a year now, I can’t go back, Jerry, I can’t go back!

            • urvile
            • 2 years ago

            OK. I will take your word for it. Anecdotal though it may be. I have had nothing but trouble with SLI and crossfire setups especially when games don’t support multiple GPUs.

            I have a freesync monitor but I get the feeling I am going to end up with a gtx 1080/1080ti. I have to wait and see what the real world vega performance is.

            • Voldenuit
            • 2 years ago

            Oh, I’ve never claimed that SLI/CF were good or worth it. But G-Sync has been a breeze for me, even with ‘problem’ games like Witcher 3.

            • urvile
            • 2 years ago

            Worth the money for you then. I actually don’t know if freesync is any good because I am still running fury X (I have 2 but you know crossfire) with a 3440×1440 screen. Waiting for vega….waiting for vega….. and now it looks like it’s going to be a let down.

            Will I get over it? With a 1080 ti probably.

            • urvile
            • 2 years ago

            So I ordered a threadripper 1920X, MSI X399 Gaming Pro Carbon motherboard and 64GB of hyperX fury DDR4-2666mhz.

            What am I going to do? Put a vega in that? It’s not looking likely.

            • Voldenuit
            • 2 years ago

            [quote<]What am I going to do? Put a vega in that? It's not looking likely.[/quote<] [url<]http://www.pcgamer.com/amd-radeon-pro-ssg-pairs-vega-with-2tb-of-memory/[/url<]

        • DoomGuy64
        • 2 years ago

        LFC fixed that like how long ago? 30 fps is just a “poor experience” because 30 fps is ALWAYS a poor experience.

    • odizzido
    • 2 years ago

    I seem to recall from back in the day when I was paying a lot more attention to benchmarks that ATI/AMD cards aged better than Nvidia ones. Not sure if it’s generally true anymore(if I am even remembering correctly) but it certainly was for the fury.

    The hardware bundling looks interesting as well. I am not really in the market for a new GPU….or really CPU I guess…..but if I were I would be taking a look at it for sure. Seems like a good idea.

      • Pettytheft
      • 2 years ago

      Fury didn’t age well or improve much but the 7970-390’s have aged quite well.

        • DoomGuy64
        • 2 years ago

        Yeah, I can definitely vouch for the 390. I bought a Fury only because of a fire sale for under $250, but still have my 390. Probably will sell the Fury for $300 (lol miners), and use the 390 until I get Vega.

      • squngy
      • 2 years ago

      AMD cards in general tend to age better, it is true, but in this segment I think most people simply sell their old cards and buy new ones as soon as something better arrives.

      What point is there in buying a worse card now even if it will perform better in the future, if you’re just going to sell it and buy something else before then.

        • Veerappan
        • 2 years ago

        I buy my cards to usually last 3-5 years. Driver support and improvements definitely factor into my purchase decisions.

        But that being said, most of my video card purchases have been capped in the $250 range, so I’m not necessarily the target demographic for the high-end Vega.

          • Kretschmer
          • 2 years ago

          AMD might not be making desktop GPUs in 3-5 years…

            • freebird
            • 2 years ago

            because all their GPUs will be bought by Crypto-miners… you mean, right? 😀

      • Kretschmer
      • 2 years ago

      They do improve slightly with time, but it’s not worth changing your buying decision.

      [url<]https://www.hardocp.com/article/2017/01/30/amd_video_card_driver_performance_review_fine_wine/7[/url<]

      • Airmantharp
      • 2 years ago

      The amount AMD cards improve over time is inversely proportional to the quality of their drivers upon release…

      Typically, AMD overbuilds hardware and underbuilds software. If an AMD card is a good value at time of purchase, then this can work out (assuming drivers are solid for the games you want to play/work you want to do), but otherwise, it’s not worth considering.

    • Krogoth
    • 2 years ago

    It looks like it will have roughly 5-10% more performance then current 1080 SKUs, but that doesn’t stop Nvidia from releasing a “1180 SKU” (Full GK104 with faster GDDR5X chips) which should allow to at least match RX Vega 64 (Air).

    Those who were expecting a 1080Ti killer are going to be disappointed. I’m personally eyeing towards RX Vega 64 (Air) since I have a Freesync monitor on hand.

    Volta is going to be another minor bump when it comes out though. Nvidia already pushed the clockspeed magic with 16nm process to its limits with current Pascal chips.

      • tay
      • 2 years ago

      Vega is 5-10% less than 1080 unless you’re talking about the liquid cooled card that has a 325W board power. This is going to be a brutal launch. 14 months too late, 30% too much power.

        • Topinio
        • 2 years ago

        ++

        Not to mention the stupid bumdles: while I appreciate some people might, I don’t need a new monitor or motherboard or CPU tyvm so tying the cards to these isn’t great.

        And every high-end video card I ever bought came with more than 2 games included, and was not £100 more because of that.

    • ronch
    • 2 years ago

    [quote<]AMD took an unusual tack in discussing the potential performance of Radeon RX Vega cards. Instead of focusing on average frame rates, the company argued for the 99th-percentile frame rates[/quote<] Hmmm.... I wonder why... 😉

      • nanoflower
      • 2 years ago

      I’m sure the Wassonater had nothing to do with it. 🙂

    • Ryhadar
    • 2 years ago

    [quote=”Article”<]Early performance numbers for Vega from Frontier Edition cards didn't include any gains from the DSBR, for example, and that feature will be enabled for the first time with the Radeon RX Vega release drivers.[/quote<] This was interesting, by the way. Folks on reddit have been speculating this was the case since Frontier Edition dropped. Personally, I thought it was wishful thinking but this should definitely help performance out a bit. Anyway, thanks for the great coverage as always, TR.

    • tay
    • 2 years ago

    Samsung CF791 isn’t a good add in. Have they fixed the flickering issues? Never mind the black -> purple during high contrast motion scenes, the text aliasing issues, the you should feel bad 1500R curve. Waste. [url<]https://www.reddit.com/r/ultrawidemasterrace/comments/6548qb/any_update_on_samsung_cf791_flickering/[/url<] The card looks like you get 1080 performance for 1080 Ti power dissipation which honestly isn't too bad.

      • ImSpartacus
      • 2 years ago

      Not to mention it’s 48-100 Hz refresh range wouldn’t’ve qualified for lfc’s 2.5:1 range requirement if AMD had not changed lfc just now to work with 2:1 range monitors. Feels kind sketchy to me.

      • Kretschmer
      • 2 years ago

      The 1080Ti is 250W. This is 350W. That’s a huge difference. And it’s 1080 performance per AMD, which means…1070 and change.

    • freebird
    • 2 years ago

    Can it run Crysis in 8K and what are it’s mining hash rates… 😀

    • DancinJack
    • 2 years ago

    lol comparing Vega to a 980 Ti. Which launched……OVER two years ago.

      • DancinJack
      • 2 years ago

      Oh, and we’re still weeks away from seeing an actual card in stores.

      Sad!

      • DrDominodog51
      • 2 years ago

      They should have used a 1070 instead.

      I mentally read 1070 instead of 980 Ti anyway since they’re pretty similar in performance.

        • DancinJack
        • 2 years ago

        That’s exactly the point. There was quite literally no reason to use the 980 Ti except they wanted to give people the idea that they’re comparing to Nvidia’s top top top.

          • DrDominodog51
          • 2 years ago

          Isn’t that their marketing team’s job?

            • DancinJack
            • 2 years ago

            For sure. I just think it’s dumb. I imagine most people that are looking at these slides know that the 1070 exists and where it is priced, and maybe even its power consumption.

      • Krogoth
      • 2 years ago

      There’s a fairly sizable userbase that are running Maxwell-era GPUs though.

        • DancinJack
        • 2 years ago

        Yeah I’m sure that’s the reason they did it…

        • JustAnEngineer
        • 2 years ago

        /me waves.
        I’m looking for what comes next after my GeForce GTX980Ti. I’ve got my eye on a FreeSync2 144 Hz monitor to replace my ten year-old UltraSharp 3007WFP and fifteen year-old UltraSharp 2001FP.

      • Kretschmer
      • 2 years ago

      I felt like this, too, but it sort of makes sense if you’re targeting 980Ti owners who need upgrades.

        • stefem
        • 2 years ago

        Lol, who’s going to update to Vega from a 980 Ti would see more a power consumption increase than a performance improvement

          • brucethemoose
          • 2 years ago

          Heck, with the 980 TI’s OC headroom, I bet it can match Vega.

          • Kretschmer
          • 2 years ago

          I mean, anyone who takes AMD marketing seriously? AMD-skewed buyers who nabbed a 980Ti when Fury flopped? They’re out there…

            • stefem
            • 2 years ago

            You misunderstood :), I didn’t made a rhetoric question like “who would consider a great deal to upgrade to Vega from a GTX 980 Ti?”. I just noted that in this scenario a user would see greater increase of power consumption compared to the performance improvement

    • the
    • 2 years ago

    [quote<]Early performance numbers for Vega from Frontier Edition cards didn't include any gains from the DSBR, for example, and that feature will be enabled for the first time with the Radeon RX Vega release drivers.[/quote<] Interesting that this can be disabled. nVidia did something similar when they went from the GTX 700 series to the GTX 900 series. I'd expect similar gains here after clocks and ALU changes were accounted for. [b<][i<]IIF[/i<][/b<] AMD can get similar gains, it'll be competitive. Right now that is a very big if considering the experience with the Founder's Edition.

      • stefem
      • 2 years ago

      I don’t understand the reference to the GTX 700 and GTX 900 series card but what many does not know when they talk about the tile based rasterizer is that NVIDIA enable it only where it improve efficiency as in some cases it could hurt performance.

        • the
        • 2 years ago

        The GTX 900 series is where nVidia added [url=http://www.realworldtech.com/tile-based-rasterization-nvidia-gpus/<]tile based rasterization[/url<]. That accounts for a good chunk of the performance difference between those two particular generations of cards.

          • stefem
          • 2 years ago

          That is just a speculation, we don’t know how much it does improve performance and as I said NVIDIA stated that they leverage that feature only “when is worth” as it’s not always beneficial, there are other difference between Kepler and Maxwell like the reworked SM structure and work distribution, scheduling, cache bandwidth and capacity, memory bandwidth trough lossless colour compression… Its a long list.

            • synthtel2
            • 2 years ago

            I’d guess they’re just disabling it on forward renderers. Tiled rasterization’s benefit is bimodal (or more), not a bell curve.

    • EzioAs
    • 2 years ago

    [url=https://media.giphy.com/media/3oEhmVyGy0CRM3lOH6/giphy.gif<]All aboard...[/url<]

      • tay
      • 2 years ago

      “When I die, I’m going to have AMD lower my casket, so they can let me down one last time!”

        • CampinCarl
        • 2 years ago

        Man I thought that fans of the Cleveland Browns were the only group disappointed enough to use this joke.

        • cygnus1
        • 2 years ago

        You won the internets with that one

      • Pancake
      • 2 years ago

      Oh ho ho! Brutal!

    • Waco
    • 2 years ago

    I like that they’re launching something new…but the slow build-up is ruining consumer expectations and excitement.

      • chuckula
      • 2 years ago

      Stop using the word “slow” in relation to Vega!!
      — AMD marketing

        • raddude9
        • 2 years ago
    • Ryhadar
    • 2 years ago

    [quote=”Article”<]Gamers will have a chance to see for themselves whether Vega is a bright new star for the Radeon Technologies Group come August 14.[/quote<] Surprise. Surprise... more waiting on AMD.

Pin It on Pinterest

Share This