AMD’s next graphics cards will be called Radeon RX Vega

Radeon Technologies Group honcho Raja Koduri says AMD's Capsaicin event at GDC has become a tradition at this point. Holding with tradition, the company announced the name of its Vega-powered Radeons at the event. Those Vega cards will drop the numeric nomenclature of their predecessors and instead be called "Radeon RX Vega."

Koduri actually talked quite a bit about Vega at the presentation, but we already had most of the information that he shared. We did see a couple interesting demos, however. The company demonstrated the benefits of Vega's High-Bandwidth Cache Controller, promising that the feature doubles minimum framerates in Deus Ex: Human Revolution versus "traditional VRAM mode." Koduri also mentioned that the packed math compute capabilities of Vega—another way to say "doing FP16 on FP32 units"—are "like having twice as many shader units" for certain calculations. AMD demonstrated Vega rendering twice as many hair strands on a character using those packed-math capabilities.

We've been looking forward to AMD's re-entry into the high-end graphics processor world nearly as eagerly as we've been awaiting the company's re-entry into the high-end CPU market, and Radeon RX Vega cards will seemingly carry that torch. Koduri didn't clarify whether there would in fact be multiple processors based on the Vega design or simply a single chip, though. Earlier rumors have seeded the idea that there might be a "Vega 10" and "Vega 11" as with the company's Polaris GPUs. We'll just have to wait and see.

Comments closed
    • tacitust
    • 4 years ago

    Whelp, just forked over $150 (after rebate) for an RX 480 a couple of hours ago, so I guess it’ll be a while before I get a Vega. Mind you, the RX 480 is more than enough to play all my Steam catalog of games at 1080p, full quality, 60fps, so I think I’m good. 🙂

    • albundy
    • 4 years ago

    Radeon RX…no prescription necessary.

    • ultima_trev
    • 4 years ago

    Hopefully the 4+ SKUs that are derived from Vega 10 and Vega 11 aren’t all called RX Vega. I personally wish they’ll have a throwback to the ATi days:

    RX Vega All-In-Wonder (Nano tier)
    RX Vega Rage
    RX Vega Rage XT
    RX Vega Rage XTX

      • Kretschmer
      • 4 years ago

      No. Noooooooooo.

      • BurntMyBacon
      • 4 years ago

      I hear they plan to release a model with an S suffix later in a similar vein to the Ti models from nVidia. As to when this RX VegaS materializes: it’s a gamble.

    • jensend
    • 4 years ago

    Didn’t notice before that AMD was doing double throughput FP16. That’ll be a big deal for many things – notably neural nets. NV has drastically gimped consumer card FP16 (1/64th FP32 throughput) to try to keep AI customers buying Tesla units.

    • tsk
    • 4 years ago

    “We’ve been looking forward to AMD’s re-entry into the high-end graphics processor world nearly as eagerly as we’ve been awaiting the company’s re-entry into the high-end CPU market”

    Slight overstatement, but okay, we’are excited.

      • JustAnEngineer
      • 4 years ago

      I am much more excited about Vega than I am about Ryzen. I do not expect Ryzen to be much faster for gaming nor to offer different gaming-related capabilities than my existing Skylake CPU provides. I do expect that a new Radeon RX Vega graphics card will offer FreeSync, which my current GeForce GTX graphics card can’t do (or can’t do with NVidia’s current drivers).

        • sreams
        • 4 years ago

        “I do not expect Ryzen to be much faster for gaming nor to offer different gaming-related capabilities than my existing Skylake CPU provides.”

        It looks like Ryzen will have a massive impact on pricing overall, so everybody should be excited about it.

          • Klimax
          • 4 years ago

          Not likely. (Unless AMD wants to commit suicide…)

      • NovusBogus
      • 4 years ago

      Maybe Steve Ballmer’s been doing some consulting work lately.

      EFFICIENCY EFFICIENCY EFFICIENCY. EFFICIENCY EFFICIENCY EFFICIENCY EFFICIENCY! EFFICIENCY EFFICIENCY EFFICIENCY….

    • Kretschmer
    • 4 years ago

    Give me a date and a set of reputable benchmarks; anything else is just noise.

    I ended up going with a GTX 1070 instead of waiting six more months for similar performance. I’ll be curious to see how Vega stacks up against Pascal. It’s a shame that the different adaptive sync standards lock us in to a particular vendor.

      • Magic Hate Ball
      • 4 years ago

      It’s a shame that Nvidia is pouting in their corner with proprietary tech instead of going mainstream you mean?

        • tay
        • 4 years ago

        I don’t think it’s that simple. I own an RX470, and have been looking to get a FreeSync monitor with a good refresh range. Well there aren’t that many IPS panels with a good free sync range and there is often flickering on them even with their crappy ranges. For example the new Samsung C34F791 [url<]https://www.reddit.com/r/ultrawidemasterrace/comments/5nae5l/c34f791_owners_fairly_certain_the_flickering_isnt/[/url<] G-Sync doesn't suffer at all from all this, plus it has ULMB to boot. It is a much better solution, albeit one that will cost $150 more. nGreedia etc.

          • DoomGuy64
          • 4 years ago

          Shill/troll confirmed. The [url=https://www.newegg.com/Product/Product.aspx?Item=N82E16824236466<]ASUS MG279Q[/url<] was one of the first IPS FreeSync monitors to be released, and it has none of the issues you're complaining about. Thats a problem specifically with that Samsung monitor, not an issue with Freesync as a whole. Don't bring up ULMB if you are even vaguely serious about adaptive sync either, as ULMB is not compatible with either brands adaptive technology at the same time. Anyone who wants ULMB can just buy a ULMB monitor, as ULMB is not vendor locked. Freesync also has a ULMB like features in LFC, which doubles framerate at lower frequencies. Freesync users can for example turn 50hz into 100hz, which lessens blur, simply by adjusting their freesync range. Gsync cannot do this, afaik. +1 for Freesync. As for panel quality and individual issues? Don't be lazy, and properly research what you're buying. That goes for every product out on the market, as no brand is immune to issues.

            • Voldenuit
            • 4 years ago

            [quote<]Freesync also has a ULMB like features in LFC, which doubles framerate at lower frequencies. Freesync users can for example turn 50hz into 100hz, which lessens blur, simply by adjusting their freesync range. Gsync cannot do this, afaik. +1 for Freesync.[/quote<] That is not ULMB. ULMB uses strobing to reduce perceived blur* by creating black frames between discrete refreshes. Doubling the framerate doesn't achieve the same effect. * EDIT: worth noting that the majority of perceived blur (I've read whitepapers stating around 70%) is due to the sample and hold behavior of LCD panels, and in the future we may see blur reduction from different panel technologies such as AMOLED.

            • DoomGuy64
            • 4 years ago

            [quote<]* EDIT: worth noting that the majority of perceived blur (I've read whitepapers stating around 70%) is due to the sample and hold behavior of LCD panels, and in the future we may see blur reduction from different panel technologies such as AMOLED.[/quote<] So you get the concept, yet take my statement out of context. I never said LFC *WAS* ULMB, only that when taken advantage of decreases blur in the exact way you described above. Doubling the refresh rate of lower ranges reduces the length of sample and hold by half. So instead of getting the blur of 50hz, you can get the sample and hold rate of 100hz. I have a MG2279Q and I can vouch for this working, and so can anyone else who has a freesync monitor capable of hitting those ranges. Edit: You know if monitor manufacturers were smart, they could copy this function into ULMB, and give you automatic adaptive ULMB when your refresh rate dips below 72hz. However, considering that LFC is a driver based feature, adaptive ULMB would likely also require driver support.

            • Voldenuit
            • 4 years ago

            [quote<]Doubling the refresh rate of lower ranges reduces the length of sample and hold by half. So instead of getting the blur of 50hz, you can get the sample and hold rate of 100hz. [/quote<] Unfortunately, doubling refresh of a static image would do nothing to reduce sample and hold blur. The idea behind black frame insertion is that the brain will interpolate motion between two discrete flashes if it's not being shown a 'confusing' static image while motion is supposed to be happening. Showing the same 30 fps image twice to make it 60 fps does not reduce perceived blur. LFC does, however, allow VRR technologies to scale down to lower framerates than they otherwise would be able to, by refreshing the pixel so the pixel decay time of the panel is not the limiting factor. EDIT: I think you're conflating two very different technologies here. cf: [quote="DoomGuy64"<]Edit: You know if monitor manufacturers were smart, they could copy this function into ULMB, and give you automatic adaptive ULMB when your refresh rate dips below 72hz. However, considering that LFC is a driver based feature, adaptive ULMB would likely also require driver support.[/quote<] ULMB and VRR (and hence LFC) don't currently coexist. The reason behind this is that if the frame times are variable, it becomes difficult to predict how bright you have to strobe the backlight to achieve constant illumination, as the next frame could be coming in half the time or twice the time of the last frame, or anywhere in-between. You could conceivably get around this by using a rolling average, and live with a flickering or varying brightness. Or you could buffer frames and display frames with a 1 frame delay, but that would be unacceptable in gaming monitors where low latency is desirable (good gaming monitors have 5ms or less total lag these days, and monitors with 16ms or more of lag are often considered less than desirable for competitive gaming nowadays). On top of that, ULMB and VRR address very different problems. VRR improves smoothness, ULMB reduces (perceived) blur. So turning on ULMB when framerates drop would not (significantly) improve smoothness, and turning it off when they rise would (paradoxically) make your image look blurrier as framerates rise. Right now, ULMB is prized in competitive, fast-paced games, where being able to identify silhouettes quickly helps with target acquisition and identification. The gamers who use it know that they are trading off smoothness for sharpness, and it has a noticeable effect on their winrate.

            • DoomGuy64
            • 4 years ago

            [quote<]doubling refresh of a static image would do nothing to reduce sample and hold blur.[/quote<] That's not how refresh rate works. You already admitted this in your earlier post, and Blurbusters says the same thing. edit: lol @ static image. Phrasing. I get the point, but bad choice of words. 144hz has less blur than 60hz, so forcing 60hz to run at 120hz will reduce blur. I'm not saying it eliminates blur, but it unquestionably reduces blur, as anyone who's used a 144hz monitor can vouch for, and so does blurbusters. [quote<]120 Hz Refresh rate: Each refresh is displayed continuously for a full 1/120 second (8.3ms) This creates 50% less motion blur.[/quote<] That's straight from blurbusters, so any other claim is false. Even if you are getting a lower framerate, 120hz still has 50% less blur than 60hz. Forcing 60fps VRR to refresh @ 120hz will have 50% less blur than 60fps VRR @ 60hz. Also, after reading blurbusters for the quote, I read that combining ULMB with VRR is being worked on, so it is clearly possible. Just not with todays monitors. Edit: Additional point: IPS/VA panels are primarily the main culprits of blur. TN is not affected nearly as much. That said, I have found my 144hz modified MG279Q to be a good compromise and well rounded monitor. I don't think any of this arguing is about actual capability though, as cherry picking a junk Samsung panel to mischaracterize an entire technology is hardly sane or reasonable. Nor is ULMB a valid arugment for VRR when it is currently not compatible. Only a fanboy, troll, or paid shill would bring up such shady arguments.

            • Voldenuit
            • 4 years ago

            Well, you specifically mentioned combining LFC with ULMB, and LFC *does* repeat frames so my use of ‘static image’ was intentional. If your game fps has dropped to 30 fps and your monitor is repeating the image at 60 Hz using LFC, ULMB/Lightboost/backlight strobing will do pretty much nothing, since there is no movement between the refreshes for the visual center of your brain to interpolate from.

            • DoomGuy64
            • 4 years ago

            Even if you are repeating frames, the period of time between refresh has been halved.

            60 FPS is 60 FPS. However, if you double your refresh rate, you get 60 FPS refreshed @ 8.3ms instead of 16.7. Blur is inevitably reduced from this. I’m not referring to ULMB, just the standard response time of a normal panel. Doubling a 60hz refresh rate with LFC reduces blur to the rate of 120hz. Maybe you need to try it for yourself, but it does work in reducing blur from VRR.

            • Kretschmer
            • 4 years ago

            I own (currently trying to sell) a MG279Q. It’s pretty good but arbitrarily limited to 90Hz in Freesync mode out of the box. GSync is certainly more rigorous with vendors

            Also ULMB and LFC have literally nothing to do with each other.

            • DoomGuy64
            • 4 years ago

            Artificial limitation that was probably set by being the first monitor available before LFC. You can simply change it with CRU.

            [url<]https://www.newegg.com/Product/Product.aspx?Item=0JC-0081-00002[/url<] Here's a newer IPS panel that hits the higher ranges ootb, and it's cheaper. Just a question of whether or not you want a name brand.

            • DPete27
            • 4 years ago

            There are a good amount of modern games that even the top tier RX480 would have a hard time achieving 90fps @ the MG279Q’s 1440p resolution. Obviously there are less demanding games that can run far above that even on a RX460, but you have to ask yourself, is >90Hz REALLY that critical (different people, different opinions), or is it an epeen spec race?

            • Voldenuit
            • 4 years ago

            [quote<] is >90Hz REALLY that critical (different people, different opinions), or is it an epeen spec race?[/quote<] As someone who's been using a 144 Hz G-Sync monitor for the past year, I can definitely tell the difference. A driver update from a few months update dropped my overwatch frames from 120 fps to 109 fps, and it was noticeable (this has since been fixed). Anything under 100 in a FPS feels 'slow' to me now.

            • DoomGuy64
            • 4 years ago

            I don’t know why people are still on about 90 FPS with the MG279Q. I have one, and am running freesync @ 144hz. Just download CRU. Such a worthless non-argument.

        • jihadjoe
        • 4 years ago

        As it is now both standard are effectively proprietary. The only way to use Freesync is with an AMD card, and the only way to use GSync is with Nvidia. What’s the value in Freesync being open when there aren’t any other GPU manufacturers to make use of it?

          • synthtel2
          • 4 years ago

          Intel makes noise about supporting Freesync, and Nvidia might someday (if they can’t get away with not supporting it any longer). G-Sync isn’t going anywhere though. A good monitor should outlast multiple video cards. If you buy Freesync, you have at least a lot more potential options for future video card upgrades.

            • Voldenuit
            • 4 years ago

            Intel’s been making those noises since Skylake, with nothing to show for it.

            Meanwhile, if I want a monitor with ULMB, G-sync is my only option (although you can buy an old 3DVision/Lightboost monitor for cheap, but I wouldn’t recommend it as panel quality has improved significantly over the years).

            Monitors *used* to last 5-10 years, but with the introduction of VRR, 4K and HDR, we’re in the transitional phase where rapid spec and price improvements will render current monitors obsolete.

            • synthtel2
            • 4 years ago

            It does depend on your use case somewhat. Still, considering the rate of improvement of monitors (still not that fast), I would find it odd if someone wanted to drop money on a good monitor without intending to keep it a while (outside of a few pro or semi-pro uses).

            • RAGEPRO
            • 4 years ago

            Samsung CFG70 has MPRT, which is a superior version of ULMB. I have one. It’s the tits. 🙂

            • Voldenuit
            • 4 years ago

            Good point, and I believe there are some benqs that have a similar feature. But the Samsung is 1080p, and at the time I was shopping, it wasn’t out, plus I wanted 1440p.

            • RAGEPRO
            • 4 years ago

            Yeah, I feel you on the resolution. I only have a 290X so maintaining >100 FPS in 1440p is a little bit of a tall order in most games. I’m happy enough with 1080p + SMAA anyway. 🙂

            • psuedonymous
            • 4 years ago

            [quote<]"Samsung CFG70 has MPRT, which is a superior version of ULMB."[/quote<] Oh wow, that's an impressive case of marketing wankery on the part of Samsung. [url=https://news.samsung.com/global/interview-how-samsung-achieved-a-1ms-response-time-in-the-cfg70-curved-gaming-monitor<]Looking at their description of MPRT[/url<], it could be summed up as "we couldn't get Global Refresh working on our panel, so we turned segmented refreshing from a bug into a feature!". The 'Those aren't bullet holes, those are speed holes!' of the panel updating world.

            • RAGEPRO
            • 4 years ago

            Uh, sorry, but you’re a little bit mistaken. This is not about the panel refresh, but about the backlight. You may want to [url=http://www.blurbusters.com/faq/scanningbacklight/<]start here[/url<] and do a little research.

            • DPete27
            • 4 years ago

            Didn’t Intel’s contract with Nvidia for IGP stuff end and they switched to AMD? I’d put good money that’s why Intel hadn’t yet supported FreeSync. (If) they’re with AMD, now, I’d expect those noises to come true relatively soon.

            [Add] [url=http://www.fudzilla.com/news/processors/42806-intel-cpu-with-amd-igpu-coming-this-year<]link[/url<]

          • sreams
          • 4 years ago

          “What’s the value in Freesync…”

          No royalty fees. The only reason nobody other than AMD supports Freesync is because nVidia is effectively the only other game in town and they refuse to give up the extra $ by supporting an open standard. Can’t really blame AMD for that.

            • JustAnEngineer
            • 4 years ago

            The value is saving the nearly $200 per monitor fee for proprietary G-Sync. At the low end, adding G-Sync actually doubles the cost of the monitor compared to a similar monitor without G-Sync. FreeSync is nearly free.

            • ColeLT1
            • 4 years ago

            I waited until the monitor was on (perpetual amazon) sale, got a <$500 dell 27in Gsync 1440p monitor. It IS a TN panel, but looks better than my old IPS Dell 2407WFP sitting next to it.

            I was comparing at the time to the Acer XG270HU (TN also) because of the freesync range of 30-144Hz.

            The price difference was less than $40. Just shop around a bit and look for the monitor you want to be on sale. For $40, I went with gsync.

            • DPete27
            • 4 years ago

            You compared a deeply discounted GSync monitor to a regular price FreeSync one? Ok.

            • ColeLT1
            • 4 years ago

            “Deeply discounted” yet it ran at that price for a year on amazon/newegg. Practically nothing electronic sells at MSRP after the introduction.

            • JustAnEngineer
            • 4 years ago

            Of course it makes sense to buy something when it is on sale. When you do an apples-to-apples comparison, monitors with G-Sync are usually about $180 more expensive than similar monitors with FreeSync (when both are at regular price or both are on sale).
            The AOC G2260VWQ6 is the cheapest FreeSync monitor available, at $115 from Newegg.

            The AOC G2460P[b<]F[/b<] is a 24" 144 Hz 1080p FreeSync monitor for $210 from Amazon. The otherwise-identical AOC G2460P[b<]G[/b<] is the cheapest G-Sync monitor available, for $400 from Newegg.

    • Flying Fox
    • 4 years ago

    So is this even a paper launch? Or it is amounting to a glorified PowerPoint slideshow? OK, the official name is announced. Big deal…

      • Freon
      • 4 years ago

      The whole thing was a complete non event.

        • Glix
        • 4 years ago

        Free <Yeah! *cheers* > T-Shirt’s.

        *crickets*

        *one guy clapping*

      • chuckula
      • 4 years ago

      We’re getting a paper launch with too much hype later tonight from Nvidia.

      This is a pre-paper launch name announcement.

    • Neutronbeam
    • 4 years ago

    Of course they weren’t going to give away a lot of information yet Zak. Everybody knows that what happens in Vega stays in Vega.

      • chuckula
      • 4 years ago

      In that case, it’ll be another 25 years until we get the information.

      Unless AMD’s secret feature for Vega is a wormhole.

        • BurntMyBacon
        • 4 years ago

        Sure we might see the information in 25 years, but how long until it gets here? I’d hate to pay the shipping fees on that one.

    • Neutronbeam
    • 4 years ago

    Prescription Vega? Since AMD is already Ryzen maybe it should RX Viagra.

      • UberGerbil
      • 4 years ago

      [url=https://youtu.be/BROS4TUg-WU<]Obligatory[/url<]

      • ImSpartacus
      • 4 years ago

      I expect a lot of corny jokes on this topic, but this one want so bad. I cracked a smile.

      You win this time.

    • bhappy
    • 4 years ago

    Who would have guessed AMD’s new card would be called RX Vega? Glad they did a whole presentation just for that news.

      • EndlessWaves
      • 4 years ago

      Not too surprising after they did the same thing with Fury last generation.

      Whether we’ll see Vega Nano, Vega and Vega X remains to be seen though.

        • derFunkenstein
        • 4 years ago

        The Fury GPU had a codename of Fiji. This is a new thing that’s super annoying, along with Quadro GP100. Are you talking about the GPU, the architecture, or the graphics card? Who knows?

Pin It on Pinterest

Share This