AMD demonstrates 7-nm Radeon Instinct card with 32 GB HBM2

As part of AMD's Computex press conference, new Radeon Technologies Group senior vice president of engineering David Wang took the stage to demonstrate the company's first seven-nanometer Radeon Instinct graphics card. The Vega-architecture GPU on this product is meant for server and workstation computing. It'll have 32 GB of HBM2 RAM layered across four stacks of memory—increasingly a prerequisite for the kinds of large data sets businesses want to chew through for machine learning applications and more.

Wang showed the chip performing a job in the Cinema 4D application using AMD's Radeon ProRender software to perform near-real-time ray tracing. Wang's demo partner was able to zoom in and out of the motorcycle model seen above and watch as the render progressed from a noisy smear to a refined model with detailed reflections. The oohs and ahs of a ray-traced image won't be news to anybody, but the fact that the chip was powered on and functioning in a live demo is an important confirmation that seven-nanometer products are on track for the company.

On top of the demo, Wang noted that AMD wants to introduce a new graphics product every year through 2020, whether with a new process—as in the case of seven-nanometer Vega—or with new architectures like Navi and the as-yet-unnamed seven-nanometer-plus product set to arrive a couple years out.

AMD CEO Lisa Su noted that seven-nanometer Vega is sampling to customers now, and the company plans to launch the professional- and server-grade card in the second half of this year. Gamers shouldn't feel entirely left out, though. Su said the company plans to bring seven-nanometer graphics chips to gamers at some point in the future, but didn't offer any more details beyond that tantalizing prospect. Given AMD's quiet on the consumer graphics front for 2018, however, a seven-nanometer gaming card is likely a prospect for next year.

Comments closed
    • ronch
    • 1 year ago

    So how do they plan to brand Vega built on 7nm to differentiate it from 14nm Vega? Lemme guess.. 7ega?

    • ronch
    • 1 year ago

    I just want them to come up with an architecture that’s as efficient as Nvidia’s. IIRC Nvidia pulled ahead in terms of efficiency back in 2014 with Maxwell in the form of the 750Ti, and since then AMD has been behind the efficiency curve as Nvidia built pretty much everything based on that architecture and improved on it. AMD’s only resort was HBM and a process shrink to 16/14nm with RX480 but even so the products weren’t enough to convince us that AMD had efficiency pinned down and once Nvidia had those too they again reminded us that AMD had an architecture that’s been missing that elusive secret sauce.

    • LoneWolf15
    • 1 year ago

    You no mess with David Wang.

    [url=https://en.wikiquote.org/wiki/Shadow_Warrior<]historical reference[/url<]

    • Unknown-Error
    • 1 year ago

    Lets hope AMD can pull a Ryzen with their graphics but CPU-wise fixing bullozer fiasco was one thing, but competing with nVidia’s graphics advancements is another.

    • renz496
    • 1 year ago

    so how this Vega 20 compares to GV100 especially in FP64 metric?

      • chuckula
      • 1 year ago

      Unless AMD went out of its way to redesign Vega’s architecture for FP64 computations*, you don’t want to know the answer.

      But in fairness to AMD, this chip is not intended to be in the same weight class as the GV100.

      * And all the noise about “AI” and “Deep learning” from the press conference means the answer is almost certainly no because those workloads intentionally *reduce* the precision instead of increasing it.

        • renz496
        • 1 year ago

        So AMD have no replacement for their aging Hawaii for FP64?

          • chuckula
          • 1 year ago

          If AMD wants an FP64 part to compete with the part of the Tesla market that addresses 64-bit calculations then this product that AMD is showing off is clearly not that particular competitor.

          They can either make another chip that is the competitor or just continue to ignore that particular market niche, which is definitely what they were doing prior to this demonstration too.

            • Spunjji
            • 1 year ago

            I’d definitely suspect the latter. They’re spread thinly as it is; evidently it’s simpler to design for graphics-appropriate precision and double the throughput for low-precision workloads than it is to scale the design up and provide high-performance high-precision operations.

            Makes sense to me. Nvidia have that market sewn up already, why spend limited resources fighting their way back into it when they can focus elsewhere.

    • DPete27
    • 1 year ago

    So nothing < Vega for the mainstream until late(?) 2019 then? Cool. Thanks

      • derFunkenstein
      • 1 year ago

      As long as miners are buying up everything they produce, there’s really no impetus. And then as soon as they release something better, miners will drive up those prices because miners want something better, too. It’ll be more profitable all over again.

    • etana
    • 1 year ago

    [quote<] Wang noted that AMD wants to introduce a new graphics product every year through 2020, whether with a new process—as in the case of seven-nanometer Vega—or with new architectures like Navi and the as-yet-unnamed seven-nanometer-plus product set to arrive a couple years out.[/quote<] He seems to have left off "Or a new increment on the first digit of a previous product name"

    • chuckula
    • 1 year ago

    Why no love for the Vega 56 Nano TR?

      • Kretschmer
      • 1 year ago

      If it’s anything like the Fury Nano seven will be sold in total at a 50% premium.

        • chuckula
        • 1 year ago

        To one miner!

        • MrJP
        • 1 year ago

        I’ve got one. Not sure where the other six went.

    • chuckula
    • 1 year ago

    That’s not the absolute best picture but using the published HBM2 stack size information of 7.75 mm × 11.87 mm, I estimate the chip has an area of a little over 300 mm^2.

    Raj claimed that regular Vega had a die size of about 484 mm^2.

      • Waco
      • 1 year ago

      I would guess it isn’t a straight die shrink since it has a much wider memory bus.

        • ptsant
        • 1 year ago

        Doesn’t it also have some new instructions?

          • chuckula
          • 1 year ago

          The presentation focused heavily on AI so I would assume there’s additional hardware for that type of processing.

    • NoOne ButMe
    • 1 year ago

    Remarkably large die size given a doubling of density 7nm will generally brings over 14/16nm technologies from GloFo-Samsung/TSMC.

    Or lots of new features? Infinity fabric on Ryzen is about 10% overhead. Scaled up to raise bandwidth between GPUs?

      • K-L-Waster
      • 1 year ago

      Well, it *is* a Pro chip, and those tend to be ginormous. The GV100 isn’t exactly svelte either.

      I expect a gamer part on the same process would be a lot more compact.

        • NoOne ButMe
        • 1 year ago

        Yes, but Vega 7nm is mostly a shrink of Vega 10.
        One would expect about half the die size as AMD claimed 7nm brings double the density. Which is less than the Foundries claim, same goes for power. Performance splits the difference with TSMC and th foundry companies’ claims.

          • K-L-Waster
          • 1 year ago

          It’s also got 32 GB of HBM2, which is likely contributing to the size of the total package.

        • the
        • 1 year ago

        The GV100 is literally the largest mass production chip ever released at 814 mm^2. True madness in terms of its size.

      • blastdoor
      • 1 year ago

      Impressive to see such a large die on a brand new process. All Intel can manage on their 10nm process is a toy dual core CPU.

        • stefem
        • 1 year ago

        The fact that we didn’t see anything bigger on 10nm from Intel doesn’t mean they didn’t have samples of larger dies internally, even this 7nm Vega is not in volume production and probably still have negligible yield for now

          • blastdoor
          • 1 year ago

          Nah, if they had it, they’d hook it up to an LN2 cooler and demo it running at 6 GHz.

      • stefem
      • 1 year ago

      7nm offers 70% max increase in density compared to 16nm but they aren’t forced to pursuit the max density road

        • NoOne ButMe
        • 1 year ago

        It offers up to 70% area reduction, or, about 3x the density according to TSMC.

        AMD claimed 2x density and showed abouc25-35% density increase.

          • stefem
          • 1 year ago

          No, the max reduction in size TSMC achieved for a simple SRAM Bit cell is 2,6x but you can’t scale everything by the same factor, there’s routing and other stuff, a whole SRAM macro was just 0,34x (or 1/3 or 70%) smaller than on 16nm.

      • synthtel2
      • 1 year ago

      Purports from way back were that this die would be fast at FP64, which could explain a lot of that.

    • ptsant
    • 1 year ago

    Just sell it already, we need competition.

      • Tom Yum
      • 1 year ago

      I’d say that they are waiting for yields to improve enough to make consumer products profitable, especially given the immaturity of 7nm process. Selling them now would be financial suicide.

        • DPete27
        • 1 year ago

        In today’s market? I dunno, scrape up the yields that pass and sell them at high MSRP (important so AMD gets the $). Release more into the stream as yields improve.
        It’s looking less likely that Nvidia Turing will launch in July, but AMD is gonna need something in the market to not fade to irrelevance when it does release in August/September.

      • NoOne ButMe
      • 1 year ago

      Vega 20 probably 100% for the pro market…

        • ptsant
        • 1 year ago

        True, but it will trickle down.

      • Kretschmer
      • 1 year ago

      You have competition. You’re competing with crypto-miners for GPUs and losing.

        • ptsant
        • 1 year ago

        That’s not how it’s supposed to work. The GPUs should compete for my money.

          • K-L-Waster
          • 1 year ago

          That’s not how capitalism works.

          Companies exist to maximize the return for investors. Full stop. Some of them make products, some provide services, but the *only* reason they do so is to return capital to the company’s owners.

          If multiple companies competing happens to produce price pressure that results in lower prices for consumers, that’s a happy coincidence. But it is a huge mistake to assume that any company — AMD included — exists for the purposes of making it less expensive for consumers to buy things.

            • blastdoor
            • 1 year ago

            I think Karl Marx would agree entirely with that description of capitalism.

            • K-L-Waster
            • 1 year ago

            So would Adam Smith.

            • chuckula
            • 1 year ago

            So would Tim Cook!

            • blastdoor
            • 1 year ago

            Not really. He had a more optimistic view. He was focused on the benefits of labor specialization and competition.

            He did not believe that “greed “ (aka, rent seeking behavior) was good. Rather, he believed greed could be harnessed for positive social outcomes.

            • ptsant
            • 1 year ago

            And capitalism exists because it allows consumers to get stuff. Companies are not natural entities. We INVENTED the notion of a company because it facilitates the production of stuff in an efficient manner.

            Now, while I appreciate the effort you put into your lecture, I still think that a situation in which technical progress stagnates and prices increase is clearly not what we expect from capitalism. Just because you have a rational (but rather obvious–sorry) explanation of why this situation occurred, doesn’t make it any more desirable.

            • K-L-Waster
            • 1 year ago

            Evidently you’re under the impression that society as a whole set up a bunch of independent companies and directed them to compete with each other so consumers would have choice?

            ‘Cus that ain’t what happened. Not even close.

            Each *individual company* — Intel, AMD, Nvidia, Apple, the whole lot of’em — was founded because the founders thought they could make money that way. Not one of them was motivated by what would help consumers. Their own profit was and remains their only motivation.

            Attempting to overlay a “greater good” filter on top of this is always going to lead to disappointment.

            None of us have to like it or consider it desirable — but until someone forms a non-profit technology development and distribution entity, it’s all we’ve got. (Pro-tip: don’t hold your breath waiting for that one…)

            • Kretschmer
            • 1 year ago

            [quote<]but until someone forms a non-profit technology development and distribution entity, it's all we've got.[/quote<] LinGPU on the desktop? 😀

            • ptsant
            • 1 year ago

            There were plans for an open-source GPU but it probably was (is?) a spectacular failure. The patent minefield is probably impossible to navigate.

            • ptsant
            • 1 year ago

            You don’t need to repeat your point, which is pretty obvious but tangential. I don’t disagree with the fact that companies need to make money. I invest in companies and receive dividends, so I understand that quite clearly.

            What I said is that in this specific instance, we, the consumers, don’t get healthy competition for several reasons. It has happened before and will happen again. And I’m not claiming that I have a solution. But I’m not going to pretend that the current situation in the PC marker is ideal. It sucks and the fact that companies still make a lot of money doesn’t change that.

            • blastdoor
            • 1 year ago

            Each individual company — the whole lot of’em — were not founded in a vacuum. They were founded within the context of a system of laws and regulations which — I hate to break it to you — actually were created by people who, at least in part, care about the “greater good.”

            Anti-trust laws, corporate taxes, health and safety regulations, etc etc — all of these things were created before any of the companies you mention and they were created to curtail the downsides of capitalism. In essence, they were created because Marx’ critique of capitalism was pretty much on the money. His ideas for what to do about it were somewhat lacking (to put it mildly).

            We do not have to just sit back and accept corporate rent-seeking behavior because “that’s capitalism, dummies!” There are things that can be done — and have been done for many decades now — to curtail the worst side effects.

            • K-L-Waster
            • 1 year ago

            If they’re actually doing something illegal, yes.

            What I was responding to 3+ levels up, though, was the idea that AMD is obligated to release a consumer level version of this GPU in the very near term solely to make sure that consumers have desirable choices — the “I want a cheap powerful GPU now ‘cus we deserve competition!!!1!” argument.

            None of the things you mentioned obligates anyone to release a particular product at a particular price point by a particular date.

            Now, if hypothetically Nvidia tried to prevent this GPU from being released, then by all means go after them with Anti-Trust. And rightly so in that scenario.

            My point is “we need competition” does not and should not determine when — or if — AMD goes to market with this or anything else.

Pin It on Pinterest

Share This