GDDR5X enters mass production ahead of schedule

In a blog post yesterday about the GeForce GTX 1080 launch, Micron confirmed something we've suspected since this past Friday: GDDR5X memory is in mass production now. That news beats the projected "summer" timeline for GDDR5X mass production we reported on earlier this year.

Kris Kido, Micron's global Director of Graphics Memory Business, primarily talked about the Pascal GPU and what we all know it can do in the company's announcement. (If you're behind, check out our coverage of Pascal from the last couple of days.) Deep in the blog post, Kris says, "Today, I am happy to announce that GDDR5X, the fastest discrete memory component in the world, has already entered mass production." Moving GDDR5X into mass production this quickly is good news for Micron, who faces stiff competition in the graphics memory space.

High-Bandwidth Memory offers 1024Gbps of potential bandwidth per stack, and HBM2 purportedly offers double that data rate. The 10Gbps data rate of Micron's current GDDR5X packages is certainly impressive, though, and the new design is much easier to implement than HBM's complicated through-silicon-via construction. Micron is currently sampling GDDR5X in 11Gbps and 12Gbps flavors, too, which will yield up to 576GBytes/sec on a 384-bit bus. We'd expect to see Micron's finest in more upcoming products from AMD and Nvidia soon.

Comments closed
    • Anomymous Gerbil
    • 3 years ago

    As with the previous article on GDDR5X, “Gbps” isn’t the right metric; presumably you mean something like “10G transfers per second per pin”, or something like that.

      • auxy
      • 3 years ago

      It’s only the “per-pin” part that is being left off. It’s being left off because these packages are rarely used singly. Ergo when discussing the performance they’re really just using it in place of clock rate because of the quad-pumped nature of GDDR5, since the ACTUAL bandwidth is dependent on implementation.

      It’s more accurate than saying 10GHz.

        • Anomymous Gerbil
        • 3 years ago

        Indeed, but as a tech site they should take the trouble to write accurate descriptions.

    • orik
    • 3 years ago

    how come we never saw any APU’s from AMD with GDDR5 soldered on the board rather than DDR3/4? I always thought that would be awesome, maybe with GDDR5X or maybe we need to wait for HBM.

      • auxy
      • 3 years ago

      GDDR5 has insanely high latency, which makes it rather unsuited for use as a CPU’s main memory. (´・ω・`)

        • NTMBK
        • 3 years ago

        Works fine for the PS4!

          • Ninjitsu
          • 3 years ago

          Would suck for a general purpose PC though. I assume games on the PS4 are tuned for the low latency.

        • tipoo
        • 3 years ago

        The GDDR5 latency thing is both true and untrue. In terms of cycles, latency is far higher than DDR3/4. However, as it’s also clocked much higher, latency in actual *time* ends up not being miles off, sometimes pretty close.

        Also if you were designing an APU around GDDR5 I’d imagine you’d be designing it to hide those latencies as well with caches, and something like the 20GB/s CPU-GPU direct bus as the PS4 has. It also has hUMA going for it so you don’t have to keep sending things back and forth for GPU compute.

      • the
      • 3 years ago

      Likely due to memory capacity issues. AMD’s SoC support 128 bit wide buses would would top out at 8 GB max with GDDR5. Even then, that requires high capacity GDDR5 chips which were released relatively recently.

        • tipoo
        • 3 years ago

        Current ones do; not like one designed for GDDR5 couldn’t change. The semicustom PS4 APU is 256 bit for instance.

    • Milo Burke
    • 3 years ago

    The Emperor will be pleased.

    [quote<]GDDR5X enters mass production ahead of schedule[/quote<]

      • the
      • 3 years ago

      What does Krogoth say?

        • Milo Burke
        • 3 years ago

        I modded one of these doohickeys to include an option for Krogoth: [url<]https://s-media-cache-ak0.pinimg.com/736x/b8/59/4a/b8594ae672559f6d7eb4c427f8a13b38.jpg[/url<] That way I don't have to ask him directly.

          • chuckula
          • 3 years ago

          We could also replace all the floaty things in a Magic 8-ball with Krogothisms to guarantee the Krogothic answer.

            • UberGerbil
            • 3 years ago

            You realize there’s just one floaty thing with multiple faces, right?

            • chuckula
            • 3 years ago

            No, there’s one floaty thing for each of my fragmented personalities.

            • UberGerbil
            • 3 years ago

            And half of them are trolls?

            • Ninjitsu
            • 3 years ago

            Spin the wheel to find out!

          • UberGerbil
          • 3 years ago

          Which animal did you pick to replace with [url=http://www.oocities.org/timessquare/2409/krogoth.gif<]this[/url<]?

        • albundy
        • 3 years ago

        Where’s my sandwich?

        • Jigar
        • 3 years ago

        Pretty sure, he is not impressed.

          • Tirk
          • 3 years ago

          Its a perplexing issue considering that even with GDDR5X 10GB/s being put on the 1080 it will still have less memory bandwidth than the 980ti due to the 1080 having a 256bit bus (1080 = 320 GB/s memory bandwidth, 980ti = 336GB/s memory bandwidth). Hopefully they’ve increased Pascal’s memory usage efficiency to compensate. I await TR’s interesting review on whether this becomes a factor.

          I think most of the hype for GDDR5X was for speeds greater than the 10GB/s being offered and I think even Micron is surprised it was picked up rather than waiting for the faster revisions, hence the sudden announcement of the ramp up.

    • alloyD
    • 3 years ago

    [quote=”Zak Killian”<]and HBM2 purportedly...[/quote<] good work!

    • tipoo
    • 3 years ago

    So we’re not letting Nvidias “G5X” rebrand happen, right?

    The first time I thought it was a flub, then I saw it in the slide deck as well as plain “G5”.

      • ImSpartacus
      • 3 years ago

      “Stop trying to make G5X happen. It’s not going to happen.”

        • Waco
        • 3 years ago

        Wow, a Mean Girls quote. I’m sad I recognized it. 🙁

          • tipoo
          • 3 years ago

          Sad? It’s a great movie!

          “What are marijuana tablets?”

            • Waco
            • 3 years ago

            I’ll admit to having never seen it, but know the quote anyway. 😛

    • chuckula
    • 3 years ago

    That’s the thing about GDDR5X: It lacks HBM’s positive aspects, which is bad in the long run since 5 years from now most GPUs will probably be using stacked memory descendants of HBM.

    However, GDDR5X also lacks HBM’s negative aspects: It’s cheap to produce, you can make it in quantity right now without too much heartburn, and it has minimal manufacturing complexity increases for the AIB makers.

    So in 5 years will anybody want a GDDR5X card for anything other than a low-end part? Nope. However, for the next 2 or maybe 3 years GDDR5X will still make sense for the majority of consumer-grade GPUs that aren’t super high end.

      • UnfriendlyFire
      • 3 years ago

      I suppose GDDR5X would be more of a mid-range option for GPUs, and a cheap stop-gap solution while HBM improves or becomes cheaper as well.

      I wonder how long VRAM DDR3 is still going to stick around?

        • NTMBK
        • 3 years ago

        I hope it will be replaced by DDR4 in this generation. Or the whole category of low end GPUs will be replaced by integrated graphics.

        • Srsly_Bro
        • 3 years ago

        If you consider the gtx 1080 mid range, that is.

          • sweatshopking
          • 3 years ago

          IT IS MIDRANGE.

            • Srsly_Bro
            • 3 years ago

            You’re being emotional too, as expected, bro. 🙂

            I didn’t make any claim, only intended to add further clarification to the statement.

          • ImSpartacus
          • 3 years ago

          It’s their halo card right now, but yes, it’s ultimately an upper mid range gpu. The G*#04 gpus are upper mid range while the G*#06 gpus are lower mid range. “Big Pascal”has yet to rear its head on the consumer market.

            • Srsly_Bro
            • 3 years ago

            You’re being presumptuous. I didn’t make any claim regarding the 1080. Stop being emotional.

        • Vhalidictes
        • 3 years ago

        It’s also the case that some GPUs simply aren’t bandwidth-limited, especially in the midrange and entry-level cards. GDDR5X might be overkill/enough for most cards into the next few years.

          • Ninjitsu
          • 3 years ago

          Yup, seems to be architecture dependent too.

      • maxxcool
      • 3 years ago

      VHS vs BETAMAX .. Let’s hope this time BETAMAX wins.

      On the other hand, We need a no hold barred throwdown of the BEST GDDR5x VS BEST HBM2 solution to see the cost vs benefit ratio and judge for ourselves the ‘need’ for HBM2 overall.

        • UberGerbil
        • 3 years ago

        It’s not VHS vs Beta, though: there are no network effects. It makes no difference to you if your friend has already bought a graphics card with HBM. Back in the day, it made all the difference who you could trade tapes with.

        Also, porn is equally available for all graphics cards regardless of the memory tech they use.

          • maxxcool
          • 3 years ago

          I’m more leaning into the issue of GDDR5x being cheap and EASY to mass produce. where HBM is expensive, complicated even though ‘on paper’ it is superior.

          edit.. added superior comment.

            • UberGerbil
            • 3 years ago

            Just about everything in tech starts out as expensive and complicated, and then gets cheap and commonplace. Almost everything in your PC began as an exotic, server-grade only technology. SSDs (and fast HDs before that), multi-core superscalar processors, discrete GPUs, high-speed serial buses… you name it, virtually everything debuted as a tech that was superior on paper but too expensive and complicated for consumers. Right until it wasn’t.

            • maxxcool
            • 3 years ago

            Fair point. But not any time soon.

        • Kretschmer
        • 3 years ago

        This isn’t a “similar solution with some tradeoffs” like the home video format war or the HD DVD format war. HBM2 is light-years ahead of GDDR5X but is segmented to the top end. Once HBM2 (3?) is affordable, it will displace GDDR5X. GDDR5X would never “win”.

          • maxxcool
          • 3 years ago

          I will reserve the right to disagree. HBM# will never fully displace PCB mounted ram. budget and mid-range mainstream cards will remain firmly PCB bound for not just cost reasons but manufacturing complexity.

        • Deanjo
        • 3 years ago

        [quote<]VHS vs BETAMAX .. [/quote<] You mean it is dependent on which one the pr0n industry supports?

      • xeridea
      • 3 years ago

      Wouldn’t HBM be easier for AIB makers? The memory is already on package and card size can be a lot smaller, with simpler cooling.

        • chuckula
        • 3 years ago

        It might be “easier” as long as the AIBs have to buy 100% pre-made modules from the GPU maker or some middleman like Hynix or Samsung. That also brings higher costs with minimal savings since they still need a PCB and still have to mount the module to the PCB.

        However, I wouldn’t be shocked to see some of the bigger AIB makers eventually spend the money to get the ability to buy the GPU, memory, and interposer seperately so they can take advantage of spot prices on the memory market. That is a big expense but may be worth it in the long run since it allows for more flexibility.

          • xeridea
          • 3 years ago

          Yeah I was just under the impression the process of working with the HBM was significantly complex enough it would have to be done by the vendor, because it is hard even for them. It could change in the future I guess.

    • NTMBK
    • 3 years ago

    Good stuff. Bring on the new GPUs!

Pin It on Pinterest

Share This