In the lab: AMD’s Radeon VII graphics card

AMD pulled some surprises out of its hat at CES this year not just by showing off one of its impressively efficient Zen 2 CPUs, but also by unveiling the first graphics card for consumers made on a 7-nm process. The aptly named Radeon VII has the GeForce RTX 2080 in its sights, according to AMD's own benchmarks, and it's got plenty of big numbers to go with its big britches.

The Radeon VII boasts 3840 shader ALUs spread across 60 Vega compute units, 16 GB of HBM2 RAM running at 2 Gb/s per pin, and a 4096-bit-wide memory bus capable of delivering a theoretical terabyte per second of memory bandwidth. Those considerable resources come together to power AMD's long-awaited contender in the truly high-end, under-$1000 graphics market established by the GTX 1080 Ti. The move to TSMC's 7-nm FinFET process has allowed AMD to crank core clock speeds to a peak of 1800 MHz, too, compared to a peak of 1546 MHz on the RX Vega 64.

The Vega 20 GPU

Along with its encouraging performance potential, CEO Lisa Su noted during her CES keynote that the Radeon VII will deliver "25% higher performance at the same power." If we take that to mean "the same power as the RX Vega 64," it's no surprise that the company has opted for a triple-fan, open-style cooler rather than the harsh-sounding blower that shipped aboard the RX Vega duo. The Radeon VII also needs two eight-pin power connectors to do its thing, just like its RX Vega predecessors. The only partner cards we've seen announced for the Radeon VII so far are just rebadged versions of this cooler, so it remains to be seen whether custom-cooled designs will eventually emerge.

Stay tuned for our full review of the Radeon VII. This card will launch Thursday, February 7 for $699.

Comments closed
    • Action.de.Parsnip
    • 7 months ago

    Deleted

    • ronch
    • 7 months ago

    How about they make a Ruby Edition? And a Fixer Edition? I miss those characters.

    • sweatshopking
    • 7 months ago

    WILL THIS MAKE MY ITUNES FASTER

      • NTMBK
      • 7 months ago

      Not as fast as the 2080

    • tipoo
    • 7 months ago

    Waiting for the next generation they teased instead

    [url<]https://i.redd.it/06b7nu7s7re21.png[/url<]

    • DPete27
    • 7 months ago

    Jeff is back!?
    Is this a side gig Jeff?

      • NTMBK
      • 7 months ago

      Looks like he didn’t get the Intel CEO gig

      • tipoo
      • 7 months ago

      Sick prank bro!

      • Inkling
      • 7 months ago

      Thing is, you know all that scripting Bruno was working on for reviewing GPUs and CPUs? We told Jeff that it was to automate some of the benchmarking, but it was actually to automate and replace Jeff. Once he had produced a critical mass of review content, we were able to build AI to replicate him.

      It’s so good that if I hadn’t told you you would’ve never known that it wasn’t actually Jeff doing these articles. But since TR is always honest with our readers, I felt like we had to inform you.

      In honor of Jeff’s years of hard work, we’ll continue to produce these under his name.

      πŸ˜‰

        • DPete27
        • 7 months ago

        Love it!

      • thedosbox
      • 7 months ago

      I’m disappointed in the lack of cat if this turns out to be his final TR appearance.

        • K-L-Waster
        • 7 months ago

        Sheesh, at least wait for the review πŸ˜›

          • thedosbox
          • 7 months ago

          The cat did not make an appearance πŸ™

    • Stochastic
    • 7 months ago

    Last I heard AMD is still targeting 2019 for the launch of Navi. That kills any excitement I might have for the Radeon VII.

      • dragontamer5788
      • 7 months ago

      It seems incredibly unlikely that AMD will release Navi at the same performance point as Radeon VII.

      Seems like Navi will be a Vega64 replacement (and below).

      • tipoo
      • 7 months ago

      Word seems to be that Navi will be launching as a mid range part, more bringing 1080 performance to under 300 dollars than replacing the VII.

      • willyolioleo
      • 7 months ago

      i expect the 2019 Navi will be midrange, probably close to identical to whatever they’re releasing for the next-gen consoles.

      A higher-end Navi probably won’t arrive until 2020.

        • psuedonymous
        • 7 months ago

        I’d expect Navi to be the midrange counterpart to Vega 20, as Polaris was to Vega and Tonga/Hawaii was to Fury/Fiji.

          • enixenigma
          • 7 months ago

          Agreed. I’d expect Navi to compete in the 2060/2070 range.

    • Mr Bill
    • 7 months ago

    I knew that Polaris and Vega are named after current and future pole stars. But I just realized that the Phenom (Thuban ([url=https://starchild.gsfc.nasa.gov/docs/StarChild/questions/question64.html<]Alpha Draconis[/url<]) core) is also a former pole star (circa 3000BC). AMD has been using star names for longer than I realized.

      • Mr Bill
      • 7 months ago

      Navi is named after Gamma Cephei – also known as Errai. It was nicknamed Navi by astronaut Virgil Ivan β€œGus” Grissom because they were using it as a navigational point and also a play on words because Ivan can be rearranged to Navi. It will be the closest star to the pole in about 4000 years.

        • K-L-Waster
        • 7 months ago

        Will this make the upcoming GPU 4000 years ahead of its time?

        • drwho
        • 7 months ago

        ivan ….Navi LOL

          • Srsly_Bro
          • 7 months ago

          Drwho…. ohw[ei]rd[o] LOL

      • Wirko
      • 7 months ago

      Poor Australians, will it ever be their turn?

        • Mr Bill
        • 7 months ago

        You can use the [url=https://earthsky.org/favorite-star-patterns/how-to-use-southern-cross-to-find-south-celestial-pole<]Southern Cross and a compass to find the South Celestial Pole[/url<]. But no usefull star is nearby. Apparently you can also use the [url=https://en.wikipedia.org/wiki/Crux<]\Crux and your right fist[/url<] to find the South pole. Its a long time till there is a Southern Pole star. [url=https://earthsky.org/tonight/sirius-future-south-pole-star<]Delta Velorum and Sirius [/url<] will become South Pole stars in the year 9250 and 66270, respectively. Hmmm, Radeon Crux, it could be a contender.

          • Krogoth
          • 7 months ago

          Technically, the southern celestial pole is within the Octans constellation and Sigma Octantis is the southern pole star.

          But all of the stars in the Octans constellation are too faint to see with the naked eye outside of ideal conditions. The much brighter Crux constellation is a practical tool for circumnavigation since its southern point faces fairly close to true south.

          • JustAnEngineer
          • 7 months ago

          Relying on that compass may be a problem if you’re looking for something near the opposite end of the globe.
          [url<]https://www.npr.org/2019/02/04/691471616/as-magnetic-north-pole-zooms-toward-siberia-scientists-update-world-magnetic-mod[/url<]

    • Voldenuit
    • 7 months ago

    Can we confirm that it has 64 ROPs?

    Early rumors on the card quoted 128 ROPs, which was taken by many people to be an error in reporting.

      • chuckula
      • 7 months ago

      The 128 ROPs was an error on Anandtech’s part and they have issued a correction that it’s 64 like everyone expected.

        • tipoo
        • 7 months ago

        I will be interested in seeing how much difference 1TB/s makes when it has the same ROP count and architecture as the Vega 64. Might be more for the baby compute card side than anything.

    • chuckula
    • 7 months ago

    Why is it that I’m afraid this review will end up like the Super Bowl?

    Everyone hates Ngreedia outside of New England but they still manage to “win” (kinda sorta) over the younger Rams although you really think that both teams deserve to lose when the whole thing is over?

      • Thrashdog
      • 7 months ago

      We really wanted that [s<]Saints[/s<]Matrox vs [s<]Chiefs[/s<]3dFX matchup, but this is what we get instead, huh?

        • K-L-Waster
        • 7 months ago

        Someone will soon post to boycott [s<]the NFL[/s<] GPUs and [s<]watch NCAA[/s<] get an XBOX any moment now....

        • alloyD
        • 7 months ago

        Wouldn’t be much of a game. 3dFX would use their Voodoo to Glide right into a victory.

          • chuckula
          • 7 months ago

          Matr[s<]i[/s<][u<]o[/u<]x would still win: The scoreboard is in 2D.

            • Growler
            • 7 months ago

            What if you used a Matrox card for 2D and two Voodoo2s in SLI? Then you get the best of both worlds.

    • ermo
    • 7 months ago

    “Does it come in black?”

      • K-L-Waster
      • 7 months ago

      Yes, but the IPS glow makes it look light grey…

        • Wirko
        • 7 months ago

        That’s VA glow. IPS glow is red. AMD used both shades of black this time.

    • gru5354
    • 7 months ago

    To all those that have been having a nice day ridiculing the R VII’s 300W TDP – 2080 has 215W.
    That 85W higher wattage with twice the memory that happens to need _some_ wattage is truly laughable <facepalm/>

      • NTMBK
      • 7 months ago

      It’s a 7nm card fighting a 12nm card, with the (ostensibly power-saving) HBM2 against GDDR6, and it [i<]still[/i<] comes out far less efficient.

      • derFunkenstein
      • 7 months ago

      85W less power and, you know, much higher performance. Probably.

        • derFunkenstein
        • 7 months ago

        my goodness I wrote the exact opposite of what I meant.

    • Chrispy_
    • 7 months ago

    I’m looking forward to this.

    The triple-fan solution worries me somewhat, though. AMD haven’t crammed in any more transistors with the 7nm die-shrink, so they have just increased clocks. The real question is whether they have ruined power consumption in their chase to match or beat a 2080.

    • maroon1
    • 7 months ago

    Predictions:

    It is going to be slightly slower than RTX 2080 on average and consume much more power than RTX 2080 Ti

      • RAGEPRO
      • 7 months ago

      Much more power than a 2080, yeah. 2080 Ti? Nah. Should be around the same methinks. Remember, this really is a next-gen manufacturing process. Even taking into account the relative inefficiency of GCN compared to Pascal and Turing, it has a big advantage there.

      • Chrispy_
      • 7 months ago

      AMD didn’t specify which version of the Vega64 they were comparing power consumtion to.

      Vega64 Air = 295W
      Vega64 LC = 345W

      Two 8-pin connectors means a peak draw of 375W is permitted for Radeon VII, the real question is why does it need such a massive cooler?

    • Waco
    • 7 months ago

    Is the review embargo going to lift prior to them going up for sale (hopefully you can say)?

    I’m on the fence since I need a new GPU anyway, but there’s essentially zero chance I’ll buy one on launch day without at least a little trusted review sauce prior.

    • K-L-Waster
    • 7 months ago

    At first glance I thought it had no video outs. Took me a moment to see them huddled close to the PCB on the first pic.

    Other than that, well, we’ll see what the review has to say.

    • Srsly_Bro
    • 7 months ago

    We just got a few inches of snow in Seattle. Can I shovel snow with this?

      • chuckula
      • 7 months ago

      With the triple fan cooler, you can blow the snow away in every dimension.

        • Srsly_Bro
        • 7 months ago

        And with the heat generated it will evaporate the water?

          • K-L-Waster
          • 7 months ago

          Dynamically Generated Vapour Chamber Cooling (DGVCC) — only on Vega VII!!

        • sreams
        • 7 months ago

        Maybe in three… But not in every dimension.

      • Jigar
      • 7 months ago

      Srsly_bro ? Graphic card is not Shovel. πŸ˜›

      • auxy
      • 7 months ago

      You’re in SEATTLE? That explains so much. ( *Β΄θ‰Έο½€)

        • UberGerbil
        • 7 months ago

        Care to elaborate?

          • Wonders
          • 7 months ago

          Seattle, Washington.

            • derFunkenstein
            • 7 months ago

            I think he wants elaboration on what the location explains. IIRC, UberGerbil is near Seattle, too.

          • derFunkenstein
          • 7 months ago

          It just hasn’t been the same since Frasier Crane signed off.

          • Srsly_Bro
          • 7 months ago

          I’m not even sure what that means. Seattle varies so much it could mean anything, even a compliment. Thanks, auxy.

            • cynan
            • 7 months ago

            Do you think at least some of this variation comes from the social fragmentation derived from the town basically being comprised of a concentrated cluster of Seattleite communities?

            • UberGerbil
            • 7 months ago

            At least some of it comes from most of them having moved here from everywhere else in the past 8 or so years.

      • tipoo
      • 7 months ago

      As a professional Canadian…

      …No.

        • derFunkenstein
        • 7 months ago

        Somebody pays you to be Canadian?

          • tipoo
          • 7 months ago

          Among other reasons, yes

            • derFunkenstein
            • 7 months ago

            Shit i live in the wrong country

            • chuckula
            • 7 months ago

            Just move to Alaska!

            They’ll pay you (seriously, they will, check Wikipedia), the weather is just as bad as Canada….. AND YOU NEVER HAVE TO SAY YOU’RE SORRY!!

            • ludi
            • 7 months ago

            They won’t pay you right away. Emigrants to Alaska have to go through some hoops to establish permanent residence first. Not personal experience, but so I am told by relatives who lived there at one point.

            • UberGerbil
            • 7 months ago

            …BUT YOU SHOULD!!

    • Gadoran
    • 7 months ago

    …. IMPRESSIVELY EFFICIENT ZEN 2…..nice, too bad nobody have tested one of these. So were is the proof?.
    Looking that card I don t see many advantages shifting on 7nm, half the power at the same clock speed…. bold claim, the realty the have only raised the clock of only 15% at the same power with higher manufacturing costs.
    Nvidia does a lot better changing radically the arc on a less expensive node having far better results.

      • Srsly_Bro
      • 7 months ago

      If you paid attention to the news you’d know. If you are grossly ignorant and just want to be in disbelief and argue, keep doing you, bro.

        • Gadoran
        • 7 months ago

        If you really think this product is the right choice to gain finally some little momentum over Nvidia, speaking of level of ignorance it is a good battle between us.
        This low volume card has not common sense and it can not hide the absence of a real Amd offering at the high end.

          • MOSFET
          • 7 months ago

          It’s just a choice in a market. That’s all it is. You’re correct in many ways – it’s not going to gain momentum [b<]over[/b<] Nvidia, it will be low volume, and it does not hide the absence of a high-end gaming card from AMD. All three points, you are correct. I don't understand your disdain for this product though. Just don't buy it. A casual market observer wouldn't care either way, and would likely be happy for a product release that's decidedly not low-end. An Nvidia fan really wouldn't care either way - this is no harm, as you point out. An AMD fan should be happy there is a GPU product release at all, especially one that's decidedly not low-end. It gives them something to sell to keep them afloat. Therefore, I don't understand your disdain.

            • Anonymous Coward
            • 7 months ago

            The internet helps bring out our inner chimpanse. MUST DESTROY THE OTHER GUYS. VERY IMPORTANT.

            • K-L-Waster
            • 7 months ago

            To paraphrase Genghis Khan: “It is not enough that I bought a good product, every other product must be seen as horrible by everyone on Earth.”

            • Anonymous Coward
            • 7 months ago

            I wonder if us westerners will become (as a whole) more robust against the low-brow trolling, click-baiting, attention-whoring and general display of degeneracy that the internet brings to each of our screens. Will the generation that grows up with all this be slaves to manipulation by businesses and governments?

            • JustAnEngineer
            • 7 months ago

            It has already happened.

      • enixenigma
      • 7 months ago

      Rebranded prosumer chip (MI50) versus chip designed from the ground up for gaming. Shouldn’t be a surprise that one is better than the other for gaming.

        • ptsant
        • 7 months ago

        Radeon Instinct isn’t a “prosumer” chip, it’s a hardcore server product that doesn’t even come with a fan. Of course, it’s based on Vega. But all the improvements have gone on getting fast compute and DL, not running games.

          • enixenigma
          • 7 months ago

          You are correct about the Instinct line. For whatever reason, I mixed this line up with the Radeon Pro in my head. Everything else you said only reinforces my point, however.

    • 1sh
    • 7 months ago

    Vega 64 is a pretty solid buy for $400. $50 more than the RTX 2060 and $100 cheaper than the RTX 2070 while offering better performance in many cases.
    Ofcourse you sacrafice heat and power consumption but you can undervolt and get the same stock performance with lower power consumption.

    That kinda makes the Radeon VII a bad buy for $699 if its only delivering 25% more performance…

      • jihadjoe
      • 7 months ago

      [quote<]while offering better performance in many cases. [/quote<] [url=https://www.techpowerup.com/reviews/NVIDIA/GeForce_RTX_2060_Founders_Edition/33.html<]lol[/url<]

        • 1sh
        • 7 months ago

        I should have said some cases if not very similar performance…
        [url<]https://youtu.be/K5vbHxAFrmI[/url<]

      • dragontamer5788
      • 7 months ago

      Hmm, I still consider Vega64 to be a workstation card, not really a gaming card (It can game, but it seems like NVidia ones are slightly better at it). For the few things that have good OpenCL support, Vega64 usually outperforms the RTX equivalent.

      The problem is that a lot of programs are CUDA only. But there are plenty of programs (ex: Blender, [url=https://www.pugetsystems.com/pic_disp.php?id=50957&<]Da Vinci[/url<] ([url=https://www.pugetsystems.com/pic_disp.php?id=50960&width=800&height=800<]Test Summary[/url<])) where the Vega64 keeps up to the 2070. The main issue is finding those programs. There's definitely more programs that work on CUDA. With that being said, Blender, Da Vinci Resolve, and Magix Vegas are the workstation programs I personally use, so the Vega64 definitely makes sense for my use cases.

    • ronch
    • 7 months ago

    Just in time. With such power draw this is perfect for the crazy cold winter in the USA.

    Well done, AMD!!

    • ronch
    • 7 months ago

    Did I miss Radeon I to VI? Sorry I must’ve dozed off.

      • NTMBK
      • 7 months ago

      Episode VII: The Geforce Awakens

      • DragonDaddyBear
      • 7 months ago

      It’s Vega II (Roman numerals), or VII. So, it’s kinda punny, like Epyc.

      • derFunkenstein
      • 7 months ago

      This is more of a countdown, because the VII is for nanometers, I think.

        • chuckula
        • 7 months ago

        Gimme a Radeon III then.

          • Krogoth
          • 7 months ago

          UNLIMITED GLUE!

      • Voldenuit
      • 7 months ago

      Well, the Radeon 580 is DLXXX in Roman numerals. So if you get the XFX RX 580, you could get an…
      XFX RX DLXXX VIII GB.

        • Chrispy_
        • 7 months ago

        Not interested until they release the [i<]XXX Black[/i<] edition. I'd buy an XFX RX DLXXX VII XXX just to crash the checkout machines at Micro Center.

          • K-L-Waster
          • 7 months ago

          That product name is going to need an NSFW flag….

      • Jigar
      • 7 months ago

      Its like Star Wars, we will go reverse after some episodes.

        • Voldenuit
        • 7 months ago

        So, you’re saying, AMD will release a [i<]rouge one[/i<]?

          • Pwnstar
          • 7 months ago

          Isn’t red their color?

    • Platedslicer
    • 7 months ago

    Bit depressing that even with smaller transistors, HBM, higher power draw, and no ray tracing shenanigans, AMD isn’t even aiming at Nvidia’s top card. Performance is great but I don’t like sweating while I play, plus there’s that lingering durability concern. The one size fits all design really doesn’t help them to compete on the perf/W front, but with Nvidia’s poor RTX sales and a recession on the horizon, I can’t really see AMD putting the resources into changing that picture.

    I guess over the next few years at least NV is going to find themselves in a situation similar to Intel’s in AMD’s Bulldozer days – competing mainly with their last gen products. Heck, that’s what’s happening even now.

      • Krogoth
      • 7 months ago

      These cards are just Vega20-based Instinct/FirePro rejects that ISV/enterprise customers didn’t want. They are placeholders until Navi comes around. They are going to end-up be a collector’s item of “interesting technology that didn’t quite make it”

        • Platedslicer
        • 7 months ago

        Isn’t that sort of what I said though?…

        [quote<]The one size fits all design really doesn't help them to compete on the perf/W front[/quote<] AMD started in 2011 with GCN by going with a single architecture for server, prosumer and consumer cards, and even then there was a marked difference in efficiency between the 7970 and 680. For a while now they've been reusing even the specific chips. It may make some sense from a value perspective for a company that can't afford multiple designs for different markets, but it sure doesn't make for a tight product. And Navi's conspicuous absence from CES doesn't really inspire much confidence that there will be changes in this scenario.

        • enixenigma
        • 7 months ago

        True, but it remains to be seen what AMD’s next high-end chip/architecture will look like. AMD has been playing catch-up to Nvidia on the efficiency front for years and, while I am hopeful, there’s no guarantee that they will be able to come close enough to compete at the high end in this next cycle. We don’t even know when their real high-end successor will see the light of day.

      • Action.de.Parsnip
      • 7 months ago

      It’s a die shrunk vega 56, what more are you expecting

    • enixenigma
    • 7 months ago

    It’s Jeff!!! Hi Jeff!!!

    Anywho, I’m interested in seeing if this card will have similar undervolting chops to the V64. AMD really needs to stop blowing out voltage on their cards.

      • Spunjji
      • 7 months ago

      Agreed here. The problem is, if they clocked this back to a happier spot on its efficiency curve then they’d be left with something that sits in an odd spot between the 2070 and 2080, potentially leaving it unable to command the margins necessary for the product to make any sense in the first place. (I’m guessing about margins of course as we don’t know their yields, but the manufacturing costs for a GPU this size on a bleeding-edge process paired with 16GB of HBM2 can’t be pretty.)

      Personally I’m happy for them to play that game as long as they leave me the freedom to scale things back myself. πŸ™‚

        • techguy
        • 7 months ago

        Except that the Radeon VII will fall precisely into the performance range you cite, with the clockspeeds it is shipping at. Lowering the clockspeed would place it outside that range.

        If you really think AMD’s cherry-picked benchmarks are representative of what you’re going to see on review day then I’ve got some cheap oceanside real estate for sale in Arizona, if you’re interested…

        • enixenigma
        • 7 months ago

        Anecdotal evidence, but I was able to cut down on power draw significantly at the same clocks on my reference RX 480 and V64(both with aftermarket heatsinks, mind) via undervolting. Much of the research I did at the time suggested that this was not difficult to achieve with most cards. If AMD continues to push high volts to get more working chips, it’ll paint the card in a bad light.

          • DPete27
          • 7 months ago

          I was able to cut my RX480 power draw @ 1305MHz from 130W to 105W by whacking off ~50mV. That seems to be a pretty common offset they chose for Polaris. Seems they tested for the upper bound, then added 50mV to cover leaky chips. Their power/voltage algorithm (can’t remember the catch phrase they used) isn’t nearly as smart as they said it was.

            • gerryg
            • 7 months ago

            So glad we can undervolt! Will help when I put it in my barely-cooled Pentium PC. And side benefit, it will last 20 or more years! πŸ˜‰

        • DPete27
        • 7 months ago

        Rumor has it, the HBM2 could cost as much as $350 alone. Total BoM as high as $650. Not knowing exactly what volume discounts AMD is getting though.

          • thx1138r
          • 7 months ago

          That rumor doesn’t really pass the smell test, you can’t buy HBM “alone” as it were, it’s part of the package.

            • K-L-Waster
            • 7 months ago

            Yes, but AMD needs to source it (I forget if they buy it from Hynix or Samsung…)

      • jihadjoe
      • 7 months ago

      Chips like [url=https://old.reddit.com/r/Amd/comments/aminfd/dont_know_if_i_should_keep_pushing_my_overclock/<]what this guy got[/url<] are the reason AMD needs to keep blowing out voltage. Sad thing is when someone with a bad chip reports an experience contrary to the expected narrative (such as that they're unable to undervolt) the community of "AMD fans" immediately shoot them down and call them idiots. This particular thread was heavily downvoted until it got more visibility.

        • enixenigma
        • 7 months ago

        That’s a shame, really. In my mind, such a chip shouldn’t have been sold.

        The whole point of blowing out voltage is so that AMD can sell chips with marginal viability such as the example here. I’m not saying that I have a clear or easy answer by any means, but AMD needs to find a way to increase yields so that they aren’t needing to push volts way high in order to produce an acceptable number of chips. I’m sure it’s easier said than done, though.

          • Anonymous Coward
          • 7 months ago

          I wonder why they don’t have a better system to bin the chips according to their potential. It wouldn’t even have to be a huge difference, sell in batches that have similar voltage requirements and clock limits, let the card makers handle the finer points of marketing them. If they have batches that run fine at lower voltage, they should make sure that’s how they are sold, kind of like automatic overclocking applied to efficiency.

            • Eversor
            • 7 months ago

            This may be an artifact of the mining craze. It is also possible they are binning them for Data Center applications and home users get the worst of the bunch.

            • Anonymous Coward
            • 7 months ago

            I’d like to see them (for example) use the 570’s (or future equivalent) as a dumping ground for the high-voltage-requirement 580’s to the extent that the lesser part defaulted to a higher voltage than the better part. (With factory overclocks probably undoing that for many actual products, but not all.)

    • Krogoth
    • 7 months ago

    This card will outpace Elmer’s glue production.

      • jihadjoe
      • 7 months ago

      There’s a Lake Elmer.
      Jim Keller is at Intel working on “integration”.
      Code name for Intel’s next chip confirmed!

      • chuckula
      • 7 months ago

      Nonsense!

      If Radeon VII doesn’t destroy the RTX 2080 then we can blame it all on the [i<]lack of glue![/i<] I mean, there's no 14nm IO chip anywhere in this thing!

      • gmskking
      • 7 months ago

      Did you see the tear down? No glue.

Pin It on Pinterest

Share This