In the lab: AMD’s Radeon VII graphics card

AMD pulled some surprises out of its hat at CES this year not just by showing off one of its impressively efficient Zen 2 CPUs, but also by unveiling the first graphics card for consumers made on a 7-nm process. The aptly named Radeon VII has the GeForce RTX 2080 in its sights, according to AMD's own benchmarks, and it's got plenty of big numbers to go with its big britches.

The Radeon VII boasts 3840 shader ALUs spread across 60 Vega compute units, 16 GB of HBM2 RAM running at 2 Gb/s per pin, and a 4096-bit-wide memory bus capable of delivering a theoretical terabyte per second of memory bandwidth. Those considerable resources come together to power AMD's long-awaited contender in the truly high-end, under-$1000 graphics market established by the GTX 1080 Ti. The move to TSMC's 7-nm FinFET process has allowed AMD to crank core clock speeds to a peak of 1800 MHz, too, compared to a peak of 1546 MHz on the RX Vega 64.

The Vega 20 GPU

Along with its encouraging performance potential, CEO Lisa Su noted during her CES keynote that the Radeon VII will deliver "25% higher performance at the same power." If we take that to mean "the same power as the RX Vega 64," it's no surprise that the company has opted for a triple-fan, open-style cooler rather than the harsh-sounding blower that shipped aboard the RX Vega duo. The Radeon VII also needs two eight-pin power connectors to do its thing, just like its RX Vega predecessors. The only partner cards we've seen announced for the Radeon VII so far are just rebadged versions of this cooler, so it remains to be seen whether custom-cooled designs will eventually emerge.

Stay tuned for our full review of the Radeon VII. This card will launch Thursday, February 7 for $699.

Comments closed
    • UberGerbil
    • 4 years ago

    …BUT YOU SHOULD!!

    • UberGerbil
    • 4 years ago

    At least some of it comes from most of them having moved here from everywhere else in the past 8 or so years.

    • Action.de.Parsnip
    • 4 years ago

    Deleted

    • cynan
    • 4 years ago

    Do you think at least some of this variation comes from the social fragmentation derived from the town basically being comprised of a concentrated cluster of Seattleite communities?

    • thedosbox
    • 4 years ago

    The cat did not make an appearance πŸ™

    • JustAnEngineer
    • 4 years ago

    It has already happened.

    • Srsly_Bro
    • 4 years ago

    I’m not even sure what that means. Seattle varies so much it could mean anything, even a compliment. Thanks, auxy.

    • ludi
    • 4 years ago

    They won’t pay you right away. Emigrants to Alaska have to go through some hoops to establish permanent residence first. Not personal experience, but so I am told by relatives who lived there at one point.

    • chuckula
    • 4 years ago

    Just move to Alaska!

    They’ll pay you (seriously, they will, check Wikipedia), the weather is just as bad as Canada….. AND YOU NEVER HAVE TO SAY YOU’RE SORRY!!

    • derFunkenstein
    • 4 years ago

    Shit i live in the wrong country

    • tipoo
    • 4 years ago

    Among other reasons, yes

    • derFunkenstein
    • 4 years ago

    Somebody pays you to be Canadian?

    • enixenigma
    • 4 years ago

    You are correct about the Instinct line. For whatever reason, I mixed this line up with the Radeon Pro in my head. Everything else you said only reinforces my point, however.

    • derFunkenstein
    • 4 years ago

    It just hasn’t been the same since Frasier Crane signed off.

    • derFunkenstein
    • 4 years ago

    I think he wants elaboration on what the location explains. IIRC, UberGerbil is near Seattle, too.

    • Anonymous Coward
    • 4 years ago

    I wonder if us westerners will become (as a whole) more robust against the low-brow trolling, click-baiting, attention-whoring and general display of degeneracy that the internet brings to each of our screens. Will the generation that grows up with all this be slaves to manipulation by businesses and governments?

    • K-L-Waster
    • 4 years ago

    Yes, but AMD needs to source it (I forget if they buy it from Hynix or Samsung…)

    • K-L-Waster
    • 4 years ago

    To paraphrase Genghis Khan: “It is not enough that I bought a good product, every other product must be seen as horrible by everyone on Earth.”

    • Anonymous Coward
    • 4 years ago

    The internet helps bring out our inner chimpanse. MUST DESTROY THE OTHER GUYS. VERY IMPORTANT.

    • thx1138r
    • 4 years ago

    That rumor doesn’t really pass the smell test, you can’t buy HBM “alone” as it were, it’s part of the package.

    • ptsant
    • 4 years ago

    Radeon Instinct isn’t a “prosumer” chip, it’s a hardcore server product that doesn’t even come with a fan. Of course, it’s based on Vega. But all the improvements have gone on getting fast compute and DL, not running games.

    • Wonders
    • 4 years ago

    Seattle, Washington.

    • Anonymous Coward
    • 4 years ago

    I’d like to see them (for example) use the 570’s (or future equivalent) as a dumping ground for the high-voltage-requirement 580’s to the extent that the lesser part defaulted to a higher voltage than the better part. (With factory overclocks probably undoing that for many actual products, but not all.)

    • MOSFET
    • 4 years ago

    It’s just a choice in a market. That’s all it is. You’re correct in many ways – it’s not going to gain momentum [b<]over[/b<] Nvidia, it will be low volume, and it does not hide the absence of a high-end gaming card from AMD. All three points, you are correct. I don't understand your disdain for this product though. Just don't buy it. A casual market observer wouldn't care either way, and would likely be happy for a product release that's decidedly not low-end. An Nvidia fan really wouldn't care either way - this is no harm, as you point out. An AMD fan should be happy there is a GPU product release at all, especially one that's decidedly not low-end. It gives them something to sell to keep them afloat. Therefore, I don't understand your disdain.

    • ronch
    • 4 years ago

    How about they make a Ruby Edition? And a Fixer Edition? I miss those characters.

    • UberGerbil
    • 4 years ago

    Care to elaborate?

    • JustAnEngineer
    • 4 years ago

    Relying on that compass may be a problem if you’re looking for something near the opposite end of the globe.
    [url<]https://www.npr.org/2019/02/04/691471616/as-magnetic-north-pole-zooms-toward-siberia-scientists-update-world-magnetic-mod[/url<]

    • Krogoth
    • 4 years ago

    Technically, the southern celestial pole is within the Octans constellation and Sigma Octantis is the southern pole star.

    But all of the stars in the Octans constellation are too faint to see with the naked eye outside of ideal conditions. The much brighter Crux constellation is a practical tool for circumnavigation since its southern point faces fairly close to true south.

    • Mr Bill
    • 4 years ago

    You can use the [url=https://earthsky.org/favorite-star-patterns/how-to-use-southern-cross-to-find-south-celestial-pole<]Southern Cross and a compass to find the South Celestial Pole[/url<]. But no usefull star is nearby. Apparently you can also use the [url=https://en.wikipedia.org/wiki/Crux<]\Crux and your right fist[/url<] to find the South pole. Its a long time till there is a Southern Pole star. [url=https://earthsky.org/tonight/sirius-future-south-pole-star<]Delta Velorum and Sirius [/url<] will become South Pole stars in the year 9250 and 66270, respectively. Hmmm, Radeon Crux, it could be a contender.

    • Eversor
    • 4 years ago

    This may be an artifact of the mining craze. It is also possible they are binning them for Data Center applications and home users get the worst of the bunch.

    • gmskking
    • 4 years ago

    Did you see the tear down? No glue.

    • enixenigma
    • 4 years ago

    Agreed. I’d expect Navi to compete in the 2060/2070 range.

    • NTMBK
    • 4 years ago

    Not as fast as the 2080

    • Anonymous Coward
    • 4 years ago

    I wonder why they don’t have a better system to bin the chips according to their potential. It wouldn’t even have to be a huge difference, sell in batches that have similar voltage requirements and clock limits, let the card makers handle the finer points of marketing them. If they have batches that run fine at lower voltage, they should make sure that’s how they are sold, kind of like automatic overclocking applied to efficiency.

    • sweatshopking
    • 4 years ago

    WILL THIS MAKE MY ITUNES FASTER

    • psuedonymous
    • 4 years ago

    I’d expect Navi to be the midrange counterpart to Vega 20, as Polaris was to Vega and Tonga/Hawaii was to Fury/Fiji.

    • tipoo
    • 4 years ago

    Waiting for the next generation they teased instead

    [url<]https://i.redd.it/06b7nu7s7re21.png[/url<]

    • enixenigma
    • 4 years ago

    That’s a shame, really. In my mind, such a chip shouldn’t have been sold.

    The whole point of blowing out voltage is so that AMD can sell chips with marginal viability such as the example here. I’m not saying that I have a clear or easy answer by any means, but AMD needs to find a way to increase yields so that they aren’t needing to push volts way high in order to produce an acceptable number of chips. I’m sure it’s easier said than done, though.

    • tipoo
    • 4 years ago

    As a professional Canadian…

    …No.

    • Gadoran
    • 4 years ago

    If you really think this product is the right choice to gain finally some little momentum over Nvidia, speaking of level of ignorance it is a good battle between us.
    This low volume card has not common sense and it can not hide the absence of a real Amd offering at the high end.

    • K-L-Waster
    • 4 years ago

    Dynamically Generated Vapour Chamber Cooling (DGVCC) — only on Vega VII!!

    • K-L-Waster
    • 4 years ago

    That product name is going to need an NSFW flag….

    • Chrispy_
    • 4 years ago

    Not interested until they release the [i<]XXX Black[/i<] edition. I'd buy an XFX RX DLXXX VII XXX just to crash the checkout machines at Micro Center.

    • Pwnstar
    • 4 years ago

    Isn’t red their color?

    • auxy
    • 4 years ago

    You’re in SEATTLE? That explains so much. ( *Β΄θ‰Έο½€)

    • Voldenuit
    • 4 years ago

    So, you’re saying, AMD will release a [i<]rouge one[/i<]?

    • Jigar
    • 4 years ago

    Srsly_bro ? Graphic card is not Shovel. πŸ˜›

    • Jigar
    • 4 years ago

    Its like Star Wars, we will go reverse after some episodes.

    • DPete27
    • 4 years ago

    Love it!

    • jihadjoe
    • 4 years ago

    Chips like [url=https://old.reddit.com/r/Amd/comments/aminfd/dont_know_if_i_should_keep_pushing_my_overclock/<]what this guy got[/url<] are the reason AMD needs to keep blowing out voltage. Sad thing is when someone with a bad chip reports an experience contrary to the expected narrative (such as that they're unable to undervolt) the community of "AMD fans" immediately shoot them down and call them idiots. This particular thread was heavily downvoted until it got more visibility.

    • gerryg
    • 4 years ago

    So glad we can undervolt! Will help when I put it in my barely-cooled Pentium PC. And side benefit, it will last 20 or more years! πŸ˜‰

    • Voldenuit
    • 4 years ago

    Well, the Radeon 580 is DLXXX in Roman numerals. So if you get the XFX RX 580, you could get an…
    XFX RX DLXXX VIII GB.

    • Srsly_Bro
    • 4 years ago

    Drwho…. ohw[ei]rd[o] LOL

    • drwho
    • 4 years ago

    ivan ….Navi LOL

    • willyolioleo
    • 4 years ago

    i expect the 2019 Navi will be midrange, probably close to identical to whatever they’re releasing for the next-gen consoles.

    A higher-end Navi probably won’t arrive until 2020.

    • Wirko
    • 4 years ago

    Poor Australians, will it ever be their turn?

    • Wirko
    • 4 years ago

    That’s VA glow. IPS glow is red. AMD used both shades of black this time.

    • K-L-Waster
    • 4 years ago

    Will this make the upcoming GPU 4000 years ahead of its time?

    • K-L-Waster
    • 4 years ago

    Sheesh, at least wait for the review πŸ˜›

    • Mr Bill
    • 4 years ago

    Navi is named after Gamma Cephei – also known as Errai. It was nicknamed Navi by astronaut Virgil Ivan β€œGus” Grissom because they were using it as a navigational point and also a play on words because Ivan can be rearranged to Navi. It will be the closest star to the pole in about 4000 years.

    • thedosbox
    • 4 years ago

    I’m disappointed in the lack of cat if this turns out to be his final TR appearance.

    • Growler
    • 4 years ago

    What if you used a Matrox card for 2D and two Voodoo2s in SLI? Then you get the best of both worlds.

    • Inkling
    • 4 years ago

    Thing is, you know all that scripting Bruno was working on for reviewing GPUs and CPUs? We told Jeff that it was to automate some of the benchmarking, but it was actually to automate and replace Jeff. Once he had produced a critical mass of review content, we were able to build AI to replicate him.

    It’s so good that if I hadn’t told you you would’ve never known that it wasn’t actually Jeff doing these articles. But since TR is always honest with our readers, I felt like we had to inform you.

    In honor of Jeff’s years of hard work, we’ll continue to produce these under his name.

    πŸ˜‰

    • K-L-Waster
    • 4 years ago

    Yes, but the IPS glow makes it look light grey…

    • tipoo
    • 4 years ago

    I will be interested in seeing how much difference 1TB/s makes when it has the same ROP count and architecture as the Vega 64. Might be more for the baby compute card side than anything.

    • tipoo
    • 4 years ago

    Sick prank bro!

    • DPete27
    • 4 years ago

    I was able to cut my RX480 power draw @ 1305MHz from 130W to 105W by whacking off ~50mV. That seems to be a pretty common offset they chose for Polaris. Seems they tested for the upper bound, then added 50mV to cover leaky chips. Their power/voltage algorithm (can’t remember the catch phrase they used) isn’t nearly as smart as they said it was.

    • NTMBK
    • 4 years ago

    Looks like he didn’t get the Intel CEO gig

    • DPete27
    • 4 years ago

    Rumor has it, the HBM2 could cost as much as $350 alone. Total BoM as high as $650. Not knowing exactly what volume discounts AMD is getting though.

    • DPete27
    • 4 years ago

    Jeff is back!?
    Is this a side gig Jeff?

    • tipoo
    • 4 years ago

    Word seems to be that Navi will be launching as a mid range part, more bringing 1080 performance to under 300 dollars than replacing the VII.

    • chuckula
    • 4 years ago

    Matr[s<]i[/s<][u<]o[/u<]x would still win: The scoreboard is in 2D.

    • dragontamer5788
    • 4 years ago

    It seems incredibly unlikely that AMD will release Navi at the same performance point as Radeon VII.

    Seems like Navi will be a Vega64 replacement (and below).

    • Stochastic
    • 4 years ago

    Last I heard AMD is still targeting 2019 for the launch of Navi. That kills any excitement I might have for the Radeon VII.

    • alloyD
    • 4 years ago

    Wouldn’t be much of a game. 3dFX would use their Voodoo to Glide right into a victory.

    • Action.de.Parsnip
    • 4 years ago

    It’s a die shrunk vega 56, what more are you expecting

    • Mr Bill
    • 4 years ago

    I knew that Polaris and Vega are named after current and future pole stars. But I just realized that the Phenom (Thuban ([url=https://starchild.gsfc.nasa.gov/docs/StarChild/questions/question64.html<]Alpha Draconis[/url<]) core) is also a former pole star (circa 3000BC). AMD has been using star names for longer than I realized.

    • K-L-Waster
    • 4 years ago

    Someone will soon post to boycott [s<]the NFL[/s<] GPUs and [s<]watch NCAA[/s<] get an XBOX any moment now....

    • Krogoth
    • 4 years ago

    UNLIMITED GLUE!

    • Thrashdog
    • 4 years ago

    We really wanted that [s<]Saints[/s<]Matrox vs [s<]Chiefs[/s<]3dFX matchup, but this is what we get instead, huh?

    • chuckula
    • 4 years ago

    The 128 ROPs was an error on Anandtech’s part and they have issued a correction that it’s 64 like everyone expected.

    • Voldenuit
    • 4 years ago

    Can we confirm that it has 64 ROPs?

    Early rumors on the card quoted 128 ROPs, which was taken by many people to be an error in reporting.

    • enixenigma
    • 4 years ago

    Anecdotal evidence, but I was able to cut down on power draw significantly at the same clocks on my reference RX 480 and V64(both with aftermarket heatsinks, mind) via undervolting. Much of the research I did at the time suggested that this was not difficult to achieve with most cards. If AMD continues to push high volts to get more working chips, it’ll paint the card in a bad light.

    • derFunkenstein
    • 4 years ago

    my goodness I wrote the exact opposite of what I meant.

    • chuckula
    • 4 years ago

    Why is it that I’m afraid this review will end up like the Super Bowl?

    Everyone hates Ngreedia outside of New England but they still manage to “win” (kinda sorta) over the younger Rams although you really think that both teams deserve to lose when the whole thing is over?

    • chuckula
    • 4 years ago

    Gimme a Radeon III then.

    • derFunkenstein
    • 4 years ago

    This is more of a countdown, because the VII is for nanometers, I think.

    • ermo
    • 4 years ago

    “Does it come in black?”

    • DragonDaddyBear
    • 4 years ago

    It’s Vega II (Roman numerals), or VII. So, it’s kinda punny, like Epyc.

    • derFunkenstein
    • 4 years ago

    85W less power and, you know, much higher performance. Probably.

    • NTMBK
    • 4 years ago

    It’s a 7nm card fighting a 12nm card, with the (ostensibly power-saving) HBM2 against GDDR6, and it [i<]still[/i<] comes out far less efficient.

    • sreams
    • 4 years ago

    Maybe in three… But not in every dimension.

    • Chrispy_
    • 4 years ago

    AMD didn’t specify which version of the Vega64 they were comparing power consumtion to.

    Vega64 Air = 295W
    Vega64 LC = 345W

    Two 8-pin connectors means a peak draw of 375W is permitted for Radeon VII, the real question is why does it need such a massive cooler?

    • RAGEPRO
    • 4 years ago

    Much more power than a 2080, yeah. 2080 Ti? Nah. Should be around the same methinks. Remember, this really is a next-gen manufacturing process. Even taking into account the relative inefficiency of GCN compared to Pascal and Turing, it has a big advantage there.

    • techguy
    • 4 years ago

    Except that the Radeon VII will fall precisely into the performance range you cite, with the clockspeeds it is shipping at. Lowering the clockspeed would place it outside that range.

    If you really think AMD’s cherry-picked benchmarks are representative of what you’re going to see on review day then I’ve got some cheap oceanside real estate for sale in Arizona, if you’re interested…

    • gru5354
    • 4 years ago

    To all those that have been having a nice day ridiculing the R VII’s 300W TDP – 2080 has 215W.
    That 85W higher wattage with twice the memory that happens to need _some_ wattage is truly laughable <facepalm/>

    • 1sh
    • 4 years ago

    I should have said some cases if not very similar performance…
    [url<]https://youtu.be/K5vbHxAFrmI[/url<]

    • Chrispy_
    • 4 years ago

    I’m looking forward to this.

    The triple-fan solution worries me somewhat, though. AMD haven’t crammed in any more transistors with the 7nm die-shrink, so they have just increased clocks. The real question is whether they have ruined power consumption in their chase to match or beat a 2080.

    • Srsly_Bro
    • 4 years ago

    And with the heat generated it will evaporate the water?

    • maroon1
    • 4 years ago

    Predictions:

    It is going to be slightly slower than RTX 2080 on average and consume much more power than RTX 2080 Ti

    • NTMBK
    • 4 years ago

    Episode VII: The Geforce Awakens

    • Spunjji
    • 4 years ago

    Agreed here. The problem is, if they clocked this back to a happier spot on its efficiency curve then they’d be left with something that sits in an odd spot between the 2070 and 2080, potentially leaving it unable to command the margins necessary for the product to make any sense in the first place. (I’m guessing about margins of course as we don’t know their yields, but the manufacturing costs for a GPU this size on a bleeding-edge process paired with 16GB of HBM2 can’t be pretty.)

    Personally I’m happy for them to play that game as long as they leave me the freedom to scale things back myself. πŸ™‚

    • Waco
    • 4 years ago

    Is the review embargo going to lift prior to them going up for sale (hopefully you can say)?

    I’m on the fence since I need a new GPU anyway, but there’s essentially zero chance I’ll buy one on launch day without at least a little trusted review sauce prior.

    • dragontamer5788
    • 4 years ago

    Hmm, I still consider Vega64 to be a workstation card, not really a gaming card (It can game, but it seems like NVidia ones are slightly better at it). For the few things that have good OpenCL support, Vega64 usually outperforms the RTX equivalent.

    The problem is that a lot of programs are CUDA only. But there are plenty of programs (ex: Blender, [url=https://www.pugetsystems.com/pic_disp.php?id=50957&<]Da Vinci[/url<] ([url=https://www.pugetsystems.com/pic_disp.php?id=50960&width=800&height=800<]Test Summary[/url<])) where the Vega64 keeps up to the 2070. The main issue is finding those programs. There's definitely more programs that work on CUDA. With that being said, Blender, Da Vinci Resolve, and Magix Vegas are the workstation programs I personally use, so the Vega64 definitely makes sense for my use cases.

    • chuckula
    • 4 years ago

    With the triple fan cooler, you can blow the snow away in every dimension.

    • chuckula
    • 4 years ago

    Nonsense!

    If Radeon VII doesn’t destroy the RTX 2080 then we can blame it all on the [i<]lack of glue![/i<] I mean, there's no 14nm IO chip anywhere in this thing!

    • jihadjoe
    • 4 years ago

    There’s a Lake Elmer.
    Jim Keller is at Intel working on “integration”.
    Code name for Intel’s next chip confirmed!

    • jihadjoe
    • 4 years ago

    [quote<]while offering better performance in many cases. [/quote<] [url=https://www.techpowerup.com/reviews/NVIDIA/GeForce_RTX_2060_Founders_Edition/33.html<]lol[/url<]

    • K-L-Waster
    • 4 years ago

    At first glance I thought it had no video outs. Took me a moment to see them huddled close to the PCB on the first pic.

    Other than that, well, we’ll see what the review has to say.

    • enixenigma
    • 4 years ago

    Rebranded prosumer chip (MI50) versus chip designed from the ground up for gaming. Shouldn’t be a surprise that one is better than the other for gaming.

    • Srsly_Bro
    • 4 years ago

    If you paid attention to the news you’d know. If you are grossly ignorant and just want to be in disbelief and argue, keep doing you, bro.

    • Srsly_Bro
    • 4 years ago

    We just got a few inches of snow in Seattle. Can I shovel snow with this?

    • enixenigma
    • 4 years ago

    True, but it remains to be seen what AMD’s next high-end chip/architecture will look like. AMD has been playing catch-up to Nvidia on the efficiency front for years and, while I am hopeful, there’s no guarantee that they will be able to come close enough to compete at the high end in this next cycle. We don’t even know when their real high-end successor will see the light of day.

    • Gadoran
    • 4 years ago

    …. IMPRESSIVELY EFFICIENT ZEN 2…..nice, too bad nobody have tested one of these. So were is the proof?.
    Looking that card I don t see many advantages shifting on 7nm, half the power at the same clock speed…. bold claim, the realty the have only raised the clock of only 15% at the same power with higher manufacturing costs.
    Nvidia does a lot better changing radically the arc on a less expensive node having far better results.

    • 1sh
    • 4 years ago

    Vega 64 is a pretty solid buy for $400. $50 more than the RTX 2060 and $100 cheaper than the RTX 2070 while offering better performance in many cases.
    Ofcourse you sacrafice heat and power consumption but you can undervolt and get the same stock performance with lower power consumption.

    That kinda makes the Radeon VII a bad buy for $699 if its only delivering 25% more performance…

    • Platedslicer
    • 4 years ago

    Isn’t that sort of what I said though?…

    [quote<]The one size fits all design really doesn't help them to compete on the perf/W front[/quote<] AMD started in 2011 with GCN by going with a single architecture for server, prosumer and consumer cards, and even then there was a marked difference in efficiency between the 7970 and 680. For a while now they've been reusing even the specific chips. It may make some sense from a value perspective for a company that can't afford multiple designs for different markets, but it sure doesn't make for a tight product. And Navi's conspicuous absence from CES doesn't really inspire much confidence that there will be changes in this scenario.

    • ronch
    • 4 years ago

    Just in time. With such power draw this is perfect for the crazy cold winter in the USA.

    Well done, AMD!!

    • ronch
    • 4 years ago

    Did I miss Radeon I to VI? Sorry I must’ve dozed off.

    • Krogoth
    • 4 years ago

    These cards are just Vega20-based Instinct/FirePro rejects that ISV/enterprise customers didn’t want. They are placeholders until Navi comes around. They are going to end-up be a collector’s item of “interesting technology that didn’t quite make it”

    • Platedslicer
    • 4 years ago

    Bit depressing that even with smaller transistors, HBM, higher power draw, and no ray tracing shenanigans, AMD isn’t even aiming at Nvidia’s top card. Performance is great but I don’t like sweating while I play, plus there’s that lingering durability concern. The one size fits all design really doesn’t help them to compete on the perf/W front, but with Nvidia’s poor RTX sales and a recession on the horizon, I can’t really see AMD putting the resources into changing that picture.

    I guess over the next few years at least NV is going to find themselves in a situation similar to Intel’s in AMD’s Bulldozer days – competing mainly with their last gen products. Heck, that’s what’s happening even now.

    • enixenigma
    • 4 years ago

    It’s Jeff!!! Hi Jeff!!!

    Anywho, I’m interested in seeing if this card will have similar undervolting chops to the V64. AMD really needs to stop blowing out voltage on their cards.

    • Krogoth
    • 4 years ago

    This card will outpace Elmer’s glue production.

Pin It on Pinterest

Share This

Share this post with your friends!