AMD announces the Radeon VII graphics card: $699, February launch

Even though the company will celebrate its 50th anniversary this year, AMD's keynote today was its first ever at CES. During the show, AMD CEO Lisa Su announced the next high-performance Radeon graphics card: the Radeon VII.

AMD CEO Lisa Su holding a Radeon VII card.

As you could probably surmise from the name, the Radeon VII is manufactured on TSMC's 7-nm process. Dr. Su referred to the chip aboard the Radeon VII specifically as a "second-generation Vega" part. While we don't have explicit details of the Radeon VII's core configuration yet, Su said it has 60 compute cores (giving it 3840 shader ALUs) running at a nominal frequency of 1.8 GHz. That GPU core is hooked up to 16 GB of HBM2 memory that, according to AMD's announcement, gives the card 1 TB/sec of memory bandwidth. Doing the math, that means it probably has four 4-GB stacks of HBM2 memory running at 2 GT/sec.

Among the slides that AMD showed behind Dr. Su were a few comparisons of the card's performance to the RX Vega 64 and to Nvidia's GeForce RTX 2080. In DirectX titles Battlefield V and Far Cry 5, AMD claims competitive performance against the green team's card, while in the Vulkan-powered Strange Brigade, the Radeon VII decisively pulls ahead.

AMD presented the first ever live gameplay demo of Capcom's Devil May Cry 5, running on a Radeon VII.

AMD showed off the card playing an early version of upcoming action title Devil May Cry 5. Su said that the game was running in 4K (3840×2160) resolution with "max details," and we could clearly see the FRAPS counter onscreen showing us that the title was maintaining well over 60 FPS, even peaking above 100 FPS. Devil May Cry 5 runs on Capcom's in-house "RE Engine," the same used for Resident Evil 7 and the Resident Evil 2 remake; based on those games' hardware requirements, the game should be quite demanding. Unfortunately, there wasn't much going on gameplay-wise in the demo area.

You can't see it here, but the frame-rate counter in the top-right says 117 FPS.

AMD also used the Radeon VII as part of its demo for the third-generation desktop Ryzen parts, showing Forza Horizon 4 running consistently at over 100 FPS, again with max details. That demo was only running in 1920×1080 resolution, though.

The configuration and stated capabilities of the card give us the impression that it's likely to be a close relative of the Radeon Instinct MI50. This is purely speculation on our part, but we wouldn't be surprised if the Radeon VII supports PCI Express 4.0, given both that the Instinct MI50 does, and also that the Zen 2 and Epyc 2 processors that were also announced today will have PCIe 4.0 support. AMD didn't comment on any specific capabilities of the new card or its onboard GPU, so we don't know what—if any—of the Instinct MI50's high-performance compute capabilities the Radeon VII may retain.

AMD's fastest GPU ever will be available on February 7, both from the usual e-tailers and, apparently, directly from AMD. Better crack open the piggy bank, though, because single-card 4K AAA gaming doesn't come cheap: AMD expects the card to go for $699.

Comments closed
    • Voldenuit
    • 9 months ago

    Hm… according to Extremetech, the 128 ROPs is a misreporting, and Vega VII has only 64 ROPs.

    [url<]https://www.extremetech.com/gaming/283649-the-amd-radeon-viis-core-configuration-has-been-misreported[/url<]

      • tipoo
      • 9 months ago

      Yeah dunno how that got that far, when MI50 has 64.

    • Phartindust
    • 9 months ago

    So if 60 CU = 2080 perf, then what will the full fat 64 do?

    • Bensam123
    • 9 months ago

    Yikes the comments here. AMD gets kudos for actually making use of inventory they hadn’t planned on selling to consumers after Nvidia shit the bed with RTX. Nvidia had such a huge hard on for ray tracing, they basically didn’t care about performance in normal conditions, so now AMD can capitalize on that. And as could be predicted people with RTX card turn ray tracing off for higher performance, smoother gameplay, and better frame rates (such as BFV). It’s so close to what we already have today it doesn’t really matter. This card was never designed to be a consumer card, it is because they can, because Nvidia gave them an opportunity to do so. This current release was supposed to be skipped.

    Anyway, there is a curious amount of hate here for ‘its too late to the party’ and that’s the sole reason people shouldn’t buy it. If it offers the same performance as a 1080ti or a 2080, why shouldn’t people buy it if they’re shopping around today? Assuming people are still buying video cards , it doesn’t matter when it’s released as people are buying ‘now’. More Vram. Worse efficiency (which doesn’t matter for a desktop).

    Better in some games, worse in others. Why does it need to be cheaper too? Why do people believe AMD should always sell something cheaper then Nvidia when it offers basically identical performance (assuming this card does based off their benchmarks)? Also MSRP for this card is $700, MSRP for a 2080 is $800. Yes you can get them cheaper now, you’ll be able to get these cheaper as well.

    This is a pointless comment considering the sentiment here, but holy cow the comments section here has gotten crazy in recent years.

      • JustAnEngineer
      • 9 months ago

      Bless his heart. I actually feel pity for someone who dedicates so much of their life to spreading hate for a single company.

      • chuckula
      • 9 months ago

      The irony is that while the RTX cards are clearly an expensive beta-test for ray tracing, even AMD is clearly trying to do the exact same thing, they just aren’t ready yet.

      So when AMD finally gets around to having a ray tracing solution we’ll be hearing a completely different story from you at that time.

      It’s fine to say that Ray Tracing is annoying and expensive right now (and maybe for a long time to come).

      It’s shilling to say that Ray Tracing is annoying and expensive…. because Nvidia did it first and when AMD gets its version out at which point AMD invented it and deserves an award.

      To wit:
      [quote<]AMD has ray tracing GPUs in development, too Nvidia is pushing its RTX ray tracing technology hard, both in terms of desktop graphics cards as well as the introduction of its first mobile RTX GPUs here at CES. But Su said that AMD also has its own ray tracing technology in development, though she was cagey when asked about details. Does AMD have an RTX 2070 competitor waiting in the wings? “I think ray tracing is an important technology, and it’s something we’re working on as well, both from a hardware and software standpoint,” Su said. “The most important thing, and that’s why we talk so much about the development community, is technology for technology’s sake is okay, but technology done together with partners who are fully engaged is really important.” [/quote<] [url<]https://www.pcworld.com/article/3332205/amd/amd-ceo-lisa-su-interview-ryzen-raytracing-radeon.html[/url<] That's B.S. corporate speak for: We're happy to call Nvidia's Ray Tracing crap because ours isn't ready yet. But if Nvidia was *really* dumb to do it we wouldn't be going on about how we're going to do it too... eventually.

      • K-L-Waster
      • 9 months ago

      You can actually thank AdoredTV for the nature of the comments — he had the AMD fanbois absolutely convinced this card was going to cost $300.

        • DoomGuy64
        • 9 months ago

        Citation needed. Adored ususally has good sources, and this seems like an attack on his channel without evidence. Most likely if anything was referenced to cost $300, it was Navi.

        I haven’t been following the Vega rumors much, but from what I did see, most people thought it was never coming out. Also, Vega VII isn’t a mid-range product, and we have yet to see a cheaper version, which could be brought down in price. That said, I don’t see why AMD would bother if Vega can’t use GDDR, or 8GB HBM is still too expensive. They could release a cheaper 56 card with 8GB, but there haven’t been any leaks on that afaik.

          • K-L-Waster
          • 9 months ago

          Here ya go…

          [url<]https://videocardz.com/79253/adoredtv-amd-to-introduce-radeon-rx-3000-series[/url<] Edit: besides, weren't you the one that was treating a $699 as the depths of evil in this very article's comments?

            • chuckula
            • 9 months ago

            AdoredTV: Showing the true power of YouTube to spread misinformation to the masses.

            Oh, but in this one video he made AFTER the CES announcement he didn’t verbatim repeat the BS he spewed before the announcement… So he’s completely accurate because he agrees with the emotional inclinations of the AMD crowd!!

            • DoomGuy64
            • 9 months ago

            That’s clearly Navi, shows right in the screenshot. I don’t know where you’re getting Vega.

            • K-L-Waster
            • 9 months ago

            I’m getting “next card from AMD” — which is pretty much what everyone’s reacting to. They saw that, and expected that performance and that price to be what was announced at CES.

            And then when it wasn’t, the complainapocalypse started.

            • DoomGuy64
            • 9 months ago

            If you’re talking about the vibe AMD was projecting, don’t bring in a 3rd party like AdoredTV and blatantly lie about him.

            Otherwise, AMD pretty much dug their own grave on this one, but there is a number of things one can infer from this launch.

            *Vega 2 performs better than AMD expected, and nvidia’s pricing made it possible to bring out.
            *Navi is either delayed, or AMD doesn’t want to talk about it.
            *Vega 2 probably performs better than Navi, or there’s something related going on.

            So, while AMD might be happy about Vega2, the average consumer is not, because we were misled. This all stems from AMD’s lack of communication of what they are doing. First, Vega 2 wasn’t coming out, and now it is. So what’s going on?

            Nobody knows. Except here’s Radeon VII for $699. Great for the PCMR tards that don’t think a company exists without a halo product, but everyone else who buys mid-range is like WTF. It’s AMD’s Diablo mobile moment.

            Overall, the Halo people are to blame for Radeon VII. Maybe one of these days I’ll start my own review site, and deliberately never review any Halo products, because that’s what needs to happen for mid-range to be taken serious.

            • DoomGuy64
            • 9 months ago

            Yup. Navi was delayed. Guess who’s leaking it?
            [url<]https://www.youtube.com/watch?v=c8EONokJTdU[/url<]

    • anotherengineer
    • 9 months ago

    Is Chuckula actually Jensen?

    [url<]https://www.techpowerup.com/251400/nvidia-ceo-jensen-huang-on-radeon-vii-underwhelming-the-performance-is-lousy-freesync-doesnt-work[/url<]

      • chuckula
      • 9 months ago

      No you shill.
      I’m actually Raj.
      Intel shill remember???

        • anotherengineer
        • 9 months ago

        lol shill

        disclaimer
        I do not work in tech industry
        I do not own a single stock or share in any company
        if they all (intel, amd, nvidia) bankrupt tomorrow………….meh

          • chuckula
          • 9 months ago

          Disclaimer: The most recent set of stocks I owned in any of these companies was AMD, which I was smart enough to sell when it was still over $30/share and made a nice profit.

          Thanks AMD.

      • Krogoth
      • 9 months ago

      I think Jensen is bitter that he didn’t call AMD’s bluff and unlocked his G-Sync card too early in the game. He thought that AMD would tease and announce Navi which would threaten Nvidia’s current stance in the mid-range and value markets.

    • Aranarth
    • 9 months ago

    From my point of view this is a good thing…
    Sure it does not leapfrog nvidia but that is not the point.

    The point is they have rev’ed up old tech on a new process and it is close to nvidia top end.

    If they pull the same thing they did with Ryzen in the CPU arena again in the GPU arena Nvidia should be worried.

    The way I see it Amd went from miles behind Intel, to 20% behind, to almost equal, to (maybe) 20% ahead with ryzen 1, 2, and then 3 we just have to wait for benchmarks.

    Now with vega they were behind, with VII, they are sort of even or slightly behind (pending benchmarks), the next major redesign should be a leap frog.

    Grab a beer and sit back and enjoy the show. This race is far from over folks and AMD just got started firing on all cylinders… Whether they crash and burn at the next turn or execute with aplomb remains to be seen. Either way, this is gonna be FUN!

      • enixenigma
      • 9 months ago

      Good point. They are playing catch up, similar to how they are with Intel. AMD has definitely made great strides on the CPU front. They may not compete at the top end, but they are at least in the picture with a die shrink of Vega. Hopefully they can significantly improve power efficiency with Navi and beyond, as that is where they are really struggling.

      • Pancake
      • 9 months ago

      Of course, everyone seems to be conveniently forgetting the fact NVidia also has access to 7nm fabs. Their next generation of products will also have a big performance/power consumption improvement.

      Given NVidia’s very much larger (many times) manufacturing orders than AMD – together with Apple and all the other phone manufacturers – the interesting play to see will be if AMD can keep up with the big dogs and not be supply constrained.

        • Voldenuit
        • 9 months ago

        The die size reduction from 14 nm to 7nm for Vega has been less than I expected (495 mm^2 –> 331 mm^2).

        I’ve also heard that the cost per transistor of 7nm is higher than on 14 nm, which may explain the cut-down to 60 CUs for yields. Obviously, the cost will go down with time, but sometimes it’s not great to be the first to a new node.

    • albundy
    • 9 months ago

    wow, that’s mighty affordable. hopefully the 2 people that buy it will let us know if its any good.

    • rinshun
    • 9 months ago

    Disappointed: I wanted a 150~250USD card.

      • albundy
      • 9 months ago

      get a used rx 580 for $100 or less and you’ll be set for a few years. i’m not giving either company my money for new cards after what they pulled in 2018. never forget, never forgive.

        • K-L-Waster
        • 9 months ago

        Sorry, did someone *promise* you would get high performance for bargain basement prices? (And by “someone” I don’t mean AdoredTV.)

        I fail to see why people are viewing the price levels as some sort of personal betrayal.

      • K-L-Waster
      • 9 months ago

      Hey, *everyone wants* a 2080TI beater that costs under $200.

      Reality, however, is not obligated to play along.

        • chuckula
        • 9 months ago

        Except for Apple.

          • K-L-Waster
          • 9 months ago

          I’m sure Apple wants their component cost to be sub $100. For which they will then charge $2500 of course…

    • caconym
    • 9 months ago

    It’s a shame there aren’t more OpenCL-based renderers, because this thing otherwise sounds like an artist’s dream. 16 GB!

      • tipoo
      • 9 months ago

      Yeah, CUDA is a hard choice limiter for many. AMD tries to wedge in there with more OpenCL/Vulkan performance per dollar.

    • Thresher
    • 9 months ago

    I expect that this price will come down pretty quickly.

    The RTX parts are ridiculously priced, but there are rumors that nVidia will reveal GTX 11X0 cards that will essentially the RTX with no ray tracing units. If that happens, I expect they would slot in at least 100 less, possibly even less than the RTX cards. If that is the case, that would put them at 699 or less for the same or slightly better performance.

    If this does happen, you’ll see some price cutting pretty quickly.

    • Mat3
    • 9 months ago

    Three generations now, each on a new process node, but the number of compute units (64) has not changed!

      • chuckula
      • 9 months ago

      I think Matthew McConaughey [url=https://www.youtube.com/watch?v=bnlv1wd13fY<]said it best.[/url<]

      • moose17145
      • 9 months ago

      As I understood it, wasn’t vega quite a bit more sensitive to HBM/memory speeds than most other architectures? Could have swore I saw several articles about how people were seeing more performance gains just by overclocking the HBM alone than by messing with the GPU core speeds.

      Could very well be that thanks to the new process, they were able to crank the GPU itself a bit higher (likely also along with generational tweaks of the overall architecture to make it more efficient), and then combined with the pairing of better HBM and possibly more stacks of HBM they were able to get a non-trivial performance gain (even while leaving the number of compute units the same).

      I could very well be wrong though.

    • moose17145
    • 9 months ago

    Holy crud there is a lot of negativity in the comments here over a product that we do not even have actual benchmark numbers for yet.

    I mean yea, I am let down by the price as well and likely won’t be buying one… but yeesh… some of you need to calm down.

    Even if it’s late compared to NVidia’s release, if AMD can get a full product line up that is largely competitive with NVidia in terms of price / performance, then I will be happy. Because NVidia desperately needs an actual competitor again.

    As for the power consumption (which I might point out none of us know what the power consumption of this looks like yet)… you guys seriously blow that HUGELY out of proportion. Unless you are running a data center full of these things it is pretty unlikely you are going to actually notice the increased power consumption. Unless you are literally gaming and keeping this thing under load like 12 hours a day, every day. In which case you are likely a 14 year old who still lives at home, has no job, and likely is not paying the electrical bill anyways.

      • chuckula
      • 9 months ago

      [quote<]As for the power consumption (which I might point out none of us know what the power consumption of this looks like yet)... you guys seriously blow that HUGELY out of proportion. Unless you are running a data center full of these things it is pretty unlikely you are going to actually notice the increased power consumption. [/quote<] Careful Icarus! Half of Lisa Su's RyZen 2 spiel is that the 9900K is a disaster because of a 50 watt power delta in RyZen 2's favor! Having said that, there's good news for this Vega: She held up a standard heatsink-fan combo. Not a giant exotic liquid cooler. That's actually a very good sign.

        • moose17145
        • 9 months ago

        I understand having to play to a products strong suit, so it is smart of her to play that card (that IS her job as CEO, to paint her companies products in the best light possible), but I would also argue that for CPUs the power consumption is more important than on a gaming graphics card, since there is actually a chance that 1,000 of these things would end up in an office environment where the cumulative additional power draw would start to matter.

        But for home use… I honestly have never built a home desktop where power consumption is even a consideration. Every time I have built a desktop I have built it to just be as fast as possible within my budget at the time, power consumption be damned.

        I guess that is where I am not seeing why people in the comments are making such a fuss about it. Are you all running 100 of these in your homes under peak load 24/7 or something that a 100 extra watts is that big of a deal?

        Now granted, I turn my desktop(s) off when I am not using it, which means it is usually only powered up during the weekend and for the most part stays shut down during the week days. So a couple hundred extra watts really will not affect the power bill in any meaningful way given how long the system is actually in active use. If I left it powered up 24/7 then I would care more… but even then I would be more concerned about idle power consumption than load power consumption, and from what I have seen, the Radeons seem to be very much on par with NVidia when it comes to idle power consumption.

        But yes I agree that seeing the card with a standard air cooler is a good sign none the less. I am not saying that having an efficient chip is not important, as it very much is, but just that it seems like people are making it out to be a larger issue than it really is. As long as the price to performance is there, I do not really see the problem. (again data centers full of the things is an entirely different argument).

      • hiki
      • 9 months ago

      High power consumption may require a new PSU, raising the cost of the card by hundreds of dollars.

        • Redocbew
        • 9 months ago

        If you’re a numpty who buys a 1500 watt PSU, then yeah that may be. Your solution there is don’t be a numpty. 🙂

        • moose17145
        • 9 months ago

        I am going to make the assumption that if you are buying graphics card in the 700+ dollar price bracket, then you likely already have a power supply capable of handling it, or were already planning on purchasing a quality power supply of appropriate size for said components.

        I am also going to argue that if you are buying a 700+ dollar graphics card, you likely do not care about how much power it consumes as long as it is the fastest thing on the market (perhaps only to be out-performed by the even more lol-tastic 1500 dollar graphics card).

        Edited to correct some grammar.

      • Redocbew
      • 9 months ago

      The more years I spend keeping track of these things the less I pay attention to launch events, or rather the reaction to launch events. I’ll keep an eye on things just so I know what’s available the next time I need to buy hardware, but I’m not really that interested in whatever superlatives a given CEO has to offer about their newest widgets.

      • NovusBogus
      • 9 months ago

      Internets gonna internet. The age of diminishing GPU returns is going to yield some top quality entertainment in the next few years.

      GPU company: Hey Internet, we just made a pretty sweet $250 graphics card. It handles normal person display resolutions flawlessly and is reasonably power efficient.

      Internet: RABBLE RABBLE RABBLE, enough of this mainstream peasantry! We demand maximum performance!

      GPU company: Okay guys, you asked for it, we deliver. It wasn’t easy and our engineers had to think outside the box with some pretty wild features, but here come those big performance gains again.

      Internet: RABBLE RABBLE RABBLE, this pricing is outrageous! We demand all the things for none of the money!

      Repeat.

        • hiki
        • 9 months ago

        GPU company: Hey Internet, we just made the same price for the same performance.

        Rich Internet: I pass. I already bought that performance for that price long time ago.

        GPU company: Hey poor Internet, you didn’t bought this performance at this price when it was available.

        Poor Internet: No. I can’t afford this price for this performance. If I could, I would had bought it long time ago.

      • Pancake
      • 9 months ago

      You say “none of us know what the power consumption of this looks like yet”. But the enormous display Lisa Su is standing in front of states “25% more performance at the same power”. That’s a marketing slide so we can assume AT LEAST 300W like the hot and bothered Vega64. Disgusting.

      Power consumption matters a lot to me for a number of reasons. Firstly, it’s a sign of good engineering being able to extract the most work per unit of energy. Good engineering should be rewarded and praised. Praise be, NVidia. Praise be.

      Secondly, some people care about the environment and their impact on this world. About tackling climate change. Minimising energy consumption is a big part of this and my household power consumption is about 1/3 as much as the city average. And this is a nice big house not some pokey apartment. I’ve ordered a solar panel array and when installed I’ll actually be pushing out 4x as much electricity back into the grid than what I use ie I’ll be a net power producer. Because it makes me feel good to be doing the right thing.

        • moose17145
        • 9 months ago

        Then what you are arguing makes even less sense given what price bracket this is in. It would be similar to arguing that a ferrari sucks because it gets 10 miles per gallon while the bugatti is obviously far superior to because it gets 11. It is an irrelevant argument because you didnt buy either one of those super cars for their milage (and if you did, then you are seriously doing it wrong). Similarly, the top end halo products have rarely, if ever, been the sweet spot for power efficiency. Both the 2080 and whatever it is competing against are going to be power hogs. If you really care about it that much then you shouldnt be looking at anything any higher than 1060 level performance or maybe iGPUs.

          • Pancake
          • 9 months ago

          These aren’t cars. You don’t sit in them, enjoy their luxurious materials, their handling or even the sound. You buy these products, stick ’em in your box and that’s about it. What qualitative differences you’ll notice come from frame rates, frame times etc – the good stuff TR and other review sites do.

          But energy consumption sure is important. My first career was designing graphics cards for PCs (back in the ’80s – early ’90s) and I simply don’t respect inefficient design. It grates me.

          The 2080 is actually pretty efficient for what it does. But to get a chance to play with that ray tracing goodness and tensor madness I’m looking at 2060. That’s the sweet spot for me.

            • Srsly_Bro
            • 9 months ago

            Snowflake squad got you again. They get hurt really easily.

            The small minds look at cars and see fuel consumption but don’t understand the many reasons why cars are not even comparable.

            • Pancake
            • 9 months ago

            I’m not trying to bait or troll moosey. The quality of debate on this here august website is best maintained by providing quality information and commentary.

            Fact is, I have an NVidia card at the moment. Before that it was AMD. Before that NVidia. And so on. I have no allegiance to a particular manufacturer. I don’t have posters of Jen Hsun Huang or Lisa Su plastered in my bedroom. I have a discrete graphics card to play games. That’s what I enjoy.

            Not like cars. Oh no. I love my old Ford truck dearly. Clean it lovingly, maintain it, repair it etc. I am really proud of my wheels. Chicks like to be in it. Really not the same as graphics cards…

            • moose17145
            • 9 months ago

            If we take the 2080’s reference TDP of 225 Watts and we assume the Vega VII at 300 watts, as mentioned, I am just not seeing that big of a deal in a 75 watt gap for home use unless the thing is effectively under load the majority of it’s life. Certainly not for people to be making as big of a deal over it as they are. I understand your stance of wanting the more efficient design if all else is equal, but I still think we should wait for actual review numbers to come out before we declare something a success or DOA.

          • Srsly_Bro
          • 9 months ago

          I have a 1080Ti FTW3 and I very much care about power consumption. You operate on the fallacy that if a person can afford x, he doesn’t care about y. That is not the case and once again you’re wrong on logic alone.

          Just because someone is able to afford something, it doesn’t also mean he doesn’t care about associated costs. You don’t get money by burning it all up.

          If a person wants performance of a Vega 7 and RTX2080, under what rules must he have a total disregard for power consumption? Why should he not care? That doesn’t make sense. Even your comparison was flawed. You have a small mind. Factual statement, not adhominem.

          You are exactly like the dumb horoscope chicks who say, “Oh, he’s a Pisces so he must be this.” Again, it’s factual, not an attack.

          Snowflake squad, do your worst.

            • Redocbew
            • 9 months ago

            I upvote you because I want to be part of the Complaining About Random People on the Internet squad. But wait… If I’m all about the complaining, then there shouldn’t be any upvotes involved at all.

            In that case I upvote you for spite. SPITE SPITE SPITE!

            • Srsly_Bro
            • 9 months ago

            Lol. I’m complaining about false equivalency and other fallacy people. Check out all the down votes when when nothing was really said. Pancake was just saying he would rather have a more efficient GPU with similar performance and moose said because he spends so much on a GPU, he actually should not care about power consumption. I was trying to help him understand but he is about feel good vibes and not facts.

            • Redocbew
            • 9 months ago

            Building a power efficient PC on principle is all well and good, but if someone builds it that way and then they think “I’m doing it this way, and everyone else should also” I hope they don’t expect most people to listen.

        • Srsly_Bro
        • 9 months ago

        You’ve upset the snowflakes and you were given -6. Have a +1 back. I’ve given moose a few good tauntings in the past. He frankly never learns.

          • moose17145
          • 9 months ago

          You have? When? Honestly, I am not remembering any…

        • Krogoth
        • 9 months ago

        Power consumption for performance GPUs is only an issue if you looking for low-noise levels and thermal management. It has little to no bearing on the electric bill. You got bigger problems if power consumption of a performance GPU is causing budgetary issues.

        Besides on the topic on energy consumption, residential use is small fry when compared to industrial and transportation usage.

      • pogsnet1
      • 9 months ago

      Yeah, running 1 card, TDP barely noticeable. Freesync + this card is awesome option besides cheaper, there are things AMD has an advantage of, like secret… :p

    • ColeLT1
    • 9 months ago

    They got rid of the vega’s blower cooler, mine was too loud.

    Also, this is kinda funny
    [url<]https://i.imgur.com/0HYaKrM.gifv[/url<]

      • Krogoth
      • 9 months ago

      Vega’s blower cooler is pretty darn effective though. It is kinda hard to manage ~300W of heat with the surface area of a large PCIe 16x PCB card without sacrificing noise levels. You need water-cooling of some kind or even more exotic solutions if you want near-silence.

        • ColeLT1
        • 9 months ago

        Was effective for sure, just could hear them from my den. Had to keep those HMB temps down, they hated heat.

      • chuckula
      • 9 months ago

      It’s nice to see that AMD cares about the mini-ITX crowd!

        • ColeLT1
        • 9 months ago

        Some say a bitchin fast 3d 2000 hides under there.

          • Krogoth
          • 9 months ago

          Resurrected by the darkest of meme magics.

      • moose17145
      • 9 months ago

      Ha!

      Reminds me of the much famed, and also much unreleased 3DFX Voodoo 5 6000

        • Prestige Worldwide
        • 9 months ago

        The Voodoo 5 6000 had nothing on the Voodoo 5 9000

        [url<]https://www.vogons.org/viewtopic.php?f=46&t=59442[/url<]

    • leor
    • 9 months ago

    Nvidia set the table here, AMD is just showing up. The price of the highest end card has literally doubled in the last few generations, until people stop paying these prices, this is the new normal.

    I can picture the strategy meeting that created this card:”Wait, people are paying $1,200 for a GPU with NO performance benchmarks??? How much would we need to sell an MI50 for to turn a profit?”

      • K-L-Waster
      • 9 months ago

      Sooo you’re saying “AMD is charging more than we thought they would: it’s ALL NVIDIA’S FAULT!”

        • leor
        • 9 months ago

        That’s a bit of a juvenile way to look at it, I’d prefer to say Nvidia created the environment for a product like this to exist. If Nvidia doesn’t jack up the prices, AMD can’t take MI50, turn it into a consumer card, and make a profit.

          • K-L-Waster
          • 9 months ago

          With that much HBM onboard they probably can’t make a profit at lower price points.

          Which would mean the options are this card at high prices or… nothing at all.

            • leor
            • 9 months ago

            Exactly what I’m getting at, if the RTX matched the pricing of the 10XX products, this card could never exist.

            • sreams
            • 9 months ago

            I would expect that lower price points won’t include this much memory.

    • christos_thski
    • 9 months ago

    So we’re back to waiting for intel’s discrete GPU for the much needed reshuffle of price points and GPU prices back to rational levels.

    This is a highly disappointing launch, no two ways about it. AMD has delivered another underwhelming graphics card, and at nVidia’s justifiably maligned price ranges, at that.

    Having said that, I think a lot of you are TOO dissapointed. PC gaming is stronger than ever since the early 2000s. AMD seems, by all accounts, to still be on track for releasing competitive CPUs. The NAND mafia-cartel is taking a break from colluding, in light of recent antimonopoly investigations, and RAM prices are coming down. SSDs are fabulously affordable. And that god-awful monitor panel segmentation into gsync and freesync is finally becoming a thing of the past.

    Additionally , we don’t have to completely write AMD off on the graphics front, and for the first time in many, many years, we may have THREE high end graphics makers about this time next year.

    For my part, I can’t wait to be able to build a banging AMD cpu-intel dGPU system in a couple of years. That should confuse the fanboys 😛

    Sure, Radeon VII is a crappy underwhelming GPU. But it’s not the end of the world. It’s still an infinitely better time to be a PC enthusiast than it was just 12 months ago. Chin up!

      • Srsly_Bro
      • 9 months ago

      You make some very good points. I wish others in this site were at least 1/4th as rational, realistic, and mature as you. AMD Vega 7 may seem like a lot but no one is entitled to it if they can’t afford it. I’m not going to the Porsche dealer to protest their high prices as if I’m entitled to a new 911 turbo for under $50k. The same happens with the immature PC crowd here because they all feel entitled to things they cannot afford. It’s a sign of the times and it’s evident in this site. The down thumbs will only further prove my point when they get hurt after reading my post.

      If a person can’t afford it, tough. Someone else can, or prices would come down. No one is entitled to anything, except to pay taxes and die.

      So much entitlement SMH.

        • Krogoth
        • 9 months ago

        It is more like that PC gaming experience has been slowly homogenizing. You really don’t need high-end hardware to get the same experience for majority of titles. There aren’t nearly as many compelling content as there use to be and most of them are just console ports.

        The climbing paywall for high-end and enthusiast-tier hardware will just accelerate the decline of traditional PC market.

          • Srsly_Bro
          • 9 months ago

          Nicely put. Thanks for the response.

          • Mr Bill
          • 9 months ago

          We have reached a point where there is no Crysis.

            • Krogoth
            • 9 months ago

            It is because there’s still nothing that can run it. 😉

          • NovusBogus
          • 9 months ago

          That is a good way of looking at it. PC gaming is alive and well, but it’s going the way of niche titles like Rimworld and Pathfinder that don’t push the envelope on eye candy. Why pay thousands to play console games with console interfaces and console difficulty scaling on PC when you can just buy a Playstation and be done with it? In many ways the indie renaissance itself has been driven by AAA studios’ tendency to place graphics and marketing above everything else.

    • PrincipalSkinner
    • 9 months ago

    I guess there’s good news to be had from this.
    If AMD priced it at $700 then they must be confident in it even after RTX release.
    But my hopes for making an affordable Ryzen PC for 4k gaming seem to have dispersed.

      • Voldenuit
      • 9 months ago

      >If AMD priced it at $700 then they must be confident in it even after RTX release.

      I think this has blown past rose-tinted glasses and into retinal haemorrhage.

        • sreams
        • 9 months ago

        AMD doesn’t have any history of pricing their products out of the market that I am aware of. If past AMD pricing is anything to go by, they see this card as a competitor for 2080.

          • K-L-Waster
          • 9 months ago

          Either that or with that much HBM onboard they can’t be profitable at a lower price.

          • Voldenuit
          • 9 months ago

          >AMD doesn’t have any history of pricing their products out of the market that I am aware of.

          Radeon 5870: $80 price jump from previous gen.
          Radeon 5870 Eyefinity: $100 price jump from base 5870.
          Radeon 7990: AMD’s first $1000 card. $300 price jump compared to 6990.
          Radeon 295 X2: Anyone remember this one? The 7990 was AMD’s first thousand dollar card, so AMD tested the waters with a $1500 card.

          Bottom line: AMD, like everyone else, charges what they think the market will bear.

            • sreams
            • 9 months ago

            I’m referring specifically to pricing as it compares to the rest of the market. So yes… What the market will bear. Some are suggesting that this new card is priced above what the market will bear because AMD can’t afford to sell it for less. I don’t think AMD has any history of pricing any product above what the market will bear, so until we see otherwise I doubt that’s the case.

            • Voldenuit
            • 9 months ago

            Well, what the market will bear also depends on what the competition is like.

            Before AMD released the 4870 and 4850, nvidia was able to charge $100 more for Tesla than what the market would bear once the 48xx series was available.

            I think a lot of people were hoping Vega VII would do the same to Turing, but the design choices for Vega (HBM2, 4096-bit bus, interposer) doesn’t give amd the price elasticity to provide better value.

      • Krogoth
      • 9 months ago

      They priced at it $699 because it is the lowest price that can make it without selling at tight margins which is bad for high-end SKUs.

        • Topinio
        • 9 months ago

        Yeah, and they must have had some slack in the MI50 market which gave them the stock.

        NVIDIA gave them the possibility, though, by putting down the 2080 at this price — if the 2080 had been $600 there wouldn’t have been a profitable niche for AMD to fit a MI50-based gaming GPU into.

    • djayjp
    • 9 months ago

    Nice PS5 gpu. We’ll have to wait for fall 2020 though (to reduce the power and cost).

    • Billstevens
    • 9 months ago

    It felt like for a good run we weren’t seeing much price inflation in video cards. 780, 980 and 1080 all kind of sat at near the same price. With the top tier starting at around $550. The 780 may have actually been $650.

    The only hope for avoiding this video card price inflation from getting locked in was for AMD to release a competitive card that undercut the price of Nvidia. And they just decided not to do that by matching the 2080 launch price with a roughly equivalent card. So we may as well get used to the fact that price to performance in video cards has been static since the 1080 dropped on us.

    The ideal price for the Vega VII would have been $599 to reignite a price war. That is clearly not happening at the high end now.

    Hopefully with the next generation we get back to increased performance for less money…. but the price for 1080 performance hasn’t budged more than by maybe $50 since the 1080 launch… 🙁 That by itself is pretty depressing.

    I don’t know maybe I am seeing the numbers with tunnel vision. But all of the 2070 class cards and above feel like a slap in the face for those of us expecting a discount on 1080 class performance after what, 2 years?

      • Laykun
      • 9 months ago

      It’s a massive leap in performance at that price range if you consider tensor workflows and DXR 😉

        • Krogoth
        • 9 months ago

        Tensors are kinda pointless outside of general compute workloads. RTX mode is just techdemo-tier at this time.

    • Mr Bill
    • 9 months ago

    [quote<]AMD CEO Lisa Su holding a Radeon VII card.[/quote<] "Winter is coming" CONFIRMED!

    • tipoo
    • 9 months ago

    So maybe a touch faster than 2080 performance for about the same 700 dollars, five months later, but having to get there by being a fabrication node down and without any of the silicon Nvidia is spending on RTX, is I guess an ok slightly better value play, but not exactly awe inspiring, even a little worrying. Oh, all that and the power consumption appears to be higher with the connectors.

    The benchmarks are no doubt picked to be favorable, so the primary draw would seem to be the extra HBM2 memory, might be an interesting card for researchers but few titles are hurting on 8GB for gamers yet. Seems like the Frontier again, or what some Titans have been.

    Then again this is only die shrunk Vega, Navi will be the interesting one to see as the new architecture.

      • Billstevens
      • 9 months ago

      I guess we have to pin our hopes on Navi that we finally get budget 1080 level performance for under $300.

      Right now a lot of cards are hovering around 1080 level performance.
      -2060
      -Vega64
      -2070
      -1070 Ti

      It seems like you still have to pay at least $400 to guarantee that level of performance.

      • Hsldn
      • 9 months ago

      With AMD it’s always the next GPU that’s going to be interesting. I’m fed up with them really. I’m going to buy a 2070 or 2060.

      At least they should have give us a price advantage . Why would you buy this instead of 2080 at the same price? One reason?

        • tipoo
        • 9 months ago

        I wish it were Navi launching instead of this but here we are. This card being so ho-hum is why I brought it up, not forgiving it at all

    • TurtlePerson2
    • 9 months ago

    I don’t really get it. The slide behind the CEO shows that the benchmarks are the same in mainstream games as the competition from nVidia and it’s coming out at the same price, just 5 months later.

    What’s the reason for someone to buy this card? If you liked that price/performance ratio, then you already bought the green team card a while back. My guess is that this thing will draw more power than the green card, so there’s not really a compelling reason to buy, based on their own promotional slide…

      • dragontamer5788
      • 9 months ago

      Do you think 8GB will be enough for the 4k games that you are playing?

      I think its enough for maybe this year. But what about next year? Texture data grows every generation, and I do think that the lifespan of 8GB (at 4k resolution / 4k textures) is at risk. 16GB is overkill, but would at least put one issue to rest in my mind.

      Otherwise, NVidia’s 2080 looks pretty good. RTX for raytracing (even if limited) and DLSS actually seems like a cool feature. Its just that the 8GB of RAM is worrisome in my mind.

        • Srsly_Bro
        • 9 months ago

        COD BO4 claims to use 9.3GB of VRAM with maxed everything minus motion blur at 1440p. I’d hate to have only 8GB unless I was on 1080p. I’m not sure if the game allocates or actually uses that much video memory.

          • shaq_mobile
          • 9 months ago

          Really? That’s so much for a game that looks like it uses primary color for textures. I have a hunch it’s not very optimized. I guess there’s not much of a push for them to use atlas textures or anything. Then again, I guess it doesn’t matter when people have your game on auto purchase. Cod is like the star wars of gaming. I feel like we keep thinking back to the good old days and thinking “maybe, maybe this one will be different!”

          Cue dark helmet’s “fooled you!”

          Also who’s idea was motion blur? Isn’t it like the Clippy of graphics settings?

          Now I’m depressed.

            • Srsly_Bro
            • 9 months ago

            BO4 is really good tho. The battle royale is very well done. The multiplayer is fast past and demanding. The guns are great and balanced except for SMGs which rule most maps. I very often take the top spot in TDM gamemode and the handling is very fluid. I’m just fortunate I have enough VRAM to play at high settings. if you like CoD and haven’t purchase one in awhile, this one is worth it, but just get the cheaper battle pack for $40.

            • shaq_mobile
            • 9 months ago

            I tried it out and it felt a little too fast and… Short range for my tastes these days. They seem to do some sort of movement animation prediction (project the player ahead of their position) which is pretty normal in some games, but it’s overwhelming in it. The bullets also felt very slow, or the scale felt odd.

            Edit: I will say that the gunplay did feel good. They’ve always done a good job with that!

            It was def better than the last one I played that was all listen servers and jetpacks. I’m just kind of over it. I think modern warfare 1 was the peak and the rest seen too iterative for something I was barely into.

            These days I’m more of a “dad game”r. Squad and Foxhole are closer to what I want, where I can choose roles to fit my mood and focus on support if I’m burnt out.

        • stefem
        • 9 months ago

        The framebuffer (no matter if 4K or even 8K) uses just a small portion of ram, what eat ram space is mostly textures and not all of them are actually needed at that specific time, they are preloaded to be ready when needed.

      • Billstevens
      • 9 months ago

      The power draw gap isn’t nearly as bad as it was between the 1080 and Vega 64. Mainly because it appears Nvidia’s cards aren’t as efficient.

      25% performance increase is impressive at the same power level on the same chip architecture. Unfortunately for AMD the Vega power draw was awful to begin with relative to Nvidia’s design.

      I agree on paper AMD seems to be lacking any major differentiation in the gaming department. The extra better RAM and bandwidth might be a selling point over the 2080 in the non-gaming department. Then again the 2080 has deep learning hardware and the cobbled together some intriguing gaming features. Even if they aren’t particularly useful right now.

      • shaq_mobile
      • 9 months ago

      Isn’t that a good thing? Isn’t similar performance and same price range, with an underdog competing, a good recipe for a price war? I dunno. I’m probably bad at this.

      • sreams
      • 9 months ago

      “If you liked that price/performance ratio, then you already bought the green team card a while back.”

      So… nVidia should have discontinued the 2080 a few months ago because everybody who wanted one already bought one?

      • NovusBogus
      • 9 months ago

      Vast quantities of HBM comes to mind. Word on the street is that if the fans spin at just the right speed in an open case they’ll project an Etherium logo on the wall.

      Also AMD cards generally age better than NV because performance is less dependent on driver tricks. This card isn’t for me but it delivers as promised it’s hardly a bad product.

      • Krogoth
      • 9 months ago

      It is a “poor man’s” Instinct/FirePro that ate too much power for enterprise customers. It is a steal for general compute/3D artist crowd like Kepler-based Titans in their heyday.

    • Buzz78
    • 9 months ago

    I was hoping for news of a 12nm RX 570 equivalent (from RX 590 binning, perhaps?) to give me a moderate gaming card with low TDP.

      • dragontamer5788
      • 9 months ago

      Navi is expected to be low / midrange. But Navi was worryingly absent from this announcement.

      The rumor is Navi is aiming for $300 and less at 7nm.

        • Voldenuit
        • 9 months ago

        There’s a big gap between $300 and $699.

        (some) People are claiming 2070 performance for the $300 Navi, but if you can get 2070 performance for $300, why would anyone pay $699 to get 2080 performance?

          • Srsly_Bro
          • 9 months ago

          About tree 99.

          • dragontamer5788
          • 9 months ago

          Frankly, I see it as Navi will likely be released at a price-point higher than $300.

          Rumors are rumors, they are more about the hopes-and-dreams of the community rather than actual solid news.

          • FuturePastNow
          • 9 months ago

          Indeed, assuming the performance and price rumors are true, a pair of those Navi cards in Crossfire would be cheaper, faster, and use less power than one Radeon VII.

          (Personally, I suspect the Navi performance rumors are accurate but the price rumor is fantasy. We’ll see.)

        • auxy
        • 9 months ago

        Lisa literally mentioned the name once right at the very end of the presentation and that was the only sight of it…

          • Srsly_Bro
          • 9 months ago

          Dr. Su*

          Show some respect to the person who brought AMD back from an impending death, and helped guide it to the successful, functioning company it is today.

          Yes, referring to her as a doctor recognizes her accomplishments, academically and professionally, and is a sign of respect to someone who has earned it.

          It’s 2k19, auxy, get with it!

            • Redocbew
            • 9 months ago

            Seriously, bro?

            • K-L-Waster
            • 9 months ago

            You forgot to include “Now go write ‘I will not disrespect PHDs’ a 1,000 times and think about what you’ve done” at the end there…

            • Srsly_Bro
            • 9 months ago

            I made a few disclaimers. Go back and read what I said completely.

      • Prestige Worldwide
      • 9 months ago

      No

    • anotherengineer
    • 9 months ago

    I think this is just more misinformation from chuckula to troll the fanbois

      • Voldenuit
      • 9 months ago

      He does a very convincing Lisa Su impression. Somebody rescue her from the locked broom closet behind the keynote!

    • chuckula
    • 9 months ago

    I think the arguing over this card may be missing a potentially more important issue: To the best of my knoweldge they never even name-dropped Navi much less showed a demo with working silicon.

    Considering RyZen 2 was on full display and it’s not launching until this summer, I’m not sure what that means regarding Navi’s launch window.

      • Krogoth
      • 9 months ago

      The lack of a Navi iGPU in future products doesn’t bode well either.

      • anotherengineer
      • 9 months ago

      stealing your trolling 😛

      [url<]https://www.techpowerup.com/reviews/NVIDIA/GeForce_RTX_2060_Founders_Edition/[/url<]

    • Johnny Rotten
    • 9 months ago

    Sadly, an underwhleming, if not unexpected announcement. If they were smart they would offer an 8 gig SKU to undercut price by $100 or so.

    Its too bad they’ve elected to join Nvidia at this price/performance ratio instead of trying to push this performance level down a tier (which should be the case given this tier has been around for 2 years). Fact of the matter is that charging $700 for this tier of performance, 2 years later is BAD whether you’re nvidia or AMD. If AMD wanted to finally cultivate mindshare they would have had much more success (imo) by pushing this level of performance downwards to a new price segment… like $499 for example. This product will not get them out from deep underneath nvidias shadow…

      • shiznit
      • 9 months ago

      They couldn’t offer an 8GB SKU because 8GB doesn’t make sense for compute workloads and this is just a rebranded MI25.

        • dragontamer5788
        • 9 months ago

        Almost right.

        They can’t offer [b<]less than 4 HBM2 stacks[/b<], and 4GB per stack is the smallest that HBM2 comes in. I've talked about this in the past: HBM2 has issues scaling: scaling downwards that is. HBM2 is an amazing tool for achieving high performance on the high-end of themarket. But for offering the best price/performance at consumer prices (~$500 levels)... HBM2 has a few flaws. Then again, GDDR6 is looking quite expensive, with the 2080 only offering 8GB at this price range. So between the two technologies, it does seem that HBM2 is offering higher performance at lower prices.

          • enixenigma
          • 9 months ago

          Also almost right because this is a rebranded MI50, not a rebranded MI25.

    • DoomGuy64
    • 9 months ago

    $699? Well, this is what happens when we tolerate RTX pricing, which started with tolerating Titans.

    After years of PC gaming ever since dos, this may very well be the generation when I finally give up the hobby and buy a console. This is ridiculous, and same with the PCMR (originally a joke, but now it is taken literally) community that has consistently ignored the problem and enabled Nvidia to ruin pc gaming.

    Yeah. I can see why all the tech site people are retiring. They saw the writing on the wall, and instead of fighting the problem, which could have [url=https://youtu.be/otvIXprL2LU<]endangered their free samples[/url<], they simply ignored the problem. Enjoy the dead platform that you all killed, which also means these PC sites as well. Less users, less traffic, no business outside of subscription models. Consoles, and even Microsoft didn't kill PC gaming, the community did. PC gaming is now priced out of the average consumer, and while the elitists may not care now, there soon will be little reason for developers to bother with the PC other than low effort console ports. This is the make or break point of PC gaming.

      • DancinJack
      • 9 months ago

      lol

      • Krogoth
      • 9 months ago

      IMO, traditional PC gaming market has been on its way out for a while. Times are changing.

      Less and less people are bothering to deal with the hassle of setting up a top of the line gaming PC to play forgettable AAA title experiences. The major publishers are already feeling it. Just look at their Q4 2018 fiscal reports. Younger demographics are going to be opting for more causal and portable platforms for their gaming needs.

      It is going to be a repeat of 1983 crash.

        • NTMBK
        • 9 months ago

        Yes, the fall in both Apple and Samsung phone sales is definitely indicative of everyone moving to mobile gaming 0_o

        Sales are down, stock market is down, it’s the start of a recession.

          • Krogoth
          • 9 months ago

          Yes, it is slowly but surely happening. Nintendo the company who is responsible for post-1983 crash gaming Renaissance knows this. The Switch is just the beginning of the trend.

          You don’t need tower systems with a huge GPU to get a good gaming experience anymore. The halcyon days of 1990s and 2000s are over.

            • anotherengineer
            • 9 months ago

            Ya, most games I have the best memories of were from that era

            ChronoTrigger, FF2, FF3 (US) zelda, goldeneye, perfect dark, ocarina of time, gauntlet, baldurs gate dark alliance, mortal kombat

            I know console games, but still great games.

            Spent many hours with buddies playing those games.

            Good times.

            • dragontamer5788
            • 9 months ago

            Nintendo has been outputting lower end hardware since the Gamecube and Wii. Nintendo’s main advantage is that Nintendo outputs cheap enough products that many people are willing to go Nintendo + XBox or Nintendo + PS4.

            I don’t think gamers will ever switch off the treadmill of “bigger FLOPs is better”. These things are not only entertainment systems, but also status symbols. Gaming PCs and other symbols to prove how much of a gamer you are. Its consumerism, but I know a fair number of people who look at these systems like that.

            People aren’t going for “good gaming experiences”, they want a gaming experience that [b<]other's can't afford[/b<]. See Virtual Reality, where the HTC Vive is over $1200 for the headset alone, if you want any proof. Alternative proof: $200+ ships in Star Citizen.

            • strangerguy
            • 9 months ago

            The rise of Youtubers, competitive gaming and hardcore ZRGB-XTREME-GAMERZ marketing by HW vendors have definitely upped the e-peen factor among PC gamers. I for one is definitely not a fan.

            Speaking of Nintendo, I find their games are a lot more interesting and compelling than the usual big budget crossplatform games and inevitable toxic MP playerbases that comes with it.

            • Krogoth
            • 9 months ago

            That is a dying market in the long haul. There are less and less PC gamers who are committing themselves through the hoops for building and running a high-end system.

            • Srsly_Bro
            • 9 months ago

            Where did you get this information?

            • Krogoth
            • 9 months ago

            Just look around and it is self-evident.

            Online communities, LAN party turnouts have been slowing waning away since late 2000s. Younger demographics are content with consoles and OEM systems (mainly laptops) for their gaming needs. Game design is becoming more simplify in favor of accessibility. Developers and publishers have been locking down their stuff which has been snuffing out the modding community. You really don’t need a high-end system for an enjoyable experience or need to keep upgrading your rig and its components on an annual basis.

            The traditional PC market that dominated the 1990s-2000s is becoming a thing of the past. It is only being kept alive by the old guard who are slowly moving away onto other interests. 2020s will accelerate that trend.

            • Redocbew
            • 9 months ago

            And GPUs are doomed! Doomed I say!

            • chuckula
            • 9 months ago

            Intel is so screwed.

            You just thought they were going to have a crappy GPU in 2020.

            But it turns out… GPUS WON’T EVEN EXIST IN 2020!

            • K-L-Waster
            • 9 months ago

            So they can be vapourware in both 10 nm CPUs and GPUs at the same time!

            • chuckula
            • 9 months ago

            Vaporware in two major categories?

            That’s what we like to call a balanced product portfolio!!

      • Srsly_Bro
      • 9 months ago

      Tell me MSRP is RTX 2080.

      Tell me MSRP of Vega 7.

      Performance should be comparable, yet one is $150 cheaper.

      The entitlement generation is strong with you. Almost time for another rant, and by my age I live in the generation, but you are a member of the group

      Tell me what AMD should have priced it at?

      399, 499, or free if you are offended by money as money=oppressive speech?

      This is what happens when people are willing to pay higher prices. Get out of your safe space where everything is rainbows.

      Idealism is only going to upset you and other idealists living in imaginary worlds.

      Do jobs and working offend you? Because that’s why people are willing to pay more, because they can.

      Done ranting for now. I await your reply about how evil jobs and money are.

        • NTMBK
        • 9 months ago

        RTX 2080 is $699, I’m confused.

          • Srsly_Bro
          • 9 months ago

          [url<]https://www.nvidia.com/en-us/geforce/graphics-cards/rtx-2080/[/url<] $799

            • dragontamer5788
            • 9 months ago

            [url<]https://www.newegg.com/Product/Product.aspx?Item=N82E16814487415[/url<] $699

            • DancinJack
            • 9 months ago

            To clear up this stupid mess, that’s only for the Founders Edition. The OEM versions start at 699, of which there are some on Newegg right now.

            • Billstevens
            • 9 months ago

            Thats the founders edition. The normal 2080 is $699….

          • rechicero
          • 9 months ago

          I don’t know the RTX 2080 MSRP right now, but at September 21st was $800 (just checked in the techreport review). Did the MSRP fall to $699 since then or you re confusing MSRP with cheapest price you can find it online? We can only compare MSRP as this card is not in the shops… it will probably be cheaper than $699.

      • torquer
      • 9 months ago

      huh?

      • brucek2
      • 9 months ago

      PC Gaming has never required a $699 card and certainly never a Titan. It makes no more sense to say these luxury high-end cards are killing gaming than it would to say that Porsche and Ferrari are killing driving.

        • Billstevens
        • 9 months ago

        But I really need 1440p games running at 200 fps 🙂

          • DoomGuy64
          • 9 months ago

          Which outside of fps, consoles can now do, and have none of the fractured distributor issues or hardware price gouging that exists on PC.

        • bfar
        • 9 months ago

        In fairness, you could get the very best products they could make for less than $300 back in the early 2000’s.

        That said, I’ve never enjoyed pc gaming more. There’s a dizzying range of content out there that runs beautifully on affordable hardware.

      • K-L-Waster
      • 9 months ago

      If you think it costs too much, maybe you shouldn’t buy one. (Funny, I think I said something similar after the 2080 launch…)

      • chuckula
      • 9 months ago

      I gotta give you some credit. At least you’ll rant about AMD too.

      A bunch of more popular people on here wouldn’t be able to hurl it at both sides like you do.

        • DoomGuy64
        • 9 months ago

        Now that Nvidia supports adaptive, there is less reason to diss them other than their game sabotaging, price gouging, or drivers, which haven’t been that bad recently. In all consideration, the RTX 2060 is pretty decent, being what else is available.

        I do like AMD’s driver features though, which would make switching not worth a side grade. Meh.

        My main complaint is the lack of affordable mid-range. Neither Polaris or Pascal are worth buying at this point, and there is no proper replacement. Mid-range today should support 1440p, and high end should support 4k. But we have no modern mid-range at all from either side. The 2060 comes close though.

      • NovusBogus
      • 9 months ago

      I know right!? It’s an absolute travesty how much the price has gone up from first-gen Vega’s headliner, the $699 liquid cooled version. DAMMIT LISA I WANT MY ZERO DOLLARS BACK!!!1!eleventeen!

      • cynan
      • 9 months ago

      Videocards used to be close to this expensive. For example the launch MSRP of the 9800xt was $499 in early 2000s. That’s getting within spitting distance of $699 in today’s dollars – maybe closer to $649, but not too far off.

      Edit:Apparently $499 in 2003 (when the 9800xt launched) is about $680 now…

        • moose17145
        • 9 months ago

        Hmm, good point. $678.27 by government calculations assuming a $500 dollar launch in September of 2003.

        Launch time frame assumed using the publishing date of the below techreport review of the 9800XT, and inflation calculated using the US Government consumer price index website linked below the techreport article.

        [url<]https://techreport.com/review/5712/ati-radeon-9800-xt-graphics-card[/url<] [url<]https://data.bls.gov/cgi-bin/cpicalc.pl[/url<] Edit: Beat me to it. You edited your post while I was writing my reply lol

    • maroon1
    • 9 months ago

    LOL AMD stopped using Doom and Wolfenstein II for vulkan test. Instead they use Strange Brigade, a game that nobody plays (and it is even less popular than Dragon quest XII on steam)

    1fps faster in two AAA game that are know to run better on AMD GPU ?! Sorry but this does not look good for AMD. If AMD marketing slides having trouble showing it winning by more than 1fps, then just imagine when independent reviewers test it. LOL

      • Krogoth
      • 9 months ago

      It is because IDtech5 engine works better on Turing than GCN. 2080Ti is an absolute monster at it and is the only solution that can effortlessly handle 4K at 144FPS under it.

    • enixenigma
    • 9 months ago

    It looks to be a consumer version of the Radeon Instinct MI50. At least AMD will be bringing [i<]something[/i<] to the table soon, even if it isn't competitive against the top end Nvidia cards.

    • USAFTW
    • 9 months ago

    What really makes it DOA, as far as I’m concerned, is the price. AMD needs to do better in terms of value, especially when its performance, launch time-frame (two years later than the competitor), and feature set (lack of DXR, etc.) is not helping.
    I know it’s probably not a big deal for some people, but it matters somewhat to at least have the option to switch on RTX and DLSS to see if it’s something that’s worth having.

      • Rand
      • 9 months ago

      The value proposition here rests pretty much entirely on the extra VRAM.
      Otherwise you’re left with an inferior feature set to the RTX 2080 with higher power consumption, with similar performance in AMD’s presumably hand picked benchmarks that likely perform better then usual on the Radeon 7. I highly doubt more then 8GB of VRAM is that big of a benefit in the near future.

      I struggle to find anything appealing in it at all.

        • Pancake
        • 9 months ago

        The price will soon drop to match market demand as has been the case for Ryzen CPUs. GPU pricing has been artificially inflated by mining demand but now that’s no longer the case they’ll be sold on their merits as graphics cards. And, just about no one is gaming with AMD:

        [url=https://store.steampowered.com/hwsurvey/videocard/<]Steam Hardware Survey[/url<] where you can see AMD's RX480/580 wonder twins with a usage share of 1.3% compared to 15.4% for GTX 1060. And nothing else of note from AMD on that list. A lot sour pusses criticise NVidia for raytracing and tensor core features. But as a graphics programmer and tech enthusiast I have to say I find it totally fascinating tech. Unless AMD can catch up to NVidia with those features none of their products are remotely interesting - at an intellectual level.

          • Krogoth
          • 9 months ago

          Until there’s a killer app that utilizes the hardware. They will remain nothing more than tech demo-tier features. This always has been the case of nearly every new hardware feature set.

          The majority of gamers get hardware for the here and now. Not for “what-ifs of tomorrow”.

      • Krogoth
      • 9 months ago

      It is a Vega 20 FirePro/Froniter/Instinct reject. It is meant for general compute not gaming performance. AMD RTG is just trying to sell off subpar inventory towards those who have a hate boner for Nvidia.

      2060 and 2070 rival the Radeon VII at lower price points. The Freesync card is now gone.

        • USAFTW
        • 9 months ago

        You make a good point on Freesync. It’s no longer an AMD-exclusive selling point.

          • dragontamer5788
          • 9 months ago

          Actually, AMD still supports Freesync over HDMI.

          Its a bit niche, but I got a co-worker who has a Freesync TV that is pissed off that NVidia won’t support the HDMI-passthrough level.

            • anotherengineer
            • 9 months ago

            Tell him to get an AMD card then, problem solved 😉

          • tipoo
          • 9 months ago

          Well hot dog, I missed some major news apparently!

          • Billstevens
          • 9 months ago

          On 12 monitors… I am hopeful many more will work well once they launch the feature on the 15th or 16th? But I would temper your enthusiasm until we see how well gsync works on monitors that didn’t pass their test, or didn’t get tested.

          There are something like 500 freesync capable displays right now and Nvidia claims they tested around 400. So they in theory at 80% coverage. Probably more because some of that 500 are not PC displays.

            • Voldenuit
            • 9 months ago

            >monitors that didn’t pass their test

            (A couple of Nvidia employees in suits, dark glasses and hats walks into Acer HQ)
            Receptionist: Welcome to Acer. How can I help you?
            Nvidia: Say, that’s a nice monitor you’ve got there. It’d be a shame…
            (knocks monitor off desk)
            Nvidia: … if something were to happen to it.

      • chuckula
      • 9 months ago

      If Nvidia deserves to be demonized for releasing $700 cards, then AMD should be held to the same standard.

      Especially when this $700 Radeon isn’t going to be curb-stomping older $700 Nvidia cards.

        • USAFTW
        • 9 months ago

        The difference is we’ve come to expect Nvidia to charge more, which makes sense most of the time since their products are generally superior.
        AMD’s whole approach to pricing since the 4000 series was “better value for nearly the same level of performance”. AMD hasn’t been playing the value card for years now. As long as their products are not clearly and measurably better than Nvidia’s at some point, they can’t charge the same price as what Nvidia put out two years before.
        All things considered, it doesn’t make sense to buy a Vega 56 over a 1070 Ti, Vega 64 over a GTX 1080, and a Vega VII over a GTX 1080 Ti, let alone a RTX 2080.
        So yes, it’s not the same AMD Radeon anymore.

        • Goty
        • 9 months ago

        I still think NVIDIA shoulders more of the blame. If the 2080 was $500 instead of $700, there’s no way this card would be priced this high.

          • Srsly_Bro
          • 9 months ago

          Don’t hate the player, hate the game. I’ve always stood by that. Execs have fiduciary responsibilities. They don’t change if a gamer complains that he can’t afford it.

          Prices are set by dollars, not tears.

          How is Nvidia wrong or at fault for selling GPUs at prices that meet revenue goals?

          Snowflake mindset is spreading into everything.

          Nvidia doesn’t owe you or I anything.

          I’ll also drop a #entitlementgeneration on you.

            • Goty
            • 9 months ago

            Nice ad hominem, bro. Might want to work on those critical reading skills, though.

            I never said I felt I was entitled to it or NVIDIA owed me a thing. I never said NVIDIA was wrong. My entire argument is that Vega VII is only coming in at this ridiculous price because it has existing competition at that price and performance, and NVIDIA is the one that set that price point.

            Good try, though? Your trophy is in the mail.

            • Srsly_Bro
            • 9 months ago

            The people who decided their dollars were what Nvidia was charging set the price. Nvidia doesn’t just pick a price from a random number generator and slap it on the card. Companies use financial models to determine price that meets revenue goals. You may prefer a lower price but you aren’t entitled to it nor does Nvidia owe you anything.

            If a person is in the market for a card of that price, he isnt as price sensitive as a person looking for an Rx570. The person is most likely using a free sync monitor and not a $100 22″ tn panel If the person is vendor locked, the price isn’t bad considering the alternatives.

            • Goty
            • 9 months ago

            [quote<]The people who decided their dollars were what Nvidia was charging set the price. Nvidia doesn't just pick a price from a random number generator and slap it on the card. Companies use financial models to determine price that meets revenue goals. You may prefer a lower price but you aren't entitled to it nor does Nvidia owe you anything. [/quote<] That's a meaningless distinction for the consumer, and since I'm guessing none of us sits on NVIDIA's board, I'm also guessing that "consumer" describes pretty much all of us.

          • chuckula
          • 9 months ago

          If the Radeon VII was priced at $500 and not $700 there’s no way the RTX-2080 would be priced that high.

          Six of one, half a dozen of the other.

            • Goty
            • 9 months ago

            Assuming it’s actually competitive with the 2080, yes, of course. NVIDIA would feel pressure to drop prices in the face of competition with a better price/performance ratio. That’s how the market works.

            The issue here is the same one AMD faced with Vega 56/64. It’s an expensive card to produce (even more so now with four HBM stacks and made on 7 nm), so they likely don’t have the room to price it much lower and still make money on it. Wasn’t it thought that AMD was losing money or at best just breaking even on every Vega 56/64 sold?

        • tay
        • 9 months ago

        Isn’t it worse in every way? What am i missing?

      • Srsly_Bro
      • 9 months ago

      Performance about as fast as RTX 2080 for less money.

      Nobody uses DXR and when they do, cards fast enough to run meaningful DXR it will be out.

      You don’t even know if DXR/DLSS, by your own admission, is worth having, yet you knock AMD for not having it.

      As far as I’m concerned, you’re delusional and don’t even know what you want except for the want to give uninformed, baseless opinions.

      Are you ever happy, bro?

        • maroon1
        • 9 months ago

        Less money ?!

        RTX 2080 already cost 699 dollar in newegg (and with two games)
        [url<]https://www.newegg.com/Product/Product.aspx?Item=N82E16814487415&Description=rtx%202080&cm_re=rtx_2080-_-14-487-415-_-Product[/url<]

        • USAFTW
        • 9 months ago

        There are RTX 2080 cards available at newegg for 699$.
        [url<]https://www.newegg.com/Product/ProductList.aspx?Description=rtx%202080&Submit=ENE[/url<] Regardless of how good and compelling RTX and DLSS are, they are SOMETHING. Something is better than nothing. Remind me again what AMD is offering here that would make me consider this over a RTX 2080 at the same price?

          • dragontamer5788
          • 9 months ago

          16 GB of RAM. 8GB of VRAM (on the 2080) will fill up as more games move to 4k.

          This HBM2 thing looked stupid initially… And then we all learned that GDDR6 was 70% more expensive to make (than GDDR5). All of a sudden, AMD is offering more RAM (superior RAM too: HBM2) at cheaper prices than the competition.

            • dragontamer5788
            • 9 months ago

            I’m curious about the downvotes.

            Its more or less well documented that 4k games are already at 6GB or higher. See this NVidia thread for example: [url<]https://www.reddit.com/r/nvidia/comments/9g0l0b/is_8gb_vram_enough/e62y3s3/[/url<] A big benefit to the 2080 Ti was future-proofing your investment with 11GB of VRAM. 8GB VRAM is usable [b<]today[/b<], but there is definite worry that it won't be enough as 4k games get more, higher-resolution textures. VegaII goes way off the rails here: 16GB VRAM is absurd overkill as far as I can tell. But its the smallest that 4x stacks of HBM2 can offer, so that's what AMD gets for choosing HBM2. Still, 16GB VRAM would put any worries about VRAM in any video game for the next 3 or 4 years.

            • stefem
            • 9 months ago

            Ask yourself: How big is a 4K framebuffer? the answer will make clear why your commnt got downvoted

            • cegras
            • 9 months ago

            4K = 3840 x 2160 x 3 (RGB) x 8 bytes (floating point, why not) = 190 MB x 3 for vysnc/future rendering ~ 600 MB ?

            • Voldenuit
            • 9 months ago

            8-bit or 10-bit rather than bytes. You don’t want floating point values for color information to display.

            • cegras
            • 9 months ago

            Is it all integer, even when doing things like AA / other post-processing?

            I was giving the worst case scenario, and even that is only ~200 mb per frame.

            • dragontamer5788
            • 9 months ago

            Ask yourself: How big are [b<]multiple[/b<] 4k textures, shadow-maps, high-quality meshes for an 4k world? [url=https://overclock3d.net/gfx/articles/2016/12/01120628914l.jpg<]Answer: 9.2GB VRAM usage for Watch Dogs 2 Max-settings @4k.[/url<] Now I've been told in other discussions that VRAM usage is a poor metric, and I recognize that. Nonetheless, you might be surprised to know that GPUs use more than just 3840 x 2160 x 4-bytes per pixel. We are certainly passed the point where 4GB, maybe even 6GB, is far too low for modern, "Max settings 4k" games. (If you're fine with "High" Settings, you might still be fine). I would expect that games of the future will be leveraging the 11GB of the 1080 Ti, 2080 Ti, and 16GB of Vega II. As such, I expect games to blow past 8GB VRAM. Maybe a game that uses 8k cloth / bumpmap textures (procedurally generated, but still has to be stored in VRAM), maybe a 20000x10000 HDRi skymap. There are a lot of ways to eat up all the VRAM in a video game... and high-end gamers (owners of a 1080 Ti, 2080 Ti) want an experience that others can't enjoy.

            • chuckula
            • 9 months ago

            The frame buffer is trivial but it’s also not what eats up GPU RAM.

            Ask yourself this: If even a 4K frame buffer is so tiny [and it is], why do we even need more than say… 2GB in a high-end GPU?

            • anotherengineer
            • 9 months ago

            [url<]https://www.techpowerup.com/reviews/Performance_Analysis/The_Witcher_3/3.html[/url<] Looks like Witcher 3 uses less than 2GB at 4k resolution. So why do we need more than 2GB for gaming? I don't know? poor coding? poor ports? e-peen? why not ram is plentiful? Or is Witcher 3 an isolated example of low gpu ram requirements?

            • chuckula
            • 9 months ago

            OK! Tell that to AMD and they can cut down the RAM while cutting their manufacturing cost on the Radeon VII so they can sell it cheap and destroy Nvidia!

            • anotherengineer
            • 9 months ago

            meh

            fanbois don’t care if it’s even free, they will buy whatever camp they’re in love with regardless of price of the competition.

            I personally upgrade GPUs about every 5-6 years and aim for $250 cnd tops. The way things are going, 5-6 years from now, integrated will probably be enough for me.

        • Voldenuit
        • 9 months ago

        [quote<]Performance about as fast as RTX 2080 for less money.[/quote<] What is this less money you speak of? Last I checked, $699 was not cheaper than $699.

          • K-L-Waster
          • 9 months ago

          Some folks have it in their heads that the *only* valid price comparison is against FE cards direct from NVidia’s web store. ‘Cus apparently no one buys OEM cards or something.

            • rechicero
            • 9 months ago

            You compare MSRP with MSRP and OEM with OEM. But when a card has both and the other only MSRP I guess there is only one way: MSRP vs MSRP. Anyways this is a hollow debate, we just need to wait until after launch to check the real price.

      • K-L-Waster
      • 9 months ago

      To be fair, it was always unrealistic to expect AMD to deliver a card that matched the performance of NVidia’s latest GPUs AND uses HBM AND is priced at R9 580 / 590 levels.

      The fans of course have had the idea stuck in their head that because it’s AMD it will *always* be at a low price no matter what.

      • WhatMeWorry
      • 9 months ago

      They can always lower their prices at anytime. They have hemorrhaged hundreds of millions of dollars over the years.

    • Goty
    • 9 months ago

    16 GB of HBM2 probably makes the 70% price premium of GDDR6 look reasonable…

    • chuckula
    • 9 months ago

    Come on AMD, this is 2019.

    I was expecting Radeon: Episode IX.

      • Srsly_Bro
      • 9 months ago

      Be patient.

        • Chrispy_
        • 9 months ago

        Ya, the new architecture and high-volume 7nm stuff from TSMC is coming later in the year.

        This is just something to ensure that AMD graphics department doesn’t come to CES empty handed. It’s quite obviously just a relabelled Radeon Instinct MI50. For whatever reason most server solutions use the MI25 so I guess AMD have excess inventory?

          • dragontamer5788
          • 9 months ago

          [quote<]For whatever reason most server solutions use the MI25 so I guess AMD have excess inventory?[/quote<] Apple and Samsung have cut their supply by 10% IIRC. (due to weaker demand of iPhones). I assume this also applies to Qualcomm (but I haven't seen any Qualcomm news yet). In any case, the cell-phone market has slowed down dramatically for 2019, which means TSMC's 7nm node needs something to do. AMD probably bought up some extra 7nm wafers at relatively low prices, low enough to make this product feasible.

          • Srsly_Bro
          • 9 months ago

          Perhaps. You’re probably right on the inventory. Regardless how people feel about a card that performs similar to an RTX 2080 and for cheaper, and how it’s s bad thing, is something I don’t have the capacity to understand.

          Rumors say Nvidia will be out with 7nm in 2020. That’s what I’ll be looking at.

    • psuedonymous
    • 9 months ago

    How do AMD’s numbers compare to Vega 64 clocked to the same speed?

      • Goty
      • 9 months ago

      I don’t think a Vega 64 can probably hit the same speeds, but things are further complicated by the fact that AMD doubled the ROP count and gave this significantly more memory bandwidth than Vega 64 has.

      The interesting thing is that they cut four CUs and only boosted the clocks about 15% and they’re getting this much more performance out of the exact same architecture. Those must have been some pretty serious bottlenecks they were dealing with in the Vega 56/64.

        • DancinJack
        • 9 months ago

        They also juiced it up with a lot more transistors. Guessing those are being used to clock it to holy high hell. Which, obviously, isn’t quite of an issue on 7nm.

          • Goty
          • 9 months ago

          The same Anandtech link I posted below says 700M more transistors than Vega 64, but there are also two more HBM memory channels to account for.

            • DancinJack
            • 9 months ago

            Also less CUs.

            I wasn’t sure that the HBM stacks were included in that number. /shrug

            • Goty
            • 9 months ago

            Fewer CUs, but you have to assume all 64 are there and they’ve just cut four. They should still count in the transistor budget, right?

            I don’t think the HBM stacks themselves are counted, but the memory controller/s account some of it, I’m sure.

            • DancinJack
            • 9 months ago

            I honestly don’t know on either of those. It doesn’t matter a whole ton I guess. We definitely know 7nm TSMC is more dense.

            • Phartindust
            • 9 months ago

            Hmm, could it be we might see the full 64 CUs in a consumer card? Would it worth it though? I mean yeah it would be faster, but enough to catch the 2080ti? Could be fun 🙂

        • RAGEPRO
        • 9 months ago

        Do you have a source on the doubled ROP count?

          • Goty
          • 9 months ago

          Sorry, should have linked to it in the original post: [url<]https://www.anandtech.com/show/13832/amd-radeon-vii-high-end-7nm-february-7th-for-699[/url<]

            • RAGEPRO
            • 9 months ago

            Ah, thanks. Looks like they got some information from AMD directly, heh.

        • Voldenuit
        • 9 months ago

        This sounds more like a rebadged Radeon Instinct with some cut-down cores rather than anything new.

        As for doubling the ROPs, it was probably to accommodate the doubled memory bus rather than the other way around.

          • Goty
          • 9 months ago

          [quote<]This sounds more like a rebadged Radeon Instinct with some cut-down cores rather than anything new.[/quote<] Isn't that all Vega 56/64 are?

            • Voldenuit
            • 9 months ago

            The Instinct MI50 and MI60 were 7nm and had a 4096-bit bus, and 64 CUs. They were originally intended for compute and deep learning.

            • Goty
            • 9 months ago

            And the MI25 was 14nm, had a 2048 bit bus and 64CUs. It was also intended for compute and deep learning. Your point?

      • Krogoth
      • 9 months ago

      It’ll be the same. Vega 20 is pretty much a die-shrink of Vega 10 with minor tweaks (only useful for general compute).

      It is pretty much a Hawaii on Ultra roids. It is unattractive buy for gamers but an interesting proposition for the general compute crowd.

        • USAFTW
        • 9 months ago

        Yeah, except on memory, where this has more than double the bandwidth. That ought to make a difference. 25% is perhaps a bit on the low side, however.

          • Krogoth
          • 9 months ago

          Vega 10 wasn’t memory bandwidth limited at gaming performance. That extra memory bandwidth is only good for general compute stuff.

        • enixenigma
        • 9 months ago

        Vega 20 has double the ROPs of Vega 10. Whether that leads to a notable performance uptick remains to be seen.

          • USAFTW
          • 9 months ago

          I haven’t seen anything hinting at 128 ROPs. Vega 7nm that we saw in the MI60 before had 64. Could be wrong though.

            • DancinJack
            • 9 months ago

            [url<]https://www.anandtech.com/show/13832/amd-radeon-vii-high-end-7nm-february-7th-for-699[/url<]

            • Voldenuit
            • 9 months ago

            [url<]https://www.anandtech.com/show/13562/amd-announces-radeon-instinct-mi60-mi50-accelerators-powered-by-7nm-vega[/url<] [quote<]The GPU adds another pair of HBM2 memory controllers, giving it 4 in total. [/quote<] They don't call them ROPs in the article because the Instincts had no video-out, but they're essentially ROPs.

Pin It on Pinterest

Share This