Intel’s discrete graphics products will begin appearing in 2020

Ever since Intel added former Radeon honcho Raja Koduri to its bench of semiconductor talent, the company has been clear that it intends to make a renewed attempt to compete in the discrete graphics card market for both gamers and data centers alike.

New product development cycles in the industry begin years before shipping products appear, so it wasn't clear when the first fruits of Intel's newfound interest would arrive. Thanks to a report by analyst Ryan Shrout, we now know that Intel will begin shipping its next discrete graphics products in 2020. Intel itself confirmed the news in a separate tweet.

Shrout learned of Intel's plans during an analyst briefing conducted by Intel CEO Brian Krzanich and Navin Shenoy, executive vice president of the company's data center group. Shenoy acknowledged that the company will be introducing GPU products for both the data center and the client markets. We still don't know how broad a range of products the company intends to introduce when it does re-enter the market, but Intel has at least put AMD and Nvidia on notice for when they can expect fresh competition in the graphics space.

Comments closed
    • tipoo
    • 1 year ago

    Interesting piece here –

    >But the collaboration came at the expense of Radeon RX Vega and other in-development projects. Allegedly, Koduri saw up to a massive 2/3 of his engineering team devoted exclusively to Navi against his wishes, which resulted in a final RX Vega product Koduri was displeased with as resources and engineering hours were much lower than anticipated. As I mention in my companion report, the implication is that AMD CEO Dr. Lisa Su wanted to devote more energy into its semi-custom business than its desktop graphics division.

    When Raja left AMD, a lot of us assumed it was because of Vegas uncompetitive performance per watt. But instead it appears this was against his will, changing the light of why he left, and why Vega was pretty meh.

    [url<]https://www.forbes.com/sites/jasonevangelho/2018/06/12/sources-amd-created-navi-for-sonys-playstation-5-vega-suffered/#66c0fcf724fd[/url<]

      • ronch
      • 1 year ago

      Thing is though, it’s funny how Koduri reacted when it’s not like everything before Vega was a resounding success. IIRC the last time AMD really was impressive hands down was when GCN came out in 2012.

    • hansmuff
    • 1 year ago

    I don’t think it’ll concern the gamer crowd that needs dedicated cards at all. Intel isn’t going to be competitive with gaming drivers. The cards will be for compute and the iGPUs may improve, but you won’t be playing AAA games in 4K with either.

    • ronch
    • 1 year ago

    TREASON!!! >:-((

    Seriously though, it would be cool to see what Intel can do here. Nvidia must be watching this really closely.

    • blastdoor
    • 1 year ago

    Even if Intel can design a competitive GPU, can they fab it on a competitive process? 2020 is not far away at all. Are they really going to be able to get their much-delayed 10nm process whipped into shape to produce GPUs by 2020? Or will this be done on 14nm++++++? Or do they have plans to make 10nm a runt process that is barely used and quickly move to 7nm?

    Or…. will they use TSMC to fab this thing?

      • chuckula
      • 1 year ago

      I think the better question to ask is: Will Intel even exist in 2020?

      Clearly the answer is no.

        • blastdoor
        • 1 year ago

        14nm++++
        CONFIRMED

          • ronch
          • 1 year ago

          Who are you to assume the responsibilities of Chuckula???? You will NOT CONFIRM ANYTHING, do you understand???

            • chuckula
            • 1 year ago

            You’re right of course. He’s a rank amateur.

            Watch how a professional does it:

            2020 Mac Pros with Apple Miracle ARM chips and Intel discrete GPUs [b<][i<][u<]MEGA-CONFIRMED![/u<][/i<][/b<]

            • Redocbew
            • 1 year ago

            You shall henceforth be known as Grand Confirmator.

            • ronch
            • 1 year ago

            That is good.

      • tipoo
      • 1 year ago

      All those years of wondering what a GPU made on Intels world leading fabs would be like, and when they finally do get into GPUs, it’s when that fab lead has evaporated…

        • moose17145
        • 1 year ago

        [url<]https://en.wikipedia.org/wiki/Intel740[/url<] How would a GPU made by Intel perform? Not well... not well at all...

          • chuckula
          • 1 year ago

          [url<]https://en.wikipedia.org/wiki/Bulldozer_(microarchitecture)[/url<] How would a CPU made by AMD perform? Not well... not well at all... [url<]https://en.wikipedia.org/wiki/Apple_Newton[/url<] How would a mobile device made by Apple perform? Not well... not well at all...

            • moose17145
            • 1 year ago

            [url<]https://en.wikipedia.org/wiki/Larrabee_(microarchitecture)[/url<] Just sayin... Intel doesn't have a good track record with Discrete GPUs that can actually compete against NVidia and ATi/AMD...

            • blastdoor
            • 1 year ago

            But there are good reasons to think the future might not look like the past. GPUs today, especially in the “compute” market, are better aligned with Intel’s historical bailiwick than GPUs in the past. Also, when you combine time, money, and good hires you can do just about anything. Intel has had plenty of time and money. We don’t know for sure about the hires, but I tend to give them the benefit of the doubt on that.

            It’s just a shame that they appear to be losing (or perhaps have lost?) their manufacturing edge. Now, they *could* get it back again, if they made a decision, say, 18 months ago to really ramp up spending on developing 7nm, and if they are willing to mostly write-off their investment in 10nm, so 10nm is quickly replaced by 7nm in 2020. Intel is fully capable of doing that…. if they are willing to spend the money. That’s the big question in my mind.

            • Redocbew
            • 1 year ago

            My completely unqualified opinion is that Intel isn’t really “losing” anything when it comes to manufacturing, but that it’s just getting harder to do process shrinks within the amount of time we’re used to seeing them. It’s always been the case for a very long time now that if there was some problem chip architects were stuck on, then we could just wait a few years for transistors to get smaller, and for chips to get faster, and then there was a good chance that what was impossible before wasn’t so impossible anymore. That’s still the way it is now I think, but I don’t think it’ll stay that way forever.

          • tipoo
          • 1 year ago

          How would a good GPU on a bleeding edge Intel fab perform, was my question for most of the past where they were 18 months to two years ahead of the world on fab tech.

          Now that they’re making a modern GPU, they’re also not ahead of the world on fabs anymore as of 7nm.

            • moose17145
            • 1 year ago

            I don’t think it would have mattered, to be honest. Intel has tried the whole “Make a high performance discrete GPU” thing a couple of times in the past, and have always come up well short of what even ATi/AMD were capable of doing. Even when they had a decent process advantage.

            Historically, this has not been a good segment of the market for Intel.

      • NoOne ButMe
      • 1 year ago

      10nm reticle limit size chip. Between 50 and 75% enabled for most SKUs.

      My wag.

    • chuckula
    • 1 year ago

    Speaking of GPUs, anybody expecting a multi-chip NaviRipper is in for a disappointment: [url<]https://www.pcgamesn.com/amd-navi-monolithic-gpu-design?tw=PCGN1[/url<] Ironically an MCM setup with non-uniform memory access characteristics actually has fewer downsides for a GPU than it does for a CPU due to the differences in their workloads.

    • tipoo
    • 1 year ago

    Can’t see it as anything but a win for us consumers.

    If it’s great, and they charge a premium, that’s fine, and Nvidia and AMD have to respond.
    If they try to charge more than it performs worth, the market prices them out and they have to reconsider.

    Excited to finally have new blood in the dedicated GPU game.

      • anotherengineer
      • 1 year ago

      But will it?

      Not sure, I think Intel would to prefer to focus on AI/deep learning, and other high margin markets for gpus, and maybe leftovers for the consumer market?

        • tipoo
        • 1 year ago

        Then it only marginally intersects but may keep Nvidia/AMD on their toes anyways.

        Again, hard for customers to lose, and potentially a few ways to win.

    • psuedonymous
    • 1 year ago

    [quote<]Shenoy acknowledged that the company will be introducing GPU products for both the data center and the client markets.[/quote<] And with that, we can [i<]probably[/i<] rule out the design being a Big Gen device. This leaves: - A completely clean-sheet architecture produced from paper to Silicon at break-neck speed from a standing start. - The Ghost of Larrabee, AKA Xeon Phi (Knights Ferry and Corner have the texture units on-die still). Xeon Phi is still in active development, and one HPC iteration (Knights Hill) was cancelled at the same time Raja was announced as being hired. Dust off the Larrabee work on "DX on x86", re-add the texture units - plus possibly some Tensor-like units pilfered from Nervana for dat Deep Learning market - and you have a GPU ready to go. Perf/watt was a problem for Larrabee, but it was also stuck back in 45nm using a chip from 2005. - Pimp out Nervana into a GPU by generalising the FP units (Flexpoint will require some wrangling to play nice with most shaders). - Smash both together, AK the previously announced Knights Crest slated for release in 2020 (announced in 2016). ::EDIT:: Tom Forsyth is now moving back to Intel to work under Raja Koduri. Yes, [url=http://tomforsyth1000.github.io/blog.wiki.html#%5B%5BWhy%2520didn%2527t%2520Larrabee%2520fail%253F%5D%5D<][i<]that[/i<] Tom Forsyth[/url<]. The "Ghost of Larrabee" option is looking up...

      • tipoo
      • 1 year ago

      Hey Intel, this seems like time to remind you you still own the rights to Project Offset…

      [url<]https://www.youtube.com/watch?v=TWNokSt_DjA[/url<]

    • Krogoth
    • 1 year ago

    This is the beginning of the end for mainstream discrete GPUs. This “discrete” unit is just a prototype for Intel’s next generation iGPU platform on their SoCs. Remember how i740 was really just Intel’s way of pushing AGP down the OEM channels?

    AMD RTG’s Navi is all about going back to AMD RTG/ATI’s root. The OEM market. ATI was huge in the OEM scene before Radeon 9xxx days via its budget and low-end discrete video cards. It is where they made the bulk of their revenue.

    Intel isn’t going to allow AMD to outdo them in making an more capable iGPU solutions. We will end-up having iGPUs from both camp that can handle needs of the majority of users/gamers (1080p, medium/high fidelity at 60-85FPS, 4K video playback).

    The reason for getting a discrete GPU evaporates for the masses and demand destruction is merely inevitable. Nvidia has no x86 license so they cannot make a SoC solution to directly compete with Intel and AMD RTG in this arena. That’s precisely why they have been trying to move away from discrete cards as their bread and butter ever since Fermi. Remember that bulk of the revenune in the discrete market comes from mainstream SKUs via sheer volume not high-end SKUs.

    2020s is going the decade where discrete GPUs will meet the same fate as discrete audio cards back in the 2000s. Low-end, bargain-basement discrete cards (2D only with basic 3D function) are pretty much dead. They got killed when SoC with iGPUs became ubiquitous. The next generation of iGPUs will eat away at mainstream/mid-range stuff which only leaves high-end portion intact. High-end will continue to endure for hobbyists, videophiles and professionals.

      • chuckula
      • 1 year ago

      As long as the CPU in a standard desktop system that attracts the most attention on TR is a [s<]twelve core Ryzen 2 -- oops I'm letting the cat out of the bag there![/s<] eight core Ryzen, then there's going to be a healthy GPU market and I don't just mean ultra high-end gaming GPUs either. The Future is Fission.

      • Chrispy_
      • 1 year ago

      Nah, you’re missing the market divide.

      There are people who don’t care about 3D performance and want the money they pay for a CPU to be allocated to the CPU cores first and have enough IGP to meet their minimal software requirements.

      The other demographic are those spending 100-200% more on GPU than their CPU, who want Gigabytes of low-latency, dedicated VRAM, are prepared to throw 200-400% more power at the GPU than the CPU, and who build their entire PC around the physical and environmental requirements of a dual-slot, dual-fan, PCIe card that absolutely MUST be separated from the CPU because the relevant lifecycle of the GPU is barely half that of the CPU and nobody wants to throw away a perfectly good CPU just because the IGP component is no longer adequate.

      You’re basically falling into that “Apple: What’s a computer?” fallacy that implies an iPad is enough computer for anyone, when that only applies to the least demanding, lowest-profit subset of the market.

        • Krogoth
        • 1 year ago

        You really don’t understand the consequences of miniaturization?

        The point is that the masses no longer need discrete GPUs for their needs and they make up the lion’s share of discrete GPU via sheer volume.

        We have already seen this happened to discrete audio cards and digital cameras. Discrete GPUs are the next victim.

          • blastdoor
          • 1 year ago

          Moore’s Law was still in effect when discrete audio cards died off.

          • DPete27
          • 1 year ago

          It sounds like you think (proper) IGPs haven’t already been on the market since, say, 2010-2011 (Sandy Bridge / Llano) and haven’t already carved their market niche away from discrete GPUs.

          Why would Intel want to sell discrete GPUs if it’s predicted that dGPUs will go the way of the dodo?

          You can only pack so much into a economically sized processor die. For this reason alone, IGPs will never offer the same levels of the performance that a dedicated graphics card/chip can. Thermals is another reason. Upgrade cycle for gaming is another; until the GPU lifecycle slows down to match the CPU lifecycle, it’s going to be more economical to have the two components separate.

          I’m not saying that the future improvements of IGP performance won’t continually chip away at the bottom line for dGPUs. Just look at how many people on Steam are gaming on an Intel IGP. They’re already pretty competent at 720p-1080p on less demanding games. I just don’t see an IGP running [insert demanding AAA game here] at 1440p, 4k, or 8k anytime soon.

          Last point, a sad one, PC gaming is a dying industry. Mobile and console gaming eats away at PC gaming marketshare every year. I was hopeful with the Steambox movement that we might see a resurgence of PC gaming, but until devs remove the artificial limitation of “no split-screen multiplayer on PC games” the ship will continue to sink. I could go on, but I’ll stop there.

          • Chrispy_
          • 1 year ago

          “The point is that the masses no longer need discrete GPUs for their needs and they make up the lion’s share of discrete GPU via sheer volume.”

          Pretty sure that the masses use laptops with Intel graphics. Gaming PCs and Gaming desktops carry such a premium and are so obviously marketed as “gaming” these days that you would have to be both illiterate and mentally disabled to buy a gaming product by accident.

        • blastdoor
        • 1 year ago

        Here’s a thought…. maybe there’s a sense in which you’re both right.

        Maybe Intel sells a CPU with an iGPU but also a GPU with an iCPU.

        Mind blown????

      • ludi
      • 1 year ago

      Eh? Intel already killed 80% of the dGPU market, they don’t have to launch a new dGPU line to continue that trend. If they’re investing in a dGPU product line at this stage it’s because they see a lucrative market for that remaining 20%. See also: Any Nvidia earnings report.

      As Chrispy just said, there is a small but high-value market for CAD/CAE and high-end parallelization tasks where a higher-end dGPU is the right tool for the job, and its lifecycle and power/cooling requirements are not compatible with CPU requirements.

      And yeah, we all would fully expect Intel to take the lessons learned from each dGPU development cycle and apply them to future iGPU iterations. But the idea that they’re just preparing for a wave of miniaturization that wipes out the entire dGPU market is silly — have you noticed that miniaturization ain’t what it used to be?

        • Krogoth
        • 1 year ago

        It is the mainstream units that are going away. Discrete GPUs are going to become niches catered towards hobbyist and professionals. Just like how digital cameras and discrete audio exist today.

      • freebird
      • 1 year ago

      I think your right in the sense that mainstream discrete GPUs will unfortunately join other tech in the dustbin of history… but I think it will more likely be do to Amazon or MS setting up Cloud GPU assisted gaming. Who knows, maybe a future Xbox or PS(X) will have a limited GPU with cache and just the duty of drawing the image to the display and all the grunt work done in a Cloud/GPU farm, latency is still an issue with that model, but simplicity and “good enuf” will win the mainstream. I don’t look forward to it.

      • Redocbew
      • 1 year ago

      Krogoth 5 years ago: GPUs are doomed!

      Krogoth today: The end is nigh! GPUs are doomed! Doomed I say!

      GPUs just aren’t that impressive, I guess.

        • Krogoth
        • 1 year ago

        5 years ago, low-end and bargain basement discrete units (2D output/basic 3D) were being destroyed by the emergence of basic SoC soluitions from AMD and Intel. The only units that are still being produced today are overpriced professional-tier equipment (for extra heads).

        Going into the 2020s, we are going to see the same story happening to mid-range/mainstream SKUs.

        It doesn’t take exceptional foresight to see this. Nvidia had seen the writing on the wall years ago. AMD RTG gave up the high-end customer-tier discrete GPU market when it became clear that will never break Nvidia’s mindshare in the near-future and capital in the discrete GPUs is going shift towards iGPUs. Intel is the 800 pound gorilla that will drive the demand destruction.

        Discrete GPUs will end-up only existing in niches.

          • Redocbew
          • 1 year ago

          I don’t expect to change your mind dude. I’m just curious for how many years you’ll be saying “this is the end, my friend” before you start to wonder.

            • moose17145
            • 1 year ago

            Kinda have to agree with Redocbew here… I have been visiting this site since they still had a dash in the name… and for a LONG time now, you have been saying how the end is near for one thing or another. That and stating how you are not impressed with anything…

            • DPete27
            • 1 year ago

            Hence the verb: “Krogothed”

      • frenchy2k1
      • 1 year ago

      You are entirely mis-reading both history and future trends.
      Intel did not create the i740 as a precursor to their integrated graphics. They did an honest try at a discrete GPU and bombed spectacularly. Later, they had free silicon in their chipsets (chipset size was determined by the number of pins needed) and engineers used it for graphics, reusing the cores they already have.

      The new foray of intel into discrete GPU is all about competition with nvidia and parallel processing base, usable for Deep Learning and other processing.

      As stated, look at any nvidia report in the past 2 years. Discrete graphics are very lucrative and, even better, the same architecture can be reused for datacenter, workstation and parallel computing, products with even better margins.

      So, in a word, they are chasing after nvidia’s profits.

      The problem is that this is only the latest effort by intel in that space. Larrabee was supposed to be a graphic chip too. This did not end up as expected.
      We’ll see how they fare this time…

        • Ninjitsu
        • 1 year ago

        Was about to say. Nvidia has been pumping out record earnings quarter after quarter. iirc even AMD saw profits in their discrete GPU business.

        The PC gaming market is seeing a resurgence (and given the way E3 2018 went, i think that will continue), and even the overall PC market decline has slowed recently.

    • ptsant
    • 1 year ago

    The guy wasn’t particularly successful in AMD, so I don’t see how he can pull it off in Intel, which has much less experience. I don’t expect true competition for gaming. Probably a compute part for lucrative AI/HPC etc.

      • NoOne ButMe
      • 1 year ago

      Raja was shot in both feet.

      First cancel of GPU projects with Rory Reed followed by taking many engineering resources to focus on semicustom.

      Both of these moves made sense for AMD as a whole company. But big damage to GPU department.

      Hard to say if Raja did a good or bad job with the hand he was given. But given his time at Apple was good, I would say he should do good with Intel. Wonโ€™t be resource starved like was at AMD.

        • drwho
        • 1 year ago

        Wrong analogy .. he would shoot *himself in the feet , but others would probably hamstring him, or tie his hands behind his back.

    • DavidC1
    • 1 year ago

    Not excited. Last big thing was their Iris Pro venture. It was a product that kept all the disadvantages of the discrete GPU and the iGPU. 3 or 4 laptop models used the first Iris Pro. 1 used the second, and only Intel used it on their NUCs for the third.

    Considering their financial, and R&D might, their results can be very often mediocre. I’ll reserve judgment for when its OUT.

    • CScottG
    • 1 year ago

    -please say it will support Freesync (2, 3, etc.).

      • Klimax
      • 1 year ago

      Most likely. IIRC Intel either already supports or announced support for it. (And no access to G-Sync)

    • DancinJack
    • 1 year ago

    I’m actually pretty pumped for this. I am completely content with my GTX 1080, but I definitely want to see what Intel comes up with.

    • NoOne ButMe
    • 1 year ago

    Probably relatively high end offshoot of a Deep Learning part.

    Full line 2021, maybe not until 2022.

    My guess.

      • designerfx
      • 1 year ago

      that and ray tracing. guaranteed “look at our ray tracing!”

        • blastdoor
        • 1 year ago

        Yup…. seems like that almost has to be right.

    • leor
    • 1 year ago

    Ok so I know this isnโ€™t exactly appropriate, but does he look like an Indian version of the KFC Colonel to anyone else? ๐Ÿ˜›

    Makes me think of this, but add Intel [url<]https://youtu.be/22SQnsmVOyI[/url<]

      • bjm
      • 1 year ago

      LOL! He does.

    • bthylafh
    • 1 year ago

    Woo, the thing I’ve never been waiting for ever since my old i740-based card died.

      • Klimax
      • 1 year ago

      I got two Still Alive…

    • chuckula
    • 1 year ago

    Thanks Raj!

    Although if Intel actually gets these things out by 2020 and doesn’t pull a cannon lake then they clearly were already partway through the design process when he arrived. But having somebody with the force of will to get a product on the market is still huge.

    • not@home
    • 1 year ago

    And each Intel GPU will come with 2 years of driver updates for FREE!

      • chuckula
      • 1 year ago

      Two whole years?!?!?

        • Neutronbeam
        • 1 year ago

        1 nanometer GPUs CONFIRMED. (well, YOU weren’t saying it–yet)

          • chuckula
          • 1 year ago

          Metaterials with negative refractive index values exist!
          Negative nanometers CONFIRMED!

            • Zizy
            • 1 year ago

            Meta-transistors? ๐Ÿ˜€

        • Goty
        • 1 year ago

        Yeah, but it’s actually just one update for that whole period.

          • chuckula
          • 1 year ago

          That’s what I call stability!

            • Goty
            • 1 year ago

            No new bugs! Chalk one up for QA!

      • Amiga500+
      • 1 year ago

      Ahh, but by the end of those two years will it be able to actually display AAA games without spurious on-screen artifacts?

      They might need 3 years of driver updates to get to that point… ๐Ÿ˜€

      More “srlsy” though – another competitor is never a bad thing. So it’d be hypocritical of me not to wish them well – even if they are a shower of dishonest ___ts.

        • RAGEPRO
        • 1 year ago

        You’re getting downvotes here but you’re completely right. While the IGP in my Core i7-8700K [i<]can[/i<] run a lot of technically-advanced games, almost none of them look right. Warframe in particular artifacts like crazyโ€”an issue it doesn't have on the [url=https://techreport.com/review/33691/building-a-basic-gaming-pc-with-amd-ryzen-3-2200g<]Ryzen 3 2200G[/url<].

      • NoOne ButMe
      • 1 year ago

      Fine print:
      Please note, driver support for games will only apply to games with at least one million active users per month.
      Drivers are not promised to be updated until after 25 months from launch.

      • tipoo
      • 1 year ago

      Yeah that’s a big thing that will have to be seen, even if day 1 performance is good. They’re no longer updating Haswell GPU drivers, while GPUs much older than that do get driver releases.

      • ptsant
      • 1 year ago

      Will it also require a socket change every generation, like the CPUs?

    • MOSFET
    • 1 year ago

    Does he still gaze lovingly at his Ryzen system and it’s pretty LED ring?

    • Chaserx
    • 1 year ago

    Competition is a good thing and for now, Nvidia has very little.

      • Waco
      • 1 year ago

      You must be on an odd planet. AMD competes in every segment except for the ultra-high end.

        • Srsly_Bro
        • 1 year ago

        1050 Ti > Rx460
        GTX 1060 6GB > Rx 580
        GTX 1070/Ti > Vega 56
        GTX 1080 > Vega 64
        GTX 1080 Ti and up…Lonely on top of the mountain.

        You must be in an odd alternate reality, bro.

          • Waco
          • 1 year ago

          Yeah, minus the part where you’re not arguing the same point.

          • synthtel2
          • 1 year ago

          Using up-to-the-hour pricing from Newegg (I’m in the market for one of these myself) and TR’s most recent review (1070 Ti):

          RX 580 – slightly faster than 1060 6G, same low-end pricing +/- $10
          Vega 56 – halfway between 1070 and 1070 Ti perf, making it ~$30 overpriced
          Vega 64 – halfway between 1070 Ti and 1080 perf, making it ~$90 overpriced

          Data is sparser for the RX 560 matchup, but it looks like it’s halfway between the 1050 and 1050 Ti in both perf and price.

            • Waco
            • 1 year ago

            It’s great that prices are coming down – it’d be wonderful if cards for sale at MSRP weren’t something to be excited about.

            • Pancake
            • 1 year ago

            If power consumption matters to you I’d stay well away from the RX580 which draws 93W more when gaming (as per this here august website’s review) compared to the 1060.

            • synthtel2
            • 1 year ago

            That’s Nvidia’s real advantage, yes, but it isn’t going to sway me. There’s more to a card than perf/power/price, and AFAICT (having just traded my GTX 960 for an RX 460) AMD has a very solid win on the rest of it.

            • Waco
            • 1 year ago

            If high-wattage power supplies ever go away for cheap I might start caring. An extra 100-200 watts just doesn’t matter in most builds.

            • DPete27
            • 1 year ago

            True at stock, but AMD WattMan is a shining star compared to Nvidia’s lack thereof.
            My RX480 runs at 1305MHz and with my undervolt, uses almost the same amount of power as a GTX 1060.

          • Srsly_Bro
          • 1 year ago

          I got down-voted for reality. Nvidia has a lower cost of ownership with it’s significantly lower power consumption and higher resale value due to the Nvidia name. The performance is also higher than similarly priced AMD options.

          Why would you get AMD for reasons other than irrational brand loyalty? Your El cheapo freestink monitor doesn’t count as a reason.

          AMD is not competing; it is only showing up to the party late, over-weight, and unattractive to most.

          Until AMD gets in the gym, a haircut and wardrobe change, it’s not looking good.

          I hope you dorks get it this time.

          Yours truly,
          Srsly_bro

            • Chrispy_
            • 1 year ago

            LOL, keep digging.

            You don’t like VRR, and you agree that it’s worth suffering the extremely limited selection of overpriced G-Sync monitors? Wake up and come and join the rest of us in 2018. Even TV’s have Freesync now and once you get used to it you won’t want to go back to tearing and/or stuttering framerates.

            • Srsly_Bro
            • 1 year ago

            If AMD had a standardized refresh rate range, if be fine with it. The range is all over the place and I posted AMD’s database of vrr monitors in a previous post. You know what you’re getting with gsync and you don’t need to hunt for the vrr range and use inferior GPUs.

            Seems like an easy choice imo.

            • DPete27
            • 1 year ago

            Statement: “If you buy a Ferrari, you know you’re getting a fast car”
            Counter: The Chevy Corvette ZR1 isn’t fast?

            • Srsly_Bro
            • 1 year ago

            Have you always struggled with comprehension and reasoning?

            Some vrr monitors have a range similar to g sync. Many don’t. Go look at AMD’s own monitor list.

            You know what you get with a ZR1.

            If you were to buy a car with 10 engine options and it wasn’t made known what engine until you performed substantially more research that the competition required, your comparison would be more valid.

            If the vehicle with 10 engine options was called “car A” and people were always talking about how great car A is, you’d have no idea what engine the person had without investigating.

            I’m really concerned with how you worked out that initial comparison in your head and decided it was valid and relevant to the discussion. Public education and genetics have not been kind to you.

            G sync has one range.

            AMD vrr implementation has many many different ranges across monitors.

            Don’t come at me with your casual knowledge and middle School reasoning abilities.

            • DPete27
            • 1 year ago

            Haha. My initial comparison didn’t take nearly as long to formulate as your derogatory response.

            I was staying at the high level of car manufacturer (Chevy vs Ferrari) but it’s funny you mention 10 of “Car A” because there may be close to 10 different variations of the Corvette (not gonna count). But regardless of what moniker they put after the name (ZR1, C6, Z71, etc etc etc) I can squander a guess what the feature hierarchy is simply by looking at price. Same thing for FreeSync monitors (to a degree). I don’t expect my brother’s $200 27″ FreeSync monitor to outshine my $450 one.

            The problem with FreeSync isn’t the variety of VRR ranges, it’s the poor marketing. I completely agree with you that it should be required to clearly present the lower and upper limits of refresh ranges on ALL VRR monitors’ product pages (FreeSync and GSync alike). GSync monitors aren’t immune to this either. Most of them show the same specs as FreeSync monitors on their product pages: max refresh rate.

            • Shobai
            • 1 year ago

            [quote<]Public education and genetics have not been kind to you.[/quote<] You need to tone it down bro, srsly.

            • Chrispy_
            • 1 year ago

            It.
            Doesn’t.
            Matter.

            Freesync adds practically nothing to the cost of the monitor and is available in a huge range of sizes, panel types, form-factors, aspect ratios, and prices brackets.

            Even the very worst-case scenario (48-60Hz Freesync range) is – for the grand old cost of NOTHING – significantly better than fixed refresh. But monitors with that limited range are few and far between.

            What you’ll see more often is 40-75Hz (LG, and other common IPS panel options) or 48-75Hz (AUO VA) and 48-144Hz (Samsung). Not only can most of the IPS panels be run at 30-80Hz using free utilities, Some of the cheapest options on the market are the no-name Korean imports and Monoprice/Qnix rebrands that will happily run at 90Hz+ and cost less than your garbage Acer/Benq office monitor.

            • Srsly_Bro
            • 1 year ago

            Or just get a not trash option and have 30 to 144hz support on all g sync displays.

            Do you refuse steak and order a can of chef boyardee a restaurant?

            Btw, I have a 2018 Dell 27″ 1080p 75Hz free sync for watching videos next to my S2716fg for gaming.

            • Redocbew
            • 1 year ago

            Dude, really. If you want to call someone out for arguing an unsupported position that’s fine, but where’s your research to support the value judgement you’re making here? It shouldn’t be too difficult to find clear and unambiguous evidence if the difference between these things is really so stark as you make it out to be.

            • travbrad
            • 1 year ago

            You’re seriously overestimating the extra cost of ownership from power consumption considering most people’s GPUs spend most of the time idling or completely turned off. I agree AMD’s cards aren’t currently at particularly competitive price points though, especially the Vega-based cards.

            As someone who bought a graphics card before the whole crypto mining craze, there aren’t any amazing values in graphics cards right now from any company. Nvidia is just slightly less ridiculous. Yay?

            • Chrispy_
            • 1 year ago

            People get upset about the difference between a 180W GPU and a 210W GPU.

            My super-efficient, A+ energy rated Bosch electric fan oven just spent 35 minutes cooking dinner for one lousy meal. At 1.2 kWh, that thing used more electricity to cook our lasagne than the energy consumption difference between a 1070Ti and Vega56 for 23.3 hours at peak load.

            I’m lucky if I have time to put in 23.3 hours of gaming [i<]a month[/i<]. So, your average AMD vs Nvidia power consumption argument boils down to this painfully obvious fact: [b<]An AMD card costs less to run each month than a lasagna.[/b<] Is that clear? Good.

            • Srsly_Bro
            • 1 year ago

            It also costs less than a tank of gas, gym membership, and Running shoes. What’s your point? AMD still draws substantially more power.

            Enough of the red herring. My belly is full. I can’t take anymore.

            • Chrispy_
            • 1 year ago

            Uh, my point is that you exceed the monthly power consumtion difference between AMD and Nvidia GPUs with utterly trivial, yet essential things multiple times a day.

            Did you shower today? Good. Also, depending on whether you showered for 5 minutes or 10 minutes, that’s just covered the extra cost of running AMD graphics cards for the next 6 weeks to 3 months.

            Your examples cite things that are regular purchases costing significant amounts of money on a monthly basis. Let’s just call out your first example – the average US household spends [url=https://www.eia.gov/todayinenergy/detail.php?id=33232<]~$1997 a year on gasoline[/url<] for 2.5 people and a GPU can be expected to have a lifecycle of around 2.5 years. So, in 2.5 years an AMD card might cost you $20 extra in power consumption but [i<]gasoline[/i<] (your example) costs 100x that - PER PERSON. You're out by [b<]two orders of magnitude?[/b<]. Srsly, bro? The next thing I'm expecting you to say is that you have a 600" willy.

            • JustAnEngineer
            • 1 year ago

            [quote=”Chrispy_”<] An AMD card costs less to run each month than a lasagna. [/quote<] [url<]https://garfield.com/comic?keywords=lasagna[/url<]

          • Sahrin
          • 1 year ago

          Vega 64 is faster than 1080. Vega 56 is faster than 1070.

        • Pancake
        • 1 year ago

        Sure, AMD may be “competing” but losing massively in every segment is not the definition of “competitiveness” as evidenced by the latest figures in the world-renowned and highly respected Steam Hardware Survey.

        1.79%

          • Waco
          • 1 year ago

          At similar performance points, AMD has cards for sale at similar prices. I’m not sure how else to analyze this.

            • Pancake
            • 1 year ago

            Competition = two or more competitors seeing who can win at a certain metric.

            For gaming that would be NVidia by a country mile (1.79%).

            For total units sold that would be NVidia by a country mile.

            For sheer massive profits in the bank as a war chest for R&D that would be NVidia by a country mile.

            • Waco
            • 1 year ago

            Market penetration has nothing to do with my original statement nor any follow-ups.

            • Pancake
            • 1 year ago

            It sure does. Let’s assume AMD and NVidia cards are exactly the same price, performance and power consumption in each segment. IF AMD were competitive their sales would be – all things being equal – in the ballpark of NVidia. But they’re not in the ballpark. Not even the same game. NVidia are thrashing AMD senseless. Total domination.

            1.79%

            What would suck even more for AMD is that in order to be “performance competitive” they have to build a significantly more complex product eg RX580 having a 256-bit memory bus vs 192-bit for 1060. So, to be even in the game they have to spend a lot more money and make a lot less profit than NVidia. And their R&D costs are amortised over a FAR smaller number of units sold. Oh, ho, ho ho. That gonna hurt bad.

            • Waco
            • 1 year ago

            You have a very different definition of “competitive” than the rest of the world. :shrug:

          • synthtel2
          • 1 year ago

          Latest says it’s 1.86%, or 4.77% of the 14/16nm dGPU market (except it’s surely even more than that due to stuff in the “other” category, as we discussed last time).

          Trying to make it look like AMD is losing 55-to-1 didn’t get any less stupid since we last went over this, but if you’re going to make a habit of it you could at least keep your numbers up to date, so we can have a nice tracker of AMD’s progress in these comments. If you’re doing that, you should probably also switch to the 4.77% number, since 14nm dGPU adoption will clearly keep growing over time and make your narrative look ill-formed. ๐Ÿ˜‰

            • Pancake
            • 1 year ago

            Thanks for the update! What hasn’t changed is that the situation is absolutely dire for AMD in the PC gaming space! You could say they are absolutely “uncompetitive”.

        • jihadjoe
        • 1 year ago

        For GPUs though they’re competitive only because they marked down their prices.

        Vega 56/64 is a bigger die than 1080ti and makes use of very pricey HBM2, yet they have to sell it for 1070/1080 prices because that’s where the performance is at unless you’re mining crypto.

        It’s ok for the consumer, but AMD can’t really push Nvidia because their margins are slim as it is. This leaves Nvidia able to dictate prices, and AMD has to slot in their products in the brackets they are competitive in.

          • Waco
          • 1 year ago

          Sure, but we can buy cost-competitive GPUs from AMD at each segment except the top tier. That’s a win for us, assuming AMD doesn’t go out of business.

        • ronch
        • 1 year ago

        Their power efficiency is killing them though. And even as an AMD fan I simply have trouble recommending AMD these days. GCN is simply in need of a real successor similar to what Ryzen did for AMD’s CPU division. I mean, yeah, Vega is at the same level of the 1080 and 1070Ti but oh man that power draw is excruciatingly atrocious! And of course that inefficiency carries over the rest of the stack. Bottomline, AMD is fine if you don’t care about energy efficiency.

          • Waco
          • 1 year ago

          I’m honestly having a hard time caring about power efficiency here. 100 watts is basically nothing – if people cared so deeply about power efficiency nobody would ever overclock.

            • ronch
            • 1 year ago

            If two products performed similarly and cost the same but one uses far more energy, like 60% more, than the other, which would you choose?

            • Waco
            • 1 year ago

            It depends on which company pissed me off most recently. ๐Ÿ˜›

      • Kretschmer
      • 1 year ago

      Yeah, ever since AMD switched from selling gaming GPUs to selling mining GPUs Nvidia has been lonely out there…

        • chuckula
        • 1 year ago

        In honor of North Korea being in the news I give you [url=https://www.youtube.com/watch?v=UEaKX9YYHiQ<]THIS![/url<]

      • Unknown-Error
      • 1 year ago

      Why is the above reasonable statement getting down-voted?

        • Srsly_Bro
        • 1 year ago

        Welcome to tech report.

        It’s likely someone was offended at one point by something you said and will down-vote anything you say in the future, no matter the content.

        Passive aggressiveness is at an all-time high on this site.

        It’s possible the reality we live in is not the one that person lives in.

        It’s also possible the person is just an idiot and there is nothing anyone can do to help.

        Feel free to provide your own explanations.

        • Pancake
        • 1 year ago

        There seem to be 3 types of opinions with respect to graphics card manufacturers on this website:

        Neutral, objective non-partisan voices such as myself.

        Relaxed and comfortable NVidia fans just very happy with the state of things and the way games are meant to be played. Probably pretty happy with their electricity bills too.

        Raging, angry AMD fans railing against the perceived injustices to their team. And they will mash that downvote button furiously to “stick it to the man”.

          • Waco
          • 1 year ago

          If I was a betting man I’d bet you get down voted for your attitude more than anything else. :shrug:

          • Srsly_Bro
          • 1 year ago

          Last observation upset some people.

        • DoomGuy64
        • 1 year ago

        Because it’s obvious trolling, and people are tired of it.

        Somebody mentioned 3 types of opinions. Here’s my take:

        Neutral: Rare. They get hit from trolls on both sides. Most have had enough of this fanboy trolling that has been left unchecked for far too long, and will downvote obvious troll posts.

        Nvidia: “Relaxed and comfortable Nvidia fans” No. Nvidia fans are the most prolific and toxic fanboys on the internet, and they see no problem with gameworks, walled gardens, ridiculously gimped planned obsolescence midrange or $2000 high end consumer graphics cards. They make sport of trolling everyone else on the internet. However, the monopoly is starting to break up due to Nvidia eating it’s own base with bad business practices, and the toxic community is off-putting to the open minded part of the community. It’s kind of like ISIS on the internet. The only people not being constantly terrorized are their own members, which even the moderates find problematic, since they have to deal with it as well.

        AMD: Tired of Nvidia screwing up games with gameworks, toxic fanboys, vendor lock-in, and ridiculous pricing. Other than that, very open minded. There also are trolls posing as AMD fans, usually with such memorable quotes like, “I’m an AMD fan, but…”

        Either way, the era of Nvidia fanboys getting away with ridiculous levels of nonstop trolling is over, as even mainstream Nvidia fans are getting tired of it.

        My suggestion: Knock it off, and be more open minded, because anyone stuck in this tribal mindset will only hurt themselves by making decisions based on propaganda, which ends up causing a Stockholm response, especially when your video card cost more than it should. IMO, Nvidia’s price gouging is especially toxic with this mentality, because nobody wants to admit they were ripped off, so they viciously defend their preferences. That’s not healthy or sustainable behavior, and it needs to stop.

        [url<]https://youtu.be/ZyAOtQOu2YM?t=421[/url<] [url<]https://youtu.be/EDCeLng44l0?t=491[/url<]

          • Waco
          • 1 year ago

          I haven’t bought an AMD card (or hell, CPU) in quite a long time and even I’m tired of the anti-AMD FUD that gets spread by “knowledgeable” posters on this site. Their attitudes when called out just make it worse.

          • chuckula
          • 1 year ago

          [quote<]My suggestion: Knock it off, and be more open minded, because anyone stuck in this tribal mindset will only hurt themselves by making decisions based on propaganda, which ends up causing a Stockholm response, especially when your video card cost more than it should. IMO, Nvidia's price gouging is especially toxic with this mentality, because nobody wants to admit they were ripped off, so they viciously defend their preferences. That's not healthy or sustainable behavior, and it needs to stop.[/quote<] All of that. Except back at you. And I'll use this post against you as you spread FUD about how bad it is that Intel is entering the market and providing that holy "competition" word that gets thrown out like a religious mantra whenever we yet again have to hear about why AMD is the only company that should ever be complimented at all.

            • DoomGuy64
            • 1 year ago

            What? You’re the only one who constantly posts FUD about how Intel’s AVX 512 is the bee’s knees. GTFO with your hypocrisy. I don’t bring that stuff up either. You start it, and if anything I might write a reply to your instigation if I feel like it.

            I don’t care about that stuff. My post was solely about the trolls who are toxifying the community with their fanboyism.

            OOOH GSYNC IS BETTER WITH NO FACTUAL EVIDENCE.
            OOOH AMD HAS NO GOOD VIDEO CARDS BECAUSE THEY DON’T HAVE A TI COMPETITOR.
            OOOH I SUDDENLY CARE THAT MY SLI RIG IS POWER EFFICIENT. [spoiler<]of course it's not 1060 SLI, because Nvidia disabled SLI on their reasonably priced cards.[/spoiler<] All of those arguments are trolls, and claiming the Nvidia shills aren't constantly trolling is denying reality. Chaserx was clearly trolling with his post, and it's ridiculous that anyone would question why that statement is considered trolling. If you can't recognize it, then you've been drinking too much of the kool-aid.

      • bjm
      • 1 year ago

      This is very true, I don’t understand the down votes. AMD is only ahead when it comes to convenient mainlined Linux drivers. For everything else, Nvidia is ahead.

        • synthtel2
        • 1 year ago

        Freesync is a big one, IMO the Windows drivers are a lot nicer too (and allow a lot more tweaking to equalize the power efficiency gap, as a lot of the gap is just conservative tuning), and my recent experience is that AMD is far ahead in 99.9% frametimes on DX9 games and games that stay too under the radar to get much optimization help from the GPU vendors.

        On the Linux side: * Nvidia’s got a raw performance advantage of a third or so on average (last I checked). AMDGPU and -PRO still have their inefficiencies. * AMD’s got a very big latency advantage due to something messed up in the NV driver. * Native gaming is much more solid on NV, probably because they were the only serious performance option for so long (and everything therefore targets that driver) as much as anything else. * Gallium Nine is fantastic and AMD-only. * People dislike Nvidia+Linux because apparently the proprietary driver breaks a lot, but I’ve never once had that problem (on Arch).

        (I traded out my GTX 960 for an RX 460 a few weeks back. It’s theoretically a pretty substantial downgrade, but the only games I play that got seriously worse on the balance were either AAA or those that don’t get along well with AMDGPU, and going from native to Gallium Nine fixed most of the latter right back up. The biggest benefits were the Linux latency drop and that some UE3 games with incurable stutter problems in either OS are now 5x better.)

          • bjm
          • 1 year ago

          Ah, yes good point. It’s really too bad Nvidia isn’t picking it up, especially with TVs getting Freesync.

      • Zizy
      • 1 year ago

      You are right, although for the meat of the market 580 is a pretty nice card. Faster than 1060, better drivers, freesync. It loses in power draw (and perf/W), but for a typical desktop PC that doesn’t matter much. Ignoring mining-inflated prices, it is generally a better purchase than the 1060 for a typical gaming PC.

      The big problem is that AMD doesn’t have anything else to compete. 560 is OK-ish only because of freesync, and Vega is barely acceptable even including freesync.

        • Pancake
        • 1 year ago

        Ever so marginally faster at 99% frame times but losing in average FPS. So, call it a wash.

        But using 93W extra to get there. That sort of obscene inefficiency should not be rewarded. Depending where you live that 93W may be multiplied by the need to remove the heat using air conditioning. It’s also pretty disgusting to not care about energy use given the challenges we have as a world with climate change.

          • Kretschmer
          • 1 year ago

          When Intel or NVidia burned energy with past inefficient designs, those products were flops.

          When AMD does it, it’s “value” or “pretty much the same.”

            • DoomGuy64
            • 1 year ago

            The only Intel and Nvidia products that got decently criticized were the P4 (offset with benchmark skewing), and FX.

            Fermi was tolerated by Nvidia fans because it was the only hardware capable of decently running dx11 games with tessellation.

            I have no issue with that, and neither should anyone else. If the card does what you want, at the price you wanted to pay, a few watts should not invoke some insane, knee jerk, end of the world reaction. It’s merely a slight disappointment that isn’t going to affect your gaming habits, because we don’t live in a 3rd world nation that rations electricity usage.

            Buy some solar panels for your house if you’re that bothered by it, because that’s all this argument amounts to. Guilting people’s purchases based on some arbitrary environmental mentality, which never stopped people from buying Hummers either.

          • Demetri
          • 1 year ago

          Is Maxwell considered terribly inefficient nowadays? AMD’s current stuff is right in line with those parts on power draw. I understand that makes AMD a generation behind in that regard, but I wouldn’t call it “disgusting”. Either way, I don’t see it putting much of a dent in most people’s power bill.

          • 223 Fan
          • 1 year ago

          No FreeSync == No Sale.

          • Goty
          • 1 year ago

          [quote<]obscene inefficiency[/quote<] Meaning that, assuming I'm pulling 93 more watts 100% of the time I'm gaming (not realistic) and I game 4 hours a day (probably fairly typical for a heavy gamer), I'm going to pay less than $15 more a year to game with a 580 at my current electrical cost of $0.108/kWh than if I used a 1060. Oh, [i<][b<]the horror![/i<][/b<]

            • Srsly_Bro
            • 1 year ago

            The point is percentage wise the difference is huge. If you can get similar performance for the same price and have lower power consumption, why wouldn’t you?

            Do you have a legitimate need for an AMD-only feature?

            • Chrispy_
            • 1 year ago

            You’re assuming that vendors like Asus, Gigabyte, MSI et al don’t overvolt the Nvidia GPUs for a few extra MHz too. Read a few reviews of non-reference GTX cards and it’s amazing how much those factory overclocks hurt the Nvidia’s performance/Watt.

            Founder’s edition cards are indeed much more efficient than AMD’s product stack, but Founder’s edition cards also carry such a huge premium that they’re irrelevant. Once you add 20% for the vendor overclock and voltage boost, the difference that’s already trivial diminishes further.

            • Srsly_Bro
            • 1 year ago

            And over volt AMD and it’s horrendous. What’s your point, bro?

            • Chrispy_
            • 1 year ago

            Uh, what’s [i<]your[/i<] point, seriously bro. Vendors don't overvolt AMD GPUs because AMD do that already and there's very little headroom left. You seem to be utterly missing the point; Nobody is saying that AMD GPUs are more efficient than Nvidia GPUs, so what are you trying to say? [i<]I[/i<] am simply stating that the downclocked, bone-stock, Founder's Edition Nvidia cards aren't representative of the cards that most people actually end up with. My MSI GTX980, for just one example, pulls an additional 220W from the wall over my Intel HD4600 graphics. How is that possible if the GTX980 only uses 165W? Well, it's because I'm not running a founder's edition/reference clocked GTX980. Not only is that slower than the typical GTX 980, it was also notoriously hard to buy and at least 20% more expensive, so almost nobody with any braincells would have bought one, other than for the explicit reason of trying to maxmise performance/Watt, which makes the GTX980 seem like a particularly stupid decision in the first place.

            • synthtel2
            • 1 year ago

            I just went from Maxwell at 140W nominal (2900 GFLOPS) to Polaris at 79W nominal (2100 GFLOPS) and on average my gaming experiences got better, despite those generations theoretically having similar power efficiency. The problem would seem to be Nvidia relying too much on hand-tuning specific games, meaning anything too old or out-of-the-way to recieve that tuning is likely to be stuttery.

            • Srsly_Bro
            • 1 year ago

            Gflops aren’t the only important element. My old hd 7950 has great double precision compute, but it doesn’t play modern games well.

            There are other factors to also consider.

            • synthtel2
            • 1 year ago

            I literally just said GFLOPS aren’t the only important element.

            If you prefer more numbers, we’re talking about…

            AMD: 128-bit * 7300 MHz RAM, 896 SPs and 16 ROPs * 1160 MHz core
            NV: 128-bit * 7800 MHz RAM, 1024 SPs and 32 ROPs * 1442 MHz core

            (… as seen in real-world usage, not just spec-sheet stuff here.)

            Point was, by all rights this should have been a pretty significant downgrade, and it turned out to be that way in surprisingly few cases.

            • Goty
            • 1 year ago

            The point is that I don’t care what the percentage is given that the impact is inconsequential. I don’t appreciate NVIDIA’s penchant for pushing proprietary technologies and their history of anticompetitive and anti-consumer practices, so if AMD has a competitive part in the market I’m looking to make a purchase in, I’ll tend to go that way, and a few cents a month worth of extra power usage (given my use case) isn’t exactly a deal breaker.

            (Just for the record, and to keep this at least semi-OT, the same thing goes for Intel as it does NVIDIA. If Intels dGPUs come out and offer tangible benefits over a competing AMD product, I’ll probably go with one of their products. If it’s just, “Well, AMD is x% worse in [metric that doesn’t affect my daily use],” then I’ll probably buy the AMD product.)

          • Zizy
          • 1 year ago

          AMD GPUs are indeed crazy inefficient and very shitty from engineering point of view (lipstick would need to be heavier than the pig and it still wouldn’t mask it). But it doesn’t matter at the end of the day (or month) for most people. The only thing that people really dislike about hungry GPUs is their noise, but 580 is still quiet enough.

          A simple “global warming” calculation:
          1l of petrol has about 9.2 kWh energy (and it isn’t its energy but CO2 that is heating the world).
          100 hours of gaming therefore equals 1l of petrol.
          So, if you play 3h/day on average, all the bad feeling of having inefficient AMD GPUs can be wiped out by cycling or walking ~5km to work or store once a month.

    • Ninjitsu
    • 1 year ago

    Well, the writing has been on the wall for a long time. Hopefully manages to reign in Nvidia without squeezing AMD out of the market (hopefully by 2020 AMD are in a far stronger position overall).

      • Klimax
      • 1 year ago

      That would depend on their execution of Navi.

Pin It on Pinterest

Share This