AMD claws back some discrete-GPU market share

At the Polaris launch in Macau earlier this year, spicy food aficionado and Radeon Technologies Group leader Raja Koduri noted that one of his goals with the new product family was to retake a significant share of the discrete GPU market. Now, AMD has provided us with some information that suggests the RTG is succeeding in that mission. The company provided us with some numbers from the latest PC graphics market report from Mercury Research, and they're relatively rosy.

Going by Mercury Research's information, AMD increased its share of total discrete GPU sales to 34.2% of the market (by unit volume) in the second quarter of 2016, an increase of 4.8 percent from the previous quarter. AMD's desktop add-in-board sales took the most ground, gaining 7.7% over the previous quarter to make up 29.9% of desktop discrete GPU sales.

According to Mercury's numbers, that performance also marks the fourth consecutive quarter of market share gains for the red team, and AMD expects things to continue to improve with the recent launch of Polaris. Perhaps most interestingly, the note that AMD sent along with Mercury's findings also confirms that the company's next high-end graphics part, codenamed Vega, will be launching in 2017 despite earlier rumors to the contrary. Perhaps cards built with that chip will help things along even more.

Comments closed
    • kamikaziechameleon
    • 3 years ago

    Quite curious when we will see more peripheral reviews. Techreport kinda led the charge into actual quality vs expensive peripherals for PC. I love those articles and I’m sure they aren’t overly expensive.

    Idea, DIY mech keyboard piece. I heard those are all the rage these days.

    • OneShotOneKill
    • 3 years ago

    This could be one of the few instances where the market re-balance came at the cost of customers.

    Usually a company gains market share by releasing a revolutionary product, this time around the over-Hype and marketing did the job.

    To those who bought the 480s… Sorry you have been had!

      • 223 Fan
      • 3 years ago

      No freesync == no sale. Time will tell whether I’ve been had.

        • OneShotOneKill
        • 3 years ago

        Freesync is alive and well on older gen cards… nothing revolutionary here.

    • sweatshopking
    • 3 years ago

    I bought the first NVidia gpu I’ve had since the 9800 this week. No 480’s in stock meant i couldn’t buy one even if i wanted, which i did.

      • Meadows
      • 3 years ago

      GTX 1060 for you too, or did you aim higher?

        • sweatshopking
        • 3 years ago

        1060

          • EzioAs
          • 3 years ago

          I’m planning on getting one too at the end of the month. I’ll probably get the MSI 1060 GT, because it’s significantly cheaper than other 1060s and the cooler is still pretty impressive.

            • sweatshopking
            • 3 years ago

            I purchased the gigabyte windforce. Was 418CAD when all was said and done. Crazy expensive.

            • EzioAs
            • 3 years ago

            I know. Even the MSI 1060 GT costs about twice the price I paid for my GTX 460 six years ago.

            • Beelzebubba9
            • 3 years ago

            You poor Canadians!

            That said Montreal is super cheap now for your unruly neighbors down south. 😀

      • 223 Fan
      • 3 years ago

      I just got an email from Newegg notifying me that the Sapphire Nitro+ is back in stock. AMD’s marketshare just got a little boost.

        • sweatshopking
        • 3 years ago

        470 or 480?

          • 223 Fan
          • 3 years ago

          SAPPHIRE NITRO+ Radeon RX 480 100406NT+8GOCL 8GB 256-Bit GDDR5 PCI Express 3.0 x16 HDCP Ready Video Card. Just in time for Civ VI and Deus Ex.

            • sweatshopking
            • 3 years ago

            Never listed with Newegg.ca that was my first choice for a card, but nobody in Canada sells it.

            • 223 Fan
            • 3 years ago

            Bummer. Sorry to hear that.

    • Hameedo
    • 3 years ago

    This is actually not true. For publicity reasons, AMD always markets Mercury numbers which are not accurate, and are based on investor calls. John Peddie on the other hand is far less forgiving toward them. And is also far more accurate.

    For example, it showed a split of Q1 market share to 22% AMD and 77% NVIDIA.
    [url<]http://jonpeddie.com/publications/add-in-board-report[/url<] Lets wait for John's numbers before we make any unneeded celebrations. We all know how AMD likes to play with words in it's marketing efforts.

      • Jeff Kampman
      • 3 years ago

      AMD cites Peddie’s latest numbers in its release, as well, something I’m not sure they’d do if it ran counter to the message they’re trying to send here. Either way, saying it’s “not true” seems premature.

        • Hameedo
        • 3 years ago

        It is a lie as well: here is Mercury numbers for Q2: AMD 22.8% NVIDIA 77.2%
        Mercury numbers are private, public don’t see them, a guys on reddit pulled the pdf file results and posted them:

        [url<]https://www.reddit.com/r/Amd/comments/4z11bm/mercury_research_q2_gpu_shipments_for_q2_2016/[/url<]

          • sweatshopking
          • 3 years ago

          wtf reddit

        • Hameedo
        • 3 years ago

        John Peddie numbers are up:
        Discrete desktop now stands at 70% NV, 30% AMD, likely influenced by NV clearing stocks.
        [url<]http://jonpeddie.com/publications/add-in-board-report[/url<]

    • Voldenuit
    • 3 years ago

    [quote<]According to Mercury's numbers, that performance also marks the fourth consecutive quarter of market share gains for the red team, and AMD expects things to continue to improve with the recent launch of Polaris. [/quote<] From steam hardware survey (JUL 2016), past consecutive months of AMD GPU installed base (this includes IGPs and laptop parts): JAN: 26.2% FEB: 26.0% MAR: 25.5% APR: 25.4% MAY: 25.5% JUN: 25.1% JUL: 24.7% Most likely doesn't cover Polaris yet, but I don't see the market share gains translating in the same way to installed base, at least among the steam user community.

      • tay
      • 3 years ago

      Don’t have access to Steam at work, but how does this compare to nvidia’s trajectory?

        • Voldenuit
        • 3 years ago

        NVDA:
        JAN: 54.8
        FEB: 55.8
        MAR: 56.4
        APR: 56.6
        MAY: 56.6
        JUN: 56.7
        JUL: 57.0

      • selfnoise
      • 3 years ago

      Is this pointing to miners vs. gamers? People seem quick to say “oh, the miners bought up all the 480s and that’s why we can’t buy them” but maybe this is some fire to that smoke.

        • Voldenuit
        • 3 years ago

        For bitcoin, at least, mining transitioned to ASICs and FPGAs a long time ago.

        From what I understand, there are still some fringe (and very speculative) minor currencies that are profitable to GPGPU, but they are hardly as common as the initial bitcoin craze.

        Possible explanations:
        1. People are replacing old amd cards with newer amd cards (esp GCN 1.0 cards that don’t support DX12 and Freesync).
        2. The Steam data does not capture 480 timeline since JUL data may predate 480 launch and/or availability.
        3. Data on desktop discrete parts is being drowned out by laptop SKUs (unlikely to be overwhelming, esp since mobile pascal is just out and mobile polaris isn’t available at retail yet).
        4. People are buying a second 2xx or 3xx card to crossfire now that they are cheap.

          • Pettytheft
          • 3 years ago

          Ethereum mining community is snatching them up. The demand for them is legit. I resold my R290 for more than I was expecting on ebay when I recently upgraded.

          [url<]https://forum.ethereum.org/categories/mining[/url<] Nearly every post is about AMD cards. Which is cool, when the bottom falls out I can pick up another one on the cheap.

            • Firestarter
            • 3 years ago

            you mean I could offload my HD7950 to a miner for good money?

            • Concupiscence
            • 3 years ago

            Their double precision speed’s excellent, so that could definitely move.

            • OneShotOneKill
            • 3 years ago

            I never did, or will understand the value mining provides.

            Use energy, create heat and destroy the planet under the pretext of providing an alternative to state backed currencies.

            WTF?!?!

            • techguy
            • 3 years ago

            Overdramatic, much? Destroy the planet? Cereal?

            • OneShotOneKill
            • 3 years ago

            I am willing to bet that running an array of 7990 at max 24/7 produces a significant amount of heat and consumes an obscene amount of power. Yes killing the planet is over-dramatic, but there is some truth the pointless nature of mining.

            Maybe someone can explain to me the benefit of an algorithmic generated finite number of “hashes” vs a pre-populated universe of serial numbers that can be used as a currency.

            But then again back to the topic, gamers are not buying what AMD is selling.

            • Voldenuit
            • 3 years ago

            [quote<]Maybe someone can explain to me the benefit of an algorithmic generated finite number of "hashes" vs a pre-populated universe of serial numbers that can be used as a currency. [/quote<] The ideal behind crypto currency was to have a "democratic" (note use of parentheses) currency not backed by any government(s) that could be used for commerce. The "value" of the currency would be derived from a mathematical scarcity and not how much green the Fed chooses to print. Of course, it quickly devolved into a gold rush and criminal misuse, but that's human nature for you. I could have said I saw that coming, but that'd be a lie, because I didn't even think people would be foolish enough to buy into yet another artificial currency (as all currencies are artificial to some extent). I guess I underestimated the amount of greed involved and how quickly the criminal element latched on to BTC.

            • OneShotOneKill
            • 3 years ago

            Thanks! Back to the barter system we go. I will trade you TWO AMD 480s for your 1080. Accept?

            How about 3?

            • Voldenuit
            • 3 years ago

            The one true unit of desirable currency should be “xx hours of Netflix and Chill”.

            Unfortunately subject to massive inflationary spikes when Stranger Things S2 or Luke Cage come out.

      • Platedslicer
      • 3 years ago

      I regret to say I’ll be joining that statistic soon… just waiting for my 1070 to be delivered.

      Ye olde 7970 just isn’t cutting it anymore and I want something that will provide smooth gameplay for a couple of years to come. Polaris barely holds its own now, I can’t imagine it will age very well.

      • Meadows
      • 3 years ago

      One might assume the research firm’s numbers come from GPUs shipped to date (to retailers or otherwise), not all of which immediately translate to installed base. Not to mention that availability was rather thin until the beginning of August or so, which isn’t even in the Steam results yet.

      • BobbinThreadbare
      • 3 years ago

      Not sure how much single quarter of sales would affect total install base.

    • Leader952
    • 3 years ago

    [quote<]AMD increased its share of total discrete GPU sales to 34.2% of the market (by unit volume) [/quote<] Unit volume is pretty useless especially if the volume is only low priced and low margin. High dollar/high margin sales is much better. From Nvidia's recent earnings report that "High dollar/high margin sales" segment really added the revenue (and profits) to their bottom line. With the release of the GTX 1060's (6GB/3GB) AMD's share in the low end will now also be contested.

      • the
      • 3 years ago

      The problem is that AMD currently doesn’t have any enthusiast class card that can compete with the GTX 1080 and the Pascal based Titan X. Vegas will need to come along for that to happen.

      The GTX 1060 will certainly contest the low end but one thing to consider that the RX 480 wasn’t out for the full quarter to take back some market share. AMD also has the RX 470 and RX 460 out to take the rest of the low margin market where as nVidia doesn’t have a Pascal offering yet. I would expect a GTX 1050 to be coming to market in the not too distant future.

      It is good to have the GPU market being exciting again.

      • kmm
      • 3 years ago

      This is an important point about margins and what exactly they’re selling. Of course it’s preferable to be selling the more profitable cards, and they’ve got to be envious of Nvidia’s healthy sales in the enthusiast segment of GTX 970s and so on, never mind all the expensive 10-series cards these days.

      But given the position AMD is in, I think they can be a little happy about moving some more units and not having their mindshare and fanbase deteriorate further.

      Let’s all hope they can get a lot out of Zen and keep up the competition in CPU and GPU development.

      • Pancake
      • 3 years ago

      It’s not entirely useless. Software developers will pay more attention to optimising and testing for AMD GPUs. But it’s gotta suck for AMD to look at their bigger rival laughing all the way to the bank.

      • beck2448
      • 3 years ago

      Exactly. Revenue share remained virtually unchanged. Next quarter we’ll see the results of volume Pascal and Polaris in the marketplace.

    • tipoo
    • 3 years ago

    I can only see this as a good thing for the market. Well done, well done.

    I hope it’s not in exchange for having decimated margins?

    • the
    • 3 years ago

    The Scott Wasson effect at work.

      • Neutronbeam
      • 3 years ago

      All hail the Wassoning!

      • tipoo
      • 3 years ago

      I really hope he scans these articles comments every now and again and gets a good chuckle out of this, lol

        • Damage
        • 3 years ago

        If only my contributions were worth five points of market share every quarter.

          • tipoo
          • 3 years ago

          It’s juts a Frame of mind, you have to Pace your contributions out

          • nexxcat
          • 3 years ago

          You should ask for a [i<]tiny[/i<] raise with the 5% market share increase this quarter as justification.

    • Beelzebubba9
    • 3 years ago

    While I’m glad for AMD, the massive performance and efficiency advantage Pascal has makes it hard to believe this generation will play out any different than the last….

      • tipoo
      • 3 years ago

      It’s certainly an uphill battle, but so long as there are no overwatts I think most gamers care more about performance per dollar than performance per watt.

      Though, to keep marketshare it may mean AMD has to lose margins.

        • Beelzebubba9
        • 3 years ago

        I agree that most gamers – myself included – don’t really care about peak power draw. Noise matters as does idle power draw, but I expect my system to produce a lot of heat under load.

        The problem is most modern computing is limited by power, so if nVidia can get 80% better performance out of the 1070 for the same ~150W power than AMD can with the RX 480, one can reasonably assume that at 250+ W the gap will be similar for Vega and GP100/GP102.

        This doesn’t mean AMD can’t sell compelling products for a compelling price, but if it takes a theoretically much more costly Vega + HBM2 to match the GP102 in performance that bodes poorly for AMD”s bottom line and continued ability to compete. Which is basically the same with every AMD GPU in a post Maxwell world.

          • DPete27
          • 3 years ago

          [quote<]if it takes a theoretically much more costly Vega + HBM2 to match the GP102 in performance that bodes poorly for AMD"s bottom line and continued ability to compete.[/quote<] This is what I've been afraid of in the recent past. It's not that AMD isn't making profits on their graphics cards, it's that their profit margin is likely smaller than NVidia. More profit means more R&D budget for next generation, and so on. Extrapolate that for just a few generations and AMD just falls further and further behind. It's just not a business situation that will ever allow a company to be an industry leader.

          • Chrispy_
          • 3 years ago

          I think I’m in the minority but my gaming mancave (technically it’s my office where I do work) gets toasty warm quite easily and my HTPC is in a very quiet environment.

          I had a pair of 290X cards, both of which were faster both for gaming and work.
          I chose to ebay them both and replace with “Founder’s Edition” GTX970 cards. The office is literally half as warm when gaming and the HTPC is considerably quieter. Almost silent at idle.

          I’m staggered by how noisy some gamer’s PCs are, but I guess I also managed to put up with screaming Delta fans on my overclocked Celerons, Durons and AthlonXPs at one point in my life.

            • Beelzebubba9
            • 3 years ago

            I, too, had PCs you could hear boot up form across the house back in the day. Man that was terrible.

            My gaming/HTPC sits in my living room which is very open, so heat isnt a problem but noise is. At this point even while playing games I have to put my ear up to the system itself to hear anything over the background noise of the city. I’ve been very pleased with it.

            Edit: The GTX 1070 I have is so amusingly efficient it doesn’t even spin up its fans for even light gaming. I think the only time I hear it is on boot when it does its fan tests.

        • ronch
        • 3 years ago

        You’re forgetting the mobile segment. With a more power-hungry product you’re asking to drain the battery faster and heat things up a bit more. That’s a recipe for fewer design wins.

        This power issue may be less critical in the desktop segment but many people will still want to get the more efficient option. Why wouldn’t they if they could? And time and time again we’ve seen that people are willing to pay more to get better products. We see that in the CPU market. AMD is cheaper but they continue to lose market share. In the GPU market, Nvidia holds the majority share of the discrete GPU market even as AMD tries to undercut them and give you more performance for the money.

        People will pay for better products if they could, more than we realize.

          • tipoo
          • 3 years ago

          True, though most laptops with dedicated graphics also swap to an IGP when on battery, no? So it would only impact you if you game or do pro work on battery, which most people plug in for. Unless you mean an AMD APU from the outset, but until Zen APUs ship they’re not doing terribly hot there on the CPU side anyways.

          Source: Have a mobility Firepro, swaps to Intel graphics on battery unless an app I set to force the dedicated graphics opens.

            • EndlessWaves
            • 3 years ago

            Yeah, battery life on a gaming laptop is irrelevant but the extra heat output is a big issue given the current desire for thin and light.

            • Voldenuit
            • 3 years ago

            If you’re in a TDP-constrained environment (like a laptop chassis), then I’d say load power consumption (and by extension, dissipatioin) become even more important.

            From the previews, the 10xxM parts are performing very well in laptop chassis. Without a major architectural change, I don’t see AMD keeping up with nvidia on mobile parts, unless there’s some massive chip binning going on for mobile.

            • tipoo
            • 3 years ago

            Yeah, also fair.

            • the
            • 3 years ago

            There is indeed chip binning specifically for mobile. The real question is how much does that defer from the desktop product in terms of power and performance.

            • tipoo
            • 3 years ago

            This would have been decided months earlier if they were going to do it, but they could also go the Intel route where they ship larger IGPs for mobile so that they can still hit performance target X while clocking it lower and taking less net power, as die size increases don’t increase power draw nearly as much as clock speed and voltage. So you can use it as a method to actually bring down power. It may eat into AMD margins if they go this way though.

            [url<]http://www.anandtech.com/show/7085/the-2013-macbook-air-review-13inch/4[/url<]

            • Voldenuit
            • 3 years ago

            Oh, there’s definitely binning for mobile. Both amd and nvidia will be doing it. But amd will be disadvantaged here due to both the process and the architecture.

            • the
            • 3 years ago

            I’m not entirely sure. AMD has been rather aggressive regarding voltage on their desktop parts to ensure stability/yields. My google-fu is weak today and I can’t find the forum thread here where a user reduced GPU core voltage significantly on a R9 290X and was able to cut power consumption in the ~30% range if my memory serves me correctly.

      • Chrispy_
      • 3 years ago

      Does it matter? AMD is competing on DX11 price and performance and handily beating Nvidia on DX12/Vulcan price and performance.

      AMD running hotter is hardly new; The real question is whether AMD are preparing a counter to Volta in 2018. In 2015 Maxwell kind of put the smackdown on GCN, so if Volta is the same architectural leap that Nvidia make but AMD don’t, it’ll be a repeat of the rinsing AMD took last year.

      I’m just hoping that marketshare gains are allowing AMD to invest more in architectural R&D. Most of the credit for the RX series goes to GloFo for the die shrink and clock boost because the architectural gains between GCN 1.2 (Tonga in the R9 380X) and Gen4 GCN (RX470) are empirically zero.

      I picked those cards because they have the same SP, TMU, ROP and bus width configurations, only the RX 470 is clocked about 20% faster (assuming a 1150-1200MHz average boost). Performance difference? [url=http://hwbench.com/vgas/radeon-rx-470-vs-radeon-r9-380x<]about 20%[/url<] RX series: 100% process improvements, 0% IPC improvements. Until someone proves otherwise to me, I'll presume it's because AMD bet on DX12 and Vulcan, which probably won't have a significant effect on mainstream games until just before Volta launches.

        • Beelzebubba9
        • 3 years ago

        It only matters because if AMD cannot produce a high performing part that can match nVIdia’s best, then they are locked out of the high margin, high end of the market. They also lose out on the halo effect of having their products featured in all of the best and most expensive systems.

        Otherwise I agree with your post.

        • DPete27
        • 3 years ago

        AMD gets free architectural R&D through the community. First Scott opened AMDs eyes to frametime variance with his [url=https://techreport.com/review/21516/inside-the-second-a-new-look-at-game-benchmarking<]Inside the Second research.[/url<] Something which nVidia claimed they had already been monitoring at least in the design of their current architecture at that time. Now [url=https://techreport.com/news/30458/david-kanter-shows-nvidia-gpus-use-tile-based-rasterization<]David Kanter showed Nvidia is using tile-based rasterization[/url<] which is likely a large clue to why nVidia can get so much more performance while using less resources than AMD. It's cheaper to be in second place. You only have to reverse engineer the competition. Let them spend all the R&D money blazing the path.

          • lilbuddhaman
          • 3 years ago

          Sounds like Nvidia is the one getting free R&D with AMD trying to be the trailblazer for DX12 performance. Their next chip will have efficiency as well as the superior design.

            • DPete27
            • 3 years ago

            How so? It’s been shown a bunch of times that Nvidia doesn’t see much gain from DX12 [async]. Mostly because async is meant to put idle resources to work. This rewards inefficient GPU design (for gaming). AMD has a less efficient design (hence why they need more resources & power to match Nvidia), hence they have more idle resources, hence they get bigger gains in DX12 async. It’s no wonder why AMD pushed so hard for DX12, they had a vested interest.

            (disclaimer) my next GPU will be an RX480 as soon as they’re in stock. I may be pessimistic about AMD, but GSync alone is a dealbreaker for Nvidia. As long as they keep up that charade, I’m willing to dismiss quite a bit of shortcomings from AMD.

            • Voldenuit
            • 3 years ago

            [quote<] I may be pessimistic about AMD, but GSync alone is a dealbreaker for Nvidia.[/quote<] Yeah, prices are high, but man do I love ULMB+Fast Sync (on fast games) and G-Sync (for slower paced games). I also appreciate that G-Sync monitors (and panels) are curated for 30-144 Hz (or higher), whereas Freesync monitors are a minefield of 48-75Hz panels that an unwary user has to navigate around. If you want a good Freesync monitor (and they do exist), don't buy the cheapest one you can find.

            • Chrispy_
            • 3 years ago

            I have found that on the three 75Hz Freesync monitors I’ve encountered, (two LG’s and a Dell) the silly window can be changed quite easily.

            Whether that’s true for all screens, I’m not sure but the LG 29UM67 we have on one of the vis stations for 21:9 testing came with 55-75 initially but was firmware updated to 40-75. I then tested it using the CRU (custom resolution utility) and found that 32-80 was in range of the embedded display controller.

            So yeah, 55-75 is pathetic, but the IPS technology and 12-month-old display controller were perfectly capable of far, far more.

            • DPete27
            • 3 years ago

            Ugh, you had to remind me…. My brother just got an LG29UM67P for $215 this past week. And I couldn’t because I have a baby on the way…

            BTW, how do you go about checking/updating the firmware on that monitor? I’ll notify my brother. Everything I’ve read about it said 40-75Hz FreeSync range.

            • Chrispy_
            • 3 years ago

            You don’t . You send it back to LG.
            In my case it was within 28 days of purchase when the specs changed to the new revision.
            I sent it back when I realised it couldn’t do 48Hz natively in Unity which was the sole reason it was bought.

            • drkskwlkr
            • 3 years ago

            It’s always ‘the next one’ with you people 🙂

        • AnotherReader
        • 3 years ago

        Computerbase.de showed a [url=http://www.computerbase.de/2016-08/amd-radeon-polaris-architektur-performance/2/#abschnitt_die_gpugenerationen_im_benchmark<]performance increase of 7% at the same clocks for the 470 compared to the 380X[/url<]. The Witcher 3 is 15% faster at the same clocks in the same article. On the other hand, the performance increase in Call of Duty: Black Ops III is non-existent.

          • Chrispy_
          • 3 years ago

          Yeah, it’s going to vary by game a bit, and websites are going to have slightly different results based on how they test and in which games.

          One thing I find doubtful about that Computerbase.de article is that they underclocked the 470 making it far more likely to hit that peak boost speed of 1040 for the whole test, whilst the Tonga card was overclocked.

          Without showing a GPU clock graph alongside those results, I have to assume the test is invalid because no Radeon since Bonaire (GCN 1.1) has had completely fixed clocks – the clocks are variable and related to power and thermal readings on a millisecond-by-millisecond basis. I am willing to bet the farm that the [i<]average speed[/i<] of the Tonga card during that test was far lower than the stated 1040MHz, whilst the R7 was likely maxed out at 1040MHz for the entire test. [b<]This is the default behaviour of Radeons, and Computerbase.de failed to even mention it in their testing methods.[/b<]

            • AnotherReader
            • 3 years ago

            I agree that testing at lower clocks would be preferable. Perhaps dropping to the 925 MHz of the original Tahiti would have been prudent. Nevertheless, the difference in Witcher 3, at least, suggests that AMD has seen improvements over the years.

            • Chrispy_
            • 3 years ago

            Yes. GCN 1.0 and 1.1 should fall behind Tonga and newer because of the memory compression.

            The games that show the most gain between generations are the same games that show improvements from more memory bandwidth. Computerbase.de didn’t even bother to match the memory speeds in their tests, as if my previous criticisms weren’t enough to screw up the results.

            • AnotherReader
            • 3 years ago

            The memory throughput was the same at 6.6 GT/s for the 380X and 470. The 280X had its memory underclocked to 4.6 GT/s.

            • Rza79
            • 3 years ago

            They are testing with a Sapphire Nitro Radeon R9 380X which has a stock clock of 1040Mhz and it’s perfectly capable to hold that speed (and much more). So what you’re saying doesn’t hold up. The test is valid.

        • Rza79
        • 3 years ago

        [quote<]Most of the credit for the RX series goes to GloFo for the die shrink and clock boost because the architectural gains between GCN 1.2 (Tonga in the R9 380X) and Gen4 GCN (RX470) are empirically zero.[/quote<] More than zero. 7% on average. Minimum 1%, up to 15% gain over GCN 1.2. [url<]https://www.computerbase.de/2016-08/amd-radeon-polaris-architektur-performance/2/[/url<] [url<]https://techreport.com/forums/viewtopic.php?f=3&t=118379[/url<] Edit: I now see that AnotherReader already posted this.

      • EndlessWaves
      • 3 years ago

      Massive? It’s hardly that. It’s been fairly common to see that sort of difference between designs in the past and it’s certainly less than some of the extremes that we’ve seen. For example five years ago your choice was between the 110W 6870 and the 150W GTX 470, and that was before nVidia cranked up the clocks and power consumption for the 500-series refresh.

        • Beelzebubba9
        • 3 years ago

        The RX 480 is roughly as power efficient as the ~2 year old Maxwell 2 based GTX 970 and the GTX 1070 ~80% faster at the same power draw on the a similar 14nm FinFET CMOS.

        I don’t see how that constitutes anything other than a massive performance and efficiency advantage for nVidia that stretches down to mobile and up to HPC/4K gaming.

          • AnotherReader
          • 3 years ago

          A small correction: the GTX 1080 is the one that is 80% faster than the 480.

            • Beelzebubba9
            • 3 years ago

            Thanks for the correction!

          • Kaleid
          • 3 years ago

          But compare the 1060 to 480, the performance between them is quite close and the difference is no more than 30-40 watts, hardly earth-shattering.

          Most of the difference seems to come from tile-based rendering, something AMD probably will implement too.

          But certainly the 480 is quite inefficient when overclocked.

      • shank15217
      • 3 years ago

      [AMD Good NEWS] —–> [TR Forums Negatometer] Beep Boop Beep Boo Beepp —-> [AMD Is Dead]

      Seriously give me a break..

        • Beelzebubba9
        • 3 years ago

        Tell me where I implied AMD is dead in my post.

      • cldmstrsn
      • 3 years ago

      We havent even seen Vega yet though….

      • raddude9
      • 3 years ago

      I’ll give you performance OR efficiency but not both. Besides, Price is more important than either.

        • Beelzebubba9
        • 3 years ago

        How is Pascal not both faster and more efficient than Polaris 10? The GTX 1060 6GB is $10 more than the RX 480 and both faster and more power efficient.

        And value is more important than price. I figured this was self evident.

          • raddude9
          • 3 years ago

          Overclock the 480 and the 1060 ceases to be more powerful.

            • Beelzebubba9
            • 3 years ago

            And what happens when you overclock the GTX 1060 past 2Ghz like pretty much every Pascal GPU will do without issue.

            • raddude9
            • 3 years ago

            Easy, use LN2 to overclock the 480 even more, result, performance the same again.

            • Beelzebubba9
            • 3 years ago

            Man you must own a huge amount of AMD stock of that to make any sense. 🙂

            • raddude9
            • 3 years ago

            I don’t own stock in any tech company any more.

            • AnotherReader
            • 3 years ago

            Pascal doesn’t have the overclocking headroom that Maxwell had. The Asus Strix 1060 had a [url=http://www.anandtech.com/show/10540/the-geforce-gtx-1060-founders-edition-asus-strix-review/17<]maximum performance boost of 13%[/url<] compared to the reference 1060 in Anandtech's review.

      • ptsant
      • 3 years ago

      Cheapest Pascal is more expensive than the most expensive Polaris.
      In a great part of the planet, the $300 price point is simply not relevant. Most of the sales occur below.

      AMD did this to capture market share. They decided to go for the biggest segment, not the most profitable. We’ll see if it works…

        • Beelzebubba9
        • 3 years ago

        The cheapest Pascal is $199 (GTX 1060 3GB), so in the same cost segment as Polaris 10.

        But yeah, I think AMD’s strategy is a good one for now but we’ll see what it does for their bottom line.

      • beck2448
      • 3 years ago
      • beck2448
      • 3 years ago

      This article neglects to mention that Revenue share remained virtually unchanged which means that Nvidia continues to make Almost 100% of discrete GPU profits. Next quarter we will see the impact of the Pascal release. Should be interesting

Pin It on Pinterest

Share This