JPR: Nvidia still leads in discrete GPU shipments

Those graphics market share numbers we looked at earlier this month are informative, but they paint a somewhat fuzzy picture by melding data for integrated and discrete graphics. What does the market for discrete graphics alone look like? Jon Peddie Research has just released a second batch of numbers that answers that very question.

Vendor Q2 2010 Q1 2011 Q2 2011
AMD 41.4% 40.5% 40.6%
Nvidia 57.9% 59.1% 59.0%
Others 0.7% 0.4% 0.4%

By the look of it, Nvidia still has a sizable lead over AMD—and that lead has increased compared to last year. Sequentially, though, AMD managed to snatch a tenth of a percentage point away from Nvidia, leaving it with 40.6% of the market for discrete graphics processors.

Jon Peddie Research’s report also suggests the discrete GPU market as a whole is shrinking. Discrete graphics shipments slipped from 19.01 million units in the first quarter to 16.1 million in the second, and JPR expects total revenue from discrete graphics to be 33% lower this year than the last. JPR blames increasingly speedy integrated graphics, specifically those built into AMD’s and Intel’s latest CPUs, for the shift.

Comments closed
    • Alexko
    • 10 years ago

    Are you sure this is for all discrete GPUs and not just discrete desktop GPUs? Lately, AMD was supposed to have a significant lead in discrete notebook GPUs.

      • quasi_accurate
      • 10 years ago

      I think this is just desktop. I’m fairly certain I’ve read more than one report that AMD sold more discrete mobile GPUs.

      • tviceman
      • 10 years ago

      Nvidia had 50% of notebook market share last quarter: [url<]http://www.digitimes.com/pda/a20110811PD220.html[/url<]

    • dashbarron
    • 10 years ago

    Alright, this has bugged me long enough. Much like the ASUS debacle, how do you “appropriately” spell “Nvidia?” I’ve seen in every way under the sun and they’re own marketing doesn’t seem clear half the time.

    NVIDIA?
    nVidia?
    NVidia?
    nvidia?
    Suck0rz@TI?

      • Alexko
      • 10 years ago

      It used to be nVidia, but they spell it NVIDIA now. See their website, it’s spelled NVIDIA everywhere: [url<]http://www.nvidia.com/page/home.html[/url<]

    • dashbarron
    • 10 years ago

    I just bought a new Nvidia GPU because they back-peddled and said Kepler won’t be out until 2012, sigh.

      • dashbarron
      • 10 years ago

      Negatives? [url<]https://techreport.com/discussions.x/19675[/url<]

    • odizzido
    • 10 years ago

    Does JPR even play games? I really doubt that integrated graphics are killing sales of discrete ones.

    I can find plenty of things to blame for it. Personally for me though there are a few main reasons why I don’t upgrade as often as I used to:

    1) The vast majority of games have DRM that I don’t like so I don’t buy them

    2) Any games that do come out that I can buy run perfectly fine on my system

    3) Even if I did upgrade my GPU for STALKER, the performance increase over what I have already would be pretty small.

    GPUs have been sitting at pretty much the same speed for a really long time now, but that is okay because game graphical requirements have been doing the same thing for even longer.

      • kamikaziechameleon
      • 10 years ago

      I think demands have stagnated more than actual power.

        • DeadOfKnight
        • 10 years ago

        Yeah, it’s not the cards; it’s the games.

    • Goty
    • 10 years ago

    Ah, consumer ignorance at its finest.

    *gets immediately downvoted into oblivion*

      • indeego
      • 10 years ago

      meh, GTX 465 is a great value @ $130, which is about the midpoint for many people.

      The huge note is a 33% drop in sales for one year. Extrapolate that growth of integrated to ~5 years and the discrete market has very troubling times ahead.

      • Farting Bob
      • 10 years ago

      Why? Nvidia is pretty much equal to AMD at all levels of the market. One card may lead its price competitor by 5fps in certain games while losing to it by the same in others. Or one card might consume 5-10% more power under full load than its rival. But these differences are really quite small. If you pick a budget and get a current generation card in that price point you will be satisfied.

    • lilbuddhaman
    • 10 years ago

    Or the shift could be that those who need discrete graphics already have one that runs anything they throw at it ?
    The Q3 and Q4 numbers will no doubt spike in relation to the massive amount of Next-Gen Engine releases?
    What % change do we typically see from Q1-Q2 ? Q2-Q3 ? Tried to find the 2010 and 2009 numbers, but no luck.

      • tviceman
      • 10 years ago

      I think you’re partially right. While the discrete market is slowly dropping, right now there is a lack of interest to purchase a new discrete card because not many games are pushing the boundaries of today’s cards AND today’s cards performance has basically been available since late 2009 when the hd5870 came out.

        • esterhasz
        • 10 years ago

        While I’m not entirely decided on the matter, there may also be something like a point of “good enough” graphics beyond which increases in hardware performance and graphics prowess deliver diminishing returns for the gameplay experience while the cost for creating content and artworks explodes.

        I find that a game engine from ten years ago that scales to my monitor’s resolution and lets me add some AA can look entirely satisfactory. But then again, I nearly fainted when Wing Commander II came out…

          • lilbuddhaman
          • 10 years ago

          Wing Commander III on the 3d0. Mark Hamill’s best work.

            • CuttinHobo
            • 10 years ago

            I was under the impression the Star Wars Holiday Special was Mark Hamill’s best work!

            [url<]http://www.youtube.com/watch?v=bbF_ecnlyTk&feature=fvsr[/url<]

    • riviera74
    • 10 years ago

    Two things: I had no idea that nVidia still has a sizable lead in graphics. Second, if what JPR says is correct, then AMD may have the last laugh since they actually have integrated graphics that clearly do not suck. That ATi purchase looks better everyday. As for nVidia, if discrete graphics shrink too much, they could face extinction.

      • tviceman
      • 10 years ago

      Extinction? Have you heard of this little chip Nvidia has called Tegra? It is in phones and tablets, and very soon it will be in laptops too.

        • beck2448
        • 10 years ago

        Also nvidia owns the professional market at almost 90% and the GPGPU market as it is in half of the top ten supercomputers. These are both growing markets and very lucrative.

      • Game_boy
      • 10 years ago

      Nvidia saw this a long time ago. They worked on four ways out:

      1) Tegra, mobile phone and tablet SoCs
      2) Tesla, server and HPC
      3) Consumer level GPU computing as a use case for discrete to replace games
      4) Denver, x86 Fusion-type consumer CPUs

      1 and 2 have not been going well in margins, they hardly show up in quarterly financials
      3 failed in its main objective because they didn’t open up the tools enough (to AMD) to make an install base such that independent devs wanted to make consumer GPU compute applications. The only ones that exist Nvidia has sponsored. A failure of proprietary standards.
      4 was almost scrapped when Nvidia “lost” the Intel settlement by not getting an x86 license out of it. The rumoured plan was an ARM core with Transmeta x86 translation (remember they bought up most of Transmeta) but was turned into just an ARM Core as Denver.

        • HighTech4US2
        • 10 years ago

        Wow you are wrong on so many levels so where do I start.

        1) Tegra has made a good profit over the last 2 quarters. And in the second half the growth will explode upwards as cell phones that use it are launched.

        4) You need to get off of char-lies t-i-t for your information.

        Nvidia never wanted a X86 CPU and stated so every time they were asked about it. Those who quote the drivel wrongly believe that only the X86 can serve computing forever. They are wrong.

        And try reading the quarterly earnings transcripts for information on what Nvidia is going to do instead of getting your mis-information from the biggest Nvidia hater around.

        And Nvidia got exactly what they wanted from Intel $1.5 billion and the right to produce their own custom designed ARM processor (Project Denver) without fear of being sued by Intel.

          • Deanjo
          • 10 years ago

          Yup, another victim of Charlies classic bull. He is the only source of the x86 rumor, then later said nVidia changed their mind and quoted his own BS make believe story as proof that they were planning one. The guy has zero credibility. He is like a hunter firing at a cloud of ducks overhead with a blindfold and then saying “yup that was the one I meant to hit”.

            • flip-mode
            • 10 years ago

            I and plenty of others think Charlie is pretty credible. The only people that have a real hard time with Charlie are the Nvidia fanboys. Other people either just don’t care or else have found that he has a pretty good record of credibility for someone who is trying to discover, access, and report on info that most tech compaines don’t want to reveal.

            He was pretty spot on with his Fermi reporting – that it would be hot, late, and partially disabled.
            He was pretty spot on with his bumgate reporting.

            For some reason, he has gotten pretty good access to some leaks at Nvidia. Perhaps due to someone at Nvidia being unhappy with their employer or who knows what.

            Anyway, he is at least what he claims to be – semi-accurate – and in this day and age even being what you claim to be is sometimes more than what people expect. But he’s always getting ragged on by Nvidia fanboys and I often stick up for him, but it gets old.

            • tviceman
            • 10 years ago

            Charlie also said the gtx580 would be a soft/paper launch and have no availability through 2010. Flat out wrong.
            And he also said GF100 would only have 10,000 cards and would then be EOL. Flat out wrong.
            He said GF104 was a “market tourniquet” and insinuated that it was actually not making much, if any money. Since then that has been Nvidia’s most popular chip and nvidia’s gross margins have risen nearly every quarter. So again, Flat out wrong.

            Charlie gets some things right. But his obvious hatred for Nvidia creates an extreme bias in his “reporting.”

            • Antimatter
            • 10 years ago

            Charlie wasn’t the only source for the x86 rumour.
            [url<]https://techreport.com/discussions.x/16516[/url<]

      • kamikaziechameleon
      • 10 years ago

      I have purchased and used both companies products and would site the #1 most untalked about but largest issue for me between brands is consumer card workstation functionality. AMD GPUs are simply not good at lite workstation work. They don’t handle creative sweet well and definitely don’t like modelers. Mean while the Nvidia consumer card not only handles the workload admirably but in many cases can be softmodded to run real workstation drivers. A 130 dollar nvidia card handles workstation loads comparable to a 300 dollar AMD card (I’m purely referencing consumer cards and not actual workstation cards.)

      • Deanjo
      • 10 years ago

      “do not suck”. Guess that depends what you are comparing it to. They still suck when compared to a medium+ range video card.

        • wierdo
        • 10 years ago

        They can play most modern games acceptably, that’s good enough for integrated, especially for reasonably priced notebooks.

        Any more than that – if that was even possible to begin with considering the interconnect limitations and lack of integrated v-memory – and it’ll eat way too much from the discrete video cards market.

      • Farting Bob
      • 10 years ago

      Another positive for AMD is that its GPU is in the Xbox and wii, and confirmed to be in the wii 2, has been rumoured (and no reason to suggest otherwise) to be in the next xboxc (probably a year or 2 away). That is a whole heap of GPU’s they will sell.

Pin It on Pinterest

Share This