Poll: Who will be the top discrete GPU maker in 2012?

Graphics are a hot topic among hardware makers lately. Intel is planning a new discrete graphics processor, both AMD and Intel want to squeeze GPU cores into their next-generation CPUs, and Nvidia claims spending on GPUs will increase dramatically in the coming years.

So, with three major players soon to be in the race, who do you think will be the top dog for discrete graphics (i.e. desktop graphics cards and notebook graphics modules) in four years? Will Nvidia retain its dominance, will Intel capture the lead, or will AMD manage to claw its way back to the top? You can share your thoughts by hitting our new poll, either below or on our front page.

In our previous poll, we asked you how often you browse the web on your mobile phone. Not too surprisingly, almost two thirds of TR readers who voted say they don’t surf on their phone at all. However, 11% said they do so every day, 8% do a few times each week, 2% do once a week, and another 3% do a few times a month. That’s almost a quarter of readers who surf on the go with some regularity.

Comments closed
    • sdack
    • 12 years ago

    I vote Nvidia.

    My thoughts:

    Intel would need to buy Nvidia to have something competitive and unless this happens it will be Nvidia. Because the current experts in 3D graphics hardware appear to be working for Nvidia and AMD there will be only a little chance for Intel to come up with an equally competitive team within the next 4 years.

    AMD has the ability but will be working a bit more on their CPUs and chip sets. While being low on cash they cannot do as much R&D as they want to and will also focus more on the OEM market with low-cost and mid-range products. Discrete graphics could possibly get subordinated to this strategy and with less attention to it.

    Nvidia, while currently holding the high-end crown as well as now aggressively targeting the mid-range market, will likely succeed with their new strategy over the next 4 years. Nvidia may loose some ground in the area of chip sets due to AMD’s efforts of creating new ones (i.e. Spider) and will find it easier to gain ground on discrete graphics.

    That is my weather forecast for discrete graphics. May the best forecast win.

    • IntelFan
    • 12 years ago

    Hate to admit it, but DAMMIT is really getting its act together and NVIDIA won’t stand a cat-in-hell’s chance of being anywhere other than eating AMD/ATI dust within a few quarters. Losing share? Errr, no. You’re looking at either 1)Old data or 2)NVIDIA data ๐Ÿ˜‰

    • nikitarrr
    • 12 years ago

    I vote AMD by far

    • Maximus_is
    • 12 years ago

    I voted for Nvidia. I figure that if the market does go to “Hybrid” graphics and AMD is still floundering about (I’m not against AMD, just a realist), Nvidia will just buy out AMD (But the ATI division will probably have to be bought out by someone else because of competition laws) and will enter that market with solutions better than Intel can provide. Of course, that means you probably wouldn’t be able to buy an Intel/Nvidia hybrid but that’s OK with me.

    • PRIME1
    • 12 years ago

    I think AMD is pretty much out of the high end market and I doubt Intel will launch anything capable of competing at the top.

    However, that does not mean that either of them could not sweep the mid to low end market. Right now though neither of them have been able to really even challenge NVIDIA. Intel has not had a discrete chip in years and it still seems a rumor whether or not they will again. AMD has fallen pretty far behind, not only in performance, but more importantly in making money on graphics cards. If AMDs struggles on both fronts continue they may have to sell off something and you can bet it won’t be their CPU division.

    • indeego
    • 12 years ago

    If I’m an nvidia fanboi simply be watching the market heavily lean in their direction, hit after hit after hit, then so be it.

    Intel has no history in the discrete market, and if you look at their historic acquisitions, they have not done well in bringing those to market (like Cisco has. Cisco is masterful at acquisitions and profiting from them.)

    AMD is losing share, it has faltering leadership by any measure, and it is quickly dying without massive change. AMD turned ATI into a market leader thet could set premium prices on its parts into another AMD, where it releases parts based on competitors market, and goes for value. While being #2 or #3 isn’t bad, it is bad when your strategy depends on your other markets being #1g{<.<}g

      • 0g1
      • 12 years ago

      Yeah, I got to agree with you here. Something about AMD’s structuring has really hurt their R&D. They just don’t make high performance parts any more.

        • charged3800z24
        • 12 years ago

        They are trying to get there footing back.. They do have some neat stuff coming, as long as it really comes out.

    • Stefan
    • 12 years ago

    VIA. No, wrong: BitBoys – Glaze3D forever!

    • packfan_dave
    • 12 years ago

    I’d guess discrete to break down as (assuming Intel does, in fact, get into discrete graphics, produces something that’s competitive at least in the midrange and low-end discrete segment)

    Largest market share (both by units and dollars): nVidia
    2nd: AMD
    3rd: Intel

    But I think that integrated graphics (whether AMD Fusion-style on-CPU or the more conventional on-chipset) will become even more common, and Intel and will expand their lead there (due to better integrated chipsets and as a side effect of AMD stumbling in the CPU business).

    • Xenolith
    • 12 years ago

    Since AMD and Intel will be revving up their hybrid markets, Nvidia will be the most focused on discrete graphics cards. Intel and AMD will be concentrating on the mid/lower markets, Nvidia offerings will be more balanced.

    • cegras
    • 12 years ago

    Any vote cast will probably be due to fanboyism.

    I’m sure if this poll was conducted when ATI’s x1900 or x800 or 9800 series were out, everyone would’ve shat bricks and voted ATI.

    • Hattig
    • 12 years ago

    Whilst I voted nVidia I do think AMD/ATI have a good chance too. Their architecture is very scalable, and it will be on 32nm in 2012 at least. Right now they’re just bumping up the number of shader cores in their GPU, is it 480 in the next revision? Stick that through two die shrinks and you’ll have 1500-2000 shader cores (improved over the current design as well) running at a much faster speed.

    I do think that ATI is now executing quite well. They moved to 55nm very quickly for several different designs. Their drivers are good, whilst nVidia have been getting lambasted for their Vista efforts. The featureset is good. The price is good.

    • snowboard9
    • 12 years ago

    The top GPU maker in 2012 will the same as the one in 2008 and for the last decade – Intel !

    Integrated graphics has a GPU in it. Sorry to burst your bubbles.

      • Hattig
      • 12 years ago

      “Who will be the top *[

        • Sikthskies
        • 12 years ago

        Sorry to burst your bubble there snowboard ๐Ÿ˜‰

          • snowboard9
          • 12 years ago

          This is the poll:
          *[

            • BenBasson
            • 12 years ago

            Poll Question:
            Who will be the top discrete graphics supplier in 2012?

            • echo_seven
            • 12 years ago

            See thread #36, Damage seems to have intended that the question be specifically about discrete GPUs.

    • Hattig
    • 12 years ago

    Intel will be on their second generation Larabee by then. I think the first generation will be plagued by poor drivers with unimplemented features, and horrible delays in getting them – this would match their behaviour on their integrated graphics. I don’t think it will outperform dedicated GPUs from nVidia or ATI/AMD however, not even by the third generation, despite Intel’s process advantage.

    I suspect the most activity will be in integrating CPUs and GPUs, but this is not relevant for the question which was about discrete graphics cards (which will also handle physics and other suitable algorithms).

    • odizzido
    • 12 years ago

    It’s almost certainly going to be Nvidia, but I sure hope it’s someone else. There should be an option for that.

    • Fighterpilot
    • 12 years ago

    Being both an Intel and an ATI fan I’d like to see them both kicking butt in 2012 but geez……betting against NVidia seems to be a suckers bet.They’ve barely put a foot wrong of late and appear to have a solid technology R&D division along with a pretty damn good marketing and public relations team.
    I wouldn’t bet against Samsung making a serious play for a spot in the graphics market either.

    • Perezoso
    • 12 years ago

    where is Trident Troll when we need him?

    • Nitrodist
    • 12 years ago

    IBM dark horse vote.

    • bdwilcox
    • 12 years ago

    Bitboys Oy!

    • marvelous
    • 12 years ago

    Intel might creep up by 2012. They are currently the top oem supplier.

      • Damage
      • 12 years ago

      Nope!

      • willyolio
      • 12 years ago

      the question’s about discrete cards.

    • MixedPower
    • 12 years ago

    S3 Graphics.

      • Price0331
      • 12 years ago

      No, just no. Thats all I have to say to this post.

    • Dirtdiver
    • 12 years ago

    How about an option for…..

    “The GPU and CPU have merged, and PC gaming died a 2 years ago so there is no GPU market” in 4 years:)

      • Meadows
      • 12 years ago

      That’s pretty unbelievable.

        • Dirtdiver
        • 12 years ago

        Really….

        Think about how things were 4 years ago. What kind of CPU and GPU did you have then.

        Looks like the most powerful GPU in the spring of 2004 was the 9800XT..
        ยง[<http://www.xbitlabs.com/articles/video/display/ati-nvidia-roundup.html<]ยง So lets think about this.... 1.) AMD has been talking about merging the CPU/GPU. 2.) Will AMD even be around in 4 years? 3.) Intel has talked about a "computer on a chip" for a long time now. They just released .45nm CPU's, not to mention the "Atom", and the new chipset they have coming out will have the memory bus on it. 4.) Where will Intel be in 4 years, .25nm? 5.) How small will hard drives be then? Will 90% of them be static ram? My point being that the whole computer will be smaller, and the room for a dedicated GPU will be a luxury. 6.) Notebooks have been out selling desktops for a few years now. The trend is continuing. Dedicated GPU market share is probably 98% desktop. Less desktops, less GPU's. The only desktop growth you see is from corporate refreshes, and those desktops have integrated GPU's. 7.) PC gaming the reason for GPU's minus the CAD/Pro graphics market. Clearly 90+% of dedicated GPU's are there for games. The PC gaming market has been shrinking for something like 8 years? It paused in 2006, then in 2007 with a large influx of titles, Crysis, Bio Shock, etc....it shrunk again capturing only something like $900 million of the 17 Billion of games sold. Consoles of course had the other 16.1 Billion. 8.) Power. If you have not noticed gas is $4 a gallon in the US or very near that. Power as a whole is maxed out. VMware is making a killing virtulaizing servers in large data centers to save power. My company just switched from HP servers to Dell servers (2900 series) because the Dell after extensive testing uses about 40% less power. GPU suck Power. Notebooks with intergraded graphics and small power efficient CPU's are the wave of the future. 9.) 4 years in the computer world is like 10 years in the auto industry. A lot can and will change in 10 years. 10.) The head of NVIDIA is crying a river about Intel and the GPU market because he is bored. He sees his cash cow running dry. So could the "dedicated GPU" be gone in 4 years. Sure why not its very plausible.

      • 0g1
      • 12 years ago

      Yeah, nVidia may be the top GPU maker in 2012, but if 90% of people are buying integrated instead of discrete, nVidia hasn’t exactly made an accomplishment have they? I can only see GPU sales stopping by a voxel based renderer being used in most new games. This will require many simple x86 cores to accelerate all the ‘if’ statements. And even then, nVidia could make a card to focus in raytracing calculations support for its traditional GPU’s.

      Otherwise, traditional GPU’s, who’s processors are focused on accelerating certain floating point instructions like MADD, will always be needed.

      • Mavrick88
      • 12 years ago

      In 4 years I see most of the market being PC games. Consoles are cool but in 4 years, and this is only based on technology trends, I see A LOT more households owning PCs and they will be the future of gaming. If anything consoles WILL be PCs……

        • 0g1
        • 12 years ago

        I think Nintendo is really onto something. I hope their motion controls get even more involving. This is an area the PC will never be suited for.

    • [SDG]Mantis
    • 12 years ago

    Long way off for a prediction,3.7 years is almost forever in computer processor terms. For nVidia, this is about the 6-series to present gap. That’s a huge jump in capability. For AMD that’s X-series to present.

    In terms of processors, that length of time represents a look back to K8 domination over the Intel P4 space heaters.

    The question is can nVidia continue to execute and keep their current lead. The fact that K9 was, apparently, a dog and never came through for AMD gave Intel the change to leap back into the processor game with a vengeance.

    For those who say AMD won’t be around, look back at twice that time gap. With Athlon XP vs. early P4’s. The performance gap isn’t that dissimilar to the current one. Intel is more wary of AMD now than then. But AMD isn’t out of the game — even if they should have stayed out of the discrete graphics market.

    In that time frame I see a lot better integrated graphics coming from all three camps. This could give Intel a huge edge in integrated graphics — since so many systems use their anemic solution by default — though they have a long way to come still to match nVidia (if they ever get the 8200 drivers working correctly) and the X3200 from AMD. An integrated solution that is good enough for desktop applications, home theater, and light gaming will be a winner.

    One real question is where are monitor and TV resolutions going? If we are still using 1080p in 2012 for major TV’s, then there is a lot of reason to believe that graphics processors that can push out 1080p to large flat panels with better response times might have an impact in the gaming area even for PC’s (not just consoles). What point is there to discrete graphics if integrated can process well enough at the resolution of the monitors dominating the market.

    • 0g1
    • 12 years ago

    I’m not sure Intel should be in this poll. How do we know that Larrabee will not just be an accelerator of sorts like the PhysX card — except more general purpose. Is there any proof that it contains rendering hardware or a display port? All I saw was a bunch of processors and a vector processor.

    And AMD, well I’m not sure if they should be in this poll either. Apparantly they’re going in the direction of integrating the GPU into the CPU. Something like a small ~20meg framebuffer built onto the CPU and then a big external UMA. So I assume they will end up stopping their discrete graphics cards?

    So the only real discrete GPU maker would be nVidia. I think AMD’s and Intel’s solutions will be faster though, especially if a nice voxel based ray tracer is realized.

    • pluscard
    • 12 years ago

    AMD without question. No one has the capabilities of AMD at this point.

      • Krogoth
      • 12 years ago

      Too bad that Porkster and Shintel are not here.

      • 0g1
      • 12 years ago

      AMD hasn’t done anything usefull since the Athlon64 came out in 2003.

      • maroon1
      • 12 years ago

      Both Nvidia and Intel are going to kick AMD ass

        • Hattig
        • 12 years ago

        With S3’s new 4300E graphics chip, the future is unknown!

        Oh, wait…

    • ssidbroadcast
    • 12 years ago

    3dFx Voodoo chips are gonna make a comeback, I can feel it!

      • Krogoth
      • 12 years ago

      With TRU SLI!

      GLID3 3XTREME EDITION!

      • Steba
      • 12 years ago

      what do you mean come back?? they never left… my other pc still has it’s voodoo 2!

        • Meadows
        • 12 years ago

        Because your PC obviously counts as the state of the entire market.

    • Steba
    • 12 years ago

    It really makes no difference because the end of the world is coming Dec 21, 2012, as predicted by Nostradamus, the Mayan calendar and by the Chinese “I ching”…. common guys don’t you know this stuff?!

    it is a bit eerie that 3 different prophecies have come from 3 different parts of the world at different times and they all point to the same date.

    …. but if we are alive i say Nvidia. maybe their new card at that time will actually be powerful enough to save the world.

      • Krogoth
      • 12 years ago

      Nothing is going to happen other then beginning of a new lunar cycle in the mayian calendar.

        • Steba
        • 12 years ago

        isn’t there a new lunar cycle every month? or do you mean the moon’s orbit will some how be altered?

          • MadManOriginal
          • 12 years ago

          The Maya end of calendar date is the end of a large cycle, the largest cycle referred to in their calendar. In Western societies we don’t commonly recognize anything above a millenium although higher numbers may have some name, the largest Mayan cycle is tens of thousands of years or more. wiki it if you want the deatils I’m sure it’s there.

      • Kaleid
      • 12 years ago

      LOL. End of days…again? No one can predict that kind of stuff.
      One day the sun will swell and burn upp all life on this planet but that’s a long long time away.

      • no51
      • 12 years ago

      So… what time zone is the world going to end at?

      • MrPeach
      • 12 years ago

      Nostradamus never mentioned any dates, so that’s crap. Not even considering all his stuff was crap to begin with. The mental wanderings of a deranged mind.
      The Mayan date counting system overflows at a particular date. Assuming it represents the end of the world is just more Y2K style nonsense.
      And the I Ching predicting the end of the world? You’re just making that one up.

        • Steba
        • 12 years ago

        seems to me your just one of those haters that likes to argue about everything in the hopes that you appear smart…

        2nd, i didn’t say i necessarily believe in the prediction i was being sarcastic but you must have been hating too much to notice so here you go…

        Nostradamus video on 2012
        ยง[<http://video.google.com/videoplay?docid=7230541823083423179<]ยง and look up Terence McKenna and "time wave zero".... he used mathematics on the "I ching" and arrived at the exact date of Dec 21, 2012 there is a five part series on you tube, look it up... thanks but maybe next time....pwned...

    • Ashbringer
    • 12 years ago

    With the coming of the CPU-GPU hybrid chips my vote goes to none. By 2012 I would be surprised if Nvidia still selling any add on graphic cards in a market that’ll be dominated by Intel and AMD hybrid CPUs.

    Ray-Tracing will be here, and it’ll be the future. The only reason graphic cards will still sell is because the architecture difference for Hybrid CPU’s will be so massive, that it won’t be much faster then on board video for DX9 and DX10 games. So it’ll be worth it to own a graphics card until game developers begin to use Ray-Tracing in games. Give or take 1-2 years, but certainly nothing like 2012.

    As a consumer you win. I personally wish for the graphics card market to die in a fire. Constant need to update drivers for game compatibility, and the prices of graphic cards are outstanding. We’ve now got the option to buy and hook up “4” graphic cards. Great, another huge gap between on board video owners to sob at.

    A lot of companies will lose out with Ray-Tracing. Consoles are still based on the old architecture that PC’s use today, so porting games to use Ray-Tracing is a big undertaking. Not to forget the huge loss graphic card companies and anything tied to them.

    • alex666
    • 12 years ago

    No offense, but this is probably the most bone-headed poll I’ve ever seen at TR. Why not “Who is going to be the largest hdd manufacturer in 7 years and 3 months”? I’m watching the news right now, and here in the US home foreclosures are up, inflation is raging again, corporations are going bankrupt, and you’re asking about who might be the largest GPU manufacturer in r[

      • bthylafh
      • 12 years ago

      Go away, troll.

        • alex666
        • 12 years ago

        Troll? Wow, I’ve never been called that before. I expressed an opinion, and for that I’m a troll? And I want to add that I generally enjoy Cyril’s articles and have the utmost respect. That said, I still think this is a silly poll, not one of the better ideas for a poll I’ve seen here. And hey, if expressing our opinions makes us trolls, then I guess I’m a troll.

      • indeego
      • 12 years ago

      Politicsreport.com is that way—–g{<><}g

        • Damage
        • 12 years ago

        Actually, it’s to the left. ๐Ÿ˜‰

      • cegras
      • 12 years ago

      I don’t see what’s wrong with his comment.

      I was GOING to vote, but then I realized any vote I cast is probably due to fanboyism.

      If this vote was cast back when the X1900 series was dominating, I’m sure the fair margin of votes would be for ATI. So ….

      • Kharnellius
      • 12 years ago

      Because 4 years is almost half that. *shrugs*

    • Krogoth
    • 12 years ago

    Intel by a long shot if you are going to include integrated solutions. It is a lot less clear on discrete front.

    Nvidia is afraid that Intel is going to marginalize their hold on discrete market. Because, Intel is beginning to make integrated GPUs do not fail at 3D graphics and are capable of satisfying any non-hardcore gamer.

    I just hope that DAMMIT can kick Nvidia in butt, so we can see GPUs that greatly exceed G80-level performance.

      • Voldenuit
      • 12 years ago

      l[

        • Krogoth
        • 12 years ago

        Larabee and its descendants are what I was referring to.

    • DrDillyBar
    • 12 years ago

    I threw my vote behind Intel on this one.
    ATI’s GPU is going to be heavily integrated with AMD’s CPU’s, which makes them not much of an option for me, and nVidia will focus solely on FPS and nifty software packages for their platforms to expose OC features to users (it’s what they think Gamers want). Intel on the other hand is going to approach the problem from a fresh perspective. While all their offerings may not qualify as Discrete, I think they will make a strong impression on the market for Graphics. As long as they deliver well, but we’re guessing 4 years here.
    Edit: As another thought, theres been little speculation of how the Havok IP will be used in their products.

    • MadManOriginal
    • 12 years ago

    Top by what measure? I voted Intel to remain on top by market share but who knows they may get a really good thing going, an x86-ish GPU is interesting and they certainly have the industry muscle to make it work.

    Otherwise I’m not going to make a guess because that’s all it will be, fanboyism as stated in the first post. 2012 is at least two full generations, including new architectures and their respins as one generation, or it could be as many as four generations.

      • Damage
      • 12 years ago

      Intel doesn’t even sell discrete GPUs. How could they “remain” on top?

      Besides, even if you count integrated graphics, they lead only in unit volumes, not sales dollars.

        • MadManOriginal
        • 12 years ago

        q[

          • Damage
          • 12 years ago

          g[

            • DrDillyBar
            • 12 years ago

            75,000x$500; 2,000,000x$18.75; Who’s Dominant?

            • Damage
            • 12 years ago

            ‘taint like that.

            The latest JPR numbers for the overall graphics market (not just discrete) put unit volume shares at 37% to Intel, 28% to Nvidia, and 23% to AMD (for Q4 2007).

            So Intel’s unit volume lead isn’t that large.

            Now, the traditional line has been that Intel throws in an IGP for about $5.00. That was the line some time ago, at least, but it has been eroding. Now, Nvidia contends Intel is essentially giving away the IGP, and as I said, I’m inclined to believe they’re onto something. So for Intel’s revenues per unit, we’re talking about something between zero and five bucks, probably closer to zero.

            On top of that, we know that many PCs are sold in so-called “double-attach” configurations, where an IGP is included onboard but remains unused thanks to the inclusion of a discrete graphics card, as well. Obviously, any double-attach scenario counts as a unit volume increase for Nvidia or AMD, but it also likely counts as a unit for Intel, despite the fact that the IGP remains unused. (The mere fact this situation exists suggests strongly that Intel’s IGP is essentially free, among other things.) Nvidia was jumping up and down about this problem with counting market share last week. I think they are right to protest.

            So…

            Zero times 2,000,000 equals.. bupkis! And when 700,000 are double-attached its equals… less than bupkis?

            • DrDillyBar
            • 12 years ago

            A good answer. ๐Ÿ™‚

            • SPOOFE
            • 12 years ago

            In essence, if the allegations are accurate, it looks to me like Intel uses its IGP to help sell its related motherboard chipsets rather than using them as a separate product. You think that’d be an appropriate description?

            • Damage
            • 12 years ago

            I’d say the reverse is true. They are selling chipsets and shipping graphics bundled into the package for little to no cost.

            • dragmor
            • 12 years ago

            But isn’t this what Nvidia is moving to as well? Arent all new Nvidia chipsets going to include a GPU so their power play system can work?

            • Damage
            • 12 years ago

            Seems to be. So the picture may get even more complex, and Nvidia may win some additional unit volumes by doing that. Another reason, I’d say, why unit volumes aren’t necessarily the best indicator of market share.

    • FireGryphon
    • 12 years ago

    Remember that Intel is the oldest and strongest company there. I’m sure they can pull it off if they want to — and it looks like they do.

      • indeego
      • 12 years ago

      They also wanted to increase the perf of their integrated graphics with new drivers. But they haven’t really made a significant dent at allg{<.<}g

      • sdack
      • 12 years ago

      Intel will not succeed. The Itanium was a very ambitious project but swallowed huge sums and never was a big success. The Core2 had to be developed outside the US and may never have seen the light. They would have kept the NetBurst architecture if AMD had not gained so much on them. You can then say Intel got lucky with the Core2. Not everything they took into their hands turned into gold and not every strategy they had was a win. In other words: wisdom comes with age but senility, too!

    • dragmor
    • 12 years ago

    Probably Intel although it disappoints me.

    AMD will be lucky to be around. I’m sure someone will own the IP and be producing cards, but another acquisition will mean more delays.

    Nvidia seems determined to annoy Intel, calling them out for a fight. Nvidia has more to lose than Intel.

    Intel have the money, the fabs, the R&D, the market position, etc to remove the competition. The only question is if they can get the drivers working. All it would take is Intel willing to enter the market with low margins (under cut everyone till its not profitable for them) and a big (funded push) to ray tracing. Couple that with Intel’s ability to deny Nvidia a chipset license.

    • bthylafh
    • 12 years ago

    Trident.

      • bdwilcox
      • 12 years ago

      I think you meant XGI. Trident Microsystems only makes integrated chips for graphics displays now.

      But XGI…there is a formula for success! Combine the graphics powerhouses of both SiS and Trident and the only thing that can come from that merger is pure gold.

    • Nitrodist
    • 12 years ago

    The way it’s been looking for the past couple of years, Nvidia is the dominant choice.

    • b4b2
    • 12 years ago

    Enter fanboys! ๐Ÿ˜ฎ

Pin It on Pinterest

Share This