Radeons take back graphics card market share

Apart from price cuts, there hasn’t been much in the way of good news on the AMD processor front lately. As any enthusiast grappling with which card to buy will tell you, the Radeon maker is much more competitive in the graphics arena. Jon Peddie Research’s market share numbers concur, and they say AMD took market share from Nvidia last quarter. Radeons made up 40.3% of discrete graphics shipments, up from 37.8% last quarter. Meanwhile, GeForce cards saw their market share drop from 61.9% to 59.3%.

You’ll notice the numbers don’t add up to 100% exactly. S3 and Matrox are still shipping graphics cards, though even combined, they make up just 0.4% of the market.

The real action is between the big dogs. JPR credits the early introduction of the Radeon HD 7000 series, coupled with Nvidia’s initial Kepler supply issues, for the shift in market share between the two. If you look at the numbers from last year, the relative positions are similar. In the second quarter of 2011, AMD had 41.2% of the pie, while Nvidia’s market share was 58.4%.

As a whole, the graphics card market declined 7% from this quarter last year. JPR won’t weigh in on whether that trend will continue, citing uncertainties associated with "world-wide economic conditions," but it does note that AMD’s A-series APUs are replacing budget add-in cards. Intel’s ever-improving HD Graphics implementations are likely lessening demand for low-end discrete cards, too. With desktop Trinity chips flexing faster integrated Radeons just around the corner, the trend would seem to be toward slow erosion from the bottom up. That’s not necessarily a bad thing for consumers, since it means the baseline quality of "free" integrated graphics is rising.

The JPR press release that hit our inbox isn’t online just yet. When the PR pops up, you should be able to find it here.

Comments closed
    • deinabog
    • 7 years ago

    I’m sure the sales of Radeon-based GPUs is helping keep AMD afloat so this isn’t a bad thing. I picked up two GeForce GTX 670 cards (had to buy them from EVGA’s store since Newegg was fresh out) back in June and I’ve been more than satisfied with their performance.

    AMD, Nvidia: it’s all good.

    • jamsbong
    • 7 years ago

    I’m not sure how significant the gains or losses are but it looks like that both sides provide great solutions. ATI works well at high resolution, better at multidisplay, and opengl support (CAD), double precision. NV is great at energy efficiency, cuda, 3d display and resale value. Both are great at gaming.
    So depend on which features you priotise, or you can have both.

    I feel that nvidia is not pushing the graphics market anymore. They are more focus on the mobile business.

    • just brew it!
    • 7 years ago

    [quote<] S3 and Matrox are still shipping graphics cards, though even combined, they make up just 0.4% of the market.[/quote<] Well I knew Matrox was still hanging on in a few niche markets, but I'm a little surprised to see that S3 is still selling graphics cards at all!

    • Bensam123
    • 7 years ago

    It’s getting there… Still surprises me how there is such a huge gap between relatively similar solutions (10% in this case is a lot of people).

      • travbrad
      • 7 years ago

      It’s a lot smaller than the gap between AMD and Intel back when AMD didn’t just have a “similar” solution but a clearly superior one. Obviously part of that was underhanded tactics by Intel, but I also think brand-names mean a lot. Not many people have even heard of AMD, but quite a lot of people know the Intel name, even if they don’t know exactly what it is.

      I still hear a lot of less-knowledgeable PC gamers proudly declare they have “geforce graphics” (usually some terrible low-end card) because that “radeon stuff is crap”. “Radeon” graphics haven’t been bad in over a decade, but old reputations die hard I guess.

        • Bensam123
        • 7 years ago

        Yeah… I can’t speak about underhanded tactics, but definitely superior marketing helps a lot. Brand name too…

        The whole BD thing isn’t helping AMDs Radeon lineup either. Even if it’s completely different, people still think one relates to the other so they’ll criticize it as such.

    • sschaem
    • 7 years ago

    I’m starting to think R. Read was crying on the phone when he called Jim Keller…

    Its also getting weird that nVidia got more cash now than AMD is worth as a whole.

    The 7 serie is a stellar product, so its weird to think a company can loose money with a state of the art product, yet the company down the street is racking in money by the hundreds of millions with a very similar product. AMD put all that effort in compute, crush nvidia HW in that regard, yet cant capitalize on it… Even when ATI HW team deliver super work, its sabotaged by AMD management.

    This is an accurate portrait of AMD executives getting their next loan.
    [url<]http://www.youtube.com/watch?v=mvZiqpjLZaA[/url<]

    • tootercomputer
    • 7 years ago

    I remember how the original 8800 series of video cards by Nvidia was such a breakthrough, really was a significant bump/improvement compared to existing cards. I cannot say I’ve seen that kind of bump since then.

    In any case, I’m really glad to see AMD keeping video cards a vibrant and competitive market. I have an almost 3 year-old 5870 in my main system and it’s been such a terrific card. I’ve purchased both AMD and Nvidia over the years, been pleased and disappointed with both.

      • BehemothJackal
      • 7 years ago

      I would argue that when ATI/AMD released their 4000 series that it was just as big of a jump forward. It was the tiny chip that could/would.

      From TR’s 4870 review “In practical terms, what all of this means is that the Radeon HD 4870, a $299 product, competes closely with the GeForce GTX 260, a $399 card based on a chip twice the size.” If you look at the review the 4870 beats the 260 in every benchmark and even beats the 280 in several benchmarks. Sounds like a pretty big leap to me.

      [url<]https://techreport.com/articles.x/14990/1[/url<]

    • Chrispy_
    • 7 years ago

    [b<]So, this is what happens if you don't have any current generation parts for less than $300? I am shocked. [/b<] People seem to forget that the 560Ti was more of a marketing-spin on the GTX460 rather than a genuinely new product. The only significant difference is that yields were so bad for 1st gen Fermi that it took six months to get fully functional parts out of the door, at which point some genius thought it would be good to add another hundred to all the model numbers for a "next-gen, minty-fresh feeling" With that in mind, the fast, efficient, quieter, cheaper, GCN-powered, compute-capable HD7770 sure makes the old GTX460-derivative seem like a dinosaur, and the 7850 is categorically better than GF114 for the same price point.

      • flip-mode
      • 7 years ago

      Really, though, what the heck is your point?

        • Chrispy_
        • 7 years ago

        I made my point [b<]in bold[/b<], because it was [i<]that[/i<] poignant...

          • flip-mode
          • 7 years ago

          Ah. I’m still not seeing how the HD 7770 makes the GTX 460-derivative look like a dinosaur. Isn’t the GTX 560 a good deal faster? Other than that it’s not unusual for new cards to make old cards seem even moreso like old cards.

            • Chrispy_
            • 7 years ago

            The point of my whole post is that:

            [list=1<][*<]Nvidia don't have a current-gen product for the midrange[/*<][*<]Nvidia's last-gen product for the midrange was, itself, a rebranding of a previous-gen product[/*<][*<]current-gen AMD midrange parts have either a power, price, performance advantage (or combination of those advantages) because the architectural and process improvements mean that Nvidia can no longer compete with silicon from two generations ago.[/*<][/list<] [quote<]Isn't the GTX 560 a good deal faster?[/quote<]Not if you're looking at price - the 560Ti's are around the same price as 7850's, which just beat them around in almost everything.

            • flip-mode
            • 7 years ago

            HD 7770 isn’t in the same league as the GTX 560. It’s wonky to compare them. The closer competitor is the GTx 550.

            • Chrispy_
            • 7 years ago

            I’m not sure if you’re genuinely misreading or just being deliberately obtuse. At what point do I directly compare the GTX560 or 560Ti (GF114) to an HD7770?

            The only direct comparison I make is between GF114 and 7850, because they are about the same price.

            If you’re going to suddenly pull the GTX550Ti into this argument, out of the blue, then my case still stands – Previous-generation midrange Nvidia parts don’t cut it alongside current-gen AMD parts. Compared to a 550Ti, the HD7770:
            [list<][*<]Performs [url=https://techreport.com/articles.x/22473/9<]significantly better[/url<]. [/*<][*<]Is the same price [url=http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&N=100007709&IsNodeId=1&Description=HD7770&bop=And&Order=PRICE&PageSize=20<]7770[/url<], [url=http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&N=100007709&IsNodeId=1&Description=550Ti&bop=And&Order=PRICE&PageSize=20<]550Ti[/url<]. [/*<][*<]Uses [url=https://techreport.com/articles.x/22473/8<]much less power[/url<].[/*<][/list<] I don't see [i<]your[/i<] point; What are you getting at?

            • flip-mode
            • 7 years ago

            You said GTX 460-derivative in your first post, which is the family of GTX 560’s IMO.

            I’m not arguing with your point that last gen has a last-gen taste to it, but it’s a “duh” kinda thing to say. At the same time, last gen cards can often provide a much better bang for buck. We’ve all seen how much the 28 nm chips have cost and how slow they’ve been to cheapen. Also, it’s hard to point fingers at the GTX 560 and GTX 550 with AMD is still mainlining the HD 6870 and HD 6850.

            • Malphas
            • 7 years ago

            The 7770 is somewhere between a 550 and a 560, it’s not that far off. To say it’s “not in the same league” is a bit of exaggeration, we’re talking like a 5-10fps difference between them in most benchmarks.

            • flip-mode
            • 7 years ago

            Yeah, isn’t that about all that separates one tier from the next these days?

      • Krogoth
      • 7 years ago

      Not exactly,

      560Ti (GF116) was a newer design than 460s (GF106. The blocks were rearranged around a bit, which is why 560Ti was somewhat faster than 460. Nvidia also binned the crap out of GF116 silicon which is why there is a large range of 560 parts. (288 CUDA cores => 448 CUDA cores).

        • Chrispy_
        • 7 years ago

        I’d love to know where you’re getting your info from. [b<]Everything[/b<] you said is wrong:[list=1<] [*<]560Ti is a GF114 product, nothing to do with GF116 [/*<][*<]The blocks were not rearranged at all, the only thing that happened were some transistor type changes to get better yields out of TSMC's 40nm process. Quoting Anand from [url=http://www.anandtech.com/show/4135/nvidias-geforce-gtx-560-ti-upsetting-the-250-market/<]here[/url<]; "[i<]GTX 560 Ti, in a nutshell, is a complete video card using the GF104 design[/i<]" and "[i<]GF114 is identical to GF104 in architecture and the number of functional units[/i<]" [/*<][*<]The 560Ti 448 was not a GF114 product either, that was a binned GF110 and therefore not relevant to these comments, being flagship and not [i<]'midrange'[/i<] silicon.[/*<][/list<] Why do I even have to explain this to you - do you just comment on this site without actually reading the articles?

    • Tristan
    • 7 years ago

    AMD is years behind NV. NV made scalar shaders with G80 in 2006, AMD with GCN less than year ago. On the other things, AMD is small compared to NV, so these few percents is nothing.

      • sweatshopking
      • 7 years ago

      OH MAN! THANKS FOR THE POST! NOW I SEE THE TRUTH!!! AMD IS THE SUX!!! THANKS BRA!

      • TheEmrys
      • 7 years ago

      Don’t often (ever?) agree with SSK, but yeah… this is the sort of post that makes the intelligentsia roll its eyes.

        • cynan
        • 7 years ago

        What does this post have to do with [url=http://www.intelligentsiacoffee.com/<]coffee[/url<] again? Or are you simply describing the nervous eye twitch the post gives you due to its sheer audacity and transparency as being similar to the involuntary rotational ocular spasms sometimes encountered after too much caffeine?

      • derFunkenstein
      • 7 years ago

      WHAT?

      • rrr
      • 7 years ago

      0/10. Too obvious

    • HisDivineOrder
    • 7 years ago

    I think it is inevitable that as games continue to stagnate you’d see the integrated GPU’s continue to expand and consume the discrete GPU market. If rumors are to be believed, then the next console leap is more a mild skip and may bring consoles up to 1080p gaming. This is certainly progression from the sub-720p resolutions they scale from currently, but I think PC discrete GPU’s will continue to outstrip them from the very beginning. This’ll probably lead to more of the same of what we’re used to now and none of the, “Console has the best graphics” for a month or two after launch like we used to have.

    With graphics seemingly plodding along, you can expect integrated GPU’s to really grow in popularity if they continue to expand in performance as rapidly as they have thus far. Plus, integrated GPU’s will eventually be completely and fully integrated into the CPU so that they are not distinct parts inside it, which offers its own performance advantages over discrete, distinct add-on boards.

    I’m not sure what to think of these things. I just know they’re coming. At the very least, I wish I could use my integrated GPU for something other than taking up space. I kinda think AMD’s approach toward CPU design with the integrated GPU is right, even if their performance otherwise is crap. Having the integrated GPU set up to run the FPU parts of the CPU in the future seems like the right way to go if they can get others (coughIntelcough) to do the same and compel the industry to use that integrated GPU. Plus, I wish Intel would use Havok Physics and get some deal with PhysX to use that seemingly weak-ass integrated they currently have to run physics to let your video card focus on the important stuff. (Why doesn’t nVidia get that PhysX could be so much bigger and better for gamers everywhere if they’d just let everyone use it? As an exclusive feature, it’s middling.)

    But in order to do that, we’d probably need Microsoft to get up off their hands and start pushing new versions of DirectX to compel standards for all these different device makers to work together. Microsoft’s too busy with this whole “Xbox” fad. 😉

      • sweatshopking
      • 7 years ago

      [quote<] "Right now we're working as always with a relatively small team on a next-generation project," Hakkinen said. "You always ask yourself: Can the new consoles really be that much better than the old ones? Be assured: They are. It is a quantum leap forward." [/quote<] [url<]http://www.gamespot.com/news/next-gen-consoles-a-quantum-leap-says-remedy-6393142[/url<]

        • yogibbear
        • 7 years ago

        Quantum leap is actually a pretty small leap 😉

          • sweatshopking
          • 7 years ago

          I’M Aware of that, but that’s not how he meant it. my point is that HDO’s concern is apparently not valid.

          • Farting Bob
          • 7 years ago

          I just hope the next leap is the leap home.

      • sschaem
      • 7 years ago

      Many reason why all this wont happen any time soon, specially with console refresh happening ‘around the corner’.

      Integrated graphics are decent on the highest end. HD4000 or 6550D
      The rest gets hit bad on 1080p … and for people with higher res display or multiple display, forget about it.

      Things like ddr3 shared by a multicore CPU and a GPU is a problem.
      And in true fusion design the cache is also shared… and soon other CPU resource will be shared.
      shared == more flexible to balance workload between shader type including compute, but you dont get more compute power overall.

      So what would make sense is that APU become ‘GPU’ where you can buy one on a PCIe card.
      (mostly to run graphic shaders)
      And preferably comes with a decent type/amount of memory. etc..
      We just re-invented the discreet ‘GPU’

      Unreal engine 4 demos should give you hope.

      Integrated (or even fused) GPU on CPU are great for mobile device battery life,
      but I see a gap forming in the next 18 month that going to boost discreet GPU sales.

    • tviceman
    • 7 years ago

    Go back to AMD’s and Nvidia’s conference calls for their Q2 earnings. AMD’s consumer GPU revenue was down 5% while Nvidia’s consumer GPU revenue was up 15%. It is a trade off either way it’s spun, but if making money is what matters most Nvidia still came out on top.

      • HighTech4US2
      • 7 years ago

      Article is only about AIB (add-in-boards) and completely ignores mobile GPUs.

      This article on TR is a prime example of bad journalism as it comes to conclusions about trends of all GPUs while completely ignoring mobile.GPUs. Raging Red Roosters link to this article and proclaim AMDs gains while completely ignoring the thrashing AMD has taken in mobile.

      Come on TR where is the balance. I don’t see an article about Nvidia gaining in mobile GPUs.

      This one is more complete as it contains all GPUs: [url<]http://jonpeddie.com/press-releases/details/graphics-shipments-in-q2-increased-2.5-over-last-quarter-and-5.5-over-last-/[/url<] [quote<]AMD’s total shipments of heterogeneous GPU/CPUs, i.e., APUs dropped 13.8% in the desktop from Q1, and 6.7% in notebooks Nvidia’s desktop discrete shipments dropped 10.4% from last quarter; however, the company increased mobile discrete shipments 19.2% largely due to share gains on Ivy Bridge which included Ultrabooks[/quote<]

    • jjj
    • 7 years ago

    lol what the hell are you smoking,Nvidia was up in Q2,AMD down in discrete,maybe in desktop AMD did better but since i can’t read what you got i have no idea if your data is wrong or you made a mistake.

    • HighTech4US2
    • 7 years ago

    Mercury Research Q2-2012 PC Graphics Report says:

    [quote<]discrete AMD 14,000 units, 42.9% share, -2.1% from Q1, -2.5% from Q2-2011 nvidia 18,600 untis, 57.1% share, +2.1% from Q1, +2.5% from Q2-2011 [url<]http://investorvillage.com/smbd.asp?mb=476&mn=244502&pt=msg&mid=12013727[/url<][/quote<] Nvidia's actual earning also show gains of 15% in GPU sales since last quarter: [quote<]Our consumer GPU business – which includes desktop, notebook, memory, and license revenue from our patent cross license agreement with Intel – was up 15.3 percent from the prior quarter, at $668.3 million. [url<]http://phx.corporate-ir.net/External.File?item=UGFyZW50SUQ9MTQ5Mzg3fENoaWxkSUQ9LTF8VHlwZT0z&t=1[/url<] [/quote<] JPR also says: [quote<]Nvidia’s desktop discrete shipments dropped 10.4% from last quarter; however, the company increased mobile discrete shipments 19.2% largely due to share gains on Ivy Bridge which included Ultrabooks. The company will no longer report IGP shipments. [url<]http://jonpeddie.com/press-releases/details/graphics-shipments-in-q2-increased-2.5-over-last-quarter-and-5.5-over-last-/[/url<] [/quote<] Summary Nvidia [b<]gained[/b<] in GPU revenue share since last quarter and the same quarter last year.

      • OU812
      • 7 years ago

      Wow, down marked for pointing out actual facts with links and quotes from industry leaders (JPR & Mercury Research).

      Downmarkers: Facts we don’t need no stinkin facts, if the post doesn’t spin our company in a positive way we down mark it.

        • flip-mode
        • 7 years ago

        It’s not for that facts. It’s for the shilling.

        • clone
        • 7 years ago

        article is about marketshare not revenue making it a silly post.

          • HighTech4US2
          • 7 years ago

          No the article is about AIB (Add-in-board) market share and completely ignores mobile GPUs and thus distorts what is actually happening in overall GPU market share.

          I am still waiting for TR’s article on how Nvidia gained overall market share in discrete GPUs vs AMD but since that would actually take a little effort it looks like that won’t ever happen.

          So yes, I will post facts and the AMD shills like flip-mode and his devoted followers will down mark me because they love to ignore any facts that don’t agree with their agenda.

          Your reply about a silly post is a prime example.

            • clone
            • 7 years ago

            yes, AMD gained AIB marketshare.

            that is what the article is about making your post silly and you get consistently down voted because you don’t want to see anything but Nvidia, making you guilty of exactly what you accuse/cry others of being…. a shill.

            those are the facts and your post is offtopic silly.

            • Goty
            • 7 years ago

            Oh, be nice to HighTech, he isn’t a shill; shills get paid for their work, he’s just a blind loyalist!

            • clone
            • 7 years ago

            thought I was being reasonably polite about it.

            • clone
            • 7 years ago
      • BestJinjo
      • 7 years ago

      This article is about market share, not revenue, earnings or unit shipments. Those #s already don’t make any sense since 42.9% + 57.1% = 100%, and yet Matrox and other small players still sell GPUs. So no, your data is irrelevant.

      GeForce.com is this way ——————————->>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>

    • Krogoth
    • 7 years ago

    I attribute this growth to the clearance sales on HD 6xxx and some of the HD 7xxx series (mostly 7850s and 7950s). The HD 77xx is also snagging some sales as well in its tier and its only direct competition are discontinued Nvidia cards.

    680s and 670s are selling, but Nvidia cannot get enough of them out to the market in order to satisfy demand. The 660 only just came out a week ago and AMD already engaged a price war with it (7850s and 7870s get cuts).

      • flip-mode
      • 7 years ago

      Who needs Jon Peddie when there’s Krogoth. Krogoth always knows.

        • vaultboy101
        • 7 years ago

        unless Krogoth is Jon Peddie….

          • flip-mode
          • 7 years ago

          Unlikely. Peddie does actual research and analysis. Krogoth reads TR and says “just what I expected, […], [made up fact]”. I’ve been watching HD 6xxx prices, and I can tell you than Krogoth has made up yet another fact with his “clearance sales on HD 6xxx” remark.

            • Krogoth
            • 7 years ago

            It is not that difficult to figure out.

            Until the recent price cuts on HD 78xx-79xx family. The HD 68xxs and 69xxs were a steal at their price points for almost a year, Nvidia had nothing to go against them. You were either stuck with underwhelming 550s and severely crippled versions of 560 (320 and 288 cores) that couldn’t compete against 68xx for the roughly the same price point or 560s (384s and 448s) that were slightly overpriced and could compete against 68xx and 69xx performance-wise.

            570 was in an odd place (couldn’t beat 7870 and 7950, ran hotter ,yet it was around the same price point) until its 660 replacement came along.

            Nvidia had hell of a time getting 670 and 680 out in supply for the first few months of its launch. Remember, how Nvidia was complaining about TSMC and their yielding issues with 28nm process?

            IIRC, it wasn’t that long ago that you pick up a 6870 and gripe about the fact that it could keep up with 7850 and 7870 at launch for a fraction of the cost!

            • derFunkenstein
            • 7 years ago

            But they aren’t “clearance priced” – they have been priced basically the same since before the 7000 series debuted.

            edit: ooh, price cuts of $20, around 7%.

            [url<]http://camelegg.com/products?sq=radeon+6950[/url<]

            • flip-mode
            • 7 years ago

            HD 6950 was going for as low as $210 before the 7000s arrived.

            • derFunkenstein
            • 7 years ago

            None of the active 6950s show that in their history, but i wouldn’t be surprised if that was a sale price.

            • flip-mode
            • 7 years ago

            So, if I change the definitions of words then you make sense. Daft.

          • chuckula
          • 7 years ago

          Or Batman…

        • Goty
        • 7 years ago

        [quote<]Krogoth always knows.[/quote<] Maybe that's why he's so rarely impressed?

      • Deanjo
      • 7 years ago

      670/680’s have been pretty easy to get for a while now. Even on newegg.

        • flip-mode
        • 7 years ago

        Quit using real facts – it’s ridiculous!

          • Deanjo
          • 7 years ago

          Sorry.

        • derFunkenstein
        • 7 years ago

        According to the press release (which is now live), “last quarter” is Q2 2012 calendar year, which ended in June, and the 670/680 was in tight supply.

      • Phishy714
      • 7 years ago

      [url<]http://panzo.org/wp-content/uploads/2011/08/tiny_turtle_not_amused.jpg[/url<]

      • sschaem
      • 7 years ago

      Have you seen any sites out of stock of the 680 recently?

      What is your statement based on?

      • Bensam123
      • 7 years ago

      Clearance sale?

      • BestJinjo
      • 7 years ago

      There was a breakdown someone was passing around and less than 5% of all consumers buy discrete GPUs for $350+. That means AMD had the market all to itself for 6 months with HD7750/7770/7850/7870 cards. It’s actually amazing how loyal NV’s customer base is since NV only lost 2.6% market share. Between those sub-$350 HD7000 series cards and HD6850/6870/6950 cards, NV had nothing worth buying except for GTX670/680 cards until August 16th. If AMD was 6-8 months late like that, I would bet it would have lost 5-10% market share. Even now AMD is forced to deliver better price/performance for people to consider it. NV marketing FTW!

    • jrr
    • 7 years ago

    I recently switched from nvidia to ati for eyefinity – let me know when I can run five displays as a single desktop off an nvidia card =]

      • Chrispy_
      • 7 years ago

      They’ll probably get around to making one if they ever stop farting around with 3D Vision.

        • Deanjo
        • 7 years ago

        Ya because people that use 5 displays are greater in number then 3d users.

          • jdaven
          • 7 years ago

          There are probably more bitcoin users than 3d users.

            • Deanjo
            • 7 years ago

            Did I say otherwise?

            • sweatshopking
            • 7 years ago

            OH MAN, LET’S FIGHT ABOUT HOW MANY PEOPLE USE 3D!!!

            • Farting Bob
            • 7 years ago

            THERE’S NO NEED TO SHOUT!!!

            • moose17145
            • 7 years ago

            LOUD NOISES!!!!!!

            • entropy13
            • 7 years ago

            HA! HA! HA!

          • Chrispy_
          • 7 years ago

          Did I say otherwise?

          Nvidia’s priority is 3D, their multi-display support comes second.
          AMD’s priority is multi-display, their 3D support comes second.

            • Deanjo
            • 7 years ago

            The term “farting around” thinks you think that they should drop 3d in favor of 5 screens. Their farting around in 3D support probably brings in more revenue in then AMD’s 5 monitor support. They are going to concentrate on what ever brings them the most revenue.

          • Bensam123
          • 7 years ago

          You know it’s funny… cause I think Eyefinity is a better experience then 3D. If only the 3D users knew.

            • BestJinjo
            • 7 years ago

            Guild Wars 2 looks excellent with 5 monitors on 4x HD7970 CF (80% GPU utilization too).

            [url<]http://www.youtube.com/watch?v=0nHKqbNX4fs[/url<]

            • Bensam123
            • 7 years ago

            Yeah, I’ve seen that video before. I would to a 1×3 setup if only game developers would put some time into it so there isn’t gawdawful warping on the edges.

            • l33t-g4m3r
            • 7 years ago

            And a combination of the two into a product such as the Occulus rift would be superior to both.

    • odizzido
    • 7 years ago

    My economic conditions are fine, but I am still running a 5850. My source cites that I have little reason to upgrade. I imagine I am not alone in this.

      • forumics
      • 7 years ago

      the 5850 was a monster when it was released and it still is very much a monster still today.
      i’m running a 5850 as well and i think it’ll last me another year at least!

        • flip-mode
        • 7 years ago

        5870 here. I’ve only played one game on the thing. I need to rectify that.

      • jdaven
      • 7 years ago

      A 7850 is about 25% faster than the 5850 according to Anand Bench and consumes about 15% less power. This could appeal to some especially since prices are now below $200 after rebate.

        • flip-mode
        • 7 years ago

        Anyone willing to spend $200 for a 25% improvement over a 5850 has had opportunities to do so already. Spending $200 to save a few dimes a month on power costs is completely whack. Anyone still sitting on a 5850 is probably doing what odizzido is doing: Waiting for the moment where the 5850 is no longer good enough to play their games.

          • travbrad
          • 7 years ago

          Yep that’s the biggest reason I haven’t upgraded my GTX460 “SC”. If I spent the same amount I spent on the GTX460 (almost 2 years ago) I wouldn’t see huge performance gains on a 1080p monitor. Sure I can’t max every setting in BF3 and play smoothly, but BF3 looks pretty damn good even at medium/high settings. Most games aren’t as demanding as BF3 either (and certainly not as good-looking)

          • chµck
          • 7 years ago

          but it’s fantastic for people who have older tech

      • Yeats
      • 7 years ago

      Yep, I still have a GTX 460, see no reason to upgrade.

        • derFunkenstein
        • 7 years ago

        I *want* to upgrade my GTX 460, but I probably won’t.

        I mean, I can’t turn on AA and VSync in Diablo III when I’m in the sewers or in really populated fights. Turning off Vsync keeps my framerates around 40-45 instead of locked at 30, and turning off AA lets me have Vsync on and locked at 60.

        The same is true of Starcraft II – when I’m on creep or showing pylon power radius with lots of pylons as Protoss, super-ultra-max settings with AA turned on (it’s been added to the game natively) is chuggy. Turning off AA or going down to mere “high” settings gets me back to smooth-ville.

        Still, that’s not enough to justify an upgrade.

      • Bensam123
      • 7 years ago

      There are and always will be people who run their technology into the ground before they upgrade. A few of my friends do it this way and then scouff when they have to spend $1000 on a new system and they don’t have the money laying around.

        • jihadjoe
        • 7 years ago

        Guilty as charged.

        I tend to keep hardware until modern stuff is really close to unplayable, but then I spend on a bit more than the midrange when I do get around to doing an upgrade. Not to the top-end, but usually to the next part down that performs almost as good and costs significantly less.

        Recently replaced a Q6600 B3 and an 8800GT (pretty baller for the time, I’m sure you’ll agree) with an i7-3930k and a GTX670. Probably going to keep both for quite a while again.

      • internetsandman
      • 7 years ago

      I have a 6950 that I bought used off Craigslist, and I was coming off of a laptop with a GT 330M inside it. I was blown away with how easily it handled games at 2560×1440, and while the enthusiast in me wants to jump up to the 7000 series, the rationality in me tells me I probably won’t notice the improvement for anything, and my wallet will just get smaller without making anything else actually better

    • yogibbear
    • 7 years ago

    Just wait for apple to sue nvidia and amd over a patent for a technology where “a device that can display visual interpretations of data on a screen”

      • ub3r
      • 7 years ago

      More like a patent over.. “A digital device”..

        • yogibbear
        • 7 years ago

        More like … “A device”

          • MadManOriginal
          • 7 years ago

          More like … “A”

            • yogibbear
            • 7 years ago

            I would be surprised if they didn’t already have that patent and be in the process of suing sesame street.

            • Sargent Duck
            • 7 years ago

            I’m sure Apple is hard at work finding a way to patent breathing.

            • Yeats
            • 7 years ago

            The irony, of course, being that Apple is attempting to asphyxiate competition.

            • entropy13
            • 7 years ago

            So “an” is safe then.

            • derFunkenstein
            • 7 years ago

            No, it’s a derivative work of “a”.

            • MadManOriginal
            • 7 years ago

            Yup, 50% of ‘an’ is clearly copied!

            • sweatshopking
            • 7 years ago

            YOU GUYS ARE JUST HATERS. APPLE IS DOING THE SAME AS THE REST OF THE INDUSTRY. DON’T HATE THE PLAYER, HATE THE GAME – NEELY 2012
            seriously though. everyone has patents. if you could use them for a BILLION DOLLARS, WOULDN’T YOU?

            • entropy13
            • 7 years ago

            Use what for a billion dollars?

            • sweatshopking
            • 7 years ago

            insert mom joke.

            • derFunkenstein
            • 7 years ago

            Insert mom.

            Insert.

            Insertinsert.

            Oh yeah, baby.

            • MadManOriginal
            • 7 years ago

            I’ve patented all-caps article comment posts. Please pay up.

            • Chrispy_
            • 7 years ago

            Your reply appeared in a rectangle with rounded corners.

            I can hide you from Apple, for a fee….

            • sweatshopking
            • 7 years ago

            you can’t patent that. it’s totally ridiculous. anyway i’ve got plenty of prior art to prove that I ACTUALLY INVENTED IT. not that the jury cares. it might just bog them down.

            • Chrispy_
            • 7 years ago

            One might even use the word ‘slavishly’ derivative.

          • ub3r
          • 7 years ago

          Actually no Sorry…
          They are suing the patent office for copying their idea of a “Patent”.

    • ub3r
    • 7 years ago

    You need to also remember that Radeons are favored for Bitcoin mining.
    And, bitcoins have only existed for 1.5 years.
    And, many people purchased radeons for this sole purpose..

    Thus… not reducing NVIDIAs sales, just increasing AMDs.

      • anotherengineer
      • 7 years ago

      *searches bitcoining*

      [url<]http://foreverrising.wordpress.com/2011/06/15/what-is-bitcoin-and-what-is-bitcoin-mining/[/url<] [url<]http://bitcoin.org/bitcoin.pdf[/url<] Looks like a Japanese name, the things they come up with, I personally like this......... [url<]http://www.youtube.com/watch?v=BkHrTC_N-sg[/url<]

        • RickyTick
        • 7 years ago

        Loved the tire ski jumping. That was fun.

        • derFunkenstein
        • 7 years ago

        The circular logic made my head spin. The comments on the first link are particularly telling – “for something to have value it must be of use to someone else”. Total non-answer to that very poignant question.

      • Waco
      • 7 years ago

      Do you really think Bitcoins contributed anything more than a scant .0000001% of the market? 😛

        • entropy13
        • 7 years ago

        Considering bitcoin mining is now on the same level of importance as piracy (not the ones near Somalia) for the US gov’t, slightly above “national security” and “terrorism”…

          • lilbuddhaman
          • 7 years ago

          At this point of difficulty in the bitcoin mining game, it almost costs more in power consumption than what you get back in bitcoins. The whole market is weird though. Imagine if you were an early adopter, and churned out a few hundred bitcoins for relatively low time/computing time….or even just bought them a year ago when they were sub $1.00. They now sit ~$10 per coin. Quite crazy stuff.

          /incoherent rambling

            • ub3r
            • 7 years ago

            Silkroad didnt exist back then.. 😀 😀

            • BestJinjo
            • 7 years ago

            HD7870/7950/7970 cards are still profitable but probably not for long. Still, that won’t really impact the market since bitcoin mining is too niche and even in enthusiast circles very few people are taking the time to set it up. If enthusiasts this generation actually took the time to set up bitcoin mining, almost no 670/680 cards would have sold because with HD7950/7970, the cards would have paid for themselves fully in 6-8 months. That tells you right there how niche bitcoin mining is since it hardly impacted the sales of GTX670/680 cards.

            What we can conclude from this market share data is that Steam Hardware Survey is not really representative of the entire market. AMD cards show very poor market penetration on Steam but the reality for the global discrete GPUs shows otherwise.

Pin It on Pinterest

Share This