AMD unveils 2011 GPU roadmap

In addition to the extensive CPU and APU information released yesterday, AMD also found the time to talk graphics at yesterday’s Financial Analyst Day. Matt Skynner, general manager of AMD’s graphics division, showed roadmaps for both desktop and mobile discrete GPUs.

 

What’s new in that desktop roadmap? AMD’s high-end Radeon HD 6990, code-named Antilles, is set for a release next quarter. The Radeon HD 6970 and 6950 cards, both sporting the graphics chip code-named Cayman, are on track to launch by the end of the year.

It’s worth noting that the cards are positioned in this slide in order of performance. Apparently, only the very high end of the 6900 series will outdo the existing Radeon HD 5970. At the low end, Turks and Caicos will merely match, and in some cases fall short of, the performance found in today’s 5700-series Radeons.

On the mobile GPU front, Skynner said AMD intends to take a "two-prong" approach to each market segment—something clearly indicated in the chart above. Skynner explained that having high-end and low-end options in each performance class will let AMD raise its average selling prices. Interestingly, the above slide tells us the ultra-high-end Blackcomb chip will be a 40-nm part, but it offers no such information about the rest of the 2011 lineup. Perhaps the color coding hints that not all members of the Vancouver series will be based on the same fab process.

On a related note, while discussing AMD’s discrete mobile GPU efforts, Skynner claimed that AMD has enjoyed a jump in market share from 24% to 62% over the past couple of years.

Comments closed
    • jdaven
    • 9 years ago

    Fudzilla just posted this article about an hour ago.

    ยง[<http://www.fudzilla.com/graphics/item/20833-amd-cayman-pushed-back-to-december-13<]ยง They received a definitive launch day of December 13th for Cayman. Also, they received word that the 6970 will the fastest single GPU. From the article: "Cayman will still be on retail/e-tail shops before Christmas, and if the performance is right, will end up to be the fastest single GPU card this year." We'll know for sure in about one month.

    • bimmerlovere39
    • 9 years ago

    Just noticed they called the 4700 series 55nm, which wasn’t. Accuracy fail ๐Ÿ˜€

    • amirol
    • 9 years ago

    In RAW power 580 will play around 5770 not even more but in gameplay rates its good I wish a day to use all this high Tera Flops ..wowww earth quake no heart quake will happen.

      • TaBoVilla
      • 9 years ago

      but also moving stick around block of ice in boiling is not like monday is over for next week in watch by my lunch over old gpus.. =(

      man, I’ve always considered beach sand in pancake for market share to be honey in the mustard when it rains, and that happened twice this lcd!

        • sweatshopking
        • 9 years ago

        lol YOU WIN +1 INTERNET.

    • amirol
    • 9 years ago

    AMD s Heartquake shocked us….Go ahead AMD its your time to beat and eat the BEATEN APPLE.

      • dpaus
      • 9 years ago

      Are we allowed to say “beat and eat” on here??

      • khands
      • 9 years ago

      Just out of curiosity, how long have you studied english?

        • sweatshopking
        • 9 years ago

        1 HOUR!!!!!!!!!

    • Meadows
    • 9 years ago

    And at last, after all these years I fall victim to the reply issue as well. Navigate away accidentally, navigate “back”, the number of the comment you’re replying to is still there but the browser couldn’t care less anymore.

    • thermistor
    • 9 years ago

    #19…Can your console do 3-monitor Eyefinity?

    #1…Can the nVidia GTX 580?

    I know what my next card will be and why, just waiting for the holiday. I already have my 22″ samsung flanked on either side with 2 other 22″ samsungs…waiting breathlessly. I’ll let y’all know how it turns out.

      • NeelyCam
      • 9 years ago

      l[<#19...Can your console do 3-monitor Eyefinity?<]l Who cares? Until the bezels are gone, I sure don't.

        • BobbinThreadbare
        • 9 years ago

        With 3 monitors, bezels shouldn’t be a big problem.

    • marvelous
    • 9 years ago

    AMD will probably want to put some distance between their 68×0 and 69×0.

    If 6870 is $240

    I’m guessing 6950 will be upwards of $300 Probably $329.

    6970 $399 if not little more.

    • mcnabney
    • 9 years ago

    Hate to be a Negative-Nellie, but in the land of console ports, why do any of these cards even matter? So you have 3-4 games per year that you can really shine on versus something like a Raden 4700?

      • OneArmedScissor
      • 9 years ago

      *Cue “But…but…look at Metro 2033 benchmarks!” replies*

      • RMSe17
      • 9 years ago

      Maybe the industry will make games that will have way better capabilities on the PC than on console.. A very well designed engine should be able to offer such dynamic difference based on the hardware. Something that will look like next-get Crysis on a PC, while look like a typical UE3 game on a console… At least I can always hope ๐Ÿ™

        • Kurotetsu
        • 9 years ago

        The industry won’t do anything like that unless Nvidia or AMD (more likely Nvidia) pays them to. Just because the hardware is there doesn’t mean anyone will bother to use it (without proper motivation).

    • dpaus
    • 9 years ago

    The low end of the low-end is “Robson”??!!? Shoulda been “Gastown”

    • phez
    • 9 years ago

    What ever happened to 28nm?

      • khands
      • 9 years ago

      It was pushed back. You’ll see the 7000 series on them as soon as it’s viable though.

      • OneArmedScissor
      • 9 years ago

      Nothing, actually. They did scrap 32nm, though, forcing them to go with 28nm further down the road.

    • spiritwalker2222
    • 9 years ago

    Is it just me or does anyone else get the impression the 6990 will just slightly out perform the 5970?

    I already feel dissapointed.

      • khands
      • 9 years ago

      It’s obviously not going to be the same jump in performance as the last couple of generations, but until they can drop a node I wouldn’t expect it to be.

    • cygnus1
    • 9 years ago

    l[

    • tejas84
    • 9 years ago

    Seems like AMD is admitting that the 6970 won’t beat the 5970.

    Therefore the GTX 580 will remain the fastest single GPU but of course the 6990 will take the top prize as fastest card…unless nvidia has another surprise planned!

      • bittermann
      • 9 years ago

      Please enlighten us since you have no actual benchmarks to prove anything you say?

      • jdaven
      • 9 years ago

      The GTX 580 also doesn’t beat the 5970 in every benchmark. Techpowerup had the 5970 beat the GTX 580 by about 6% in the summary table. So that puts the 6970 slightly below the 5970 as the chart above shows or right about the performance of the GTX 580.

      Guys we are talking about a few percent in performance here. I seriously doubt many of you will be choosing the GTX 580 over the 6970 if the performance difference was a mere 5% or less. These two cards are going to be very close.

        • bittermann
        • 9 years ago

        I hope they are that close in performance! Price wars here we come…and with memory prices falling this Holiday season might be a great time to upgrade.

      • [+Duracell-]
      • 9 years ago

      I don’t think the HD 6970 was intended to displace the HD 5970, anyways. It’s a single GPU solution, and the 6xxx generation is more of an evolutionary step than a total architecture change.

      Antilles was intended to replace it.

      • can-a-tuna
      • 9 years ago

      Why should it beat dual GPU card? You’re not making any sense. It’s enough they’ll beat GTX580.

      • TaBoVilla
      • 9 years ago

      Your username tejas84 always reminds me the mayor failure the NetBurst based Tejas processor was, getting canceled and all, cast into oblivion for being such a hot-head. Why don’t you just do the same?

        • Meadows
        • 9 years ago

        g{

          • TaBoVilla
          • 9 years ago

          I failed miserably =D

    • internetsandman
    • 9 years ago

    Living in Vancouver myself, it’s pretty sweet to see all of those references to the nearby ski resorts and attractions

    • Meadows
    • 9 years ago

    I guess they had to show /[

      • bdwilcox
      • 9 years ago

      I wish they’d show the 6950, which is what I’m hankering for.

        • khands
        • 9 years ago

        I’m pretty sure it’s going to release at either $300 or $350, the difference of which has pretty staggering consequences actually.

          • bdwilcox
          • 9 years ago

          /[<"the difference of which has pretty staggering consequences actually"<]/ Like what? I'm really not following you.

            • khands
            • 9 years ago

            At $300 it’s low enough that a price war would push pretty much everything else they have below it down as well, at $350 they’ve got room to not impede on the 6870 should Nvidia invade that price space.

          • OneArmedScissor
          • 9 years ago

          There’s a $70 difference just between the 6850 and 6870. $300 would be giving it away.

            • [+Duracell-]
            • 9 years ago

            For all we know, it could end up being:

            HD 6950 – $300
            HD 6970 – $400
            HD 6990 – $500

            Since the 69xx series is going to be targeted towards the performance/enthusiast crowd, it’ll make sense to price it accordingly, since most people that want that level of performance will either run crossfire/SLI or just drop the money on a card like this.

            If we can extrapolate performance from that chart alone, the HD 6950 should be priced around where the HD 5870 is right now, which you can find from $300-$350.

            But then again, all speculation.

            • Vaughn
            • 9 years ago

            HD 6950 – $300
            HD 6970 – $400
            HD 6990 – $500

            This looks pretty accurate duracell but I believe the 6970 will be on par with the 580 and carry the same $499 price tag the 6990 will probably be $600 as it should destroy all.

            $300-$350 seems about right for the 6950

            • urbain
            • 9 years ago

            I think the 6970 will out perform the 580, based on the reviews of the 580,the 580 is only 20-30% faster than 480 which means some 30-40% faster than 6870,so if the 6970 is 80-100% faster than the 6870,its practically going to be at least 40% faster than 580″in most cases”.
            and of course the 6990 will be a GOD that I’am under.

            • khands
            • 9 years ago

            I really don’t see that kind of difference without a die shrink, it’d be awesome, but I don’t think a simple rebalancing and a couple of tweaks can eek out that kind of performance.

            • Silus
            • 9 years ago

            That’s a pretty big if. HD 4890 to HD 5870 the performance increase was 35-40%. 50% maximum and only in some cases. But from HD 6870 to HD 6970 you’re assuming it will be 80-100% ? Good luck with that ๐Ÿ™‚

            The only time that really happened (actually more than 100% on most instances) was when NVIDIA introduced G80 (8800 GTX) and that time represented a major shift in GPU architectures, since they went from fixed units for specific tasks to a unified approach. This won’t happen again any time soon.

            • khands
            • 9 years ago

            I thought the 5870 was about 80-90% faster than the 4870. Slightly under the 4870×2.

            • Silus
            • 9 years ago

            Don’t let me stop you. Just read the reviews at launch. And I didn’t say HD 4870. I said HD 4890 (which isn’t that much different than the HD 4870 anyway).

            • khands
            • 9 years ago

            Just double checked Techreports 5870 article, it did pretty much match the 4870×2, and averaged about 33%-50% faster than the 4890, which means the 4890 was a better card than I remembered it being.

            Also, yes, I had misread your prior post as 4870, rather than 4890

      • can-a-tuna
      • 9 years ago

      Jen-Suns eyes will pop out when they see Cayman.

        • Meadows
        • 9 years ago

        Unlikely, I don’t doubt NVidia are hard at work too.

          • JustAnEngineer
          • 9 years ago

          And what indications do we have that NVidia are “hard at work” *[

Pin It on Pinterest

Share This