UPDATE: New Catalyst betas boost 7000-series performance

Later this week, AMD is going to release a fresh batch of beta Catalyst drivers. These new drivers will feature a gaggle of optimizations for graphics processors based on the Graphics Core Next architecture—that is, most members of the Radeon HD 7000-series family, from the 7750 on up.

These optimizations will purportedly yield substantial dividends in the form of performance increases, which should be visible across the board, in both DirectX 10 and DirectX 11 titles. AMD touts an “average” performance boost of 15%, but it says increases of as much as 30-40% can be seen in some Battlefield 3 maps. Only GCN-based offerings will benefit, though. AMD says it’s already had five years to optimize the VLIW architectures of 6000-series and older Radeons, but it’s only beginning to tap into the full potential of newer, GCN-based products.

Source: AMD. Note that the Y axis isn’t zeroed.

In the case of the 12.11 beta drivers, tapping more deeply into that potential has involved a number of tweaks throughout the software stack. AMD says many of the tweaks have to do with memory management—specifically, the handling of textures and surfaces as they move back and forth across the GPU’s memory and caches. Other tweaks involved understanding how popular shaders run on GCN-based products and where bubbles occur in the pipeline. AMD says it’s also optimized thread and surface management, and it mentions miscellaneous, “generic level” improvements.

Radeon HD 7000-series users can purportedly look forward to further performance increases over the next year or so, too, as AMD continues to optimize its drivers for the GCN architecture.

Coupled with the new game bundles introduced today, these optimizations could further sway consumers toward Radeons during the holiday shopping rush. Of course, we’ll have to see for ourselves whether AMD’s performance claims check out in the real world. It’s not uncommon to see driver releases promise meaty performance increases that turn out only to apply in best-case scenarios. We already know the latest optimizations may yield only marginal benefits at higher tessellation and antialiasing levels.

Incidentally, AMD also has WHQL-certified 12.10 Catalyst drivers in the pipeline for this week. However, those drivers won’t feature the GCN optimizations. Radeon HD 7000-series users will want to grab the non-WHQL-certified 12.11 beta release in order to enjoy the best performance.

Update 4:00 PM: The Catalyst 12.11 beta drivers are now available from AMD’s knowledge base site. They support Windows 8, 7, and Vista.

AMD has also released the WHQL-certified Catalyst 12.10 drivers, which can be downloaded from AMD’s Support & Drivers page. Full release notes for these drivers can be perused here.

Comments closed
    • Deo Domuique
    • 7 years ago

    It’s O.K… Now, could we have an in-depth, intense scrunity on 12.11 by you, TR? I’d really like a nice one on this… It will be interesting to see what might has been changed, like power draw or anything etc. I’d really aprreciate it; I never trust such nice graphs made by Nvidia or AMD.

    • Firestarter
    • 7 years ago

    Just a heads up, after installing the 12.11 drivers, all text in XBMC has disappeared.

      • indeego
      • 7 years ago

      Dude you settled for text up until now.

    • Squeazle
    • 7 years ago

    Can you update their graph? Just for fun?

    • Firestarter
    • 7 years ago

    Installed it last night and did some Borderlands 2. With the added benefit of heavy placebo, the seat-of-the-pants-o-meter registered a Damn-Fine framerate.

      • Hattig
      • 7 years ago

      I approve of your FPS scale. What’s under “Damn-Fine”?

        • Firestarter
        • 7 years ago

        I’d say Pretty-Good, but it really depends on what kind of day it has been. Yesterday for example, was a good day. Also I had been consuming a tasty white wine that I accidentally bought as the bottle looks very similar to a wine that has been a favourite of ours for a while. Anyhow, drivers were installed, frames were generated and psycho’s, rats and improperly large creatures were getting various critical bits blown off of their soon-to-be corpses at point blank range with a nice shotgun. In addition, said actors in my spiel were set on fire by the aforementioned shotgun. You’d think that shiny new graphics drivers would do nothing to improve the screams of the burning alive, but somehow they did! All in all, it was a good game.

    • ronch
    • 7 years ago

    While these performance increases through driver updates can be seen as positives from both AMD and Nvidia, they can also be seen as glass-half-empty scenarios. It only means that the earlier drivers aren’t as optimized or efficient, and both companies are expected to increase performance as much and as cheaply as possible for the sake of competition, and one of the methods for that is to squeeze the last remaining drop of performance out of their hardware.

      • odizzido
      • 7 years ago

      Everything takes time….if they were to wait until their drivers were perfect the GPU would be obsolete before it launches.

    • alienstorexxx
    • 7 years ago

    will you guys test these drivers?

      • Damage
      • 7 years ago

      Yep. Just have another project to finish first!

        • tbone8ty
        • 7 years ago

        😉

        • MadManOriginal
        • 7 years ago

        Nice! New drivers is one thing that’s not tested often enough in online reviews by major sites. I know it’s got to be time-consuming, but at the same time reusing results from launch drivers for 6-12 months may not be representative of current performance either. TR to the rescue!

        • spuppy
        • 7 years ago

        Great! None of these claims matter until they’ve been looked at under a microscope.

        • BIF
        • 7 years ago

        So when Techreport tests the drivers, please also include tests for GPU folding.

        Folding is the main reason that I am planning to purchase more GPU than I “need” for my actual graphics work or gaming.

        So how much more bang for my buck would I get with the new ATI drivers as opposed to a comparable GTX 680 or 690.

      • xeridea
      • 7 years ago

      Other review sites have tested. I techpowerup shows 4-10% average, depending on card.
      HD 7750: +4%
      HD 7770: +5%
      HD 7850: +4%
      HD 7870: +10%
      HD 7950: +7%
      HD 7970: +7%
      HD 7970 GHz: +7%

      Would be nice to see frame time charts to see how well higher latency frames are smoothed out, but gains have at least been verified. Also, it varies somewhat by game, but there is significant improvement across the board, unlike typical driver optimizations.

        • jdaven
        • 7 years ago

        According to the techpowerup.com review, the 7970 GHz edition is now the fastest single GPU graphics card beating the 680 by 4% in the summary of all resolutions graph.

        [url<]http://www.techpowerup.com/reviews/AMD/Catalyst_12.11_Performance/23.html[/url<] Good work AMD!

          • BestJinjo
          • 7 years ago

          7970 GE has already has been faster since June 2012:
          [url<]http://www.techpowerup.com/reviews/AMD/HD_7970_GHz_Edition/28.html[/url<] and even in recent testing before Cats 12.11: [url<]http://www.techpowerup.com/reviews/ASUS/HD_7970_Matrix/26.html[/url<] The reason recently 680 looked better is because everyone just kept focusing on BF3 and ignoring all the other games where it was losing. Also, games like Borderlands 2 kept the focus on GTX680 despite 7970GE beating it in Witcher 2, Metro 2033, Sleeping Dogs, Dirt Showdown, Alan Wake, etc. Almost all major websites weere already reporting HD7970 GE as the faster card starting in June 2012 but most people don't read reviews and were stuck in Spring 2012 when GTX680 was faster than 925mhz 7970.

            • Hattig
            • 7 years ago

            You’ve hit the nail on the head, and AMD has actually done something clever (for once) in having a funky name for what would otherwise be a normal driver update release. This refocusses people on the current performance, not the performance in Spring.

        • tbone8ty
        • 7 years ago

        +1 for latency charts

        also image quality hasn’t been modified so this is another +1

        • Hattig
        • 7 years ago

        Thanks for that table – looks like the 7870 really benefits – and 1080p is an ideal resolution for that card. 7970 owners should be triple monitor gaming at the very least 🙂

        And free significant performance enhancements are a win-win regardless.

    • MadManOriginal
    • 7 years ago

    The best part about the graph which no one has mentioned yet…not only did they exaggerate the gains by starting at 90%, [b<]they also exaggerated them by ending the graph at 110%![/b<] If they'd shown the full gains above 110% for Unigine and Civ 5 it would have made the other gains look smaller. lolol.

    • MadManOriginal
    • 7 years ago

    ‘purportedly’…TR’s favorite word which I pretty much never read anywhere else, ever!

      • Damage
      • 7 years ago

      …purportedly.

      • HisDivineOrder
      • 7 years ago

      “Purportedly, they love the word, ‘purportedly.’ However, I do not think the purported use of the word, ‘purportedly’ truly matches up to the purported usage of said word. Purportedly.”

    • flip-mode
    • 7 years ago

    People, while we all hate it, the graph is done just the way a marketing department should do a graph. Any marketing department that doesn’t accentuate the positive is inept.

      • Airmantharp
      • 7 years ago

      Not just that, but the graph is designed to show the differences, visually- percentage differences like say Nvidia does with each driver release aren’t as nearly as dramatic. When you’re trying to communicate…

      • cynan
      • 7 years ago

      Yup. It’s pretty well taken for granted that the vast majority of all marketing is at least a little misleading, if not downright dishonest.

        • sunner
        • 7 years ago

        “Yup. It’s pretty well taken for granted that the vast majority of all marketing is at least a little misleading, if not downright dishonest”.

        Cynan, look at it from Marketing viewpoint….

        “Our ‘dishonesty’ is done for noble purpose of relieving JoeSixpak of $$ he might otherwise waste on unhealthy drugs&booze”.

        • rrr
        • 7 years ago

        How is starting at sth else than 0% to accenuate the differences dishonest?

      • Parallax
      • 7 years ago

      Wait, didn’t AMD just lay off a bunch of people? Does this mean they let engineers go BEFORE the entire marketing department?

    • south side sammy
    • 7 years ago

    nice chart. reminds me of an auto insurance commercial. the fine print is too small and out of focus to read…..

    • indeego
    • 7 years ago

    What happens when they release a version without “never settle?” Does this imply AMD has…um…settled…after all?

    And yeah the graph is fscking redonculous. This is what happens when you cut huge swaths of marketing departments, eh?

    • Goty
    • 7 years ago

    *cue the complaining by people who can’t read graphs*

    Ah, I see I’m already too late.

    Y’know, life must be pretty good if the biggest thing you have to complain about is the domain of a graph.

      • Arclight
      • 7 years ago

      This does qualify as a 1st world problem “AMD just posted a graph showing performance improvements, but I know they are exagerated, so I have to wait until someone reviews them”.

    • Arclight
    • 7 years ago

    Ah dat graph

    • Meadows
    • 7 years ago

    NEVER SETTLE driver? What’s next, the RIGHT ON TIME wall-mounted clock? DEAD MEAT sausage products? SO HARDCORE walnut packets? GOOD FOR YOU mayonnaise?

    Ridiculous.

      • bittermann
      • 7 years ago

      Never go full retard…

        • shaq_mobile
        • 7 years ago

        LOL hahahahaha

        well i did like “DEAD MEAT sausage products”

        • flip-mode
        • 7 years ago

        Epic reply.

    • cynan
    • 7 years ago

    According to [url=http://www.anandtech.com/show/6393/amds-holiday-plans-cat1211-new-bundle<]Annandtech[/url<], while these new drivers offer a significant performance boost at lower resolutions, they apparently don't do much for higher ones (ie, HD 7970 shows a significant boost at 1080P but not at 1600p) where it counts [i<]more (under the majority of conceived usage scenarios as determined from an independent pole of PC gamers)[/i<]. Oh well, better than nothing. The 7900 radeon cards were always sort of under-performers at lower resolutions relative to Kepler.. Maybe these drivers address that deficit. Edit: Added link Super late edit in italics to appease the 1080p sticklers below 😉

      • Firestarter
      • 7 years ago

      The German Computerbase.de also found a healthy boost in performance: [url<]http://www.computerbase.de/artikel/grafikkarten/2012/bericht-amd-catalyst-12.11-beta/4/[/url<] Also in 1920x1080

        • Firestarter
        • 7 years ago

        The release was delayed apparently because of a bug in BF3: [url<]http://youtu.be/hT2nnvvRK-I[/url<] According to [url=http://www.pcgameshardware.de/AMD-Radeon-Hardware-255597/News/AMD-Never-Settle-170-USD-Spielepaket-und-der-Wundertreiber-Catalyst-1211-Beta-1031436/<]pcgamehardware.de[/url<] the problem occurs about every 5th game launch on certain configurations. The version that was reviewed has been leaked: [url<]http://www.pctreiber.net/asrock-bios-downloads?did=556[/url<]

      • shaq_mobile
      • 7 years ago

      why does it only count at 1600p? the majority of gamers right now *probably* use 1080p. besides, its all free for the end user.

        • DancinJack
        • 7 years ago

        It doesn’t only count at that resolution. Nor is that what he said. If you’re buying a 7900 series card though, wasting that power on a 1080p resolution is nuts.

          • bittermann
          • 7 years ago

          I guess the old saying fits, “Opinions are like aholes, etc…”

          • Firestarter
          • 7 years ago

          [quote<] If you're buying a 7900 series card though, wasting that power on a 1080p resolution is nuts.[/quote<] 120hz

          • Waco
          • 7 years ago

          Bull. A 7950/7970 is a great buy for 1080p gaming. Today’s overkill is the card that will last for years.

          My 4870X2 lasted me from launch day till earlier this year. That’s a LONG freakin’ time.

            • rrr
            • 7 years ago

            And your wallet cried because of electric bills, I assume.

            • clone
            • 7 years ago

            unless you’ve been living in the dark your whole life, never turning on a 60 watt light bulb, no tv, no furnace, no AC, no fridge, no home stereo, only using a smartphone you charge on your neighbors outside socket….. using an LED flashlight for shaving…… all the while paying $2.00 for each bottle of water…….

            if they haven’t been doing this, then no, likely never noticed a difference in the electric bills.

        • BestJinjo
        • 7 years ago

        Hardware Canucks shows HD7970 GE now beating GTX680 by 9-15% at 2560×1600:
        [url<]http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/57413-amd-12-11-never-settle-driver-performance-17.html[/url<] Gigabyte Windforce 3x HD7970 Ghz edition with 1100mhz clocks for $450 on Newegg just obsoleted all GTX680s at $450+ for high-rez gamers.

      • Arclight
      • 7 years ago

      Thanks for the link Wow, the BF3 improvement is pretty significant. Nice.

      • flip-mode
      • 7 years ago

      My opinion: 1080P counts.

        • cynan
        • 7 years ago

        Lol. I honestly didn’t mean to tread on the pride of gamers playing at 1080p. Nor did I mean to imply that people with sub 4MP displays shouldn’t be buying HD 7900 cards.

        I just thought it was sort of obvious that eeking out a few more FPSes would be of greater benefit, by and large, to someone with a single HD 7970 gaming at 2560×1600 – who in a handful of games may run into situations where a smooth gaming experience is more likely to come into serious jeopardy where the same gamer playing at 1080p with the same card would not. Perhaps those with 120hz 1080p monitors may find also themselves in this situation..

        Yes (performance at) 1080p is important too.

          • flip-mode
          • 7 years ago

          Ah, yes, that stands to reason.

          • A_Pickle
          • 7 years ago

          I dunno, man. Gaming at 1080P, with all the settings cranked up, is still far and away better than any console. Right now, I’m packing a Phenom II X6 1055T and a Radeon HD 6850. I’d love an HD 7950.

      • BoilerGamer
      • 7 years ago

      Actually, reading the Techpowerup review : [url<]http://www.techpowerup.com/reviews/AMD/Catalyst_12.11_Performance/23.html[/url<] [url<]http://www.techpowerup.com/reviews/AMD/Catalyst_12.11_Performance/5.html[/url<] the % gain you get at 1600p is about 5%(4FPS) for most 7XXX cards, while this is smaller than the 7+%(6 FPS) on 1080p, it is still nothing to sniff at. In fact a 4 FPS increase at 1600p difference is probably more significant than 7 FPS increase at 1080p given how low the 1600p base FPS was.

      • derFunkenstein
      • 7 years ago

      That’s kind of what we should expect. At higher resolutions there are other factors that making compute performance matter only to a point. ROP throughput and memory bandwidth start to get in the way.

        • cynan
        • 7 years ago

        Yeah. The Annandtech article sort of said similar. These drivers are most likely optimizing shader efficiency, while as you say, memory bandwidth is less driver dependent (more of a fixed quantity) and this comes into play at higher resolutions.

          • Airmantharp
          • 7 years ago

          I’m trying to figure out why memory bandwidth would be resolution-dependent; does the GPU somehow stop rendering fewer pixels/second as resolution increases? FPS should go down, but shouldn’t the number of pixels rendered be mostly the same?

          Restated, does FPS decrease proportionately more than resolution increases with increasing resolution?

          I honestly think it’s the opposite. As resolution increases, other dependencies are eclipsed by GPU performance; since memory bandwidth doesn’t change, it should be less of a limitation as resolution increases, not more.

            • derFunkenstein
            • 7 years ago

            Nobody said it’s resolution dependent. I just said that it’s a scarce commodity at higher resolutions and as a result it becomes an issue before the un-optimized programmability. If you can only push so much down the hallway, making a bigger demand creates a bigger block.

            • Airmantharp
            • 7 years ago

            My point is that it’s at a scarce commodity regardless of resolution. I don’t see how it relates to the discussion.

            • willmore
            • 7 years ago

            Caches are of a fixed size and textures, etc. grow with resolution, so more cache misses mean more loads from memory.

            • Airmantharp
            • 7 years ago

            Textures grow with resolution? Please explain.

            There may be a few things that are technically more limited by increasing resolution, but I’ve usually found (and almost always with modern GPUs) that pixels/second increases with increasing resolution.

      • sschaem
      • 7 years ago

      7970?

      Dragon age 2 2560×1600 : 35fps -> 47fps
      Over 30% perf increase

      WoW 2560×1600 : 63FPS -> 73FPS
      Over 10% performance increase.

      BF3 2560×1600: 46.6 -> 49.9FPS
      7% performance increase

      • ZGradt
      • 7 years ago

      That’s to be expected. High resolutions are full rate bound, and no software trickery can get around that.

      • willmore
      • 7 years ago

      Misspelled ‘poll’.

        • cynan
        • 7 years ago

        +1 Ah crap. I did indeed. Sometimes homonyms are hard ;-(

          • willmore
          • 7 years ago

          I can only imagine what hell English must be for EASL learners.

      • Deo Domuique
      • 7 years ago

      The 1080p is the “lower resolutions” and 1600p the “higher resolutions” ? Are you kidding me? We’re talking about the 2 biggest resolutions, while 1080p is being used widely, in relation to 1600p at least. I won’t even mention about the vast majority using even lower ones…

      Honestly, when you said “Lower Resolutions” I thought you were talking about 1366×768 etc.

        • Firestarter
        • 7 years ago

        Pish, youngins. I fondly remember when 320×200 was the standard, and getting quake to run at 400×300 VESA LFB mode was like getting your eyes lasered, for free! That was way back when getting your eyes lasered was at the very cutting edge of medicinal technology.

        Then, of course, came the 3DFX Voodoo.

        • cynan
        • 7 years ago

        If you are a PC gamer/enthusiast and are stuck with a [b<]desktop[/b<] display of 1366x768 or lower, then man, I feel for ya. I thought I was late to the LCD display game, but even my 19" analog CRT from 1998 was 1280x1024 and the second hand 20" I got in early 2000s could do 1600x1200 Further more, this whole article is not directed at you. It is for people with HD 7000 series video cards. If you've purchased an HD 7000 series card before first updating your desktop display past 1366x768, you're doing it wrong. For enthusiast gaming (on a desktop - and we're largely talking about desktop GPUs here), 1080p is pretty much near the lower end of the spectrum resolution-wise these days, like it or not.

          • clone
          • 7 years ago

          just trying something out [quote=”whatever”<]blah blah blah[/quote<]

    • Firestarter
    • 7 years ago

    Well a few % here and there is always cool, but I’m always pretty skeptical about this kind of claim. Maybe if the test rigs aren’t already testing something else, you could do a quickie average fps test to see if they pulled it off? Pretty please?

    Sugar on top? 😀

    • Shambles
    • 7 years ago

    Oh look, a dishonestly proportioned graph. AMD and nVidia. So different, so much the same.

      • Meadows
      • 7 years ago

      Had they drawn the graphs starting from zero, the differences would’ve been invisible. And still, this is a driver; users should be glad that software alone is occasionally able to improve performance further at all.

        • cynan
        • 7 years ago

        Misleadingly proportioned visuals are still misleading.

          • Meadows
          • 7 years ago

          Not misleading, they labeled the scale.

            • cynan
            • 7 years ago

            The fact that they labelled the y-axis correctly means that the graph is not incorrect (as the default assumption would be a zero starting point). However, by virtue of chosen scale, it is still misleading.

            • MadManOriginal
            • 7 years ago

            “…However, by virtue of chosen scale, it is still [s<]misleading[/s<] marketing." FTFY!

      • Ryhadar
      • 7 years ago

      If this graph came without a labeled Y-Axis I would agree with you. Since it does, about the only “dishonest” thing about it is the Civ 5 and Unigine benchmarks are “soooo good” AMD wouldn’t finish drawing the bars — which looks stupid, by the way.

        • rrr
        • 7 years ago

        Not sure, why you get downvoted, I guess some ‘savvy’ TR readers are too mystified by numbers and just look at the bars and get misled by it all the time, instead of starting to think for thermselves

      • Alexko
      • 7 years ago

      Ordinarily I’d agree with you, but for a driver update, I think it’s reasonable, since those typically provide only small improvements.

Pin It on Pinterest

Share This