Radeon Software 16.7.1 addresses RX 480 power draw concerns

Late yesterday, AMD addressed concerns regarding the bus power draw of its Radeon RX 480 graphics card in a statement to the press and in a post on the AMD Gaming Facebook page. The upcoming Radeon Software 16.7.1 update will include changes that lower the amount of power the card draws through the PCIe slot. That update will also create an option to reduce the total power consumed by the card called "Compatibility Mode," presumably for the lowest-end systems out there. AMD says that Compatibility Mode will have "minimal" performance impact.

Radeon Software 16.7.1 also includes a set of optimizations for the Polaris architecture that purportedly increases performance by as much as three percent in some games. AMD says this performance increase should help to offset any performance decreases that users experience by turning on Compatibility Mode in Radeon Settings.

AMD notes that despite the fix, it still believes that the RX 480's power draw behavior as configured at launch doesn't threaten motherboards or other components in a system. RX 480 owners who are nonetheless concerned should be getting peace of mind soon, though: AMD will be releasing Radeon Software 16.7.1 "in the next 48 hours." We'll have to wait and see what folks with in-depth power testing equipment find out once they get this new driver-card combo on the test bench.

The full text of AMD's statement follows:

We promised an update today (July 5, 2016) following concerns around the Radeon™ RX 480 drawing excess current from the PCIe bus. Although we are confident that the levels of reported power draws by the Radeon RX 480 do not pose a risk of damage to motherboards or other PC components based on expected usage, we are serious about addressing this topic and allaying outstanding concerns. Towards that end, we assembled a worldwide team this past weekend to investigate and develop a driver update to improve the power draw. We’re pleased to report that this driver—Radeon Software 16.7.1—is now undergoing final testing and will be released to the public in the next 48 hours.

In this driver we’ve implemented a change to address power distribution on the Radeon RX 480 – this change will lower current drawn from the PCIe bus.

Separately, we’ve also included an option to reduce total power with minimal performance impact. Users will find this as the “compatibility” UI toggle in the Global Settings menu of Radeon Settings. This toggle is “off” by default.

Finally, we’ve implemented a collection of performance improvements for the Polaris architecture that yield performance uplifts in popular game titles of up to 3%1. These optimizations are designed to improve the performance of the Radeon RX 480, and should substantially offset the performance impact for users who choose to activate the “compatibility” toggle.

AMD is committed to delivering high quality and high performance products, and we’ll continue to provide users with more control over their product’s performance and efficiency. We appreciate all the feedback so far, and we’ll continue to bring further performance and performance/W optimizations to the Radeon RX 480.

1: Based on data running ’Total War: Warhammer’, ultra settings, 1080p resolution. Radeon Software 16.6.2 74.2FPS vs Radeon Software 16.7.1 78.3FPS; Metro Last Light, very high settings, 1080p resolution, 80.9FPS vs 82.7 FPS. Witcher 3, Ultra settings, 1440p, 31.5FPS vs 32.5, Far Cry 4, ultra settings, 1440p, 54.65FPS vs 56.38FPS, 3DMark11 Extreme, 22.8 vs 23.7  System config: Core i7-5960X, 16GB DDR4-2666MHz, Gigabyte X99-UD4, Windows 10 64-bit. Performance figures are not average, may vary from run-to-run

 

Comments closed
    • Pancake
    • 3 years ago

    From what I’ve read around the web even in “compatibility” mode the PCIe power is still right at the limit if not slightly over. I wouldn’t be comfortable with that. I have to maintain my “do not buy” recommendation.

    • willyolio
    • 3 years ago

    is TR going to do a test to see how the new drivers affect system power draw and performance?

    • ronch
    • 3 years ago

    [u<]Even if[/u<] they don't go beyond the PCIe power spec it's obvious they intended to draw the maximum allowable current. So why are they encouraging overclocking so much so that they even went out of their way to produce that OC utility called... "What, man??!" They must be aware that any sort of OC will immediately make the card exceed the PCIe power spec.

      • Mr Bill
      • 3 years ago

      The consensus appears to be that the VRM’s on this card are hugely over spec’ed and that altering the balance to draw less power from the PCIe bus will hardly strain the remaining VRM’s that take power from the PCI cable. You can follow along here [url=https://techreport.com/forums/viewtopic.php?f=3&t=118144&start=180<]About that Polaris Power Issue[/url<]. chukula already linked all this below, he was the thread starter.

      • EndlessWaves
      • 3 years ago

      Yes, and?

      Overclocking is by definition pushing things beyond spec. Violating third party specifications is perfectly common in other components. For example nobody bats an eyelid at recommending a heatsink above Intel’s CPU socket weight limit. Even the popular 212 Evo exceeds the spec for the LGA 1150 socket.

    • ronch
    • 3 years ago

    I am quite sure AMD wanted to sell as many RX 480s before Nvidia could respond. They know Nvidia won’t sit around while they introduce a card like this so they’re gonna milk the 480 for all its worth as fast as they can. It’s just a real pity Nvidia didn’t have to stick out their foot for AMD to trip and splat their face on the floor.

    Nonetheless, I’d go for one of those newfangled 8-pin RX 480s if I were planning to get a 480. I mean, why take the chance?

    • HisDivineOrder
    • 3 years ago

    This whole thing reminds me of the R9 290/X. That was another card where nVidia rained on AMD’s parade weeks ahead of launch, AMD was forced to overclock out the gate beyond the power/heat they were expecting, shenanigans happened, and suddenly a mediocre cooler was made to look absurd with insane heat/sound matched with efficiency that looked positively last generation.

    Put simply, AMD did not intend to clock the RX 480 at the speeds they launched with until they saw if they didn’t no one would see it as anything except disappointment. And this card could not be a disappointment. There HAD to be some spin where it wasn’t. The company’s future required it.

    That’s not a great place to be.

      • cynan
      • 3 years ago

      One way or another, you’re right, AMD needed to push this card beyond what they were initially hoping. However, whether this is directly a result of being blind sided by Pascal resulting in a necessity to boost clocks – as you infer – or rather something process related that result in reduced efficiency (e.g., voltage leak across transistors), requiring more power to hit clocks they had planned for all along, is not clear to me.

      If it is something that will be remedied with the next stepping or GloFo fab run, then things might not be quite so dire.

        • HisDivineOrder
        • 3 years ago

        That’s the thing. I always take GloFo as a pejorative. I have yet to see GloFo ever match any major Fab at the equivalent size/process with the same level of efficacy/performance.

        I take GloFo screwing up as a given. AMD binding their future to GloFo was probably the worst move they made after they sold their Fabs.

    • chuckula
    • 3 years ago

    If I was a smart marketer I’d come up with a cool term like “overwatting” and then spin it as a positive aspect of the cards.

    Like: Bringing the full overwatting experience to consumers everywhere!
    Or: Nvidia doesn’t have a product in this price range that can overwatt like Polaris!

    • smilingcrow
    • 3 years ago

    The real curiosity for me is the “compatibility” mode. If it is actually a ‘compliance’ mode then call it what it is.
    I can imagine a PR person saying that it’s so that the board is compatible with compliance standards. 🙂

    • maxxcool
    • 3 years ago

    The FACT that they’re fixing this is my issue. It clearly means after sampling their own stock and sent back review samples they validated the claim to such a degree that just ‘rejecting the notion’ was not a safe financial answer.

    Which in its basest form to me is admission with lesser ownership to the problem.

    Bad form AMD …

    edit: the=they

      • Pitabred
      • 3 years ago

      there = they’re

        • maxxcool
        • 3 years ago

        derp

    • anotherengineer
    • 3 years ago

    I guess this works for both 4GB and 8GB versions?

    [url<]https://www.techpowerup.com/223913/amd-retail-radeon-rx-480-4gb-to-8gb-memory-unlock-mod-works-we-benchmarked[/url<]

      • Vaughn
      • 3 years ago

      it only works on the 4GB cards there is no extra memory to unlock on the 8GB cards!

        • DoomGuy64
        • 3 years ago

        [quote<]it only works on the 4GB cards[/quote<] and review models. I doubt this will work with retail cards at all.

          • biffzinker
          • 3 years ago

          A couple of reviews over on NewEgg for the retail RX 480 4GB say otherwise, I would imagine it also works for the Sapphire RX 480 4GB as well.

          [url=http://www.newegg.com/Product/Product.aspx?Item=N82E16814150771&SortField=0&SummaryType=0&PageSize=10&SelectedRating=-1&VideoOnlyMark=False&IsFeedbackTab=true#scrollFullInfo<]NewEgg - XFX Radeon RX 480 4GB[/url<]

            • DoomGuy64
            • 3 years ago

            That is a pleasant surprise, but I wouldn’t count on it happening for every card. Good news for the lucky few, I’m sure.

          • anotherengineer
          • 3 years ago

          [url<]https://www.techpowerup.com/vgabios/?architecture=&manufacturer=&model=RX+480&interface=&memType=&memSize=[/url<] let us know 😉 edit - cmon RX480 users, uploads your BIOSs to TPU

          • Waco
          • 3 years ago

          Do you just love replying and making things up? You just make statements without any evidence when evidence that directly contradicts you is widely known and available.

            • DoomGuy64
            • 3 years ago

            Personally, I find it much worse to make general statements like, “all 4gb cards will unlock to 8gb, so don’t worry and buy one” because it most likely is limited supply and a lot of people are going to get burned following that logic.

            But hey, feel free to keep giving us advice, [i<][b<]TmarTn[/b<][/i<].

            • Waco
            • 3 years ago

            Nobody made a general statement that it works for all of them, nor did anyone suggest buying a 4 GB card just to unlock it. However, you seem to excel in making explicitly false statements. 🙂

            There’s a chance it’ll work on the reference boards, that much is clear. Obviously it’s not guaranteed, but it’s also obviously not just review cards that it works on.

            Perhaps you’re just a bit salty since you’ve been called out repeatedly for false statements recently? Is that why you’re insinuating I’m some CS:GO player nobody cares about?

    • jensend
    • 3 years ago

    By using half the bus width of the card it replaced, and by going with GDDR5 rather than GDDR5X to meet price targets and shipping dates, AMD likely had to overclock the memory to avoid being heavily bottlenecked. I’d guess this is much of why the power situation is what it is.

    [url=https://www.computerbase.de/2016-06/radeon-rx-480-test/12/#diagramm-performancerating-speicherbandbreite<]Computerbase.de[/url<] and others have shown that overclocking the memory even further brings sizeable benefits and that even though overclockability is low with present boards (power delivery issues again?) some samples of the GPU can be undervolted substantially. It may be that Polaris 10 is capable of using GDDR5X (edit: Pascal chips are compatible with both) and that it would just require a new PCB etc. If so, then with that and careful binning AMD would have a noticeably better "480X" on its hands without silicon changes.

      • chuckula
      • 3 years ago

      [quote<]some rumors have said that some Pascal chips are compatible with both[/quote<] I don't think it's much of a rumor when the GP104 that ships with the GTX-1080 and GDDR5X is the exact same GP104 that ships with the GTX-1070 and GDDR5.

        • jensend
        • 3 years ago

        Doh! hadn’t remembered that about the 1070- I breezed over reviews and forgot that crucial spec (was going to pay more careful attention to TR’s review…)

          • chuckula
          • 3 years ago

          It’s OK, TR hasn’t actually posted a GTX-1070 review anyway.

            • jensend
            • 3 years ago

            Yes I am aware of this

            • jensend
            • 3 years ago

            [url<]https://www.youtube.com/watch?v=xECUrlnXCqk[/url<]

      • Leader952
      • 3 years ago

      [quote<]Computerbase.de and others have shown that overclocking the memory even further brings sizeable benefits[/quote<] The RX 490X will probably have GDDR5X memory.

      • Mat3
      • 3 years ago

      Not to mention the whole debacle with the 8GB RAM on board for the 4GB cards and discontinuing the 4GB cards due to lack of those memory chips being attainable. What a boatload of GDDR5 hassles.

    • Sargent Duck
    • 3 years ago

    I bet there was some DAMAGE control going on over at AMD the past week

      • Krogoth
      • 3 years ago

      Impressed by this post.

      • tipoo
      • 3 years ago

      He’s also looking masterful in that beard

      [url<]https://twitter.com/scottwasson/status/748162402265403392[/url<]

    • fellix
    • 3 years ago

    MIning coins with three RX 480’s? Don’t use unpowered PCI-E risers:

    [url<]https://bitcointalk.org/index.php?topic=1433925.msg15438647#msg15438647[/url<]

      • tipoo
      • 3 years ago

      Isn’t GPU mining terribly uneconomical ever since FGPAs and ASICs hit? And even those are uneconomical now

        • xeridea
        • 3 years ago

        I don’t keep up on it much these days (though I used to heavily mine). Bitcoin ASICs have been out for a few years, so that hasn’t been worth it for a long time. Litecoin I know has FPGAs, not sure about ASICs, but it is more memory consuming so not sure if full ASIC would be worth it. There are some niche altcoins that can still be mined with GPUs, I don’t know what ones though. There are sites that tell you profitability of each coin if you wanted to know.

        • PrincipalSkinner
        • 3 years ago

        There are some new coins which are ASIC resistant. Ethereum is popular right now amogst GPU miners.

    • derFunkenstein
    • 3 years ago

    And you don’t even have to sign up for another account to download these drivers. Hallelujah.

      • DrCR
      • 3 years ago

      First to release will be a founders edition of the driver, for pay.

      • egon
      • 3 years ago

      Any other reasons not to buy Nvidia this time? The G-sync vendor lock-in issue is the first that comes to my mind – what else?

      In choosing between an RX 480 and a GTX 1060, it’ll be tough to decide what I find more distasteful – the AMD’s excessive power consumption (beyond the PCI-E issue – at idle, playing video, multi-monitor), or the sum total of Nvidia’s recent arrogance.

        • tsk
        • 3 years ago

        The only reason I choose the RX 480 over the 1060 is because Nvidia don’t support the adaptive sync standard. I would like to have the 120W TDP of the 1060 in my new ITX system.

        • anotherengineer
        • 3 years ago

        The additional up front $$ ??

        I remember when the top-tier cards were ~$475, well that’s what I recall paying for the x1900xt about 1 month after on the e-tail market when it was king of the hill and when I lived with mom and dad lol

        Seems Nvidia was able to make pricing for upper-midrange cards as much as the flag-ships used to go for, and pushed the flag ship prices up to $1k per card.

        Maybe I’m just bitter the mid-range ($150-$225 imo) has stagnated since the 7850 days.

        As for AMD’s excessive power consumption, is it them or GloFo’s silicon or both? Hopefully AIB have some better non-ref designs.

          • ptsant
          • 3 years ago

          So true. I remember when $300 meant top-of-the-line, like when I bought the ATI 9800. The occasional card would go up to $400-$450 but $500 was considered obscenely expensive up until the launch of the 780 by nVidia at $649 (580 -> $499, 680->$499, 780->$649). Then of course came the “Titans” for people with more money than common sense and finally the Founder Tax.
          The 980 was a nice exception to this rule, happily. We will have to wait a few more weeks to see if the price approaches the MSRP.

          To be fair, AMD also launched the Fury X at $649, but anything above has always been dual-GPU (for better or worse).

      • psuedonymous
      • 3 years ago

      I’ve yet to have to sign into anything to download Nvidia drivers, beta or otherwise, through the website or GFE. That idea seems to have been silently scrapped without notice.

        • nanoflower
        • 3 years ago

        Not at all. The idea is just now coming to fruition. Going forward it will be necessary to log in to get the GeForce Experience drivers. It may be possible to get a stripped down version from Nvidia and will certainly be possible to pick up drivers from other sites (along with anything else those sites wish to add) but for the full drivers you will be needing to log in. That’s been confirmed by many sites and Nvidia.

    • Waco
    • 3 years ago

    Now we just need to wait to make sure overclocking doesn’t destroy the slot power budget…

      • chuckula
      • 3 years ago

      From what we’ve seen, it looks like AMD already did the overclocking for you before they shipped the card.

        • Waco
        • 3 years ago

        You know what I’m saying though. 🙂

          • tipoo
          • 3 years ago

          over-overclocking

      • DoomGuy64
      • 3 years ago

      You’ll run into thermal issues before power, considering the stock cooler.

        • Waco
        • 3 years ago

        This is untrue, given that even a modest overclock was pushing 100 watts sustained through the PCIe slot.

          • DoomGuy64
          • 3 years ago

          I didn’t hear it was that high, sounds exaggerated. Also a moot point if the driver update changes the power draw to use the power supply instead. If that changed, then you will indeed run into thermal issues before power.

          That said, it seems like what issues have existed, were under specific circumstances, and have been exaggerated to the point of hyperbole. Sure it’s a potential problem, but was it an actual problem for the majority of users? Nope.

            • Waco
            • 3 years ago

            There were numerous tests that confirmed >100 watts from the PCIe connector when overclocking.

            No, this isn’t a moot point. It’s even worse with the users that will want to run a pair (or more) of cards. Not many cards have every approached this level of draw from the slot, even when overclocked. Personally, I’d love a tunable in the driver that would let you set a cap on the slot usage, since the PCIe cables can handle far in excess of the entire card’s 12v current demands.

            • cygnus1
            • 3 years ago

            I read elsewhere that on the reference cards PCBs the power sources are joined electrically and it’s not possible to draw much more or less from one or the other. Meaning the only way to decrease slot load is to decrease total board power use. I’m very much anticipating seeing if that’s confirmed when this driver comes out. ‘Compatibility mode’ could possibly be dropping the cards total power budget to about 120 or 130W to ensure that the PCIe 12v draw stays under the 66W max (remember, 75W is total of the 3.3v and 12v PCIe lines).

            • Waco
            • 3 years ago

            I don’t think that’s true after reading and talking to people about the VRM design.

            • DoomGuy64
            • 3 years ago

            Nothing like completely ignoring the title of this topic, like the fix doesn’t exist. (strike one) Pointing out niche scenarios (crossfire) that aren’t the majority of usage scenarios. (strike two) Then we finish off the FUD with doubling down on the hyperbole and ignoring the fix. (strike three.)

            From what I’ve heard, the default fix is exactly the scenario of pulling the extra current from the power supply cables. Compatibility mode is for older motherboards, and people with masochistic tendencies.

            • Waco
            • 3 years ago

            Try reading without your red glasses. I’m not critical of this at all, I just hope they actually capped consumption in ALL scenarios.

            • tipoo
            • 3 years ago

            [url<]https://www.pcper.com/reviews/Graphics-Cards/Power-Consumption-Concerns-Radeon-RX-480/Overclocking-Current-Testing[/url<]

            • DoomGuy64
            • 3 years ago

            [quote<]Because this is with overclocked settings, AMD is not really responsible for the specification breach in this specific instance[/quote<] [url<]http://www.tomshardware.com/reviews/amd-radeon-rx-480-power-measurements,4622-2.html[/url<] [quote<] "To be clear, your motherboard isn't going to catch fire. But standards exist for a reason."[/quote<]

            • tipoo
            • 3 years ago

            Waco:
            ” given that even a modest overclock was pushing 100 watts”
            You:
            “I didn’t hear it was that high, sounds exaggerated.”

            Now when I show you proof of the above, you’re saying “but it’s overclocked”. Well yeah, that’s what you reacted to, if you recall. It was a one liner, hard to miss the point of what you responded to.

            • DoomGuy64
            • 3 years ago

            Your example was running the card out of spec, which is why it sounded high and exaggerated to me. I heard the card was pushing around 80 watts top.

            We’ve had numerous sites test the card, and they found that normal settings were not an immediate problem, so you guys are going out of your way to exaggerate the issue by highlighting worst case scenarios.

            Pretty obvious this is one of those things blown out of proportion by hypochondriacs. It’s a small problem. There’s no need to hyperventilate, scream the sky is falling, and use worst case scenarios to drum up FUD. People can recognize the issue just fine for what it is, without you exaggerating it.

            • Waco
            • 3 years ago

            Nobody exaggerated anything. Your lack of reading comprehension is the problem here.

            • DoomGuy64
            • 3 years ago

            No, you just used misleading examples of overclocked settings to spread FUD. Never mind that factory settings are not that bad, and the average user is not going to overclock.

            • Waco
            • 3 years ago

            Which I clearly stated. Thanks for playing.

            • tipoo
            • 3 years ago

            My example was exactly what you replied to saying 100W seemed exaggerated. Waco mentioned an overclocked card = 100W right off the bat. Maybe if you keep changing the goalposts you’ll get somewhere.

            • DoomGuy64
            • 3 years ago

            By all means, continue to bring up examples of 1500mhz overclocks and crossfire. Not misleading power numbers at all.

            I’m just pointing out that your examples are what they are. Overclocked. Not stock.

            • Waco
            • 3 years ago

            Which was stated to begin with.

            Sigh.

            • DoomGuy64
            • 3 years ago

            Not clear enough.

            • Waco
            • 3 years ago

            I’m sorry you can’t read, but we can’t help you by writing more.

            • DoomGuy64
            • 3 years ago

            No, but it sure satisfies that internet warrior troll itch.

            • Waco
            • 3 years ago

            I hope you note the trail of bad reputation you’ve gotten from this. You’re the only troll that refuses to use logic.

            • DoomGuy64
            • 3 years ago

            What logic? There is a difference between overclocked and reference clock power draw. Pointing out the worst case scenario is disingenuous and fear mongering.

            The actual scenario where the reference power draw was a problem was pretty slim. You pretty much had to make it a problem, to make it a problem. Which is exactly what you did by emphasizing overclock numbers.

            Anyways, now that the driver update fixed the issue, it’s even more hilarious that you still want to argue about it. Your argument is a dead horse at this point. Literally. The thread has moved down the list, and you’re the only one still reading it.

            Anyone who was following this topic would just read the newer article on the fix. The only reason why I’m still posting is purely amusement, and you’re providing lots. Trying to wrap my head around your how your logic works is interesting. You say the sky is falling, I look up and it’s still there. Funny guy.

            • Waco
            • 3 years ago

            Keep trying to put words in my mouth if you wish, but it’s not impressing anyone except yourself.

            I clearly stated my concerns. Those concerns, by the way, are not mitigated entirely by the new driver update.

            You may not care if systems are pushed beyond specification in ways that can damage hardware. I do. That’s fine, but claiming [i<]nobody[/i<] should care is disingenuous.

            • tipoo
            • 3 years ago

            >This is untrue, given that even a modest overclock was pushing 100 watts sustained through the PCIe slot.
            >I didn’t hear it was that high, sounds exaggerated.

            You go on pretending this one’s on us, the funny thing about the internet is it’s all right here.

      • ptsant
      • 3 years ago

      The fix involves programming the IC controller to redirect more current from the PCIe plug and less from the MB slot. The partition between the two is programmable, not fixed.

      You can theoretically draw 50W from the slot (which is very well in the spec) if your PSU can handle 100-110W on the 6-pin.

      This has been posted here. Interesting read:
      [url<]http://www.overclock.net/t/1604979/a-temporary-fix-for-the-excess-pci-e-slot-power-draw-for-the-reference-rx-480-cards[/url<]

        • Waco
        • 3 years ago

        Right, I understand the fix. I’m hoping they did it correctly and didn’t rush it. 🙂

          • ptsant
          • 3 years ago

          My point is that you can do the redirect yourself if you use MSI Afterburner. This is obviously risky, but if you are overclocking you are probably aware that it is a risky activity.

          Anyway, I don’t think people buying reference should look for overclocks. If you know you want to overclock wait 1-2 weeks and buy a Nitro or Strix with an 8-pin power supply. Problem solved.

            • Waco
            • 3 years ago

            I haven’t seen measurements to detail if and how that works. Have you?

    • meerkt
    • 3 years ago

    A nice load of weasel wording there, nay?

    [quote<]"we are confident that the levels of reported power draws ... do not pose a risk of damage to motherboards or other PC components based on expected usage"[/quote<] Translated: We abuse the spec, and in high load scenarios it might damage your hardware. [quote<]"Users will find this as the “compatibility” UI toggle in the Global Settings menu of Radeon Settings. This toggle is “off” by default."[/quote<] It looks better than reversing the toggle and calling it "overclock mode (use at your own risk)". Needed because they want to edge the GTX 970 by default?

      • shank15217
      • 3 years ago

      People buy high end motherboards with gold capacitors for what? If my ASUS ROG SUPER EPEEN MEGA-ATX board died due to the slight over-power draw of an R480 card I woudn’t be upset at AMD, I would be upset with ASUS. Most DYI gamers are building rigs with motherboards that can overclock, undervolt, overdraw etc etc.. so this is called blowing things out of proportion much like the GTX 970 ram issue.

        • Waco
        • 3 years ago

        No prior cards have had this kind of power draw through the slot. It doesn’t matter how well built the board is, to a point, since the slot itself is the limiting factor beyond a certain point.

        • meerkt
        • 3 years ago

        Maybe it will be fine, maybe it won’t. That’s what standards are created for: so that everyone knows what to expect from everyone else, and design accordingly.

        I don’t know if expensive motherboards are more durable and forgiving, but the 480 isn’t targeting the high-end market so it’ll also be used on cheap mobos. Even if it were high-end, you can’t rule out people running high-end cards on low-end mobos.

          • shank15217
          • 3 years ago

          Maybe one 480 isn’t but how about two 480s? You are ruling out the fact the people who actually might care would probably be buying cpus that can overclock, non-generic power supplies and decent motherboards. I am not saying what AMD did is right, but I don’t think there is really that much of an impact.

        • chuckula
        • 3 years ago

        Yeah, but wasn’t Polaris supposed to be all about “bringing gaming to the masses” who are using 5 year old AM3 motherboards that cost $49.95 when new? Since nobody wants to spend all that money on those nasty overpriced Nvidia products?

        At least that’s what [s<]Lenin's[/s<] [u<]AMD's[/u<] [url=http://startlr.com/wp-content/uploads/2016/06/1465814555_394_The-new-advertising-campaign-AMD-implemented-in-insurrectionary-style.jpg<]advertising posters[/url<] led me to believe.

          • slowriot
          • 3 years ago

          I am so confused by your actual positions…

          Do you or do you not agree the RX 480 is targeted to budget buyers? Because to me it seems the answer is obviously yes. AMD identified their target market with the RX 480. Which is precisely why the “powergate” issues are so concerning, because they could especially impact those buyers on budget/old equipment.

          If shank15217 were right, and therefore AMD’s marketing wrong, I think he would be right that it doesn’t matter as much (not the same as not mattering, but not a disaster IMO). But I think he’s wrong about the type of person buying these cards, and therefore AMD’s marketing is right and their execution/design was what dropped the ball.

            • shank15217
            • 3 years ago

            Why is budget = old? It could be high end if you just get 2 cards for crossfire.

            • slowriot
            • 3 years ago

            Budget buyers tend to not upgrade with every new generation of hardware. They tend to upgrade one major component at a time. It’s likely if someone is considering a RX 480 then the other parts of their system are a generation or two or three older.

            Crossfire (or SLI for that matter) does not scale consistently or well enough to justify buying 2x RX 480 over a single GTX 1070 (or AMD equivalent).

            • chuckula
            • 3 years ago

            [quote<]Do you or do you not agree the RX 480 is targeted to budget buyers?[/quote<] Yeah, and nothing I have ever said has contradicted that. I was the one getting massive downthumbs for questioning why AMD chose to use a $600 Intel CPU on the ultra high-end X99 motherboard to demo the Rx 480 since that is clearly not the target market for these cards. In this particular thread the original poster was going on and on about how high-end motherboards shouldn't have a problem with the Rx 480. I logically pointed out that the Rx 480 is [b<]not[/b<] intended for high-end motherboards but for the low-end motherboards that are much more likely to have issues with an out of spec piece of hardware.

            • slowriot
            • 3 years ago

            This is what you said…

            [quote<]Yeah, but wasn't Polaris supposed to be all about "bringing gaming to the masses" who are using 5 year old AM3 motherboards that cost $49.95 when new? Since nobody wants to spend all that money on those nasty overpriced Nvidia products? At least that's what [s<]Lenin's[/s<] AMD's advertising posters led me to believe.[/quote<] My point being... No, that isn't "logically pointing out" that the RX 480 is not intended for high-end motherboards. It's sarcasm and statements like "AMD's advertising posters led me to believe." if anything make it sound like you don't agree with AMD's marketing.

            • chuckula
            • 3 years ago

            Uh… sarcasm means something, but not what you seem to think it means.

            If I was being sarcastic I would have gone on & on about how nobody would ever use an Rx 480 in a cheap motherboard. See, that would be “sarcasm” which is saying one thing while really meaning a different thing.

            Additionally, AMD’s rather offensive advertising most certainly [b<]did[/b<] lead me to believe that Polaris isn't meant for high-end systems. That's why I was questioning the premise of the Rx 480 not having a problem because an expensive and overprovisioned motherboard may not have a problem with it.

            • slowriot
            • 3 years ago

            OH please.

            [quote<]Yeah, but wasn't Polaris supposed to be all about "bringing gaming to the masses" who are using 5 year old AM3 motherboards that cost $49.95 when new?[/quote<] This alone? Ok, maybe not sarcasm. [quote<]Since nobody wants to spend all that money on those nasty overpriced Nvidia products?[/quote<] But that? That is CLEARLY not something you believe in. So what is it other than sarcasm? [quote<]At least that's what [s<]Lenin's[/s<] AMD's advertising posters led me to believe.[/quote<] And this? Is that more not sarcasm?

            • chµck
            • 3 years ago

            My definition of budget is <$100 tbh…
            The first GFX card i bought was a nvidia gt6600 for about $115.
            Next was a 4870 for $280 at launch, which was the high-high end back then.
            So, at least to me, a 480 for $240 would be considered high end.

        • Voldenuit
        • 3 years ago

        [quote<]People buy high end motherboards with gold capacitors for what? [/quote<] People with $200 motherboards don't buy $200 GPUs. Ppl who buy $200 GPUs stick them in $50 motherboards, or OEM pre-builts.

        • ronch
        • 3 years ago

        Yeah. Me too. If someone stabs me with a knife I wouldn’t hate the person who stabbed me because hey, it’s my fault for having a soft belly and not wearing armor. /s

      • Concupiscence
      • 3 years ago

      I can understand how it’s seen as weasel wording, but this just reads like routine damage control and an effort to put customer (and investor) minds at ease. I wouldn’t hesitate to snag a (non-reference) RX 480 tomorrow if I had $250 just laying around.

      • Anonymous Coward
      • 3 years ago

      Thats not weasel wording, they are explaining a reality which is more complex than two or three words.

    • smilingcrow
    • 3 years ago

    So seemingly AMD are saying it’s a non-issue, have issued a ‘fix’ to reduce the load on the PCIe slot which was reported as being way out of spec and also added a second ‘compatibility’ mode which reduces the power consumption as well.

    Advanced Massaged Delivery – the new paradigm in delivering PR.

      • flip-mode
      • 3 years ago

      We should be fair here – AMD isn’t saying it is a “non-issue”, they are only saying they do not think there is a risk of damage. But it is clearly regarded as an issue by AMD – they have very quickly responded to the situation and they have released a driver update to deal with it.

        • smilingcrow
        • 3 years ago

        As they say there is no risk of damage I wish they would say why they have felt the need to address this with two ‘fixes’.
        It’s a very unusual situation for sure.

          • Froz
          • 3 years ago

          It’s simple. They felt the need because everyone is talking about it. It really might not be a problem for stock clocks (that’s what tom’s hardware is now saying, for example), but could be an issue when overclocking.

          On the other hand, did you really expect their lawyers to let them admit in the official announcement that the card could damage other components?

            • smilingcrow
            • 3 years ago

            Froz: “that’s what tom’s hardware is now saying, for example”

            Toms: “Our modified second round of measurements replicated the results of our first round. The amount of current flowing through the motherboard PCIe slot’s 12V rail exceeds the upper limit of the tolerance range described in the PCI-SIG specifications. Our repeated measurements of the reference card show currents 23 percent above the PCI-SIG norm. The voltage tolerance range doesn’t change this at all because only the current matters here. ”

            The data hasn’t changed. It’s out of spec by a lot and that is not a good idea even if the odds of damage are probably low. Why take a chance?
            Glad they could fix this in a driver as it first I doubted it would be that simple.

            • Froz
            • 3 years ago

            I’m not saying that it’s not out of spec, I’m just saying that according to tom’s hardware it shouldn’t be a problem everyone was making it to be.

            [quote<]4. Current hardware should be able to handle this amount of current without taking any damage, as long as the motherboard’s slots are clean and not corroded. It’s also advisable to make sure that the graphics card sits precisely in its slot. This should always be the case, though, even with significantly lower amounts of power.[/quote<] [url<]http://www.tomshardware.com/reviews/amd-radeon-rx-480-power-measurements,4622-2.html[/url<] And: [quote<]Here's why we don't think this is such a big deal. When the specifications were introduced, four pins were reserved for the 12V power supply, two each on the top and the bottom of the board. This can be seen in the picture of an old graphics card above. The neighboring third pin on the top was reserved, but not actually assigned to anything. But now, in the latest standards, it’s generally used as an additional 12V pin, along with the other four, making it possible that a total of five pins are available for the 12V power supply. However, the allowed current was not changed to be compatible with older hardware. [/quote<]

          • nanoflower
          • 3 years ago

          If they didn’t put out a ‘fix’ the conversation would have gone on for months even if not one person could show a problem caused by their implementation. That’s the way of the Internet so the only solution for AMD is to put a solution that removes the main complaint (too much power being drawn from the PCI-E bus) and thus puts a damper on the discussion.

            • smilingcrow
            • 3 years ago

            A fix is good and necessary no doubt and seemingly will be timely which is all good.
            But I still feel that the press release is self-congratulatory and too vague as it doesn’t clearly state exactly what the two fixes are doing.
            I think due to the relative seriousness of the issue that it would be better to come clean and state clearly and in detail what the changes are.

          • BobbinThreadbare
          • 3 years ago

          I bet stability is a big reason. Trying to draw more power that slot is designed for might mean no damage, but less power than needed, and instability.

        • Beelzebubba9
        • 3 years ago

        While the power issues cost AMD a sale to me, I can’t really fault AMD for how they’ve handled the issue post launch. It’s just unfortunate they couldn’t have planned their hardware better in the first place.

    • Froz
    • 3 years ago

    [quote<]Performance figures are not average, may vary from run-to-run[/quote<] That's so funny. They could as well put random numbers in there. With 3% difference it probably means they run the test many times and selected the worst run with the older driver and compared it to the best run with the newer driver. In reality there might be zero difference between them.

      • tipoo
      • 3 years ago

      Odd conclusion to make from that caveat. Silicon has varied chip to chip since forever.

        • Froz
        • 3 years ago

        Ah, I see. I have misunderstood it. I blame it on my English :).

    • USAFTW
    • 3 years ago

    It’s a relief that the load on the PCI-E slot can be lessened. I would leave the compatibility toggle off given that the 6-pin connector can actually supply much more than 75 since it’s much more robust.

      • Bonusbartus
      • 3 years ago

      A glance at the PCIe specification tells us that while a 2×3 power connector may only be used for a power draw up to 75 Watts, the connectors at least are rated for Currents up to 8Amps.
      which would be 96Watts Per Pin.
      Also the 2×4 connector specification explicitly notes that the contacts used in auxiliary power connector (2×4) are of the correct rating to meet a 7.0 A requirement.
      Which would be 84Watts per pin, used on a 6 pin card would mean it should be able to handle 84×3 = 252 Watts Peak power, if I read that spec correctly…

      the PCIe spec for wires to the auxiliary/power connectors however is 18AWG, which for power transmission is only rated for currents upto 2.3A which would end up at about 82Watts for a 6 pin connector

      edit: cable spec

        • Waco
        • 3 years ago

        That cable spec is the wrong one to look at – chassis wiring is the one you should take note of (where 18 gauge is good for 16 amps).

    • chuckula
    • 3 years ago

    I updated this thread in the forums: [url<]https://techreport.com/forums/viewtopic.php?f=3&t=118144[/url<] In a nutshell: The "default" profile in those drivers still exceeds the letter of the PCIe standard, but as a practical matter drawing extra power [within reason] from the 6-pin PCIe connector is not going to cause anywhere near the issues that you get from drawing too much power from a motherboard. So as a practical matter, the "default" settings give you the same card TR reviewed but without some of the potential headaches. I'm actually much more interested in the "optional" driver setting that purportedly complies with the PCie spec 100%. I have a feeling this is the "real" Polaris that AMD originally intended to launch, not the hot-clocked "new" Polaris that they pushed out the door.

      • tipoo
      • 3 years ago

      Seems like it. They also say the 3% boost in games the new driver provides should substantially offset any impact from the lower energy setting, so it seems pretty sure to me they got worried and pushed it past what they found was it’s peak efficiency point, or at least the better tradeoff of efficiency to performance (since power use scales more than linearly with voltage+frequency, while performance scales less than linearly with clock), in favor of boosting it a bit more.

      All that debacle didn’t seem worth it if it’s 3-5% they squeezed out.

      • Froz
      • 3 years ago

      I recall from that twitch channel that the 6-pin implementation in this card is out of spec anyway, using a sensor pin as ground? I’m not sure if I understood it correctly. Some people also claimed that it means that connection could safely deliver more power then normal 6-pin and that the sensor is not needed, as the card have other means of detecting if the cable is plugged in (that it would not try to get all the power through the pci-e slot, but instead probably just wouldn’t turn on at all).

        • chuckula
        • 3 years ago

        Yeah, I had heard their 6-pin connector on the board was a little unusual in nature.

        Detecting if the cable is plugged in is relatively easy: No power coming in? No cable.

        • tipoo
        • 3 years ago

        I think that part is actually fine

        [url<]https://forum.beyond3d.com/posts/1928167[/url<] "AMD doesn't need the sense function at the 6pin plug because the VRM controller detects if a phase doesn't have a supply and can signal that. That works basically as sense as well and one can use all three pins for the ground connection (as the plug is wired anyway)."

        • nanoflower
        • 3 years ago

        As I understand it the 6 pin connector can deliver the same power as an 8 pin connector without any problems but the concern is with PSUs that don’t use cables capable of handling that much power. That supposedly isn’t an issue with any PSU on sale today so that’s why AMD went with a solution that allows them to draw more power than people might expect through a 6 pin connector.

      • Chrispy_
      • 3 years ago

      I’d be interested in the “optional” driver too. I also suspect that this is what the original, ideal balance of performance and power consumtion would have been, with the 5.5TFlops spec.

      Had the RX 480 launched in a vacuum, this is what we’d have had. Just like the 290 series where Hawaii was a much more efficient performer at its 925MHz “sweet spot” rather than the 1000+ to compete with the Titan and GTX 780, I suspect the RX480 will fall somewhere short of a GTX970 but be a quiet card with better performance/watt.

Pin It on Pinterest

Share This