AMD issues statement on R9 290X speed variability, press samples

In the wake of our investigation into whether retail Radeon R9 290X cards are slower than press samples, AMD has issued a statement on these matters. Here’s the text in full:

Based on feedback from the enthusiast community, we’ve implemented an all new PowerTune mechanism in the AMD Radeon R9 290 series that exploits the full capability of the individual GPUs rather than clamping performance to a least-common-denominator type of capability level. This has the advantage of improving overall performance but does result in some degree of performance variability. These changes will also result in some degree of run-to-run test variability based on environmental and operational conditions in un-controlled test environments.

The range of performance differential is not expected to meaningfully change the user experience but we’ve taken note of recent reports that the degree of variability is higher than expected. Reasonably we would expect the variability to occur both above and below the performance of the press samples, however it appears that most reported performances are biased towards the low side. We are actively investigating these reports and we will update when we have completed our investigation.

It is important to note that is it to be expected that the more thermally limited the setting the more variation can naturally occur. AMD Radeon R9 290X, by default, ships with two default BIOS settings for different acoustic levels and the “Uber” setting was designed to limit the level of thermal throttling and comparisons in this test mode are more consistent. Furthermore, AMD have implemented comprehensive PowerTune controls within the OverDrive panel of Catalyst Control Center and users can tweak to their own desired balance between acoustics and performance.

Interesting. We’ll continue to track this story as it develops. I’m looking forward to seeing what AMD has to say in its next update.

Comments closed
    • jonjonjon
    • 9 years ago

    come on you aren’t that naive are you? what would you do if it was your company? i know i would make sure i was sending review sites good chips. maybe not as greedy as amd with the top of the top but you know all the companies benchmark every chip before they send it out. why wouldn’t you?

    • JumpingJack
    • 9 years ago

    Can you provide proof that everyone does it? I would like to see that.

    • jdaven
    • 9 years ago

    This. +1. No one wants to comment on your post because it takes away all the hate. I for one vote for no more pre release hardware reviews.

    • NeelyCam
    • 9 years ago

    I’m not a medical professional, so I don’t use those terms on people. But for what it’s worth, no – I don’t think Klimax has an IQ under 70 (even if standard deviation of 30 is used). I think it’s somewhat fair to say that the vast majority of people on this site have an above average intelligence.

    • tcubed
    • 9 years ago

    I think that was the technical intention of powertune in the first place… Marketing just banked on it and tried to sell the card as both a cool&quiet card and as a performance powerhouse ultimately rushing everything before a driver that could do both was out… and a mess ensued… It could really be both with decent cooler and decent ccc controls/driver.

    • tcubed
    • 9 years ago

    I’m no medical professional, it’s my assessment as a layman on an online forum, am I not entitled to an opinion? It is my opinion that killmax was behaving like a moron in that particular case, exactly following the definition of the term, obviously it was not a diagnostic –

    BUT…

    If you took it like a diagnostic and disagree with it … are you a medical professional? Or is it just you opinion that as to the definition of the term and to your knowledge he did not behave like a moron?

    • NeelyCam
    • 9 years ago

    [quote<]To be a moron is not a name it's a medical condition...yeah look it up! You only chose to see it as an insult, it's just an assessment.[/quote<] Are you a medical professional with the expertise to diagnose that medical condition? If you are, then yes - I guess you were making a medical assessment. If not, you're using the word as an insult. So, which one is it?

    • tcubed
    • 9 years ago

    Last I recalled there was another fix last few days I read something about crossfire & eyefinity – Might be wrong … too tired to look it up though.

    • Airmantharp
    • 9 years ago

    The first fix came through over the summer; we’ve heard nothing since.

    • tcubed
    • 9 years ago

    ok you may be right… even so they committed to fixing it up to 4k and they probably will if the current progress is anything to go by…

    • Airmantharp
    • 9 years ago

    Scroll up. They’ve fixed Crossfire to 1440p; that’s it. 1600p or Eyefinity is still broken for everything under the R9 290 series.

    • tcubed
    • 9 years ago

    Well they sort of fixed it with the new drivers (I think a few days old now) – But you’re right old cards will still suffer some because of it but they really did good on the promise to deliver better drivers during 2013 which they actually did. Don’t think this problem is easy to solve especially with so many generations to be looking at. I think the only case not fixed is cfx+ eyefinity + 4k – which is really a very rare case

    • Airmantharp
    • 9 years ago

    That’s only one product of the very many that they are selling. They said they’d fix all of them, yet they’ve left over half a decade’s worth of their customers out in the cold.

    • tcubed
    • 9 years ago

    Yes… it is a Marketing & PR disaster no doubt about it!

    • tcubed
    • 9 years ago

    I did not say early silicon in that sense … early in the sense of production ready silicon just one of the early batches… maybe I wasn’t clear about that.

    But yes you are absolutely right… if it is misrepresented they have to be held accountable.

    To the industry terms… english is my 3-rd language so I’m sorry if I misspelled or used “inadequate” terms.

    And yes benchmarks are a marketing vessel no doubt about it! But sending in a golden chip is very dangerous because of something exactly like this scandal. If this problem would have been on the other end of the spectrum and retail gpu couldn’t hit & keep 1ghz in uber mode then I would have been really pissed because my concern is performance. However you’re right noise & heat are also parts of the picture but for me they are secondary as I will never buy a reference card anyway. That’s why I say it’s just a red herring

    • Airmantharp
    • 9 years ago

    And they made their point clear- they under-spec’d the cooler so far that their potential customers would have to buy their competitor’s product if they wanted something that was cool and fast without voiding their warranties!

    • JumpingJack
    • 9 years ago

    [quote<]Press release are early silicon a few month in, the silicon fabing will get better and yelds will be better and you will start seeing more of the better sort... which has always happened... So what is it you're so upset about ?? [/quote<] Oh, I see now... so press samples are not actual released silicon, they are actually sent before the product is finished. You mean when I look at a CPU and the reported stepping or the GPU and it's reported stepping in a review that it is actually not that stepping, the same stepping I see when I purchase the product? Or, as you point out, more of the better sort (you are not using industry terms correctly to be honest, but with your extreme intellect I follow what you mean) .... so in this case AMD magically had a great sort but the sort turned to 'poorer silicon' in the retail samples? Here is the story..... you, me, the other people in this discussion, and many many others who simply visit hardware sites use the hardware reviews to make purchasing decisions. What these sites do is they procure a product, run tests, produce charts, graphs on various aspects of those products. This includes performance as well as power consumption, and in the case of GPUs, noise factors as well. These are compared against other products of the category along with price and we all make our decision. AMD, Intel, Nvidia, Crucial, Rosewill, Seasonic, Corsair... on and on an on. All submit products for scrutiny, they use the hardware press as a marketing tool, a means of advertising their product to the masses. As a consumer, I trust that the product that these companies submit to the reviewers are representative of the products they intend to sell me. What upsets me is if a company submits a sample to the press that represents what I will by in the store and it turns out that what I saw evaluated is not the same as what I purchased. Now, if what I purchase turns out better... gravy man, and bad for that company because they have under advertised their product. However, if what I purchase (assuming it is fully functional and free of defects) is less than advertised, then I have been ripped off. And the 'everybody else does it' argument does not hold water. Now, it is not clear if AMD did this intentionally.... gave reviewers special silicon with special BIOSes in order to shine a good light on the product. As I said before, I am not convinced either way. However, if AMD did in fact misrepresent the product then it is fraudulent. Not only is it fraudulent, it is illegal and AMD should be held accountable.

    • tcubed
    • 9 years ago

    hey you said they didn’t solve it… now you drag old silicon into the discussion… smart move… wrong answer… look up 290x CFX then we’ll talk

    • Airmantharp
    • 9 years ago

    I’ve never read a benchmark/test/article. I’ve never owned an AMD GPU, or AMD CPU. Hell, I’ve never played a game!

    Do you really think that you’re smarter than every single poster here? That you’re the authority on everything GPU?

    Please tell me how smooth the frame-times are on a pair of R9 280x or older/lower cards at 2560×1600, or any Eyefinity configuration. I’ll wait.

    • tcubed
    • 9 years ago

    They attempted to show that the card can also run quietly like nvidia’s do… it was a stupid move, and a mess of a PR & Marketing campaign…

    Quiet mode is to my assessment just a Marketing stunt… as you may see in reviews the cards behave within 1% variance and super stable in uber mode so that’s not some “unattainable” boost it’s actually the way this thing is meant to work.

    • JumpingJack
    • 9 years ago

    Ok, so AMD gives us a ‘quieter’ mode because in their desire to have the performance crown, jacked the power so high that their reference cooler was too loud…. I get it, I get it now…. but nobody, according to you, really runs a flagship GPU in quiet mode. Interesting. I see now, thank you!!!

    You are so smart.

    • tcubed
    • 9 years ago

    none is fraudulent… geez … they pick one out of a bunch and test it if they’re happy with it they ship it otherwise they chose another simple… Press releases do somewhere middle top of the bunch, you will see silicon in retail doing better and doing poorer on all the sides. But when you send something to review you want to avoid looking like an idiot by sending a product with a manufacturing defect or sub-par performance. This doesn’t mean you won’t get the advertised speed you will just have a bit more heat or might need to use a bit more voltage…

    But the same can be reversed…

    Press release are early silicon a few month in, the silicon fabing will get better and yelds will be better and you will start seeing more of the better sort… which has always happened… So what is it you’re so upset about ??

    • tcubed
    • 9 years ago

    Obviously because of the damn leafblower of a fan they have… They knew it was shit so they decided instead of putting the effort into developing a new one they just tweaked the settings.

    • JumpingJack
    • 9 years ago

    Ohhhh, great, so all payers cherry pick their silicon? So every review of every product I read is fraudulent because everybody does it?

    (Edited for a typo)

    • tcubed
    • 9 years ago

    have you ever been hired?? Cause it sure doesn’t look like it.

    • tcubed
    • 9 years ago

    and now the conclusion:

    Yes that’s exactly it… because, better silicon has less leakage meaning lower power draw, higher frequency capability and less heat dissipation. So to your assessment it is correct and it still is not a problem because:

    Even if the silicon is poorer on retail cards (which most probably it is because ALL do it Intel and NVIDIA does it too – choosing good silicon to send to reviewers – they don’t want to have a crashing silicon or sub-par evaluation it’s just the nature of things to chose & most probably pretest the silicon you send out) – it just means you will get less overclock headroom and higher instability @ low voltage it’s exactly what you expect in poorer silicon – But this doesn’t mean your silicon isn’t going to behave within advertised parameters. It’s just not going to do well on underclock because of higher resistance(leakage) and overclock because of heat buildup(also leakage)

    Now you got that? Ok now look it up on the internet if I’m not right I’ll stand corrected!

    • JumpingJack
    • 9 years ago

    No, I have never been fired, and if I had found myself in such excellence and extreme intelligence as yours I would have worked double hard not to be fired, as I am in such awe of your superior intellect.

    You are so smart.

    • tcubed
    • 9 years ago

    those were the exact words of the last guy I fired… hrm… do I know you?

    • JumpingJack
    • 9 years ago

    I have… I bow to your extreme intellect, you are so smart.

    edit…

    [quote<]That particular case may have, yeah, why not, happens all the time... still we're talking about "power saving mode" problem not performance mode problem... I don't see why you would buy a flagship gpu and then run it in power saving... and on top of that have a problem because of it... [/quote<] Oh, yeah, err, ummm o' wise one ... could you please explain if people don't buy a card to run in quiet mode, why did our beloved AMD develop and advertise that as a feature? I anxiously await your response.....

    • tcubed
    • 9 years ago

    err… what are you babbling there?? Just re-read the reviews and my comments… then re-read yours come back and explain it so you don’t look the way you just did…

    • tcubed
    • 9 years ago

    Geez… you got me there… I’m no authority on anything… like yourself. But you may -to your discretion- chose to look cfx benchmarks up … or not … and post something stupid like this…

    • tcubed
    • 9 years ago

    Surprisingly enough for me… being an AMD fan and all… my last gpu I bought was an NVIDIA one… Last 6 systems I built were 2 intel + nvidia and 4 amd apus… so yeah I’m really an AMD fan… Oh wait… I also made 4 laptops recommendations… all Intel 2 with AMD cards one with nvidia… one integrated intel…

    – To your comment – My comment be it insult & name calling laden brought lots of info to the discussion… what did you chip in with yours?

    – I hate the damn leafblower fan on AMD reference card… That’s why I wait for a non-ref one…

    To be a moron is not a name it’s a medical condition…yeah look it up! You only chose to see it as an insult, it’s just an assessment.

    • JumpingJack
    • 9 years ago

    Ok, I see… so the press ‘power saving mode’ is better than the retail ‘power saving mode’ because the retail have poorer silicon and lack the ‘special bios’, is that it?

    Wow, you are so smart.

    • tcubed
    • 9 years ago

    That particular case may have, yeah, why not, happens all the time… still we’re talking about “power saving mode” problem not performance mode problem… I don’t see why you would buy a flagship gpu and then run it in power saving… and on top of that have a problem because of it…

    So let’s make this clear, you have a stability problem with a downclocked gpu in the flagship area?? Is this a… mobile chip? a… server chip? an… embedded chip? Well… NO… guess what, you will put it in highest gear possible and floor it – nobody gives a crap about low power instability in a 500$+ flagship gpu…

    unless…

    You’re not in the market anyway… oops… like say nvidia fan on an amd article… or… one who will never have the money to buy one anyway…

    • JumpingJack
    • 9 years ago

    Ahhhhh there we go, so the retail cards just have poor silicon is that it?

    • tcubed
    • 9 years ago

    It doesn’t need to make you feel better.. it’s just the way it really is…

    Now the instability might be due to poor silicon which is a very valid explanation but might also be due to other components being bad… That particular case you’re talking about isn’t particularly known for it’s great quality you know…

    • tcubed
    • 9 years ago

    why so serious??

    • JumpingJack
    • 9 years ago

    Ohhh, yes, good point. I glanced over the concept of flashing the retail cards with the press BIOS, indeed, that ‘special’ bios seems to make the retail cards go faster, no doubt.

    It appears that the ‘special’ press BIOS also drops the voltage slightly, in one case causing the retail card to be unstable…. that explains the data surely as lower voltage gives more thermal headroom to allow the ‘special’ press sampled cards to run faster…. wow, this makes me feel much better.

    • JumpingJack
    • 9 years ago

    Wait, but that is what is so entertaining to read 🙂 🙂

    • NeelyCam
    • 9 years ago

    Seriously, enough with the name-calling. If you can’t respond without personal insults, maybe you shouldn’t respond at all

    • NeelyCam
    • 9 years ago

    [quote<]You seriously belive any real enthusiast giving a crap that its flagship card doesn't handle low powr correctly?? You dump 500$ to what? Play it cool? This is just too hilarious... [/quote<] As an AMD fan you have obviously adopted this attitude because AMD can't keep its cards quiet. [quote<]Look it up you moron[/quote<] Calling names when you can't argue logically [quote<]geez... nvidia fans are some of the worst... toped only by apple fans really...[/quote<] Your insult-laden whining here makes AMD fans sound worse than NVidia/Apple fans

    • kithylin
    • 9 years ago

    VARIABLE CLOCKS ARE BAD!!!!

    I still do not understand why review companies, the users (people) and everyone are just “Accepting” that these new “PowerTune” and “Boost Clock” features on these new video cards are a “good thing” at all.

    For YEARS since the Voodoo era all video cards have been 100% fixed clocks in 3D. No clock changing up or down. All video cards had 1 clock speed, and it ran at it. Whatever power usage or heat it generated was just “part of having that video card” and we all dealed with it.

    It worked for a -VERY- long time.

    Instead today we have both ATI/AMD & NVIDIA advertising (And selling) their video card products claiming a certain clock speed. But in reality, in most situations, none of the video cards sold will -EVER- hit their advertised clock speeds during gaming on a regular basis. They get hot, and throttle back and run sometimes as much as -20% slower.

    The bottom line is.. this is called “False Advertisement.” But people continue to buy these products and put up with this, even though they’re lied to on a regular basis these days.

    Both ATI/AMD and NVIDIA should at least offer customers the ability to -COMPLETELY DISABLE- boost/powertune if they want to. Either via software or a hardware switch to flick on the card it’s self, something. Someone has to stand up to these companies and tell them that what they’re doing is WRONG AND WE DON’T WANT IT!!

    • Airmantharp
    • 9 years ago

    Yes, preach that AMD has fixed all of their drivers beyond a single monitor and single GPU. That ‘scandal’ still hasn’t been resolved. You must be the final authority on everything!

    • shalmon
    • 9 years ago

    talking about voltage… I wonder…. Couldn’t one expect the “incredibly large” 1% variation not be chalked up to variability in pertinent system components.? Say… Power supply… Motherboard… Bios version and settings… And the load of God knows how many additional components on said power supply? I mean, these things affect cpu targets, why not gpu?

    • tcubed
    • 9 years ago

    I don’t endorse this product… it has a shit blower… why the hell should I buy a shit blower if I can buy a perfectly good non ref/custom card when they come out (at the same price and with a warranty)… and I can wait till then?

    Yes I agree with you I shouldn’t have had to… AMD should have made it’s homeworks and build a decent cooling solution. Actually… I don’t recall when any AMD ref card was actually quiet… Not to the degree this one “blows” but they all were pretty noisy.

    So… If I can chose to wait for a month or two for AIBs to get their pretty coolers out why not do it? I never bought a reference design card not with ati/amd and not with nvidia though admittedly nvidia’s ref cooling solution is much better.

    • bfar
    • 9 years ago

    You shouldn’t have had to wait, tcubed. The fact that you are waiting is not a great endorsement of the current product.

    • KenLuskin
    • 9 years ago

    AMD chose to use a cheapo cooler for the reference model.. WHY?

    I blame the marketing and Sales dept!

    AMD marketing has been HORRIBLE!

    The actual GPU is fantastic!

    CONCLUSIONS:

    1) The GPU team is excellent

    2) The “business” folks who chose the cooler are horrible.

    3) The marketing folks are just as bad or worse than those who chose the cooler.

    4) Heads should roll in the “business/sales/marketing” depts at AMD.

    5) AMD needs to consult with outside testers before making decisions on coolers/settings.

    ******Most important for AMD and gamers, is that AMD is now putting out top of the line GPUs that are dramatically lowering the price that Nvidia can charge for their slightly better competing products.*******

    This is the REAL story!

    The BS about golden press copies is just a bunch of nerd GEEKS run amok, and making a mountain out of molehill.

    • tcubed
    • 9 years ago

    You might be right it’s been quite a while…

    • ronch
    • 9 years ago

    Don’t look now, gerbils, but there seems to be a new troll-bot on the loose around here, or maybe it’s an old, banned troll-bot who just registered a new username.

    • tcubed
    • 9 years ago

    Again a missinformed post by killmax… dude go read the reviews that actually try to understand what’s wrong so you get a clue… get a second opinion from somebody with a twice as high iq as yours… i guess a dog will do..

    • tcubed
    • 9 years ago

    Well this is a change in attitude… ! Wonder if you took your time to get informed… well it certainly looks this way! Cudos!

    about the frame variance there were reviews years before nvidia released that software and mae it into anscandal. Nvidia turned a blind eye to it untill they fixed it with drivers on their side… then they just released the tools they made for and used at testing the issue so that now everybody could point the finger at amd.

    Cudos to nvidia for picking up the problem and fixing it. In the gpu business it has allways been like this , one company pointing the finger at the other. In the end we the consumers stand to gain off it!

    Nvidia fixed the heat issue in fermi
    nvidia and amd both fixed the video quality variances
    Nvidia and amd both fixed the frame variance both single gpu and sli/cfx
    With last update amd fixed the powertune fiasco – so this entire discussion is useless…

    Who won? Us the consumers! Whatever your gpu choice might be!

    • tcubed
    • 9 years ago

    Look it up you moron… plus the point was to illustrate the stupidity of this scandal. Just as the exlpoding ti event. This is a non-problem blown up by a few missinformed and non professional hacks. There are some serious reviews out there explaining in great detail what happened and also a new driver that fixes the issue.

    So what the hell are you talking about??

    You seriously belive any real enthusiast giving a crap that its flagship card doesn’t handle low powr correctly?? You dump 500$ to what? Play it cool? This is just too hilarious… just like you little nvidia fans squarming like toads at any minor issue in amd camp.

    • tcubed
    • 9 years ago

    The point was headroom and overclocking you toad! Geez

    • tcubed
    • 9 years ago

    Dear killmax you have zero understanding of laws and legislation or the term merit. But throw around with terms you know nothing about.

    You also have zero understanding of technology too. Because if you would have the faintest idea what you are talking about you would realize quiet mode is actually a new software function that you absolutelly have no idea how it works. Uber mode is in fact the old way cards were kept from exploding talk about implied…

    First, amd did not advertize a certain fps or certain frequency but “up to”.

    I don’t say amd is a saint you will find posts of mine even on this thread trashing what amd does wrong. But this is blown waaay out of proportions by idiots like you. Who have a hard time reading thorowly or understanding reviews or posts.

    Plus 1% variance exists in ANY hardware unless mission critical… we talk consumer grade hardware… btw 1% variance is expected in nvidia too so just stop being such an idiot.

    And irony is waranted as you dumb nvidia fan fail to understand that this problem affects the low power setting of a flagship gpu… as if you are going to use it to browse your favorite nvidia spots on that…

    • Klimax
    • 9 years ago

    No. That is not awesome. That is horrible on quite few levels, but then one can argue that if one wants to be cheap then he gets what he pays for. Unreliable performance, where you play lottery whether you get performance you saw out here or if you will be stuck with worse card.

    But in no way anybody can say it is good engineering. Well, for AMD it is good, when the chip can’t reach target frequency, it is customers problem, not theirs, but that shouldn’t be lauded!

    This is cheating customer, plain and simple. No way you can whitewash it.

    • Klimax
    • 9 years ago

    Great idea to destroy warrant and pay most of difference in price between 290x and 780Ti for custom cooler and then spend even more time on fixing AMD’s problems.

    • Klimax
    • 9 years ago

    You act as if AMD was saint. AMD fanboys can often make NVidia’s fans look sane.
    Evidence for following: Exploding cards (large scale issue, not singular issues)
    As for Fixed benchmark, AMD was there and Might have been actually first to that. (IIRC Quake 3)
    Video quality? How about AMD cheating on it too and quite recently.

    Dear AMD fanboy, next time make sure you don’t look like idiot and fool by writing nonsense while ignoring problems on your side.

    • Klimax
    • 9 years ago

    Fit for purpose? Misleading costumers? Pretty usre there could be quite few things they could be hit by.
    1% variance? Try again. Oh, and BTW Take a look at Far Cry 3 and how 290x handles that. (BTW: Uber mode is not default and has absolutely horrible characteristics and still doesn’t solve the problem, just bloody hack)

    As for NVidia, warranty covers this, (manufacturing defect). If not honored, there is lawsuit. We have been there already couple of times, including Microsoft, NVidia and likely others too. (Not sure about ATI or AMD)
    As for Fermi’s temperature, I missed it back then, but when I was looking through return rates published by French hardware site I did expect higher percentage on Fermi. No such thing there, they were quite reliable, but we’ll see if AMD managed same thing, but I wouldn’t bet either way. (They already had couple sudden changes, so who knows what will go wrong)

    Why would 780Ti explode? There is so far no such thing yet. (Evidence to contrary pending obviously)

    Frankly, don’t speak of fans, irony meters are not yet that strong to handle such bloody irony.

    • Bensam123
    • 9 years ago

    Arguably, the ‘Uber’ mode also holds the fan at a certain level by sacrificing performance as well.

    I’m sure changing the bios would result in less stability as those bios’s probably use less voltage, which means less heat, which means cooler and quieter system, but at the same time manufacturers are flashing a bios onto the cards with higher voltage because they require it… I’m guessing due to using inferior components, which was a good point someone else brought up on the initial review.

    The fault of all of this may not lie with AMD at all, but rather manufacturers using subpar components. The more I hear about this stuff and the more information that’s rolling in, the more it seems like manufacturers are pinching pennies (less then that) to save a buck or two.

    Which in and of itself may be good in it’s own way, this may signify a way of picking out good manufacturers from bad ones now. If things are now getting sensitive enough where we can see variance in quality of components then manufacturers may once again differentiate one from another besides warranty. Normally they just increase fan speed to deal with more heat (Inferior components? Just pump more voltage through them), but AMD is enforcing a static fan speed across all cards which makes that impossible without tweaking or custom drivers…

    • Voldenuit
    • 9 years ago

    [quote<]voodoo2's... man those were some seriously mean cards... expensive as fku too I remember they were sold for 1k a pop if I remember correctly and in sli... you got to be really rich :D[/quote<] From memory, Voodoo 2 cards were MSRP'ing from $300-400 at launch, depending on make (some were factory overclocked). You might be thinking of Quantum 3D's Obsidian cards, which were high end Voodoo 2 SLI-on-a-stick cards with models ranging from $1000-$2500.

    • Bensam123
    • 9 years ago

    Voiding warranty is a ridiculous non-point. That’s like pulling the tag off your mattress.

    That aside, who says you can’t get a card that performs as well as TRs first sample? There are a lot of confounding variables here that could be responsible for the discrepancy, especially considering their second review unit performed almost on par with the HIS model. It could all come down to manufacturer picked components (cheaper board components, cheaper TIM, inferior power delivery over all, cheaper PCB). More data points are definitely needed here before claiming foul and that there is no way for a end user to receive a similar experience.

    Heck a symptom of this is the high variability in the fan speed output for coolers. Notice how the HIS fan performed poorly when working on PWM (original driver), but when the driver is forced to hold a certain RPM it functions status quo? That’s a sign of a low end fan motor which requires more voltage to operate at the same RPMs as the original design. One could also draw a correlation between this and the low end performance of the card in general. TR could ascertain whether or not the TIM is to blame by simply reapplying a new one across the two retail cards. The most likely culprit to be changed across manufacturers is the TIM and we may see one company changing on their components. Normally this would go unnoticed, but now people are paying close attention to powertune and it’s very sensitive to changes…

    Normally if something has inferior components and needs to be cooled more, the fan simply cranks up more, but in this case AMD is enforcing a standardized fan speed across all the cards, which are made by different manufacturers using different components.

    AMD could simply bump up the fan speed a bit to maintain 1ghz or even a minimum clock speed. It can be almost completely guaranteed through modification of powertune. So at the end of the day the end user may not even need to do anything, but then people will point out the noise being worse even if performance is static across the board. Traditionally cards ramp up fan speed to deal with higher GPU temperatures, instead AMD this time around has chosen to decrease clock speed, which was probably a really bad decision from a political standpoint.

    You can see this yourself with past GPUs ramping up a fan more under Furmark compared to Battlefield 4 or whatever, it’s completely common and no one makes a big deal about it. AMDs approach was just different this time around and the cooler is in no way maxed out, it’ll just produce more noise the higher the faster they spin it.

    • HisDivineOrder
    • 9 years ago

    Because it’s AMD. If it were nVidia, you’d see the same people screaming about how horrible it is to expect people to void their warranties to get a proper cooling solution.

    Never trust the opinion of someone who picks a video card based on the corporation as if they were a sports team or their favorite super hero. All you’ll get are people who ignore the problems of one company while screaming about the other company at every chance, regardless of relevance to the topic at hand.

    The new Powertune has pervasive problems that are obvious to anyone who thinks about how the scheme works and apparently are easily discerned when cards are compared to one another. Is it strange that people are actually arguing that nVidia shouldn’t have bought AMD product from a neutral third party to help review sites have more cards to compare?

    Thus, ignoring the, “It’s nVidia, they must be evil,” I arrive at the core argument of, “How dare nVidia buy more 290X’s from a neutral third party for a comparison article!”

    Which I find to be quite strange as logic goes. AMD should have done it themselves. Not the first (or apparently the last) time nVidia will step in to illustrate some ongoing hardware problem AMD should have figured out on their own, though, eh?

    Who was it who figured out how to detect frame latency issues on AMD cards again? PCper, TR, and… oh, right. nVidia.

    nVidia isn’t a benevolent, do-gooder corporation in it to save gaming, PC gaming, enthusiast PC’s, or whatever. They’re in it to milk gamers for every dollar they can. AMD’s just the same. Once you accept that, you can talk about when nVidia has problems with solder melting, leaf blowers, or GPU’s burning up after a driver update; or AMD has problems with Crossfire, frame latency, 4K gaming, or now their new Powertune.

    It’s a lot easier and a lot less stress once you start talking about the problems they actually have in context.

    • NeelyCam
    • 9 years ago

    “Dear AMD – I know you had this offer that I could get the card if I pay a random amount of between $400 and $600, but just [i<]hate[/i<] uncertainty!!! [b<]Why can't you just charge $600?!?[/b<] I'd be so friggin' pissed off if my neighbor got it for $450 and I paid $500. If it was $600, everyone would be GUARANTEED to pay the same amount - it's much better"

    • Voldenuit
    • 9 years ago

    [quote<]AMD engineers wanted to offer something more, but people freaked out about lack of "guaranteed performance", instead of being happy with the extra boost they got beyond what the "guaranteed performance" would've been...[/quote<] "Dear AMD, I'd love to pay you [i<]up to[/i<] $599 for a 290X, but here, have a dollar instead. It's the same thing, right?"

    • madgun
    • 9 years ago

    Hey AMD shill how much did you get paid for the advertisement?

    • KarateBob
    • 9 years ago

    review samples should represent retail products people can buy. If 90+% of review cards outperform a Titan,.butof less than 25% of retail cards, then SOMEBODY SHOULD BE FIRED AT AMD.

    • tcubed
    • 9 years ago

    you make me laugh… stability like… say chips falling off of the board… or exploding cards… or whining noise blowers… wooden cards or physix or stereoscopic 3d nobody uses… or fixed benchmarks … or video quality corner cutting…

    yeah…

    totally worth it… to pay 20% more for 5% more performance, 3 old games from only one producer and 1 100$ voucher against a failed console attempt…Be sure to look for an explosion hazard sign on it – if you find one don’t go on a plane with it!

    Well I guess being a fan or incredibly stupid doesn’t hurt and it’s not illegal either – so just go ahead!

    • deb0
    • 9 years ago

    AMD can go kick rocks. This is yet another reason why I’ve been an nvidia consumer for years: Stability, performance, simplicity and innovation. Yes, it comes at a premium, but imho, these attributes are worth paying for.

    • tcubed
    • 9 years ago

    really… now honestly… you dump 500$ on a card to run it quietly?? What will you run with it microsoft office? Powerpoints?Minesweeper? hrm… don’t tell me your next concern will be battery life won’t it?

    I get it to be upset that the leaf blower fan is noisy – I get that, I don’t like it either and wait for custom coolers – but to call it fraud / false advertising etc and making a scandal out of quiet mode on a flagship GPU card is ridiculous.

    The quiet mode/performance variance problem would be a real problem if either of the following would occur:

    1 – The card would be a professional grade GPU (workstation / server)
    2 – It would be a mobile GPU
    3 – it would be a 100$ GPU
    4 – it would affect uber mode

    I don’t see either of them in this case…

    • tcubed
    • 9 years ago

    not normal mode, “quiet” mode – meaning the card tries to hold the fans rpm under a certain level by sacrificing performance/frequency that’s it…

    Just go to Tom’s or LegitReviews(I think) they have tested the variance and both came to the same conclusions:

    1 – In uber mode variance is negligible
    2 – In quiet mode the retail card performed with it’s delivery bios ~10% lower
    3 – Updating the bios by using either the press sample bios or the one offered by powercolor (or even amd itself) would bring the variance down to ~1%

    draw your own conclusions after you read those in-depth articles they are quite good!

    • tcubed
    • 9 years ago

    Intel never needed to exploit anything they just bribed everybody and kept quiet…Talk about professionalism you make me laugh my ass off

    • tcubed
    • 9 years ago

    yeah like what laws… tell me I’m curios… + Please do explain why in uber mode there is no difference? If you want performance just flip it in uber mode and that’s it… good luck prooving 1% variance warrants any merit in a court…

    About the laws… did NVIDIA get a fine for fermi chips falling off the board? Hmm? “The cards are designed to work at that temperature” – remember? Were you this pissed back then?

    Or will you rather gamble and buy a 780Ti and hope yours doesn’t explode? (Because AIB made a mistake) Hmm?

    geez… nvidia fans are some of the worst… toped only by apple fans really…

    • tcubed
    • 9 years ago

    voodoo2’s… man those were some seriously mean cards… expensive as fku too I remember they were sold for 1k a pop if I remember correctly and in sli… you got to be really rich 😀

    • tcubed
    • 9 years ago

    Most probably a stupid mixup with bioses… I highly doubt it’s more then a stupid mistake because this has gotten AMD nothing but bad press and would be totally stupid to do it intentionally in the first place as this whole “scandal” was predictable …

    Also I would really hire some decent PR guys this is just ridiculous…

    Also I am quite sure that anyone who wants performance will just put this card in uber mode and be done with it – you only get like 1% variance with that.

    If you want “quiet mode” you wait -like me- for non ref and custom boards with custom coolers and basta!

    • tcubed
    • 9 years ago

    Must be nice to be a worthless nvidia fanboy with obvious reading deficiencies! –

    Here a song fitting your understanding level:

    Tra la la la la… show us some benchmarks…
    Tra la la la la… To support your claim…
    Tra la la la la… Or go back to your momma
    Tra la la la la…

    But if anyone dares to point the finger to exploding 780Tis everybody goes “Can’t hear you, my fingers are stuck in my years!”

    • tcubed
    • 9 years ago

    Yes and 2 of them had the decency to update the bios and see the variance drop in the 1% area… The rest are just keeping quiet about this + only those 2 actually mentioned the difference between quiet and uber mode…

    The rest stick to the old story and imply golden chips which is simply not true! And hope it will be such a scandal that they can turn the attention away from their flagrant incompetence… At least the other 2 had the spine & technical knowhow to dig deeper and find the root of the problem.

    The variance is a problem but not a technical one but a pure PR / Marketing one…

    And none of the mentioned review sites mentions anything about nvidias 780Tis exploding cards … makes you wonder why doesn’t it? If this card is pushed to the limit… then why does a simple AIB error make the card explode hrm?? Nobody looks at that now… because it’s much more important to discuss something that by the way is now fixed with the new beta drivers… where you can shose how to set your card up…

    • tcubed
    • 9 years ago

    What the hell are you talking about ?? There are guys out there keeping this card at 1.1-1.2ghz all the time full throttle @ 70C with waterblocks or advanced air cooling…

    But I do agree AMDs PR is the worst! I mean… wtf… this statement is a complete joke

    • bitcat70
    • 9 years ago

    [quote<]they deserve a big kudos for acting professionally[/quote<] In that context? Probably. But they don't need to act "unprofessionally" if all they need to do is to "ask" (nudge, nudge, wink, wink) OEM's not to use those other guys chips.

    • derFunkenstein
    • 9 years ago

    Why can’t I buy cards that perform as well out of the box? Why do I have to mod my brand new shiny card? Why am I expected to void the warranty?

    • Bensam123
    • 9 years ago

    No, I don’t remember those things, but I’d believe it. A couple ‘nudges’ in slides is different from trying to get reviewers to make a big deal out of a relatively small problem that can easily be fixed with a new TIM, a couple more % points on a fan, or just a new fan in general. Heck there are still a lot of things that can account for the discrepancy and a lot of good ideas in the original article besides AMD is a horrible company that sends super cherry picked models out to reviewers.

    At least TR now has enough R9s for a quad setup to test the new Radeon Crossfire stuff.

    • clone
    • 9 years ago

    agreed.

    • NeelyCam
    • 9 years ago

    [quote<]I can't remember when Intel took advantage of their competition's (that would be AMD, of course) misfortune, but I'm sure they have.[/quote<] I'd be surprised if there weren't any snipes in the press after Bulldozer came out

    • ronch
    • 9 years ago

    Intel, AMD, Nvidia — they all ‘stoke the fire’. Remember when Intel had Sandy Bridge chipset problems? AMD came up with their ‘Ready, Willing, and Stable’ marketing campaign. Remember when Nvidia had melting ball contacts problems? AMD also came with some nice marketing slides saying how they ensure the quality of their chips. I can’t remember when Intel took advantage of their competition’s (that would be AMD, of course) misfortune, but I’m sure they have. If they hadn’t done so ever, they deserve a big kudos for acting professionally so consistently considering how AMD has stumbled so many times as their competitor in the CPU space. Quite honesty I think Intel reps are some of the most professional out there. Maybe AMD should pirate some of them if they can afford to: AMD probably has the worst marketing and PR hacks in the industry.

    • sid1089
    • 9 years ago

    Yeah. They make very good cryptocoin mining cards.

    • Vasilyfav
    • 9 years ago

    TL;DR: AMD admits shipping R9 290s with a barely adequate cooling system.

    • ronch
    • 9 years ago

    Whatever they say and whatever anyone says, I’m chalking this up to lousy QA testing and/or marketing. The marketers probably were the ones who wanted to push these chips all the way to their thermal and electrical limits. Perhaps 95C is ‘normal’, according to some folks, but the fact that these cards can’t stick to their target performance even with the (better?) retail coolers means there’s something amiss. This reminds me of those 220w TDP FX chips that AMD is boasting to be running at so-and-so clock speeds but only when Turbo has kicked in, which obviously also depends on environmental conditions (cool weather, hot weather, whatever). So naturally, some of these 290/290X chips will be able to reach higher clocks more easily, some will tend to run a bit cooler, some are destined to be used in cold areas or hot areas, etc. QA testing means testing different chips under different conditions and binning them accordingly. The fact that AMD didn’t even cite base clocks seems like they have something up their sleeve, something fishy if you ask me. With practically every computer part you can buy today, there are guaranteed performance levels and all units bearing the same model number/specs should be able to operate at a minimum baseline, which AMD’s PR hacks are silent about when it comes to these cards. As ever, AMD’s PR dept. is one of the worst ever.

    So sad to see how AMD is just a shell of its former self. I think AMD needs to fire more crazy marketers. Give them pink slips instead of 8-balls.

    • allreadydead
    • 9 years ago

    Do you mean, you would buy one of 3rd party 290 with stock cooler or you mean you’d buy one with 3rd party cooler, like ACX or DirectCU ?
    I’d buy a 290 non-x with 2 slot aftermarket cooling but it was not my point.
    My point is merely what TR made; they sent out BETTER samples and misguided everyone. If they were aware of it, thats what I understood from statement, it’s called fraud. I’m in that part and that’s why I said “MAKE me buy”. They needed to convince me they made a honest mistake if I was gonna buy that card….

    • bitcat70
    • 9 years ago

    [grammar Nazi]
    [quote<]would of never[/quote<] What's that? Does it grow in one of the forests the Hobbit has to cross? [/grammar Nazi]

    • Airmantharp
    • 9 years ago

    As others have mentioned, this is more of a marketing flub than a technical one.

    From a technical standpoint, AMD’s PowerTune allows you to get the best out of your GPU. You choose how much cooling to give it, and it runs as fast as it can. That’s awesome.

    • Klimax
    • 9 years ago

    Fermi ran stable on announced clocks. This doesn’t. And it’ll get likely worse. (See Far Cry 3 for future)

    • Klimax
    • 9 years ago

    And that’s the problem. It’s bloody money-based roulette, where house is AMD. You got card which has hard time getting even close to stated boost? Bad luck…

    Sorry, unacceptable. And in fact might run afoul of few laws in Europe I suspect…

    • DragonDaddyBear
    • 9 years ago

    I doubt AMD did anything intentionally underhanded. Unlike GPU’s of just a few generations ago, this one manages to run as fast as it can up to a thermal limit, rather than a predetermined clock, resulting in the possiblity of variance (based on case, airflow, ambiant temp, etc). Knowing not all CPU’s are alike, that some OC better, some can run at slightly lower voltages, some are hotter, etc. Being the size of silicon that it is, I wager that the effect is more pronounced on GPUs of that size. Perhaps AMD sent out a “higher bin” than was sold to customers, but, based on their response(s) I think it’s just a fluke.

    Good on AMD on not making excuses, though, and working hard on trying to lower the standard of deviation.

    • Bensam123
    • 9 years ago

    Interesting… so the intended benchmark mode was ‘uber’ and uber has much less variability then normal operating mode? This definitely should be confirmed… if so that means simply flipping the card to uber will fix the issue people have with this card, which is it doesn’t produce stable enough performance across cards.

    Of course that also means it’ll run faster and louder. But it seems as though TRs entire premise for their dismay with AMD was based on normal mode not offering performance that was stable enough across cards.

    This is all putting aside another cooler would fix these issues or simply turning the fan up by a few percentage points would also probably do that. They could even make a slider which would maintain 1ghz by making the fan spin faster. Once again, of course it would be louder…

    It seems as though none of this is set in stone. All of this can be fixed in one way or another, but people are choosing to view this as a problem that can’t be changed and it’s a permanent ‘cheat’ AMD did (IF AMD sent cherry picked cards in the first place). In other words, end users never get the performance of the benchmark reviews.

    TR definitely should look into the compromise necessary to maintain a steady 1ghz across cards (turning up the fan speed, new TIM) and variability of uber mode for the sake of being fair.

    • Bensam123
    • 9 years ago

    I agree… AMD catches a lot of flak, no matter how small the error (4-6 FPS). It’s unfortunate that all this has resulted in a blow of up hate. It seems like the haters were just waiting, lerking for something like this to appear so they can rage on AMD for not being the company they want and living up to the other godsend companies.

    No one even seems to care that Nvidia is stoking the fire on this. Sure it’s in Nvidias best interest to do it, but that doesn’t mean people need to agree with it or like it.

    • dragosmp
    • 9 years ago

    And the new ones are just as bad

    If the old marketing team has been fired and the new one does tons of errors ==> the problem is not (only) in the marketing department.

    The leadership has been changed also.

    …it’s probably bigger than any single department in AMD, they are getting pushed as a company to deliver something at any price, while sacrificing common sense and their reputation.

    Power tune is a great technology, certainly better than what their CPU division managed to do; weaker than Intel’s, better than Nvidia’s and APU’s. It’s mind boggling that anyone imagines that a hot card with a crappy cooler in a closed-case, whose performance/speed depends on ambient temperature and airflow doesn’t work as fast as on the press’ open test-beds.

    • flip-mode
    • 9 years ago

    I was actually surprised by the responsiveness of the statement, and in addition to that, surprised that AMD actually responded. Usually AMD seems to have fingers in ears.

    • Spunjji
    • 9 years ago

    A properly designed system will run cooler than an open test bed. If you’re not properly designing your system and putting in a graphics card that draws 250W of power then you are objectively Doing It Wrong.

    • Spunjji
    • 9 years ago

    This really isn’t anything at all like Bulldozer. It’s more like Fermi.

    • Spunjji
    • 9 years ago

    Cheers DB.

    • Spunjji
    • 9 years ago

    What does being advertised have to do with anything? There is a measurable baseline speed, they just never said outright what it is.

    • Spunjji
    • 9 years ago

    There’s a practical guaranteed minimum that can be easily inferred from testing of the card, if you care to read the relevant articles.

    • Klimax
    • 9 years ago

    Aka you get what you pay for. Either sort adequate solution or best there is.

    • Klimax
    • 9 years ago

    Best? You have no guarantee on any base or boost frequency unlike with NVidia, where you do know what variance will be.

    No, it is not best way, not at all, because you have zero guarantee you will get the promised features/performance.

    • BestJinjo
    • 9 years ago

    Heard the same story since 2009; since then thousands of users with AMD cards have gotten their cards paid for. Litecoin is here to stay for at least 3-4 months and it only takes 1 month to pay for a single R9 290/X.

    • bfar
    • 9 years ago

    ‘Review bios’…..cool…..

    Why were reviewers being given something different from what was being sold to consumers? They reasons may well be innocent, but most consumers would be justified in expecting that media reviews would be based upon the exact same product that’s delivered to their homes.

    • NeelyCam
    • 9 years ago

    This alone points to the utter stupidity of Bitcoun/Litecoin/whatevercoin. It’s like these people think they can magically create value out of nothing with anything but some electical energy used to “mine” these “coins”.

    The value of these “coins” are based entirely on the [i<]temporarily[/i<] observed value of the "coin". There is nothing of actual value backing any of these. These are crazy speculations schemes at best - Ponzi scheme at worst. Central banks should shut all these down, so people would go to work instead of spending their time building useless computing farms doing nothing of value and just heating up the planet for nothing

    • NeelyCam
    • 9 years ago

    Both

    • NeelyCam
    • 9 years ago

    Wow, facts hurt like a b****

    • NeelyCam
    • 9 years ago

    My apologies. I wasn’t well informed. I can’t be an expert in everything.

    But at least I try to be nice when I communicate with people on a topic I’m an expert in. That scores me some points, no..?

    • tcubed
    • 9 years ago

    No! Because the card needs to be kept quiet read certain rpm of the fan… it trottles the card so it doesn’t go over a certain level of cooling needs. So in quiet mode it sacrifices performance for acousics by caping fan speed. In uber mode it is performance oriented and scales cooling… got it now?

    • tcubed
    • 9 years ago

    Well then you are like me and wait for non ref cooler and updated drivers… simple… i know i am!

    • Arclight
    • 9 years ago

    If the review was made using Uber mode, i think it warrants furthur investigation. Running the card in Quiet mode does not show the whole picture. We’ve seen cards cooled with aftermarket solutions maintain 1GHz and go beyond without issue, imo this is just another confirmation of how bad the custom cooler is.

    It’s important for end users to know what to expect when they use the card as is, but they have been warned by many review sites not to buy the reference design unless they can deal with the thermal/noise issues and now lower performance when using the fan at lower speeds (aka Quiet mode, even though acoustically it’s not very quiet).

    • Gragodine
    • 9 years ago

    What? you have no idea what your talking about do you? AMD nor NVidia have anything to do with crypto currencies. You need to inform yourself before making dumb comments like those. PLUS the days of mining bitcoins with GPU’s are way over. Even then AMD cards would still outperform NVidia cards.

    Also just so you know there’s more cryptos than just litecoin and bitcoin.
    XPM
    PPC
    NMC
    TRC
    and more.

    • kalelovil
    • 9 years ago

    Unfortunately it seems they culled their QA team as well.

    • cegras
    • 9 years ago

    Huh? So that means in quiet mode, there’s no guaranteed minimum performance.

    • HisDivineOrder
    • 9 years ago

    Actually, I think he did submit the short, concise version of their statement by trimming all the fat that went in verbose detail about how the system is supposed to work and how it actually is working…

    Even I thought their statement was overly verbose for remarkably little said. That’s saying something. 😀

    • NeelyCam
    • 9 years ago

    Intel should make some at 14nm and take over the market since others are stuck at 28nm

    • HisDivineOrder
    • 9 years ago

    nVidia didn’t sponsor attack articles. nVidia contributed competitor cards from a third party source for an article about 290 variability. Did nVidia know that the variability was there? Probably. Did they tell TR to say it?

    If they explicitly told TR to report the results exactly as they did word for word without TR doing testing, then that’s advertising. If TR was going to do the test anyway and nVidia merely added more cards to the pile to prove the point, then I wouldn’t call that advertising. I’d call that nVidia trying to help illustrate something that’s more obvious to them (since I’d imagine they buy a lot of competitor product to evaluate the competition) than it is to the poorer review sites.

    I suppose nVidia could have just done an article illustrating the problem, but that’s not going to be received well, is it? But if there’s a problem, are you saying you’d rather not find out even if it’s nVidia that’s the one seeing it?

    Were you also against the articles revealing the frame latency problems AMD had been experiencing for years? Because nVidia was the one that walked AMD’s techs through figuring those out, too. It’s not a big leap to think nVidia might understand better how badly the variability is than AMD, considering nVidia seemed more aware of their frame latency issues, too.

    If this were AMD helping nVidia figure out problems, we’d have so many forum posts saying, “Wow, AMD is so kind and generous for doing nVidia’s job for them!” But when it’s nVidia doing it, it’s nefarious “attack articles!”

    Look. If there wasn’t something wrong, there wouldn’t be an attack article. There’d be nothing to attack. So who are you shooting here? The messenger or the message? If you want to blame someone for there being a problem, blame the people who built the damn product that’s got problems.

    Not the people trying to show you the problem. It’s not even the first time this has happened. Did you even notice how even AMD acknowledges that they too are now starting to recognize problems? 😉

    • NeelyCam
    • 9 years ago

    Exactly, and people would’ve praised the great perf/$.

    AMD engineers wanted to offer something more, but people freaked out about lack of “guaranteed performance”, instead of being happy with the extra boost they got beyond what the “guaranteed performance” would’ve been…

    • HisDivineOrder
    • 9 years ago

    The feeling seems to be they are desperate. They counted on getting by with a card that can’t do what they wanted it to do at reasonable temperatures AND fan speeds/volume. So they sacrificed one or both to get the performance up. Reminds me of Bulldozer, except instead of just having subpar performance, they decided to ramp up the speeds and NOT improve the cooler.

    The cooler was supposed to be enough because they weren’t going to push the silicon to the ragged edge like they did. So the cooler now looks subpar because it’s being asked to do something it wasn’t designed for.

    They had the choice to either further delay the part (already delayed from early 2013) or release as it is, relying on non-reference designs with different coolers to take up the slack.

    This has happened before, you know. Look at the early reviews of the 7970GHZ. Months ahead of actual release, those reviews showed cards that were outrageous in acoustics of their cooling solution. AMD wound up not releasing the reference design for months after that review and let non-reference designs redeem the concept.

    This time, they HAD to get something out there ahead of the next gen consoles, so they couldn’t afford to sit on it.

    Or so they thought. I think they’d have been better served allocating all 290X’s to non-reference designs with better cooling solutions.

    • HisDivineOrder
    • 9 years ago

    Or buy a non-reference version with a cooler up to the task of keeping it at max performance, of course.

    • JustAnEngineer
    • 9 years ago

    Is sponsoring attack articles against your competitor a type of advertizing?

    • HisDivineOrder
    • 9 years ago

    That’d be fine if these “max overclocks” were able to be disabled and we could get a general feeling for what the baseline will actually be. Instead, the “max overclocks” are being represented in a way that doesn’t fall in line with typical user experience because most reviews happen rapidly in open testbeds rather than in closed systems.

    Counting on that, AMD can rely on reviews showing these cards in a better light than they actually are worthy of. Lots of people have said this was inevitable given AMD’s new Powertune, but it seems like the reviewers are only now realizing it.

    • HisDivineOrder
    • 9 years ago

    I’d argue that the nVidia version is better not because it lead to superior performance, but because it provided more reliable performance that can be counted on. And the spin masters–they of questionable claims–at nVidia didn’t even try to sell cards based only on the boost clock. Imagine if they had. They could have advertised huge speed numbers if they went by only the absolute top speed the cards were capable of.

    That’s not even the Boost clock. That’s the higher than boost clock the cards often managed to hit.

    But nVidia–nVidia of all companies!–thought that was too shady to do.

    AMD didn’t. Think on that.

    • HisDivineOrder
    • 9 years ago

    They shipped them cards directly from Newegg. Are you saying that nVidia used contacts inside Newegg to package up bad 290X’s so as to look brand new and then get TR to report on something most sites are also saying is seemingly happening despite limited supplies and comparison points?

    I’m pretty sure TR isn’t being paid by nVidia. They were sent cards to do tests. Is there special advertising being shown I’m missing here? 😉

    Anandtech on the other hand…

    • HisDivineOrder
    • 9 years ago

    Must be nice to ignore the fact that AMD jumped on the idea of selling a product whose performance jumps between extremes rapidly and uncontrollably and call that, “tried to offer maximum performance on each card.”

    But if anyone dares point out that the performance jumps uncontrollably between extremes and that this result is not ideal for either people trying to get a certain level of return on their investment?

    I just don’t know. I think you’re trying too hard. I think people who pay $400-550 for a video card ought to have a baseline level of performance guaranteed out of box no matter what. You know? A …baseline clockspeed that’s advertised and guaranteed?

    Seems reasonable to me.

    • sschaem
    • 9 years ago

    Dead on , every words.

    I wonder how amd engineering feels about those bozos in marketing… Must be depressing.

    • SCR250
    • 9 years ago

    No serious miner uses GPUs anymore. ASICs are the way.

    • IPlayNaked
    • 9 years ago

    I wonder why it was only mentioned here because Nvidia sent them the cards and wink wink, nudge nudged them to do it?

    Same reasons, I imagine.

    • JustAnEngineer
    • 9 years ago

    Ponzi scheme or money laundering scheme?

    • JumpingJack
    • 9 years ago

    People are not harping (or should not be harping) on the fact that the cards throttle. AMD obviously has made this a high end card, intended for performance and sacrifice power and thermals. At 95C you are only 15-20 degrees away from thermal run away. They have left almost no room for engineering margin.

    What people are harping on is that the performance presented in reviews is different that the performance you will get when you buy one retail. I count at least 4 different HW review sites that have demonstrated that their press samples were higher performing than retail versions of exactly the same card. This is the consistent observation between them all.

    The impression, rightly or wrongly, is that AMD sent only the best performing samples to the press, and as such, what you buy is not going to perform to what was advertised. The layman term for this is cherry picking.

    Now, every time there is a major HW release, there is most always some fanboys from the opposing side that brings up the ‘cherry picking’ argument. Often times, in fact, this is how you can tell they are fanboys because there has never really been any direct evidence of cherry-picking be it from Nvidia, Intel or AMD — at least not until now. Eyebrows are being raised with the 290X because there is now some indication that cherry picked samples were indeed used to seed the press.

    I am not convinced, personally, either way…. but the odds are stacking up against AMD in this case. AMD acknowledges this:

    [quote<]Reasonably we would expect the variability to occur both above and below the performance of the press samples, however it appears that most reported performances are biased towards the low side. We are actively investigating these reports and we will update when we have completed our investigation.[/quote<] So we will need to wait and see... or we may never know if the treat this like I suspect that they may.

    • Billstevens
    • 9 years ago

    Your correct on point 1, this does come off as an overclock.

    The upsetting thing is, because all initial reviews pointed to higher performance. That means it is likely that AMD put out its best GPU spins that could handle their overclock to win the performance crown. When in reality, to maintain stable performance on all 290x GPUs, they needed to be 5% or 10% slower than the competition…. It’s misleading.

    I don’t fully agree with point 2 as the inability of certain GPUs to handle the same voltage settings as others, to make them run well with the same firmware, indicates that the quality of the GPU you happen to get will determine your ultimate performance. Your cooling solution won’t make you a better overclocker if you become unstable at certain frequencies and voltage settings due to reasons other than heat.

    • DPete27
    • 9 years ago

    My 2 cents

    1) People should look at the new power tune as a max overclock. Everything (CPU/GPU/RAM) OCs differently. This does complicate things from a performance crown standpoint though. No ideas on how to tackle that problem.

    2) This whole fiasco will be nullified when custom coolers hit the market in a couple weeks. With an appropriate cooler throttling won’t be an issue.

    • derFunkenstein
    • 9 years ago

    What’s golden is that the HIS sample couldn’t take the lower voltage, duh. Probably time to read the words in the article. If they can’t all handle it, then it’s not really valid is it? Some people might get lucky, but others won’t.

    • aggies11
    • 9 years ago

    Fair enough. I’m not sure it’s just the binning that’s a problem though.

    This time around they had a specific performance target (wanting to match/beat their competitors). There chips could do it, but needed to be pushed to the limit.

    The problem here is since they are essentially automatically Over clocking there chips, variability is a real concern. The differences in voltage between their review firmware and retail firmware is a very dangerous gamble to take. I’m guessing they put a “target” voltage level(s) in the press sample firmware and hoped that they would get enough chips of quality to match. That didn’t work out.

    Normally you leave yourself a little headroom, for these sorts of things. I’m guessing that marketing/ sales pushed for this, probably against some engineers protests.

    It’s just a marketing nightmare trying to sell variable quality products in a market that is used to fixed performance. (And the competitors are delivering it too).

    • chuckula
    • 9 years ago

    Effusive praise of AMD is fine (although it could be at least a smidge on-topic and related to your earlier article) but the links to shady eastern-european websites and promises of “thousands a month!” are a bit over the top.

    [Edit I see ximage21 is obviously using those Litecoins to hire sockpuppets to downthumb not only me… but Damage…]

    • Andrew Lauritzen
    • 9 years ago

    How is that any consolation for someone who doesn’t want a leaf blower in their machine? Remember that even the rather questionably-named “quiet” mode is the default, and not exactly quiet to start with.

    • tcubed
    • 9 years ago

    Yeah like exploding 780ti or… chips falling off of th board that kind of fluff?? Variance of 1% in quiet- read economy mode with review bios qualifies as cherrypicking… cool….

    • Pwnstar
    • 9 years ago

    Which is why they fired their marketing team first.

    • bfar
    • 9 years ago

    Cherry picking reviewer samples is also smart and devious.

    I say let Nvidia and AMD tear lumps out of each other. It’s healthy competition and it keeps them both on their toes. They’ve both released great stuff over the years, and they’ve both put out some rubbish along with questionable marketing gaffs. The point is, when they fluff it, they need to be reminded of what we expect as paying consumers. Keep that up and we can look forward to good things in the future.

    • Damage
    • 9 years ago

    Link removed. Pondering the spam ban requests.

    • tcubed
    • 9 years ago

    This does never end now does it? Wtf is golden about a chip when using same bios on retail shows variance of about 1 percent??? Geez … golden chips my ass, stupid pr& marketing… the chips are fine and also did you not get the part where only quiet mode is affected? No ? Corse not why the hell would you the pr text is such bs it’s incredible you actually need to have decent iq and readig comprehension to get it… you obviously don’t qualify.

    • derFunkenstein
    • 9 years ago

    AMD made its biggest mistake by not advertising its card with a baseline speed and advertising it can go “up to” a certain speed within its thermal/power limits. Add to that the incredibly golden press samples and the result is people can’t buy the performance that was reviewed.

    • maxxcool
    • 9 years ago

    hahah to bad you will never see the return…

    • Airmantharp
    • 9 years ago

    This is the kind of flexibility that AMD’s PowerTune should expose- the ability to use any variable as a target. Further, AMD should develop the software to better highlight what particular limit is being hit and causing the card to throttle- is it voltage, clockspeed, GPU temperature, VRM temperature?

    And hell, they should throw a stability test in there too; maybe even run it ‘in the background’ and use it as another variable for throttling, allowing the GPU to exceed all limits if everything is good, and dialing it back and reporting to the user if some issue arises, like a wonky power feed.

    • jihadjoe
    • 9 years ago

    Agree. They should have marketed the chip as 850MHz with a 1GHz boost and everyone would have been happy.

    • odizzido
    • 9 years ago

    I bet all those people who fried their cards on the SC2 menu would have loved a card that throttles well.

    As someone who didn’t have that problem, I still quite like it.

    Now what I would really like is to have some of those cards with a good cooler and to be able to set my own max temperature in the CCC.

    • Andrew Lauritzen
    • 9 years ago

    I don’t think anyone is disagreeing with the existence of “turbo”-like features driven by chip sensors. The problem is that they aren’t validating and binning their chips well enough given that. The consumer needs to be able to reasonably expect a certain level of performance out of a GPU they buy, and that includes the variations in voltage and such that the individual chips themselves have.

    AMD’s response is quite reasonable I think, but it’s clear they do need to do better in terms of binning and validating their parts.

    • Palek
    • 9 years ago

    The story of AMD in a nutshell.

    • NeelyCam
    • 9 years ago

    I think I figured out that ximage’s message was AMD marketing instead of Litecoin marketing: AMD boards/CPUs selling has nothing to do with Litecoin. I think he just latched onto the Litecoin marketing train because, according to him, AMD GPUs offer some advantage there

    • Airmantharp
    • 9 years ago

    Outside of a GPU being able to determine it’s max stable clockspeeds for itself, AMD’s method comes off as the best solution we’ve seen so far, and will likely impress once better coolers are attached.

    • jss21382
    • 9 years ago

    Granted, asics aren’t available for litecoin yet, imo it’s only popular right now because gpu computing is still possible, once asics hit this currency the gpu crowd will move somewhere else, same as when they shifted from bitcoin to litecoin.

    • Ringofett
    • 9 years ago

    ?? I rarely ever look at the forums, but I almost never see posts like that on articles comment sections (though its possible they get deleted before I ever look).

    • tcubed
    • 9 years ago

    dude… did you not read the statement? it affects only quiet mode… in uber mode no problems… ALL cards can go in both quiet mode and uber mode… so what the hell are you babbling about lottery/minimum guaranteed performance?

    • cegras
    • 9 years ago

    The question is, is there at least a guaranteed minimum performance? When people played the batch lottery with overclocking intel/amd chips, at least they had an advertised frequency.

    • chuckula
    • 9 years ago

    And dedicated ASICs will destroy any GPU and cause the whole [s<]Ponzi scheme[/s<] uh.. I mean "market" to collapse.

    • NeelyCam
    • 9 years ago

    I wonder why none of this is mentioned on Anandtech’s “AMD Center”

    • f0d
    • 9 years ago

    actually nvidia has never been good at mining bitcoins
    amd has the cryptocurrency mining market cornered

    • f0d
    • 9 years ago

    bitcoin diddnt have any name recognition a few years ago either and people thought they were dodgy back then too

    i dont blame ximage’s reaction from people saying how they “never heard of them” and he does seem like a non english speaker trying to educate

    • jdevers
    • 9 years ago

    The link was pretty shameless, but he is actually quite accurate. Every AMD CPU above the 7850 is sold out pretty much everywhere and this is the reason why. People stopped mining Bitcoin on GPUs a while ago (money losing proposition now with the ASIC miners), but Litecoin is huge money right now.

    Litecoin has been in the mainstream press a lot lately when the price jumped from a few dollars to about 40 in November.

    [url<]http://www.forbes.com/sites/timworstall/2013/11/28/bitcoin-litecoin-were-well-into-south-sea-bubble-territory-here/[/url<] [url<]http://www.forbes.com/sites/reuvencohen/2013/11/28/cryto-currency-bubble-continues-litecoin-surpasses-a-billion-dollar-market-capitalization/[/url<] And many others. I'm not pimping like the OP did, but I will say that I have made a decent bit of money on Litecoin (if I had 20/20 hindsight and wasn't so risk averse I would be typing this from an island in the South Pacific instead of in an ice storm...oh well, at least I get the free cooling =).

    • tcubed
    • 9 years ago

    ok AMD PR is obvioulsy ridiculously stupid…

    AMD take a lesson:

    a) nobody cares how cool you techguys are if you justify a blunder

    b) You hid the actual message in a mess of a press release what are you like 10?

    c) You NEVER say that the variance is higher in any context because idiots will take it out and use it as citation

    d) You don’t make a 3 paragraph statement in which you blame yourself and don’t point to the actual problem…

    e) nobody cares about tech explanation… keep it vanilla.

    f) forum and review readers have very little iq or technical knowledge – keep it stupid simple

    Who the hell wrote this shit!?!?

    This is a press release:

    “After careful consideration of online buzz and reviews we came to the conclusion that the usage of powertune is very misunderstood and misrepresented.

    While uber mode has a variance in the 1% range the quiet mode is affected by our strive to keep the card both cool and quiet while performing at optimal levels. We have taken into consideration your feedback to this feature and are announcing that with our current driver update you may now control all these settings on your own.

    This change should increase the reliability of benchmarking and flexibility of setup to suit any needs and environment.

    While this affects only quiet mode and should show a variance in both directions we’re investigating the precise circumstances described in the online reports.”

    Now – I wrote that in 5 minutes and every nvidia monkey around would understand it – but noo… you HAD to write a lengthy useless blabber that conveys zero confidence and zero message. Geez… Sometimes I do wonder if this company really wants to continue being undervalued because of bullshit like this or would rather increase its valuation & market opinion…

    Edit: added pt f in light of recent comment on this page…

    • Goofus Maximus
    • 9 years ago

    I guess the smart engineering was mated to some abysmally bad marketing.

    • superjawes
    • 9 years ago

    Still better than consumers (or anyone) discovering that their retail cards were consistently and noticeably below the benchmark. JohnC started a thread in the forums before TR made any mention of the issue.

    • NeelyCam
    • 9 years ago

    ximage’s original message did sound like spam marketing (‘make 1000s of dollars per month!’) – I just wasn’t sure if it was marketing for Litecoin or AMD

    • superjawes
    • 9 years ago

    My mistake. Corrected.

    • NeelyCam
    • 9 years ago

    Bitcoin market cap is >10x that.

    • f0d
    • 9 years ago

    litecoins are real and have been around for a while

    people used to say bitcoins were not real at one stage too (and i made hundreds of bitcoins when nobody knew about them – oh how i wish i diddnt spend them back then and had them now)

    not sure about link as i hardly ever click links i dont know exactly where they go

    • NeelyCam
    • 9 years ago

    So, litecoin is AMD’s answer to NVidia’s Bitcoin?

    Well, sorry – seems like Bitcoin has cornered the market and litecoin is an also-ran. We don’t really need multiple virtual currencies

    • slowriot
    • 9 years ago

    Huh? ximage21’s reaction tone makes sense to me given chuckula’s equally hostile accusations…

    • Antimatter
    • 9 years ago

    The marketcap of Litecoins recently crossed the $1 billion mark, so they’re quite popular.

    • NeelyCam
    • 9 years ago

    [quote<]that aside litecoins are real[/quote<] I'm sure they are real, and I'm sure there are other similar virtual currencies that haven't been mentioned here. But that's not the point - the point was that Litecoin doesn't have the same name recognition as Bitcoin, and ximage got upset when this was pointed out

    • NeoForever
    • 9 years ago

    [quote<] If the benchmarks models were in the lower 50% of all samples and people were generally getting cards with better performance, AMD would look a lot better right now.[/quote<] While true hypothetically, no one would have spent this much time and attention if retail cards were doing better then benchmarks. It would have been a fact not well exposed and hence AMD might have looked better but only in the eyes of few.

    • NeelyCam
    • 9 years ago

    “Better” is relative; NVidia achieved higher performance with a larger chip. And I never said that NVidia’s engineering wasn’t smarter.

    • slowriot
    • 9 years ago

    Where are you finding a GTX780 for $450?

    • superjawes
    • 9 years ago

    [quote<]That being said its a tough pill to swallow that spending 400 to 500 dollars on a card is a gamble on what kind of performance to expect. I mean there is no way around feeling screwed if you end up with the slower card after spending the same amount of money.[/quote<] This exactly. If the benchmarks models were in the lower 50% of all samples and people were generally getting cards with [i<]better[/i<] performance, AMD would look a lot better right now. But as it stands with the general trend seeming down, the 780 looks like a better choice at $500. Fortunately, it doesn't seem like this is an issue on the 290 (non-X), so that's probably still the best choice if you're slooking for top-tier performance.

    • anotherengineer
    • 9 years ago

    AMD needs some PR like this

    [url<]http://www.youtube.com/watch?v=cf6NjW-o2qs[/url<]

    • tbone8ty
    • 9 years ago

    This would of never been a problem if Amd spent time designing a good cooler.

    They expected a 2yr old 7970 ref cooler design to work and ona bigger hotter chip lol

    But good on them for keeping in contact with their fans and try and resolve issues as of late

    • Meadows
    • 9 years ago

    If it’s smart engineering, why did NVidia do better?

    • Billstevens
    • 9 years ago

    Sounds like there is a price to pay for undercutting Nvidia prices on the higher end. By keeping yields high and prices low we end up with much more variability in silicon quality. Overall it seems like even the not so great ATI 290 cards bring an excellent level of performance for a much better price than we are used to.

    That being said its a tough pill to swallow that spending 400 to 500 dollars on a card is a gamble on what kind of performance to expect. I mean there is no way around feeling screwed if you end up with the slower card after spending the same amount of money.

    Maybe some more free games could soften the blow

    • f0d
    • 9 years ago

    i diddnt click the link – it might be dodgy it might not (i hardly ever click links i dont know exactly where they go)

    that aside litecoins are real

    • NeelyCam
    • 9 years ago

    [quote<]we’ve implemented an all new PowerTune mechanism in the AMD Radeon R9 290 series that exploits the full capability of the individual GPUs rather than clamping performance to a least-common-denominator type of capability level[/quote<] This is smart engineering - AMD tried to offer maximum performance on each card. IMO, they don't deserve the attacks and scandals and whatnot that resulted. But whatever greatness the engineers developed was rendered pointless by PR/Marketing presenting this in a bad way. And NVidia marketing jumped on this PR disaster like a rabid wolf. They are smart and devious

    • Billstevens
    • 9 years ago

    I’ve always liked the Tech Report forums because there is virtually now shameless adverts. I say good work to the admins and community.

    • ximage21
    • 9 years ago

    Business Insider covered Litecoin and its relationship to Bitcoin several days ago (basically it is a virtual currency that is like “silver” compared to Bitcoin “gold” and is mined mostly by AMD GPUs because of hardware superiority).

    [url<]http://www.businessinsider.com/introduction-to-litecoin-2013-11[/url<]

    • Laykun
    • 9 years ago

    This whole problem reminds me of when I got my Dell L521X. The cooling solution was just inadequate for the hardware inside, and as a result it throttled itself considerably (although here the throttling is much less severe). I get the feeling nothing will really be fixed with the current generation of 290X cards and it won’t be till the 390X when you’ll see a revised cooling solution.

    • NeelyCam
    • 9 years ago

    Bitcoin I’ve heard of. Litecoin… never. Wired, PBS, NBC, Time and WSJ has HUGE amount of articles. Sorry for not spending the time to read on all of them.

    However, I find it interesting that you seem personally insulted about people not knowing about litecoin. Instead of blaming us, maybe you should put more effort towards brand awareness. And a tip: spamming forums and comments sections of tech sites and insulting their readers is bad marketing

    • nanoflower
    • 9 years ago

    I said I had never heard of Litecoin because I haven’t. Ever discussion I’ve read/watched/heard of digital currency and mining has been around Bitcoin. There may be other forms of digital currency out there but Bitcoin is the one that gets mentioned by the majority of people. Besides the link points to a foreign language site which is always a red flag on an English language site (whether it’s correct or not.)

    • chuckula
    • 9 years ago

    BAN! WITH! FIRE!

    • bittermann
    • 9 years ago

    Nope…but the truth must sting a little if that’s all you can come up with is a username dish? Might want to up your integrity standards a bit as well.

    • Stickmansam
    • 9 years ago

    The reference 290x is not worth if you’re not going to water cool it, put on a custom cooler or mod it.

    • ximage21
    • 9 years ago

    Both Bitcoin and Litecoin and virtual currency have been in major media for a number of months!

    From Wired to PBS to NBC to Time to Cnet to Engadget to Wall Street Journal, how can you say that with a straight face.

    The link is from a tech site covering the performance of AMD Radeon R9 cards along with the worldwide demand for them.

    • kamikaziechameleon
    • 9 years ago

    I’m very happy with TR transparency on this.

    I doubt there is anything truely exceptional happening here in regards to pre-release not reflecting actual product. I actually recall a few TR editorial remarks on how annoying it was to work with pre-release non consumer hardware.

    Ultimately it comes down to this, the 290x might not be worth its own premium. Prices will adjust and then people who were holding back will get it and everyone lives happily ever after.

    • kamikaziechameleon
    • 9 years ago

    Bitter much man?

    This is the highest integrity website for tech I frequent.

    • Vhalidictes
    • 9 years ago

    Eh. It’s an issue, to be sure. But considering the price differential vs the green team I’m not too upset.

    • superjawes
    • 9 years ago

    [quote<]They admitted to performance variability and bias of the variability to the low side.[/quote<] Right there you disprove your own point. It just boils down to what I summarized, which is that retail 290s are performing under the press samples. AMD uses a wordy, overly-complex way to say this. And most of the press release just repeats how the 290 series works; it's not an explanation of the problem. The problem has nothing to do with PowerTune and everything to do with benchmarks that come across as misleading. There's really not a nice way to put it. If AMD shipped cherry-picked 290s to inflate the benchmarks, that's shady, and it really hurts anyone who picks an AMD card based on said benchmarks.

    • nanoflower
    • 9 years ago

    Yep.. Never heard of Litecoin before this and with the link included I would say this is a spammer so bringing down the ban hammer seems justified.

    • nanoflower
    • 9 years ago

    I agree with Jdaven. There’s not much more that AMD could say at this point unless they just wanted to say “oh yeah, we cherry picked the GPUs for our review samples.. Sorry about that.” I don’t see any company stepping up to say that if it were the case, which it may not be.

    • panthal01
    • 9 years ago

    TR is actually my favorite website and the first I look at everyday and has been for many years. it’s actually rare for me to feel inclined to post about something.

    With that being said, this whole article has bothered me.between Nvidia buying the cards and not testing them in uber mode,the fact that every single vendor since the beginning of reviews has sent thier best samples.Krogoth eluded to this earlier so maybe some of you have not been around long enough to appreciate that.

    before someone says I’m biased, I have used nothing but Nvidia cards since I ditched my voodoo 2’s in SLI(now that was a uber set up for the time)

    • maxxcool
    • 9 years ago

    wow double advert in one post… nice… BAN!

    Joined:Tue Oct 22, 2013 2:01 pm
    Last visited:Thu Dec 05, 2013 1:58 pm
    Total posts:0 | Search user’s posts
    (0.00% of all posts / 0.00 posts per day)

    • Airmantharp
    • 9 years ago

    If said products were available, this wouldn’t be an issue.

    • south side sammy
    • 9 years ago

    since when is it news that manufacturers send cherry picked parts to review sites?……… ati/nvidia/intel……….etc. all do it and probably almost always did.
    the only way around it is for review sites not to accept pre-release product and hold the review over until it can be purchased like every day Joes have to do it.
    that’ll never happen.

    • DaveBaumann
    • 9 years ago

    FYI – we asked for the firmware flash/test to be done. At present we have not been able to replicate the results seen, and the retail BIOS should be the same as the press BIOS (checksums we have done from others indicate this to be the case) but its part of the investigation. The press BIOS is posted in TPU’s BIOS database so anyone can try it.

    • chuckula
    • 9 years ago

    Hrmm… banhammer for spamming with that last edit? I can’t tell the difference between that and the usual spam links that infest the forums now & again.

    Edit: Yeah, multiple edits and he even put in a spam-link. Way to make AMD look bad…

    • Forge
    • 9 years ago

    No, they completely dodged the firmware question, which is a smoking gun.

    AMD lied to us all, by sending out cherry-picked golden review samples, with non-retail tweaked firmwares that give higher clocks and higher performance.

    [url<]https://techreport.com/review/25712/are-retail-radeon-r9-290x-cards-slower-than-press-samples/6[/url<] All they "admit to" here is that there's a difference between reviewed and retail cards, and then they spin the complete lie that they have no idea why. They know EXACTLY why.

    • ximage21
    • 9 years ago

    The biggest news is that just about all retailers around the world are sold out of AMD HD7950, HD7970, R9-280, R9-280X, R9-290, R9-290X and 7990 cards along with tons of AMD motherboards and CPUs.

    People are apparently not just gaming on AMD, but also mining virtual currency like Litecoin and making 1000s of dollars per month with multi-Radeon GPU rigs!

    • bittermann
    • 9 years ago

    TR is the only one making a mountain out of a mole hill. Honestly I stopped reading the article after “Nvidia offered to buy the cards”. Don’t care if its slightly justified that just seems pathetic (for NV not TR) in my book. 3rd party OEM products with better cooling will eventually take care of all the cry babies screaming throttling!

    • jdaven
    • 9 years ago

    Uh no. The statement was very clear. Used common English words that takes 30 secs to read. They setup the problem in the first paragraph. Gave an explanation and resulting action in the second paragraph. Extra information and summary of how the products should perform in the last paragraph.

    They admitted to performance variability and bias of the variability to the low side. Admitted the drivers and hardware react to different environmental conditions. Agreed to an investigation and improvements to power algorithms.

    Please submit how you would have liked AMD to word the statement.

    • drfish
    • 9 years ago

    [quote<]Reasonably we would expect the variability to occur both above and below the performance of the press samples, however it appears that most reported performances are biased towards the low side.[/quote<] I'm not thrilled about the situation but I am satisfied with this acknowledgement of it. I don't envy the guy that has to benchmark a chip that is smart enough to run as fast as it decides it can but the writing has been on the wall for a while and it looks like it's time to adapt.

    • superjawes
    • 9 years ago

    The statement uses a lot of words, runaround, and restating of what the 290 series does to say:

    [quote=”AMD”<]Yes, retail 290 series cards are performing below press benchmarks.[/quote<] And it's not like this is a new issue either. This surfaced weeks ago, but now people like Scott have been able to independently retest and present results from third party investigation.

    • Firestarter
    • 9 years ago

    I’d buy one of the 3rd party 290’s

    • jdaven
    • 9 years ago

    Uh, I thought the statement covered everything so I’m not sure about your criticism.

    Anyway, the real evaluation will come after AMD’s investigation and how AMD deals with the results.

    • allreadydead
    • 9 years ago

    In before krogoth;
    I’m not impressed :\ Come on AMD, MAKE me buy a 290..

    • Forge
    • 9 years ago

    Wow, Such PR
    much BS
    Doge hate

    Seriously, AMD? Hiding behind this? No comment on review sample firmware increasing performance when applied to retail cards?

Pin It on Pinterest

Share This

Share this post with your friends!