Are retail Radeon R9 290X cards slower than press samples?

With the release of the Radeon R9 290 and 290X, AMD upended the high-end graphics market by offering performance competitive with Nvidia’s existing products at substantially lower prices. The new Radeons didn’t just improve the value proposition, either. The R9 290X captured the overall GPU performance crown, wresting it away from the GeForce GTX 780 and Titan by the slimmest of margins. Such little differences are magnified in the world of high-end graphics, where the spoils—and sales—often go to the victor. After all, if you’re forking over something north of 500 bucks for a graphics card, bragging rights are probably involved to some extent.

You can imagine, then, how things went a bit pear-shaped when folks started reporting that Radeon R9 290X cards purchased at retail don’t seem to perform as well as the review units AMD supplied to the press.

Whoops. Sounds bad, doesn’t it? How can that be?

Well, from here, things get kind of complicated. Although the retail R9 290-series cards appear to have the same basic hardware and specifications as the review samples, the zillion-dollar question is what happens during everyday operation. You see, like the Turbo Boost mechanism in Intel CPUs, the Radeons’ PowerTune algorithm adjusts clock speed dynamically, from moment to moment, in response to current chip temperatures, the GPU workload, and the video card’s pre-defined power limits. For one reason or another, folks found that at least some retail R9 290-series cards seemed to operate at lower clock speeds than those initial review units.

AMD identified one apparent cause of the problem pretty quickly: the blowers on some retail cards weren’t spinning as fast as expected, and the reduced cooling capacity resulted in lower clock speeds. This explanation was quite plausible. Heck, we’d already seen how an increase in blower RPM can improve the R9 290X’s performance when we switched it into “uber” fan mode during our initial review. One can imagine that different blowers might not respond to increases in voltage quite the same way. If blower RPM were varying substantially from card to card, that might well explain the clock speed differences.

AMD soon issued a fix in the form of a software update. The Catalyst 13.11 beta 9v2 driver sought to equalize blower speeds from card to card by monitoring RPM directly, thus hopefully improving performance on retail cards that seemed to lag behind.

That change seemed sure to help, but as we discussed on our podcast, we had lingering questions. Had blower speeds increased generally, making the R9 290-series cards even louder? Because, you know, they were awfully darn loud before. More importantly, how much of the card-to-card variance remains, even with the new driver? I really wanted to know.

We had motive to test some R9 290X retail cards against our press samples, but we lacked the means. Although you may have heard stories about the glitzy lifestyles of semi-obscure hardware reviewers, the truth is that we can’t just order up several $549 graphics cards on a whim. Heck, these days, I can’t order lunch on a whim. Loading up a shopping cart at Newegg with 290-series Radeons wasn’t really an option.

Then something funny happened. We got a call from the folks at Nvidia offering to purchase a couple of retail R9 290X cards for us to test. The cards would be ordered from Newegg and shipped directly to Damage Labs for our scrutiny. The sample size wouldn’t be large, only two cards (with boxes still sealed) pulled at random from Newegg’s stock, but apparently the green team was confident enough in the likelihood of differences between our review samples and the retail cards to make the purchase. Since we were interested in exploring the question—and a little amused by the prospect of these fierce competitors buying one another’s products—we accepted the offer.

A couple of days later, we took delivery of two Radeon R9 290X cards: one from HIS and the other from Sapphire. Apart from the stickers on the cooling shrouds, the two look to be identical to one another and to our two R9 290X review samples. Almost immediately, I started some initial testing, to see if I could spot any obvious differences between the cards. Little did I know how much work lay ahead.

Our testing methods

Our test systems were configured like so:

Processor Core i7-3820
Motherboard Gigabyte
X79-UD3
Chipset Intel X79
Express
Memory size 16GB (4 DIMMs)
Memory type Corsair
Vengeance CMZ16GX3M4X1600C9
DDR3 SDRAM at 1600MHz
Memory timings 9-9-9-24
1T
Chipset drivers INF update
9.2.3.1023

Rapid Storage Technology Enterprise 3.6.0.1090

Audio Integrated
X79/ALC898

with Realtek 6.0.1.7071 drivers

System
drive
Corsair
F240 240GB SATA SSD
Power supply Corsair
AX850
OS Windows
8.1
Driver
revision
GPU
base

core clock

(MHz)

GPU
boost

clock

(MHz)

Memory

clock

(MHz)

Memory

size

(MB)

GeForce GTX
780 Ti
GeForce 331.82 beta 876 928 1750 3072
Radeon
R9 290X sample 1
Catalyst
13.11 beta 8/9v2
1000 1250 4096
Radeon
R9 290X sample 2
Catalyst
13.11 beta 9v2
1000 1250 4096
HIS Radeon
R9 290X
Catalyst
13.11 beta 8/9v2
1000 1250 4096
Sapphire
Radeon
R9 290X
Catalyst
13.11 beta 9v2
1000 1250 4096

Thanks to Intel, Corsair, and Gigabyte for helping to outfit our test rigs with some of the finest hardware available. AMD, Nvidia, and the makers of the various products supplied the graphics cards for testing, as well.

Unless otherwise specified, image quality settings for the graphics cards were left at the control panel defaults. Vertical refresh sync (vsync) was disabled for all tests.

In addition to the games, we used the following test applications:

The tests and methods we employ are generally publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.

Our first test case: Skyrim

We knew up front that finding solid answers to our questions about card-to-card variance might be difficult. For one thing, the R9 290X is particularly sensitive to ambient temperatures. A warmer environment can produce lower clock speeds, and cooler ambient temps can lead to higher clocks. Unfortunately, Damage Labs isn’t set up for precise climate control. I’m lucky to have breathable oxygen most of the time. I did try to keep room temperatures from rising too high by cracking open a window when running a load test caused the room to heat up, but maintaining a perfectly steady environment just wasn’t realistically possible.

Our solution for that problem was to take lots of samples, especially for our main test case, in Skyrim. (We chose this game because it’s a reasonably taxing workload, which is also why we’ve used it for testing GPU power consumption in the past.) We monitored each card’s vitals for 30 minutes while running the game with our character standing still in a particular spot. We then tested each card three times, to see how much clock speeds varied from run to run. That’s 10.5 hours of testing at a minimum, just in Skyrim, to accommodate all of the configs we included. The actual testing time was much longer, since when we started, we were kind of clueless about the best way to proceed.

Our primary goal was to compare the performance of the retail and press sample 290X cards with the new Catalyst 13.11 beta 9v2 drivers, which attempt to equalize blower speeds. However, out of curiosity, we also decided to test our initial review unit and the HIS retail card with the older 13.11 beta 8 driver, which doesn’t equalize fan speeds, to see how much of a difference that software change makes.

We’ve plotted a number of variables from our test sessions below. You can click the buttons beneath the plots to see the results from each card. The plots come from one of the three test runs, while the bar charts show the median results from three runs. Also, note that the unit of time on the X axis in the plots is seconds. Somehow, I failed to include the units when making the graphs. Too much sitting around in white fan noise has dulled my wits, apparently.

The card labeled “290X sample 1” is the review sample from AMD that we used in our R9 290X review. Sample 2 came to us a couple of weeks later, also from AMD, after we requested a second card for CrossFire testing. The HIS and Sapphire cards are the retail units. I’ve also included numbers from our GeForce GTX 780 Ti review sample for comparison.


You can see several things in the plots. Each 290X card starts out at a solid 1GHz. Then, its clock frequency drops and bounces around as the GPU reaches its temperature limits and PowerTune starts working to balance temperature, power draw, and operating frequency. The amount of time before the clock throttling begins varies, depending mostly on how warm the GPU was before we fired up Skyrim. To account for this variability, we clipped off the first five minutes (300 seconds) of the test period when calculating the average clock speed (or fan speed or FPS) for each run.

Click over to the HIS 290X’s results, and the newer Catalyst driver that modified blower speeds has clearly paid off. The HIS card clearly runs slower with the older Catalyst 13.11b8 driver. By contrast, our initial review unit is largely unaffected by the change. You can also see that sample 1’s clock speeds appear to be somewhat higher than the other cards’. If we plot the median clocks across three runs, here’s how things line up:

That’s a relatively large amount of variability across four copies of the same card, especially since we’re running a popular game. Skyrim isn’t a peak workload in terms of power consumption or thermals. You can see that the initial review unit is the fastest of the bunch, regardless of which driver we use, while the HIS card takes up the rear. The newer driver does help the HIS card make up some ground, but it still trails sample 1 by quite a bit.

The graph above is just a summary, though. This table will give you a sense of the clock speed variability from run to run, as the temperature in Damage Labs fluctuated.

Clock
speed (MHz)
Run
1
Run
2
Run
3
Median
Sapphire
R9 290X
891 924 902 902
HIS R9
290X
882 893 889 889
HIS R9
290X – 13.11b8
845 832 835 835
290X
sample 1
928 952 930 930
290X
sample 1 – 13.11b8
947 958 957 957
290X
sample 2
913 912 903 912
GTX 780
Ti
1005 1004 1005 1005

Clock frequencies for the individual cards varied by as much as 22MHz during our Skyrim tests. Ideally, we’d impose stricter temperature controls and do even more testing. Heck, in an ideal scenario, we’d have a much larger selection of 290X cards to test, and I’d be conducting those tests from the temperature-controlled lower deck of my enormous luxury yacht anchored off the Yucatan coast, aided by a team of cheerleader interns. Sadly, that’s not the case. Still, I think the trend for each card begins to become clear after several runs—and we have more data to review.


The new Catalyst drivers raise the HIS card’s blower speed by over 300 RPM. Obviously, they’re addressing a very real problem. Those drivers also raise the blower RPM slightly on sample 1, our first review unit. That means the 290X will be a little louder overall than our initial review indicated. (All of these tests were conducted in the 290X’s default fan speed profile, not in “uber” mode.)

Notice the slight saw-tooth pattern on the plots for each of the 290X cards when used with the new drivers. The plots are much flatter with the 13.11 beta 8 drivers, which means the sound coming from the blower should be smoother and less variable, not quite as easy for the ears to notice. Presumably, when those little spikes happen with the new drivers, the GPU is reaching a thermal limit and needs additional help. The ensuing ramp up is abrupt, although the ramp down is more linear. Contrast that to the fan speed curve for the GTX 780 Ti, which is smoother than an Nvidia marketing pitch.

Also, in the “stuff you didn’t expect” department, notice that the blower RPM for the GeForce GTX 780 Ti is higher than for any of the 290X cards, even though the 780 Ti is much quieter under load than the R9 290X. Nvidia’s blower appears to have a slightly smaller diameter, but I’m impressed that it runs at substantially higher RPM and produces less noise.


Each plot starts with a flat line for the first couple of minutes, and then the FPS numbers start varying up and down at regular intervals. Why? Because after your charcacter stands still in Skyrim for a while, the game switches to an “attract mode” where the camera pans around him in a circle. The variance you see in the FPS plots is caused by that constantly changing perspective.

The clock speed variance we saw above translates into performance differences fairly predictably. The total gap in terms of FPS is fairly small, since we’re testing at 4K resolutions where Skyrim doesn’t run at really high frame rates. There is a 10% gap, though, between the HIS 290X and sample 1 with the 13.11 beta 8 drivers. With the newer drivers, where blower speeds are more even, the FPS gap narrows to about 5%. We might see larger differences at lower resolutions, where GPU speeds are likely to play a larger role than they do at 3840×2160. (At 4K, memory bandwidth has gotta be a big constraint.)

Thing is, small differences in performance can cost you a lot of money at the very top end of the graphics card market, with bragging rights on the line. The price difference between the Radeon R9 290 and 290X is $150, and the two were 3% apart in our overall performance assessment. That example’s a bit extreme, but we saw a 12% difference between the GeForce GTX 780 and the GTX 780 Ti. The price difference between those cards is $200.

The worst-case scenario: MSI Kombustor

In addition to a real-world game workload, I though it might be interesting to try something more extreme. MSI’s Kombustor utility is based on the infamous Furmark tool that the GPU companies have taken to calling a “power virus.” Early versions of PowerTune were created in part to deal with the programs like this one, which tend to push GPUs to their power and thermal limits.

Since Kombustor heats up a GPU pretty quickly, I decided to shorten our test sessions to 20 minutes each. As before, we then disregarded the first five minutes as a warm-up period. We also used only two test runs per config here, to keep our total testing time in check.


Interesting. For the most part, the 290X cards settle in at 727MHz, which appears to be the 290X’s undocumented base clock, and range up or down from there only intermittently. There is some variance among them. The Sapphire, for instance, drops as low as 400MHz at one point. By contrast, our first review sample somehow manages to avoid dropping down to the 727MHz baseline. As a result, sample 1 maintains a much higher average frequency than the rest of the bunch.

Oh, I should mention something. I logged GPU temperatures throughout all of these tests. The reason you don’t see them plotted here is simple: they’re boring, flat lines at 94-95°C for each 290X card. PowerTune is quite effective at maintaining its target temperature, as is Nvidia’s GPU Boost. The other variables are the ones that fluctuate.


Hmm. Even in this extreme thermal workload, both of the press cards’ fan curves are relatively flat, with only an occasional bump above the roughly 2200-RPM target. The HIS and Sapphire cards both make sudden forays into higher-RPM territory, only to decay slowly and then repeat the process. The retail cards will surely be louder as a consequence of their higher blower RPM.

This divergent behavior suggests there’s some sort of difference between the review units and the retail cards. Either the fan control algorithms are somehow working differently, or the chips on the retail cards simply require more voltage (and thus generate more heat) in order to operate at similar clock speeds.

The most striking difference in this context, of course, is the performance of our first 290X review sample. That card’s average frequencies are as much as 128MHz faster, yet its RPM curve remains lower and flatter than the retail units’.

Crysis 3

To round out my testing, I decided to try a couple more games, to see if the clock speed differences we’d observed would persist across other workloads. I was only able to conduct a single, 25-minute test session for each config, but I’m hoping the additional data proves enlightening, even if it isn’t as solid as our triple-run Skyrim results.

As in our other game tests, we simply had our character stand still and look at a (mostly) static scene. This game doesn’t go into any sort of attract mode, so the workload doesn’t vary much at all over time.




Our initial review sample continues to outperform the rest of the 290X cards in Crysis 3, as we’ve seen elsewhere. The difference is larger in terms of clock speeds than in frame rates, again probably because we’re testing at a massive 4K display resolution.

Yes, I should have tested at a lower resolution. We might well see more separation between the 290X cards if I had. I fail at media sensationalism yet again.

Battlefield 4

I tested BF4 much like Crysis 3, over a 25-minute period while viewing a static scene with relatively high quality levels at a 4K resolution.




The basic trend holds, with review sample 1 outperforming the other 290X cards. Again, the frame rate differences are pretty small in absolute terms, but nobody wants to play BF4 at 36 FPS, anyhow. The bigger problem is the clock speed difference of just over 5% between sample 1 and the two retail cards. That gap will translate into larger FPS deltas in other scenarios.

A firmware solution?

We were about to wrap up our work on this article when we became aware of another variable that might warrant some attention. In fact, the folks at AMD pointed out this issue, since they’re currently puzzling over it themselves.

Nate over at Legit Reviews has been looking into this same set of problems, and he found that firmware differences between the press and retail cards might be playing a role. Like us, he measured clear differences between the performance of his 290X review sample and some retail cards. He then extracted the firmware from his 290X review unit, flashed it to a retail 290X, and tested again. Turns out the retail 290X performed better when using the press sample’s firmware.

Why is that? We don’t know, neither does Nate, and AMD hasn’t answered our repeated inquiries about what the cause might be. At AMD’s request, though, we captured the firmware from 290X sample 1 and flashed it to our HIS retail card. We then ran this card through our triple-session Skyrim test to see how it fared.

Clock
speed (MHz)
Run
1
Run
2
Run
3
Median
HIS R9
290X – retail firmware
882 893 889 889
HIS R9
290X – sample 1 firmware
908 907 912 908
290X
sample 1
928 952 930 930

Hmm. With the firmware change, the HIS card’s clock speeds look to be up by about 20MHz in this test scenario, but they’re still about 20MHz lower than the clocks of 290X sample 1. Could the change be due to some difference in cooler RPM?

Not that I can tell. Heck, the HIS card could simply be faster because of variances in ambient temperature, although frankly, I doubt it. I let the room heat up a bit during the final test run with the press sample’s firmware, and the HIS card was still faster than in our first round of tests.

For what it’s worth, the alternative firmware didn’t alter the HIS 290X’s performance much at all. The card averaged about 76 FPS with either firmware revision.

The HIS card did seem to be a little unstable with the press firmware, though. Our test rig locked up several times during Skyrim testing. Could it be that the press sample firmware applies a lower GPU voltage over time? Slightly lower GPU voltages would explain both the higher clock speeds—due to added thermal headroom—and the instability, if the Hawaii GPU on the HIS card isn’t quite up to the task of stable operation at those voltage levels.

To find out, I dug into the GPU-Z logs once more.

Average
VDDC (mV)
Run
1
Run
2
Run
3
Median
HIS R9
290X – retail firmware
1097 1101 1105 1101
HIS R9
290X – sample 1 firmware
1092 1092 1096 1092
290X
sample 1
1069 1061 1070 1069

 

VDDC (mV)
Median 90th

percentile

99th

percentile

Peak
HIS R9
290X

retail firmware

1109 1141 1156 1164
HIS R9
290X

sample 1 firmware

1094 1125 1148 1164
290X

sample 1

1070 1125 1148 1172

Turns out the press sample video BIOS runs the HIS card’s GPU at about 10-20 fewer mV than the retail firmware. The GPU in the press sample 1 card is obviously a higher quality piece of silicon; it runs at higher frequencies with lower average and median voltages without instability.

What should we make of this seemingly minor voltage delta? Honestly, I don’t know. PowerTune is a dynamic algorithm, and it will supply more voltage in order to reach higher clocks if the thermal headroom allows. This 10-20 mV variance could be caused by ambient temperature differences rather than firmware changes. Still, the fact that the HIS card isn’t quite stable with the firmware from the press sample makes me wonder.

So now what?

We’ve learned a few things definitively in our testing, I think. First of all, AMD’s software fix to equalize blower speeds in the Catalyst 13.11 beta 9v2 driver release definitely improves the worst of the low-clock-speed problems that some R9 290X owners observed. The fix appears to raise fan speeds overall for 290X cards, slightly for our initial review unit and more dramatically for our HIS retail card.

Beyond that, I think we’ve collected enough data to say with confidence that our initial R9 290X review unit, sample 1, is superior to the two retail cards we tested, regardless of the driver or firmware revision. Even with the blower speed fix in place, our first review unit runs at 5-10% higher clock speeds than the retail cards, depending on the workload. That deficit translates into a 5-10% advantage in frame rates, though usually toward the lower end of that range at 4K resolutions. Sample 1 appears to achieve these clock speeds at lower voltages than the retail cards, too.

Furthermore, the two retail 290X cards exhibit higher fan speeds in our peak thermal workload, MSI Kombustor. They make intermittent, abrupt forays above the 2200-RPM limit imposed by the Catalyst 13.11 beta 9v2 driver, while our two press samples stick much closer to the 2200-RPM cap.

Do any of these findings really matter to current or prospective 290X owners? The short answer is, if you’re concerned about performance, they only matter by 10% or so at most. You can decide what to make of that fact. I’m sure some happy 290X owners won’t really mind as they’re gleefully slicing through opponents in Battlefield 4. Good for them.

Personally, I think our results matter in a few specific ways. I’ve already mentioned that the 290X’s fairly generous card-to-card variance isn’t a good fit for the realm of high-end video cards, where performance differences of less than 10% can command a premium of $150 or more. Bragging rights aren’t cheap, folks.

More notably, our review of the Radeon R9 290X likely overstated the product’s performance—and understated its noise levels—compared to the average card shipping to consumers today. Evidently, AMD chose to include some of its very best Hawaii GPUs aboard the review samples it supplied to the press. We’re not the only publication to notice this fact. A number of other media outlets have looked at this issue and found that their review units outperform retail 290-series cards, as well. Once our findings were clear, we contacted AMD and asked them to comment on this matter; it seemed proper to give them a shot at explaining themselves. Unfortunately, we still don’t have a statement or any convincing explanation for what happened.

The 290X’s relatively broad card-to-card variance stems from the decisions AMD made when defining this product. This new version of PowerTune is the first major dynamic voltage and frequency scaling (DVFS) scheme without an advertised base clock. AMD probably should have given the 290X a guaranteed base frequency somewhere north of 727MHz and chosen a more conservative peak clock speed, as well. Doing so could have resulted in less performance variance from one card to the next. That formula might have made the 290X more difficult to produce at high volumes—and it very likely wouldn’t have allowed hand-picked 290X review units to snatch the overall performance crown away from the GeForce GTX 780 and Titan on launch day. It would have had the great virtue, however, of setting more honest expectations.

I should mention that, between the cooler RPM tweak and the firmware questions it has raised, AMD still appears to be refining its 290-series graphics cards in fundamental ways that affect their operating speeds and noise levels. That’s both good, because some fixes may be forthcoming, and a little odd, since this sort of engineering work is generally supposed to take place before a product finds its way to consumers.

We still have some practical questions to answer about the gap between our experience with the initial review sample and the current reality with retail products. Our testing here has focused on primary variables like clock frequencies and blower RPM, but we need to measure the practical effects. How does moment-to-moment clock speed variance in retail 290X cards affect individual frame rendering times? We’ve only tested FPS so far, which doesn’t tell you what happens, ahem, inside each second. We also haven’t yet measured noise levels with the new drivers. We’ll have to find these things out in future testing, which we intend to conduct with retail cards and firmware whenever possible.

Update 12/5/13: AMD has issued a statement on these matters.

My Twitter posts vary widely in word count.

Comments closed
    • Diplomacy42
    • 9 years ago

    you understand that they don’t actually get to keep that money, right?

    • beck2448
    • 9 years ago

    Rubbish. EVGA 780 ti Classified just reviewed stable at 1290 OC. Way more headroom than 290x which even with the WINDFORCE cooler only raised 40 to keep it stable.

    • tcubed
    • 9 years ago

    Like the half billion $ revolving door credit they just got?? Or the 60M in net profit they just made last quarter… fairly sure they could you know… just compute how many annual wages would fin in there…

    • dodozoid
    • 9 years ago

    I´d like to see numbers…

    • jihadjoe
    • 9 years ago

    Silicon that can run the higher frequencies at lower voltages is “best” silicon!

    • Airmantharp
    • 9 years ago

    Who’s the PR monkey now?

    • tcubed
    • 9 years ago

    Let me put it bluntly… NOBODY AT NVIDIA / AMD gives a fuck about anybody on the forums (myself included) , they just keep the buzzfire hot so they get brand awareness… And guys like him (HDOr) only make it easy…

    Let me also emphasize that if he(HODr) were a true PR person he would have never written such a lengthy and weak argument… he obviously has a beef with rory read and how he runs the company… Well my assessment (and most people on wall st.) is that if it weren’t for him AMD would have been bankrupt long ago, he has steered AMD away from what looked like inevitability back then, diversified the portfolio, lowered liability and contractual bonds and ultimately generated free cash-flow. He is a shark no doubt and far from the flower power hector ruiz type of guy… he’s a one-shot-one-kill type of guy, action oriented and determined, he’s a fighter. Without this type of attitude amd would be dust right now. We’ll see how well he fares now as he has brought the company back to flotation, will he be able to steer it towards a long strategical target and execute on it? Will he get out of debt fast enough so it doesn’t start to pile up and also slow enough to not starve the company? Well we shall see in 4-5 years now won’t we 🙂

    • tcubed
    • 9 years ago

    talk about pushing to the limit…

    Nvidia(780Ti): Advertises 875MHz – 928Mhz core clocks – and actually runs 1000+Mhz almost all the time. Besides that it runs GDDR5 @ 7GHz – it’s theoretical maximum! 384 bit interface.

    AMD(290x): Advertises up to 1ghz – actually runs around 750-1000 in quiet mode, 950-1000+ in uber mode, uses GDDR5 @ 5 GHz, 512 bit interface.

    290x on waterblock: 70C @ 1200mhz coreclock & 5.4HGz mem clock have been seen…

    My conclusion: Nvidia has a brilliant cooler, AMD has a shit cooler. Nvidias card is maxed out and you would probably be able to squeeze another 10% more performance out of it with overclock… AMD has lots of headroom(also apparent in the fact that 290 is close to 290x because of less heat dissipation problems)

    So who’s pushing it to the limit?

    • tcubed
    • 9 years ago

    If checksums match why do retail cards go within 1% variance vs press release cards in quiet mode when the bios is moved over to them (As legit reviews did for instance)? Similar results were seen by Tom’s HW review updated bios insignificant variance (I think @ Tom’s they used powercollors bios and that brought it within 2-3% of press)

    Something doesn’t ad up here, already 2 articles written way earlier then this one show very little variance with press release or updated bioses… Something is fishy to say the least… either that checksum claim is a lie or a mistake or it’s something else entirely.

    • tcubed
    • 9 years ago

    Well.. I have to agree with swaaye… changing a GPU cooler is not as easy and straight forward as a CPU one, you need a bit of skill and might also void the warranty which is not the case with cpu coolers which are fairly easy to install & remove and don’t void any warranty! On the other side CPU’s are sold bare … are video cards ever sold bare?(so you don’t pay for a crappy cooler in the first place?)

    • tcubed
    • 9 years ago

    Not having a cooling setup to handle 290/290x well…

    For instance I have 4 fans (not counting the 2 on the powersource) on my case 120->200mm, 1 front 1 back 1 side(for cpu aid), 1 bottom(for gpu aid), I also have a huge cpu cooler with 2 120mm fans on it and my old card has 3 fans… talk about airflow… and noise.. I never ever hear a thing from it – For a while I even ran with the box cpu cooler and I hardly ever heard that one(even with slight overclock active)…

    And by the way a case may have a much better airflow then “open space”: In a case you can force the airflow and it might actually result in better results, don’t forget in open space you have virtually no dissipation on the back of the card and hot air might build up around the card and in the pcb if running it for extended periods of time for benchmarks for instance. I’m fairly certain that in my case I would have better airflow – I’m just being picky and don’t want to pay for a leafblower of a fan they call a cooler… I ain’t payin’ for that shit man! But I will no doubt buy a custom/non ref card. But this is my choice as a customer.

    Do you honestly believe that anybody dumping 500$ on this card will put it into a restricted airflow situation?(when a big good atx will set you back what? 150-200$? and you can use it for ever and ever- well until you find something better) And if they do don’t they just deserve what’s coming?

    • tcubed
    • 9 years ago

    Amen to that!

    • tcubed
    • 9 years ago

    good thing we all know how wide your knowledge is:

    ——–> . <—- look closely it’s there!!! I swear!!!

    • tcubed
    • 9 years ago

    wise words! But there should be a limit to the extent of blowup you do and this goes all the way…

    I think what Intel did with it’s “marketing funds”(+ Dell, HP & IBM) and “cripple AMD” (Intel and then Cinebench to showcase it…) were much worse – still nobody got this inflamed about it… even to this day I have yet to see an article bringing down the hammer on Intel for that… they got sued they lost they payed – nobody cares… AMD screws up a cooling solution & the low power setup for the flagship gpu and everybody goes berserk.

    • Kaleid
    • 9 years ago

    “So I ended up buying a Thermalright TRAD2 GTX which costs more but blows the S1 out of the water.”
    Oh I doubt that. Perhaps the cooler didn’t have proper contact, or perhaps you ignored the VRM?
    Thermalright makes their coolers too small and the fins way too close to each other, which makes it necessary to have noisy high RPM fans.

    With a 700rpm Scythe 120mm fan I don’t get over 45C in any game that I have tested so far. And that’s the highest RPM fan in my entire system.

    “Anyway – you pay for that stock cooler when you buy a card. I would rather not pay for garbage and then add another $30-50 to get the job done properly.”
    You get crap coolers with CPUs too, so why not simply change them both?

    • tcubed
    • 9 years ago

    WOOT – hell no! Do you want to expose the *HUGE* 1% variance of uber mode are you insane?? WTF dude… they need to have a scandal to get the name plastered all over the internet and if they get financing from nvidia … what the hell nobody’s going to bother about that.. cause cards are expensive… you make me laugh my ass off one placement on this site makes 4 cards worth of money in one day(provided non idiotic placement pricing – haven’t checked… maybe I should…)

    They need to sell advertising space, with every user that comes here they get money so it’s only natural they wouldn’t just go like

    “AMDs flagship GPU behaves erratically in power saving mode” – nobody would give a flying fuck about such a title…

    but

    “Are retail Radeon R9 290X cards slower than press samples?” – WOOT! Scammers! BURN in HELL! You rigged the benchmarks etc…

    Cudos to the guys making headlines you would really thrive in tabloids!!

    Calling it variance and hiding the fact it’s only in quiet mode is the way to do it… then implying golden chips… perfect receipt for a shitstorm like this one… and techreport stands only to gain from anybody coming on this site.

    On the other side AMD really asked for it with this kind of PR this was bound to happen. They knew they had a bad cooler why overhype it?? I would have gone for a less dramatic approach and let the cards shine for themselves – They just asked for it! AMD get a hold of that runaway PR of yours!!

    • tcubed
    • 9 years ago

    “THE place for investigative journalism” – LOL – Tom’s article on this is 10x more indepth and 2 weeks earlier, and they contacted AMD repeatedly to figure it our… they have put lots of effort into it TR just made a write-up with a scandal theme then got cards with NVIDIA money and did not do uber mode tests just quiet mode tests and called it a day… you’re funny!

    • curtisb
    • 9 years ago

    The last sentence in your quote says it, though:

    [quote<]I'd get spun up about the potential disparity, but [i<]I don't think the 780's rasterization rates are likely to limit its gaming performance any time soon[/i<].[/quote<]

    • tcubed
    • 9 years ago

    tell me killmax in a “Monkey see monkey do” scenario are you the monkey and HDOr the brain?

    • dodozoid
    • 9 years ago

    From TRs review of GTX 780:
    “Interestingly enough, that change means different things in different cases. Some GTX 780 cards will have all three SMX units in a single GPC disabled, so the entire GPC goes dark. In that case, the card will have four raster engines, so its peak rasterization rate will be four triangles per clock. Other 780 cards may have their disabled SMX units spread around, so all five GPCs and raster engines remain active. Which configuration you get is presumably the luck of the draw. I’d get spun up about the potential disparity, but I don’t think the 780’s rasterization rates are likely to limit its gaming performance any time soon.”

    Its in higher segment too (not long ago it was good 100 bucks more than 290X)
    Thats why I asked what impact does it realy have and which of the two was reviewed

    • tcubed
    • 9 years ago

    oh killmax… Like you would know what “facts” and “good logic” actually mean… you never ever make any sense, your logic is mostly broken and your facts & info on the industry is close to non-existent yet you comment like a rabid fan that you are…

    • tcubed
    • 9 years ago

    well said… your only problem is that you have this kind of insight only into amd… Let’s make a list of nice corporations that did far worse then AMD:

    Intel, HP, Dell, IBM…

    Neither Nvidia nor AMD have ever impacted the it world quite as much as foul play by these guys… The’re saints both of them considering these guys…

    Yet… somehow you only talk about AMD as the villain and NVIDIA as the saint… talk about bias…

    now about NVIDIA:
    Benchmarks fixing, video quality corner cutting, chips falling off of the board, 104C is normal, wooden card flashed as real thing, developed PhysX only to then drop hardware acceleration for it and not saying so… getting multimonitor support 2 years after AMD etc…

    Now some of “these things” were done by AMD too – the fact of the matter is: The tech report is 2 weeks after Tom’s and Legit Reviews reviews even though they started this, got cards bought with nvidia money…

    One would assume that TR would be the fastest with a response to the variances… but no it was Tom’s who had assumed a problem in benchmarking from the very beginning and have pursued the problem as long as they could…

    Now to the AMD screwed up with:
    – Battle field 4 PR announcement
    – Reference cooler
    – Variance PR announcement

    They mostly screwed up on the PR & Marketing part not on technical stuff…

    Plus: DUDES! These guys whine about POWER SAVING mode ona FLAGSHIP PGU??? Don’t you find this the least bit amusing??

    reporter: Hello, what will you use your 500$ GPU for?
    HDOr: Well, I will put it in powersaving mode so I don’t have to pay 1$/month more on electricity.
    reporter: And what games will you run on this GPU?
    HDOr: Lots of brand new GPU bound games!
    reporter: Can you give us some titles?
    HDOr: Just triple A title, very popular ones that require lots of compute power!
    reporter: Like?
    HDOr: Anything that runs on NVIDIA and not on AMD of corse cause they have incredibly bad hardware.
    reporter: So what games are you going to run?
    HDOr: well certainly not Battlefield 4 – as it comes bundled with AMD and not NVIDIA cards!
    reporter: WHAT GAMES?!?
    HDOr: Well you know…
    reporter: No I don’t!!! What are they??
    HDOr: Well…minesweeper… solitair…
    reporter: (facepalm)
    rest-o-world: (facepalm)
    HDOr: What?

    ..
    HDOr: What??
    everybody: (facepalm)

    Now my post was at least funny – Nvidia fandiot!

    • curtisb
    • 9 years ago

    Different market segments entirely. If you’re purchasing a GTX 760, you’re not exactly looking for bragging rights when it comes to performance. There’s a roughly $300USD different in price from a GTX 760 to an R9 290X…more in some cases.

    Also, the issue here isn’t necessarily the variance in card to card. It’s that the review units sent to the press consistently perform higher than what is available for purchase. At this point it’s not clear if this is due to cherry picking, a firmware difference, a blower difference (quality, manufacturer, etc.), or any other number of things.

    • alienstorexxx
    • 9 years ago

    no one cares about bitching on nvidia 😉

    • alienstorexxx
    • 9 years ago

    you’re one of those who always could overclock it’s gpu the same as press samples?
    i see this new power tune as an oc algorithm more than a stock frequence algorithm. you migh probable see two HIS gpus perform different.

    • alienstorexxx
    • 9 years ago

    c’mon hawaii chip isn’t part of the cooler design, stop doing quick responses and try to be reasonable.

    • alienstorexxx
    • 9 years ago

    you can argue if you don’t like the card to be that hot, you can’t argue that this level of temperature isn’t going to make your gpu last less because of it.

    • alienstorexxx
    • 9 years ago

    the best part is that no one gets paid, it’s just people expressing their opinions.

    everyone make it’s posts based on their experience, sometimes more, sometimes less, what they decide to belive about all they’ve learned and that’s reflected on posts you can read here.
    at the end, it’s just opinions mostly biased according what they like.

    journalist are the ones who have the more responsability because they are the ones getting paid for what they say, it’s their job, demand them to be serious when they post, demand them to be fair when they analyze graphs, when they talk about the product you’re going to buy, when they write their final conclusions, not these fanboys.

    • swaaye
    • 9 years ago

    Well I don’t read Anandtech as much as I used to.

    • BIF
    • 9 years ago

    Eyefinity with six monitors or bust.

    • BIF
    • 9 years ago

    Yes, I believe there is an overclock setting.

    • BIF
    • 9 years ago

    How can you guys have not noticed it; wasn’t Anandtech who was among the first to agree to AMD’s press blackout dates? When was this, a year ago? 18 months?

    Every site who accepted that limitation should have an asterisk next to them.

    • BIF
    • 9 years ago

    I’m just going to start -1’ing anybody who writes a post is longer than my 24″ screen is high. Don’t worry, it’s in portrait mode. But even so, in this case it is…

    -1

    Cheers! 😉

    • BIF
    • 9 years ago

    LOL, you could have just picked TWO to start and offered more if anybody was interested. But now that we got the history of the world in burst-mode, we have nothing left to talk about, nor do we have the energy reserves to do so anyway…

    I need a nap! 🙂

    • BIF
    • 9 years ago

    WTF? Next time, please consider putting TLDR tags at the beginning of your rant. I think my ISP is going to tell me I went over some monthly limit just because I loaded this page…

    • dodozoid
    • 9 years ago

    Talking of variability, have anyone explored this?

    From TRs review of GTX 760:
    “The way Nvidia disables the SMXes also means different GTX 760 cards will have different tessellation capabilities. Remember, the GK104 chip’s eight SMX units are paired up inside four GPCs, or graphics processing clusters, and each GPC has a raster engine that can rasterize one triangle per clock. To make a GTX 760, Nvidia can either disable one entire GPC or turn off two SMXes in two separate GPCs. In the former configuration, one of the raster engines goes dark, and the card rasterizes three triangles per clock. In the latter, all raster engines survive the culling, and the raster rate goes up to four per clock.”

    What is the difference in performance between those two options and which one was reviewed?

    • Bensam123
    • 9 years ago

    I don’t know… Thermaltake, Noctua, Phantik, and Zalman seemed to make quite a living off of that… There are more every year. I would like to point out there is already a 3rd party heatsink available on the market…

    But this is just amounting to ‘no ur wrong’. A month will tell.

    • Bensam123
    • 9 years ago

    We don’t know a game doesn’t hit Nvidias threshold till it’s tested in the same way the 290 series was tested, which is highly focused on heat generation and dissipation. Up til this point people haven’t really went out of their way to push or test them in such a way (at least as far as I know of).

    It also is not AMDs chip that doesn’t have thermal headroom, but the heatsink… Once again arguably it still does, they just have to turn up the fan.

    • Klimax
    • 9 years ago

    Kepler so far has quite thermal reserve, no game hits it yet. On the other hand 290x is already hitting thermal limits in Far Cry 3, so we have already indication how it will look in future.

    Which BTW might even cause failure for Mantle regardless of anything else, because it’ll try to push cards even harder…

    • Klimax
    • 9 years ago

    Ad Hominem at its best. And no evidence about his supposed status either. Arguing against your favourite company, doesn’t make one shill, especially when he has it grounded in facts and good logic.

    • swaaye
    • 9 years ago

    I hadn’t noticed it either. Wow….

    • Ratfish
    • 9 years ago

    Oh wow, never noticed that. Yikes.

    • Airmantharp
    • 9 years ago

    No.

    You cannot equate the tiny market for replacement coolers with the availability of retail options shipping with third-party coolers. ASUS, MSI, Sapphire, et. al. have coolers ready to go for the R9 290 series. If they were allowed to sell them, they would be!

    • Bensam123
    • 9 years ago

    No it’s not a good case, but it’s a representative worst case scenario which is good for figuring out what cards do when they’re pushed to extremes.

    Yup, it may indeed get worse for AMD… it may also get worse for Nvidia, that’s why a worst case scenario is a good idea.

    • Bensam123
    • 9 years ago

    There being only one third party cooler right now is a good example of how they haven’t had enough time to design a new one yet. There will most definitely be more then one company designing a cooler if this is this big of a deal. That’s money to be made right there. Heck AMD is probably in the works of doing so themselves as well considering how this has went.

    Pretending AMD has some sort of secret agreement to keep it’s boardmakers from changing the cooler is folly.

    I will once again iterate, you won’t lose your warranty unless you forget to ship the thing back without the stock cooler attached to it. And if that’s the case, I would assume you shouldn’t be changing out the heatsink in the first place and should just bump the fan speed by like 3% which would fix all these issues.

    • ptsant
    • 9 years ago

    I appreciate your honesty in mentioning the source of your “inspiration” in this article and also in the previous article on 4k/eyefinity (https://techreport.com/blog/25399/here-why-the-crossfire-eyefinity-4k-story-matters). However, I am not sure it is ethical to let the competition, ie Nvidia, define the agenda. In the end, this is the definition of marketing. Making a car go fast is engineering. Making speed an important quality in the eyes of the consumer is marketing. You see the problem here.

    Since I have been reading techreport for quite a long time, I would also like to remind you a similar situation when “AMD attempted to shape review content” (https://techreport.com/blog/23638/amd-attempts-to-shape-review-content-with-staged-release-of-info). Maybe you are applying double standards?

    In the end, even though I recognize many of the qualities that make me visit regularly this site for tech news, I really wonder whether coverage of AMD products is truly objective.

    • Airmantharp
    • 9 years ago

    While this is the way it goes with most releases, AMD’s partners have had plenty of time to build custom coolers for these cards. It’s not like the variables haven’t been known for months!

    And don’t get your panties in a knot when I point out that some random “$59 GPU Cooler” isn’t a fair point of comparison, at least unless you’ve put that particular cooler on one of your own cards. There’s always hidden costs, which include the highly probable loss of warranty which is difficult to define a value for.

    • Klimax
    • 9 years ago

    I don’t think Furmark would work too well for real world test, but games like Far Cry 3, which are already hitting thermal limits on 290 are better case.

    BTW: 4-6 are missing now, but it’ll likely get much worse. See Far Cry 3.

    • Bensam123
    • 9 years ago

    Because the cards are fresh out of the shop and they haven’t had enough time to design custom coolers yet? This is the way it goes with pretty much all releases.

    Any solution is a solution regardless of your moral catch all you’re slapping on it. It can’t be recommended just like you can’t recommend cutting the tag off your pillow which you find super annoying because it is against the law. If you don’t recommend people change their coolers on their GPU you shouldn’t recommend people use a different cooler on their processor. Taking that a bit further, they really shouldn’t even build their own computer as building one will expose anyone to having to install a heatsink in the first place.

    You’re really grasping at straws if you’re now summing up a cooler with ‘necessary tools’ which amounts to a couple screw drivers and thermal paste (apparently you can’t use thermal paste you get with it). Oh and shipping and handling… can’t forget those hidden costs!

    • Bensam123
    • 9 years ago

    The only reason people found out about the variability ‘problems’ is because AMD messed up the fans and used PWM instead of RPMs to equalize fan speeds, which caused a noticeable performance and GPU speed drop. No one would notice 4-6fps missing.

    Sure, there is a guaranteed minimum no one has bothered testing these things though. For instance pushing the 780ti through furmark and record what it does compared to a retail card. In the case of Nvidia, that may just result in a louder fan, which is traditionally what GPUs have done in the fast if they’re heating up too much… they don’t reduce GPU speed, they increase the fan speed. The only time they drop GPU speed is when there is no way the fan can keep up.

    • Bensam123
    • 9 years ago

    Aye… I would say this is the BIGGEST variable out of all of them. This is all about noise. They can fix the performance variability simply by turning the fad speed up on the cooler, but people are just waiting for them to do that so they can cry ‘foul!’ on that as well too. Even though performance will be completely static across the board, it’ll then be about the noise being too loud compared to press samples (regardless of how much).

    I’ve mentioned it before, there is 60% additional speed on these coolers. ‘Uber’ adds 10% on and it definitely makes a performance difference. It’d take a lot less then that to fix this little variability. Ideally they should’ve made the fans variable in the first place instead of the performance. That was probably a bad call on their part. As performance is a big deal (especially for some people as Scott put it), but noise… not so much.

    It is still quite weird that decibel results weren’t included in this or the difference in sound that would result from equalizing performance of the HIS card with card 1, even though time was taken to reflash the firmware.

    • Bensam123
    • 9 years ago

    Dude, there is no way they know you put a third party cooler on your card unless you get rid of the old cooler or forget the little memory module TIMs.

    Although I’m sure you’ll cry foul as this is a crime. It’s also a crime to tear the tag off your pillow. The only reason they state something like that voids your warranty is to stop people from doing it who don’t know what they’re doing and invariably botch it up, like forgetting to put thermal paste on it.

    Or wait for a different version of the card which has better cooler, which I already mentioned two paragraphs above buying your own cooler with $150.

    • jihadjoe
    • 9 years ago

    But if AMD releases a new driver, or company X releases a new game/benchmark you can’t do a comparable re-test anymore because you’ve already altered the card.

    • Jigar
    • 9 years ago

    Never mind, i am out of coffee to argue paid shills.

    • Jigar
    • 9 years ago

    WOW, loads of them.

    • Klimax
    • 9 years ago

    AMD Portal.

    • Klimax
    • 9 years ago

    So you don’t have anything to counter his argument so you toss out Ad Hominem? You shouldn’t have any upvotes, because your post is logical fallacy and not contributing anything to discussion.

    • Zizy
    • 9 years ago

    But luck of the draw happens to stock operation of every single electronic part. The only difference is that in the past you only got laptop that ran a little bit longer or shorter because of slightly different power consumption, now it is also a bit faster or slower. This speed thing is an issue of every single dynamic power stuff. Should we abolish automatic OC because of that? Hell no.
    Just make suitable bins – difference between top and bottom part should be small enough. I do not mean limiting boost here, frequency could go from 100MHz base to 5GHz boost… just every single part should behave predictably. 5% variability is ok enough, hard to make it lower without artificially restricting better parts or creating tons of bins. 10% is already too high. 15%+ is absolutely unacceptable.

    There are 2 real issues:
    Misleading marketing. Card is being marketed as up to 1GHz or something, while it should be 700-ish MHz with boost up to 1GHz, like CPUs are.
    Obviously, it should be also benched when operating under stock clocks, as this is the performance guarantee – there should be a switch to disable this and every single review should have both default (with boost) and baseline (without) data.
    <EDIT: Interpretation of results might get even more problematic than now. Should one emphasise performance under default conditions in a decently ventilated case and note that your own card could be a few percent faster or slower than this number? Or should one state performance under base conditions and note that unless you have case completely without ventilation your card will be (significantly) faster. />

    And there is a potential third issue, which hasnt been confirmed yet:
    AMD cheated and knowingly sent faster samples than average.
    This one is really really problematic if true.

    • aggies11
    • 9 years ago

    That makes sense. I can see why you’d have that opinion.

    But I think you can go into the settings and manually crank the fan speed to 100%, if you wanted. And it still wouldn’t void any warranties.

    So from that standpoint, I’d say that any fan speed is largely arbitrary, and one are “max performance”. But there are two that AMD definitely ships “as is” out of the box. But the fact that they give us the choice at all suggests that they known people might not like the acoustics of “Uber” and so provide a (marketing only) labelled “quiet”.

    But based on how powertune works, I’d imagine that even at the Uber fan mode, the lower volted press sample should always show a clockspeed advantage? I’m not a thermal engineer so I won’t hurt my brain too much thinking about that one.. 😉

    But I 100% agree with you that, considering that they shipped a part that is very much performance limited by cooling, with an inadequate cooler, is kinda silly. It likely would be the easiest thing they could do to increase performance, and probably not be expensive.

    They must really be pinching pennies on this one I guess.

    • Disco
    • 9 years ago

    You weren’t rambling at all. I guess my only point is that by testing in ‘uber’ mode, we would not be selecting an arbitrary fan speed or performance level. We would be running the cards at a sustained ‘max’ performance level that AMD would consider within acceptable (warrantee still valid) thermal limits. If the retail cards were significantly lower performing when this state was enabled, then I would really think something fishy is going on. But if all cards (retail or press release versions) can all achieve the same performance without voiding the warrantee, then there is no issue. But that wasn’t tested, so we can’t really come to any solid conclusions (in my opinion). That’s my basic issue.

    And obviously, the stock cooler is a piece of crap. I find it amazing that the NVidia cooler is actually running at a higher fan speed, but so much quieter. The AMD card is pretty amazing (price vs performance) so you have to hand it to their gpu engineers. But it’s unbelievable that they slapped such a noisy cooling system on their ‘crown jewel’ product.

    • madgun
    • 9 years ago

    Surprised to see the kind of responses from AMD shills. There is no point in attacking Techreport. It’s THE place for investigative journalism.

    No one is stopping you guys from buying the 290X so please get a life! It’s not as if AMD is putting money in your back pockets..

    • swaaye
    • 9 years ago

    I put an S1 on my 6970 and Furmark would put it into a thermal runaway condition even with 2x 80mm fans. So I ended up buying a Thermalright TRAD2 GTX which costs more but blows the S1 out of the water. Hawaii is an even higher power chip.

    Anyway – you pay for that stock cooler when you buy a card. I would rather not pay for garbage and then add another $30-50 to get the job done properly.

    • Shobai
    • 9 years ago

    Check DaveBaumann’s response in the other thread; it seems that the review and retail BIOS checksums match. That would seem to rule out firmware variation.

    • Shobai
    • 9 years ago

    From the AMD response comments

    [quote=”DaveBaumann”<]FYI - we asked for the firmware flash/test to be done. At present we have not been able to replicate the results seen, and the retail BIOS should be the same as the press BIOS (checksums we have done from others indicate this to be the case) but its part of the investigation. The press BIOS is posted in TPU's BIOS database so anyone can try it.[/quote<] [edit: need to learn the formatting tips...] I'm guessing he's referring to these? [Is it legal for me to link there?] [url=http://www.techpowerup.com/vgabios/index.php?architecture=ATI&manufacturer=ATI&model=R9+290X&interface=&memType=&memSize=<]TechPowerUp VGA Bios Database[/url<]

    • aggies11
    • 9 years ago

    Sorry if I rambled a bit.

    If your main point is why are they being tested in a throttled state. The answer I was trying to convey is that technically, even Uber mode is still throttled. The nature of AMDs powertune setup, is that any additional thermal headroom will translate to a clockspeed increase. (Up until the max. Which doesn’t seem to be reached). So any increase in fan speeds (ie. quiet vs Uber) should lead to an increase in performance.

    So my idea was that any fan speed (other than 100%) really is an arbitrary line in the sand to use. So using quiet is fair as that’s where Scott decided to draw his line in the sand. And presumably for noise reasons.

    If your idea is that no one uses the 290x on queit mode, and everone is using Uber, then I can certainly undestand that reasoning there. But I’m not sure if we can immediately assume that. And further, whether it would have really closed the delta any, considering that it’s just a % fan increase.

    • aggies11
    • 9 years ago

    My personal take on it is: It’s not so much that the press Sample was cherry picked to be the “best” silicon.

    But rather the initial press sample is running at lower VOLTAGES. Since voltages are set in the firmware, the press samples have different firmware then retail. That is the key difference.

    Why is it a difference? Well as Scott points on, voltages impact heat. Since the performance of the 290x is capped by a thermal limit, a higher volted card is going to have a lower ceiling (than if it were lower volted) as it runs hotter, clock per clock.

    Now initial press samples having different firmware isn’t out of the ordinary, but it usually doesn’t matter to much. However due to the specifics of AMDs powertune setup (Where they are essentially automatically overclocking each GPU to it’s limit.), these small changes in voltage will result in significant real world performance impact.

    Why did they have to change voltage? If we are being charitable, the initial press samples were examined, and a “safe” minimum voltage was determined to get them to run Stable. That same voltage, when tested on the large run of chips for retail, couldn’t guarantee Stability on the breadth of Chips AMD wanted to bin for retail. So they had to up the voltage (to cover the chips that required the highest voltage to be stable). If we are being uncharitable, then we can say the Review samples had a custom firmware allowing them to reach a higher ceiling then retail. I’d personally probably split the difference and say that the yields for the silicon wasn’t as good as they wanted, and so they had to lower there tolerances in order to meet the suppy they wanted. The lower quality required an adjustment of the firmware.

    As for the answer to the question, the second sample seems to have come later, presumably meaning later production, ie. closer to retail voltage. So closer to retail performance.

    Having Sample 1 run the retail firmware (and raise it’s voltages) would be the best indicator, as that should bring it in line with Retail performance.

    • the
    • 9 years ago

    The curious part is the second press sample as it is behaving much like a retail card. When was the second card requested and received?

    • Ratfish
    • 9 years ago

    For the unaware, what do you mean by the bit about Anandtech?

    • Kaleid
    • 9 years ago

    I change the cooler on all cards I own, and most of the time the king is Arctic Cooling S1 plus a 120/140mm fan.

    • Airmantharp
    • 9 years ago

    So, you believe that reviewers shouldn’t investigate issues like these? That ignoring said issues would be more impartial?

    • Trickyday
    • 9 years ago

    Seems absolutely crazy. My neighbour has an identical car to me. His car tops out at 98. My car only gets to 94 (maybe it was a warmer day).

    We are not completely stupid and a point has been made. This sort of persistant diggin reflects the darker side of impartial reporting. Most readers should apply more than a pinch of salt when reading any review site, especially those with a conviction.

    • swaaye
    • 9 years ago

    Yeah I will never buy one of these cards with the stock cooler and I am amazed anyone else has (other than the water people).

    • Kaleid
    • 9 years ago

    All of this nonsense could be avoided if they just bothered to make a better cooler.

    • shaq_mobile
    • 9 years ago

    take your science elsewhere. we don’t need your facts when we have our opinions.

    • shank15217
    • 9 years ago

    Its not a made up metric, performance based on die size is useful metric in GPUs if fabbed on the same process which they are. TR constantly references it all over the place in all their GPU write ups, but hey why read the reviews, its more fun to shill the forums.

    • NeelyCam
    • 9 years ago

    [quote<] I guess that's why 95 degrees all the time makes sense to them.[/quote<] 95 degrees all the time [i<]does[/i<] make sense, if the components can handle that. If you don't agree, I can try to explain it once more, but I've tried that before and people still seem to fail to understand the underlying reasons

    • Disco
    • 9 years ago

    I understand what you are saying, but it still doesn’t really address my main point.

    Why did an article get written about variation among cards that are working in a ‘throttled’ state? Of course there will be variation due to all the factors that have been discussed… chip quality, voltage, fan speed etc… the main consideration is that PERFORMANCE is not the #1 variable when it is running in this condition. So why is the performance being used for comparison?

    I have no idea if this is the case, but what if the HIS card (with the lowest performance relatively) happens to be the quietest? Then, one could argue that the ‘quiet’ mode is working really well. Because performance is not the #1 criteria.

    If we are going to contrast the performance of different cards, then they should be in PERFORMANCE/UBER/SUPERMAX (whatever you want to call it) mode. And that level has been set by AMD at whatever speed they chose, so that should be the level the card is ‘guaranteed’ for, and that is why we would test at that level (not whatever arbitrary level we want).

    • puppetworx
    • 9 years ago

    Yes if they were smart, however if they then had to come back and retest the card…

    • Damage
    • 9 years ago

    AMD’s statement on 290-series variability and press vs. retail performance:

    [url<]https://techreport.com/news/25733/amd-issues-statement-on-r9-290x-speed-variability-press-samples[/url<]

    • derFunkenstein
    • 9 years ago

    Touche.

    • aggies11
    • 9 years ago

    Very good point. People have complained about AMD’s powertune strategy for a while now, and this is the very obvious drawback.

    But I don’t think we need to jump to conclusions of cheating. From Scott’s testing it seems pretty straightforward.

    Every AMD card with powertune gets a “free” overclock. A very smart overclock, that will push the card to it’s operating limits given the current temperature/environment. But not dangerously so. Put the card in a case with bad airflow and it will never overheat, it’ll give you the best performance possible in that set up. From an engineering standpoint, it’s a nice idea, no upper or lower bounds, just the perfect balance of performance/temperature at any given point. From a marketing/sales point it’s tricky though. There is no maximum you can advertise, nor no minimum you can guarantee.

    That’s nice and all, but just with any Overclocking there is variability. Powertune tries to be smart about it, equalive for fanspeed and thermal throttling. But as anyone familiar with overclocking knows, powertune does not account for voltage. (What? Of course it does…) Not in the way that a typical overclocker would do. You want to try to get the lowest voltage that gets you STABLE at that clock speed. (More voltage = higher temps, which limits the ceiling of your overclocking efforts). Enthusiast OCers spend lots of time testing there systems for stability.

    Powertune unfortunately can’t do this. It can’t lower voltage and test for what’s “STABLE”. Stable voltage settings are baked into the firmware. They are the same for all cards. That is the achilles heel in this setup. The voltage settings must be set the same for all cards. Which means the potential for all cards must be limited by AMD to the “lowest common denominator” of all the cards they are selling. They have to test their cards and pick values that will be stable across the board.

    The difference between test/review units and retail is a result of changed values for voltages in the review/vs retail hardware. Looking at Scotts experiments with the HIS card not being stable, it seems these voltage increases were necessary.

    Why? Either cause the review units were from an earlier batch of chips and when the newer batches came back those voltage values wouldn’t work across all chips. Or because they needed to get more aggressive in their binning tolerances to get the quantity of chips they wanted.

    Doesn’t make it a good idea, but not necessarily cheating.

    TLDR – this doesn’t have to be due to cheating on AMDs part, but just due to the downsides of their Powertune strategy. Powertune basically dynamically overclocks cards on the fly. But to overclock you really need to be sensitive to each individual chips voltage requirements. It can’t do this, voltages have to be set across the board for all boards, in firmware. The early review samples had lower voltages then the retail cards (possibly due to smaller sample size in their testing? Issues that weren’t evident until more chips were produced/tested. Or aggressive binning needed to produce enough sale-able chips). But this small change has the potential to adjust the performance ceiling across all the cards, making it lower.

    So it doesn’t seem like much of a mystery, rather straightforward. And while it probably wasn’t on purpose (ie. planned cheating), it still sucks as it means review samples aren’t indicative of retail performance. But that is a big downside to the “gamble” AMD is taking with their Powertune strategy. It relies on overclocking to extract maximum performance, and OCing is incredibly variable/sensitive.

    • SuperSparky
    • 9 years ago

    Might as will give up the denial. Fanatics will never ever recognize truth when it is presented to them. You see, the truth can fall out of the sky, land on their face, and start to wiggle, and it makes no difference. In their eyes, their idol can never do any wrong, and there must be some conspiracy.

    There is also the possibility of the “truth” being perceived more than it really is, and thus being turned into a conspiracy itself.

    Corporations have many owners, stockholders, and typically most of those are investments from pensions, 401Ks, retirement accounts, etc. In other words, ordinary people. However, that also doesn’t leave out the heavy hitters with large ownership interests that can effectively breath down the necks of panicking executives, which in turn breath down the necks of their managers, which in turn breath down the necks of their employees. The bottom line is, when a company is faltering, the execs have to answer to the stockholders, and many try like many kids in trouble, try to hide it, or blame it on someone else, instead of own up to the mistake and fix it honestly.

    Hand picking silicon, and using top end parts to create “review” units was probably something they figured wouldn’t be noticed (and maybe not a factor they considered a negative reviewing factor), and their thinking was likely to “make sure nothing went wrong with the reviews”, at least initially. Although I do not dismiss malice either. I can imagine a schenerio of an executive pressuring a manager, pressuring a team, to put together units of top quality to send to reviewers, because of the company’s precarious standing. All are thinking if they screw this up, there goes their job. So frankly, I, if I was an engineer, would put together the best and top quality componants to send to those reviewing the latest and greatest product of the company.

    Just food for thought.

    • clone
    • 9 years ago

    O’ Reilly loves to claim balance, doesn’t work for him either.

    • clone
    • 9 years ago

    life teaches why, no need to explain the obvious.

    • aggies11
    • 9 years ago

    It’s a bit more involved then that. As long as the card isn’t hot enough, the clock speed will keep increasing. Once the card is hot enough, the fan speed will increase up until it’s limit. And the process repeats until both the fanspeed limit and temperate are reached.

    Why the “quiet” vs “uber” mode distinction isn’t as relevant is that, all cards have the ability to increase fanspeed. Uber isn’t actually max fan speed either. There is still 50% more fanspeed to be increased. If you want to go to 100% you can, with really anycard. But all cards want to try and set reasonable fanspeed limits for noise reasons.

    Put another way: you could say, why stop at uber, why not test at 60% fanspeed? If you are interested in the best performance, fan noise should never be an issue. And keep going from there…

    So we have to draw a line somewhere. The fanspeed limits that the cards ship at seem reasonable enough, and also seem to be at the limits of what most people consider acceptable for fan noise.

    The names for the fanspeed ratings are just marketing. Quiet or Uber, the name doesn’t matter. It’s a question of fan speed and noise. And where do you set the limit for what most people consider acceptable?

    • aggies11
    • 9 years ago

    I think a more interesting point to focus on is the differences in voltages between the firmware revisions.

    The way I’ve always looked at this is Powertune means that AMD has “automatic overclocking” built in. That fine and nice. But unlike typically “factory overclocked” versions of boards that have a guaranteed (ie. tested) frequency, this is more like the typical enthusiast DIY overclocking. Different chips are going to have different results.

    AMD takes a gamble and makes the “advertised” (read: expected) performance to be the variable OC performance. A dangerous gamble if there is much variation in chip production. Anyone familiar with Overclocking knows a lot about the luck of the draw in terms of how much headroom is available for OC. Not everyones results are equal, and sometimes there can be a big difference.

    Specifically powertune tries to compensate for this by pushing each chip as hard as it can to get to the set thermal limit, and fanspeed. Those are dynamic things that it can adapt to from card to card.

    What it can’t adapt to is the voltage required for a given clockspeed to be “stable”. There is no way to measure that (unlike fanspeed RPMs or Temperature). If you OC, you know that different chips can require differing amounts of voltage to hit the same clocks. So AMD has to set these values in firm ware that will work for ALL CHIPS, even if some are worse/better.

    Depending on there tolerances for binning, they most likely had to set the retail voltages higher than the press samples. Or rather, after testing large loads of the chips, had to up the voltages to guarantee all boards would be stable. But as written in the article, higher Voltages mean higher temps, which means boards will hit their throttling points sooner.

    Regardless of temperatures or fan speeds, these voltages are hard locked to the firmware, and are what are likely responsible for the variations we are seeing. How much this means to you as enthusiast, is a personal choice, but from a business strategy it’s a tricky marketing move.

    • Disco
    • 9 years ago

    I agree. This is a very important question that needs to be answered. I don’t own a 290x, so I can’t say I completely understand how boost works. But I think it means that it just runs at the 1Ghz (or whatever it’s supposed to be) and the limits on the fan speed are removed.

    Anyways, at the most basic level, it’s the boost speeds/performance that needs to be cross-checked between different versions of the card. Not some ‘your mileage may vary’ kind of ‘quiet’ mode where high performance is not the most important variable.

    It’s like complaining that your notebook isn’t running at full speed, when you have the power settings on ‘balanced’ to conserve battery life. And because of some firmware differences, the performance under the balanced settings have a +-5% variation.

    • maxxcool
    • 9 years ago

    someone likes DDD 😉

    • Lans
    • 9 years ago

    That and posting data for multiple runs with recorded ambient lab temperature would be very helpful/enlightening.

    Right now, I am left wondering how much temperature swing was there over course of >= 10 hours given some days there is a huge difference between high and low temperatures. Maybe it doesn’t matter but there is nothing presented that says one way or another.

    • HisDivineOrder
    • 9 years ago

    [quote<]Now toss these in a case with poor airflow, and a room at 25C and they could end up running at 700 Mhz.[/quote<] That's why AMD's Powertune technology is so clever. Most review their cards in open testbeds, not closed systems with realistic layouts. Open may favor tri-fan cooling setups, but blowers are going to benefit, too. Just not as much. I suspect most people are going to have cases and most people will not have a cooling setup capable of handling an R9 290/290X at full speed during a typical gaming scenario. It's just so much heat to be pumping out. So yeah, I think a lot of people are going to be dropping down in 30 minutes, in hour 2, of a long gaming session and not even realize it except from sudden performance drops and/or sharp increases in volume from their card's cooling solution. Let's hope the long delayed, barely acknowledge custom coolers with non-reference boards wind up redeeming the line.

    • Wildchild
    • 9 years ago

    tl;dr

    • jihadjoe
    • 9 years ago

    That’s all well and good, but luck of the draw shouldn’t apply to stock operation. Overclocking, sure, but cards should meet a certain baseline when running stock.

    How would you like it if certain FX8350s only ran at 3.8GHz? The 290X is advertised as a 1GHz part, so AMD should have shipped it with a cooler that allows it to run at 1GHz all the time.

    • Zizy
    • 9 years ago

    Hum yeah, card 1 is 4-6% faster than the other 3, which are equal. Now what? Enough to call AMD cheating? Luck of the draw?
    What if you try firmware of retail cards on samples?

    Also, on page 6 you state there is no difference between HIS firmware and retail one in these 2 runs. Both show flat line with occasional bumps above that line. On page 3 graph is completely different. Yes, different application, but it still looks very strange.

    • HisDivineOrder
    • 9 years ago

    You’re welcome. 😉

    • HisDivineOrder
    • 9 years ago

    Yet mine illustrates WHY you can’t trust them. There are a lot of people who insist that AMD is a company to be a trusted, a company that is UNLIKE Intel and nVidia. These people need evidence.

    So I’ve given it to them by illustrating various lies in the context of a campaign of deceit. Especially during the Rory Reed years.

    The funny thing is if I hadn’t said those lies specifically, then I would have gotten people saying I was a paid nVidia shill (read below) who didn’t reference the lies.

    Can’t win if I do, can’t win if I don’t.

    • HisDivineOrder
    • 9 years ago

    Because people paid by nVidia have a tendency to say that nVidia and Intel lie. 😉

    If only nVidia WERE paying me and I had to STRUGGLE to invent things to say about AMD’s supposed generosity, things would be a lot better.

    Afraid not. Sorry.

    • lilbuddhaman
    • 9 years ago

    -http://en.wikipedia.org/wiki/Planned_obsolescence

    -American Car manufacturers in the 90’s

    -EA

    • dodozoid
    • 9 years ago

    I wonder, can GPU-Z read ASIC quality of r9 290 series? If so, could you please post numbers of all four of your cards, Scott?

    • Klimax
    • 9 years ago

    Did you think they could hire any competent people in place of old fired team with the budget they have?

    • Diplomacy42
    • 9 years ago

    err, I would expect THE SAME CARD to have a variance of 5% given the parameters of the boost clock and the variability of in game performance demands. a 7% deviation from the mean is really not that excessive.

    • Arclight
    • 9 years ago

    Can you tell us how the cards do in Uber mode? Is there still a big difference in GPU frequency?

    • Klimax
    • 9 years ago

    I don’t think it ever translated there for a long time. (Way too many factors in price beside die size – including the way it is designed and how much you can bin for give product – see GK110)

    Closest we got I think is original Bulldozer, because it had huge die, too many transistors and thus was quite costly to produce, where it caused too low profits for AMD. (When they had to cut price significantly)

    • Klimax
    • 9 years ago

    [quote<]Nvidias units also feature a boost feature, but retail cards have not been monitored to see if a similar variance was also present in their cards (which is something that should be looked into). [/quote<] Unlike with AMD there are no current reports of such problem and IIRC some sites have obtained some further units on their own too. Second, thing is that with GeForce we have guaranteed minimum and maximum and thus we know beforehand maximum permitted variance (short of driver bug or botched bios) so I'd say different case from AMD's. As far as I can say, I monitored my own Titan (initial batch aka original revision) and it matches or exceeds frequency of reviews and is quite stable. (I think I have still some runs saved somewhere from testing two fan settings and later when monitoring VMEM by GPU-Z)

    • dodozoid
    • 9 years ago

    Call me a grammar-nazi, but “Their is data to support…” makes all your arguments invalid in my eyes despite you may have a point.

    • Jigar
    • 9 years ago

    I have been following your comments since 2 months on TPU as well, its now confirmed you are being paid by Nvidia dude.

    • Diplomacy42
    • 9 years ago

    I have to admit, I don’t understand that line at all. Its November. Put a test bench in the yard, cover it with a tarp and call it good.

    Over the course of a 30 hour test, you could even get some interesting relative ambient temperature data.

    • Klimax
    • 9 years ago

    Not sure what exactly “this stunt” applies, but your two references aren’t same to my k knowledge as current ADM case.
    480 was performing, but hot and noisy (IIRC), but hit the target on both press and retail cards. 5800 had broken architecture, which required one of the first driver-level game specific optimizations. (proper instructions ordering) not unlike BD or Netburst in CPU are and again press and retail units didn’t differ much.
    Not same. Or were you referring to something else and if so, what it is and links please.

    • Klimax
    • 9 years ago

    Success? Any evidence for it?

    • Diplomacy42
    • 9 years ago

    Honestly I thought it was a little funny hearing TR underscore and double underline the comments of an early adopter fanboy, lol. useful, valuable, and funny.

    • NeelyCam
    • 9 years ago

    Just replace “press samples” with “initial press samples” and you’ll get the point. The first batch of samples are what got reviewed by most of the media, and those reviews were used to make the conclusion that AMD cards matched NVidia’s in performance but at a much lower price.

    Don’t get hung up on mathematical facts again and focus on how all this is applied in real life

    • bfar
    • 9 years ago

    Nvidia were widely criticized for those products. AMD deserve the same scrutiny.

    • Spunjji
    • 9 years ago

    I’ve not once had an issue with re-fitting the original cooler on a card for warranty purposes, but it’s true that YMMV here.

    • Spunjji
    • 9 years ago

    It *should* translate into the final cost of the product, all things being equal. Not saying it always does…

    • NeelyCam
    • 9 years ago

    I agree; noise is one of the metrics of interest in this in the performance/price/noise three-way… And applies to Uber as well as Quiet. If the retail samples all perform the same in Uber mode, my feeling is that, to compensate for power efficiency and cooling variance, the noise levels vary.

    My take is that the whole 290X “scandal” is largely just a PR screwup by AMD. Once these cards come out with better coolers, they’ll match Nvidia’s in performance and noise and are probably still priced lower.

    • NeelyCam
    • 9 years ago

    I need a bigger display on my phone; I broke my finger trying to scroll through that

    • NeelyCam
    • 9 years ago

    Please explain how they are false. Looking at the results, the conclusions make sense

    • beck2448
    • 9 years ago

    The Crossfire debacle really put me off AMD for the foreseeable future. That was horrendous selling a high priced product that did not actually deliver the frames to the player, and they denied , denied, denied until the entire enthusiast community exposed it with independent testing and they STILL took eons to even address it. I don’t trust them, and their marketing guys are such arrogant idiots.

    • ahmedabdo
    • 9 years ago

    No. The conclusions in this article are unfortunately false!

    • Klimax
    • 9 years ago

    [quote<] Hawaii put the performance per die size squarely back on AMDs corner, and now this.[/quote<] Made up meaningless metric is made up meaningless metric. It doesn't serve anything. (Well, exception: Moving goalpost.)

    • Klimax
    • 9 years ago

    Or only as far as you can throw their PR.

    • Arclight
    • 9 years ago

    I’ve only looked at the graphs and read the last page. Was the test really done without Uber? If so that would certainly matter since if i would purchase this card i would make sure to set it up to perform the best it can.

    The follow up question would be, is there as a big difference in clock rates while in Uber mode?

    • clone
    • 9 years ago

    lol, his shredding of the star wars crowd was a classic.

    “perfect……. for me to poop on.”

    • clone
    • 9 years ago

    don’t trust any company….my post is shorter yet says the same.

    • cobalt
    • 9 years ago

    I think the controversy is a concern that the batch of cards sent widely to press outlets for reviews might have been cherry-picked and provide much better performance than what folks buying the actual cards get. Thus, reviews would consistently show a skewed view of reality. You don’t think that sounds a lilttle disconcerting?

    I could understand your desire to ignore this as random variation, except this may not be an isolated case; Scott pointed to several other outlets report the exact same findings.

    Further, the second card was requested separately, and it arrived weeks after the one they sent for review. (I can’t tell from Scott’s wording whether or not it arrived after the first review was published.) So even if it performed identically to the retail cards, that doesn’t eliminate concern over the card that was initially sent for the review.

    • kilkennycat
    • 9 years ago

    One item somewhat overlooked is the chip-area of the Hawaii GPU — 430mm squared. The nVidia GK110 is 550 mm squared. With virtually identical power dissipations, the thermal transfer efficiency of the GPU packaging itself comes into play.

    Is there anything about the internals of the Hawaii GPU package that ensures equivalent thermal transfer efficiency versus that of the internals of the GK110 package in spite of the Hawaii chip being about 25% less in surface area?

    Memories of Ivy Bridge vs Sandy Bridge immediately spring to mind.

    Did AMD also screw up the thermal design of the Hawaii GPU package itself, requiring the heatsink/fan combo to work harder to help offset the problem?

    • Disco
    • 9 years ago

    I must be missing something. I don’t understand what the controversy is. In ‘quiet’ mode, there are some variations in performance – between one card supplied directly from AMD and another card directly supplied by AMD (and two retails units). OK. I get that. But what ‘s the ACTUAL problem (I’m being serious with this question)?

    The variations are relatively minor, but they do exist. At least the driver revision seems to mostly fix it. BUT, no end user who is looking for the total top performance is missing out at all. This individual would just put it in BOOST mode, which is what they should do if they want the highest FPS or whatever for Battlefield etc. People who leave their card in the ‘QUIET’ mode aren’t looking for the absolute top levels of performance, and probably don’t care about 2-5 fps difference. If they did, they would put it in BOOST mode. Am I misunderstanding something?

    Regarding the sample firmware vs retail firmware, why isn’t there an examination of the differences in BOOST mode? Isn’t that where we should be looking if there has been a true mis-representation of max performance?

    Seriously, someone explain to me what the real problem is here. I have been a very long-time reader of TR (no dash!), and I have to say that I don’t really like how Scott completely fails to mention once, on that entire last page of the article, that the second card supplied DIRECTLY from AMD is pretty much identical to the retail cards. Where’s the conspiracy?

    • Pwnstar
    • 9 years ago

    Holy…

    • Airmantharp
    • 9 years ago

    Conspiracy?

    No. This is exactly what we expected- that literally any negative variance in tolerances would cause these cards to perform worse than the review samples.

    The problem is that said variances are still at up to ~10% [b<]after[/b<] AMD fixed the first major variance in fan hubs by changing how they regulate cooling.

    • Diplomacy42
    • 9 years ago

    look, i have no doubt that every press sample had a manual burn in process where they looked for any hint of problems before sending off their sample to be tested. not doing this would be negligent, but the implication of these articles has continuously been that you-will-not-find-these-cards-in-the-wild. That simply isn’t true.

    Personally, I think that if you get a card that is performing too poorly, it should be subject to RMA, but other than that, the actual numbers don’t bear out any type of conspiracy or subterfuge. Even the press samples had a fair degree of variance.

    • MFergus
    • 9 years ago

    On the clock speed comparisons, both the retail and press sample cards were on quiet mode so it’s a fair comparison. For all intents and purposes, quiet mode is the default and normal mode of the card. It’s quiet compared to Uber but by no means is it quiet, it would still be considered a loud card if that was it’s only mode.

    • Airmantharp
    • 9 years ago

    Or in the bearings or magnets in the fan hubs, or the…

    AMD ironed out the cooling issues quickly and in good form- but the ASIC variability isn’t as nearly easy to cure.

    • FuturePastNow
    • 9 years ago

    Well HIS and Sapphire didn’t make the cards, they just slapped stickers on them, since they’re stock designs they came off the same production line as the review samples. That doesn’t mean there’s no variation in quality of the TIM used or its application, though.

    • mesyn191
    • 9 years ago

    Its not that messed up of a situation at all.

    TR’s testing was all done in Quiet mode. As was the other sites who are playing up the issue. Some of them even reduced the fan cap further. In Uber mode its not a problem at all.

    • NarwhaleAu
    • 9 years ago

    No business does that. Basic economics tells you that.

    Company A does exactly what you think every company does – substandard, overpriced, poorly designed and built card, but at a lower price so that they can exploit the high end later on.

    Company B comes in, builds a slightly better card than A, but at a slightly lower price point and immediately captures the vast majority of the market share.

    Company C comes in, builds a much better card with a profit margin of 5 to 10% (which is the gross margin most companies make – check finance.yahoo.com) and captures most of the market share of A and B.

    Company A craters as sales plummet and is soon extinct. Company B has a decent product, so cuts its price so that consumers get a better deal and that it can remain in business (it needs to maximize profits, which is roughly where the volume and price of the supply and demand curves intersect).

    What you think is widespread just doesn’t exist that often. Most companies legitimately try to do a good job. They may stretch the truth, over-promise and under-deliver and sometimes they cut corners, but all in all, they are attempting to deliver a good product that consumers will buy.

    The idea they deliberately build a sub-par product is nonsense.

    • JumpingJack
    • 9 years ago

    Good point, my mistake, I was reading Titan as Ti…

    • shank15217
    • 9 years ago

    I am not surprised that card makers got these cards out to retail as fast as possible to make a quick buck, this thing has bad QA written all over it. After all the effort AMD engineers put into making hawaii (I really can’t blame them because they don’t decide what the base and peak clocks should be) this is a sad situation. Hawaii put the performance per die size squarely back on AMDs corner, and now this.

    • shank15217
    • 9 years ago

    Yea right? Next time at least compare the right cards…

    [url<]https://techreport.com/review/25611/nvidia-geforce-gtx-780-ti-graphics-card-reviewed/11[/url<]

    • jihadjoe
    • 9 years ago

    Except all kepler cards hit their advertised boost speeds, and oftentimes go beyond.

    • HisDivineOrder
    • 9 years ago

    How long will certain people continue to insist the AMD is a benevolent Robin Hood trying to rob the rich (nVidia, Intel) and give to the poor (consumers)? They’re a corporation. They do what all the corporations do. Every single thing. They’re not any better. In fact, when someone is desperate–even a corporation–they start to do crazy, even horrible things just to try and get ahead long enough to survive.

    Should I even list all the shady things AMD has done lately?

    They’ve been nothing but shady with the R9 290 and 290X. They “reveal” it without specs. They insist you’re going to be buying it on a day that comes, goes, and is gone. They tell you Battlefield 4 is coming with all cards, then backtrack and say, “We never said anything different” despite the fact they actually did say something different in about as official a way as one can imagine (head PR guy).

    They throw out review cards with special firmware with cherry-picked hardware that performs far better than the regular product. They count on people not being able to get enough cards at the initial release to truly test the differences between releasing and reviewing product. They can call on benefit of the doubt and get great sales for the holiday period while everyone is in disarray with less facts than normal.

    It doesn’t hurt they flew all these people out to a tropical island, showed everyone a group of long-term strategy technologies that require a lot of developer support they currently don’t have, and then did their bait ‘n switch. I don’t put it past them to have released those widespread rumors that there was a version of the 280/280X having TrueAudio.

    But that’s just about the R9 290/290X, really. Let’s recall the last real CPU launch by AMD. “You can review the parts that are great right now if you want to be released of embargo on THOSE specific parts. Review only the things we say you can review, say only the things we let you say, and you can review the entire CPU a month later once we’ve decided you can. You know, right after product is shipping.” Hmmm… that doesn’t sound the least bit like cherry-picking what is said, does it? Naturally, TR didn’t go for it, but they suffered hits on the site for it, hits that went to other apparently less credible sites that DID take those hits greedily. In exchange, AMD got initial reviews, hoping that people wouldn’t go back for the update after having a positive impression of their latest product.

    From a Phenom shipping with horrible errata that hobbled performance to the entire Crossfire problems both at the initial Radeon 7970 launch to this year when they could no longer deny their widespread and rampant problems with frame latency, to 4K problems, you have an AMD that ignores problems until they are caught elbow-deep in the cookie jar. They promise solutions, hint that they’re coming far sooner than they are, and then release updated product to truly solve the problem. They refuse to acknowledge the problems.

    There’s the (so-called) “high end” FX CPU’s, too. They’re announced as “only for OEM’s” and as high dollar CPU’s, but within a month are sold everywhere and for far less, too. Hmmm… did they do that as a publicity stunt? Would you call that a lie or just marketing trickery? Seems odd either way.

    And then there’s the delay of Volcanic Islands. You know, the R9 290/290X and the replacement parts for the Radeon 7950/7970 aka the R9 280/280X, etc. Supposedly coming at the beginning of this year on every comment that came from their mouth last year, this year when the time came, they shrugged and said, “Wut? We never said nothin’. These aren’t the droids you’re looking for.” Waving their hands at the entire intarwebz.

    When Kaveri was to come earlier this year. “These aren’t the droids you’re looking for.” Suddenly, Richlands makes perfect sense, coming when it did. Naturally, we all want to know what happened. “We released CPU’s precisely as we intended to exactly when we intended to in exactly the way we always intended and nothing at all changes. No delays, no problems, everything is great, and we’re working exactly as intended.” Jedi Mind Trick again, just for good measure.

    Everything they’re doing speaks to a company more focused on moments of great spin rather than consistent greatness. They fire entire teams of people to hire individual names. They figure the names will make for great press releases and take away some of the bite of losing masses of people despite being cheaper in the long term. They release bundles and bundles of games, but keep the same line of cards for over two years. The games take away most of the sting, right? They want that initial great review on their APU even if the final reviews smacked down their CPU performance. They love the initial buzz of having a $400-550 card line that can beat nVidia’s $500-700 series even if the final analysis shows they were sending their very best to get those initial scores. They’ll even try to soothe users who are burned by lack of Battlefield 4 despite a highly quoted, widespread announcement email (right before a weekend no less!) by top level brass announcing said BF4 for everyone… by giving it away to a thousand people via a freebie campaign. This is no doubt far, far cheaper than giving the game away to all their customers instead.

    People, this is AMD. A company desperate to scrounge up a profit. This is the company Rory Reed is making. They’re counting on loyalty, on doubt, on obfuscation, and more than a little La-la-la’ing by people who either remember bygone days or just don’t expect this kind of deceit on a regular basis to be real and not “overblown.”

    It’s not that this “kind of” thing hasn’t been done by other corporations before. This “kind of” thing has happened before and every time we see it, reviewers go berserk. They should. It’s what they should do. I suppose that’s why AMD helped a lot of reviewers feel better about them–to give them more leeway during the last few months–with their little tropical island giveaway. They also bought Anandtech, lock, stock, and barrel. Effectively, if not literally. It helps to have Anandtech avoiding most scandals with a good long delay before acknowledging or responding to said controversies. They ignored the frame latency problem and even denied it for a while. Then they ignored the 4K problem for a while, too. They also took part in the APU review of selective parts with glee. I suppose that’s when they showed up on AMD’s radar as receptive to “special advertising.”

    The problem isn’t that other companies haven’t tried to do things like this. It’s that there’s a reason they don’t get away with it. There’s a reason that we get annoyed by it. It doesn’t matter if it’s nVidia or Intel, we should and have gotten annoyed, angry, and railed against it.

    AMD themselves have done it before in the past. Things “like this.” The problem here is they’ve done so much in just the recent past. One thing after another, layered, it gets to be so much that it’s hard to even make a short list of all they’ve done, it’s so much.

    TLDR; don’t trust AMD. They’re desperate and they’re doing anything at all to make an impact, even if that’s going to wreck their reputation in the long term. Everything they’ve done suggests they don’t see a long term. I guess that’s why 95 degrees all the time makes sense to them.

    • Airmantharp
    • 9 years ago

    Hey, I’m just adding an addendum, in agreement, with you and with the article.

    AMD sent out cards that met certain levels of performance and set certain expectations, which were then not met by average retail samples.

    • Airmantharp
    • 9 years ago

    If you had asked us what GPU to buy before the R9 290X and GTX780 Ti, we’d tell you to wait. If you ask us right now, well, we’ll still tell you to wait. Wait for AMD’s partners to get the ‘good’ 290(x) cards on the market.

    If you’re looking for a high-performance solution, at least.

    • Airmantharp
    • 9 years ago

    Do you come with a setting [i<][b<]other[/b<][/i<] than jackass?

    • HisDivineOrder
    • 9 years ago

    Throwing on a third party cooler, ruining your warranty, and… well, let’s hope the card doesn’t fail for whatever, right?

    Yeah, $150 for a third party cooler, but not for the warranty. Cheaper just to get the solution that’s done out of the box.

    Or wait for the final version of the card that comes with good coolers.

    • HisDivineOrder
    • 9 years ago

    “Back in the day…when they had to endure a long-term load (several hours).”

    Big difference from today in modern times with short-term loads (>5 mins).

    • Fighterpilot
    • 9 years ago

    Oh good!…Another bash AMD article.
    Just get over it Scott.
    The R9 cards are a success.
    You’ll just have to live with that until you can fawn all over Maxwell.
    (I’ll wait for that article for you to prove me wrong)

    • JumpingJack
    • 9 years ago

    Incorrect, the 290X does not draw less power, rather the 290X draws significantly more power than the 780 Ti (EDIT: This was incorrect, I looked at Titan and read in my brain 780 Ti)

    [url<]https://techreport.com/review/25509/amd-radeon-r9-290x-graphics-card-reviewed/12[/url<] [url<]http://www.anandtech.com/show/7481/the-amd-radeon-r9-290-review/15[/url<] [url<]http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/63742-amd-radeon-r9-290x-4gb-review-16.html[/url<] In general there is 30-40 Watt delta with the 290X running higher power. Though I have seen some reviews that put the ultra mode closer to within 10-20 Watts of the Ti.

    • Gam3sTr
    • 9 years ago

    Wow this will affect my gpu of choice in the future. If I have money that is this Christmas.

    • Krogoth
    • 9 years ago

    They have done it plenty of times in the past.

    Nvidia fangirls and marketing drones are getting too excited over something that their precious company has done before.

    GTX 480, Nforce 3/4 (passive cooling), GF3 Ti500 and FX-5800.

    • Krogoth
    • 9 years ago

    Not true at all.

    Back in the day, the higher-end Prescotts, Smithfields and Barcelona(Phenom I) had constant throttling issues with the stock HSF when they had to endure a long-term load (several hours). IIRC, 480 and 2900XT shared similar fates with their stock coolers, especially in chassis that didn’t have enough breathing space.

    With silicon made in the pre-throttling era, the symptoms were in the form of BSODs, graphical artifacts and so forth. It usually happen to long-time owners of the said hardware.

    • Deanjo
    • 9 years ago

    Pretty much all GTX cards have a second DVI port and yet they keep their temps in check. Dropping the second DVI port to accommodate better cooling shouldn’t have to be done. It’s just a piss poor cooling solution being overtaken by a hard pushed GPU.

    • willmore
    • 9 years ago

    Error on page 5. The first graph on the ‘HIS’ tab shows frame rate and not clock speed.

    • ClickClick5
    • 9 years ago

    Pfft, buy Forge a 12 pack and he will do for a week.

    • entropy13
    • 9 years ago

    Eyefinity setup with Z87 and 4770K! lol

    • Diplomacy42
    • 9 years ago

    They shouldn’t even have a quiet mode, they should have done something like Precision X or Afterburner and made every card customizable on a slider between performance and volume. Most of their problems seem to stem from the fact that people think that they can apply last gen preconceptions to something that is designed to break them.

    • interbet
    • 9 years ago

    They should have never shipped these things with Quiet mode enabled by default. It really does confuse people and apparently even editors which should know better and are probably getting emails from competitors constantly harping this.

    • Shobai
    • 9 years ago

    On a related note, are the retail sample firmware dumps being uploaded somewhere? And if so, where would an interested party look for them?

    • Diplomacy42
    • 9 years ago

    If you took away the variable clocks, fan speeds and voltages, and programmed them like the GTX 580, they wouldn’t be set at 1000ghz, they would be set closer to 900, maybe 920. the extra headroom is extra. saying “my card doesn’t have as much extra as his card” is kindof silly, and it would be like complaining that “my card can’t be overclocked as well as his card.”

    • Airmantharp
    • 9 years ago

    For what other reason would we not have third-party coolers shipping on R9 290(x) cards other than AMD’s interference at some level? It was obvious on release day that better coolers were needed.

    As for voiding your warranty- well, any solution that results in a voided warranty isn’t a retail solution. That’s an entirely different argument. I’m not saying people shouldn’t do it, or that I wouldn’t do it, rather that it’s not something that can be recommended.

    And for the price of third party coolers? Here’s the thing- it’s not just the listed price of said cooler. It’s tools to make the change and if necessary customize the part/accessories like memory or VRM sinks, it’s supplies like TIM or glue, it’s extra parts like fans that quickly add to the price with quality parts, it’s shipping for everything, and then you lose the warranty that was part of the cost of the card!

    So, I’m still definitely a proponent of third-party coolers, but only when they come installed from a vendor as stock with a warranty.

    • Airmantharp
    • 9 years ago

    They undoubtedly have- but this is the first real example we’ve seen in the desktop arena. In the mobile arena where everything has always been power/heat constrained to some extent it’s much more widespread. Take a look at how Futuremark is handling some of the major phone manufacturers :).

    • Airmantharp
    • 9 years ago

    Please. The Nvidia products being compared, along with previous AMD products, had such advertised speeds- and they hit them. The R9 290(x) don’t; rather, potential buyers would have to compare performance based on reviews. The issue is that unlike previous examples where review performance could be relied upon, the R9 290(x) had no margin for variability.

    With the very best samples sent to reviewers, it’s pretty easy to understand that the average performance most purchasers of these cards will not match what reviewers published. That IS a problem.

    • Airmantharp
    • 9 years ago

    The initial review confirmed quite clearly that changes in ambient temperature made a difference in performance; but ambient temperature doesn’t change that fast. Note the section where the cards are tested again in a warmer environment, and where the pristine review sample is shown to run faster than the retail samples even when run in said warmer environment.

    TR shared their results based on their investigative testing and we likely haven’t seen the last of their efforts here, given that Scott detailed their less controlled variables and untested variables in the article.

    • Bensam123
    • 9 years ago

    Boost speed is a maximum, not a minimum. If AMD listed a minimum of like 850mhz it would pretty much fit every game out except engineered heat production programs like furmark.

    • Bensam123
    • 9 years ago

    Why do you say they’re preventing their partners from improving the cooler?

    Voiding your warranty with a 3rd party cooler is silly. That’s a catchall and there is relatively no way for them to know you change out your cooler unless you missplace it when you need to RMA it. It’s just there to keep the less tech savvy people from attempting it and botching their unit.

    Cost in this case is $150 you save on buying a 290x and the $150 you need to put towards a good cooler. They don’t cost $150… you save money.

    Curiously, you originally were a proponent for 3rd party or better coolers, now you’re saying they don’t matter and AMD should just suck it up and be the horrible company they are… why?

    • Airmantharp
    • 9 years ago

    Because your GTX580’s will actually hit their advertised clockspeeds just as well as the GTX580 review samples did. The R9 290Xs clearly are not.

    • Chrispy_
    • 9 years ago

    This is probably what happened.

    AMD designed a 250W cooler and then realised they’d need to run the chip hotter than that to meet expectations.

    • Airmantharp
    • 9 years ago

    Read the review?

    Scott stated that neither ‘within the second’ nor noise testing was done in this review, intentionally, in order to keep man-hours in check. The lack of either does not invalidate their conclusions in the least.

    • Diplomacy42
    • 9 years ago

    I have 2 Nvidia GTX 580s. I run them at stock speeds and voltages, but use a custom fan profile when playing certain games. One of the cards runs stock at 1000mV, the other runs at 1063mV. Currently, one of them is running at 46 degrees and the other is running at 41 degrees.

    I am confused why this is any different… at least until AMD says outright “yeah, you caught us. we cherry picked our press samples out of a batch of 10,000 and odds are you won’t get anything like those”

    • NeoForever
    • 9 years ago

    Spoilers are well deserved if someone reads comments before reading the article 😉

    • madgun
    • 9 years ago

    Talk about an AMD shill. Techreport is not changing your buying preferences but merely investigating a certain occurrence which should not have happened.

    And please bring some evidence about NVidia’s clock variance. AMD is not that dumb that it wouldn’t expose Nvidia if given a chance.

    • Airmantharp
    • 9 years ago

    The argument, either way, would definitely be moot if AMD had improved the cooler or allowed their partners to. The problem? You still can’t get a retail sample with a different cooler. That means that improving the cooling beyond stock results in voiding your warranty, and in most cases, even more costs.

    • madgun
    • 9 years ago

    Simple logic. Nvidia has a minimum base clock defined and they have a much more relaxed thermal headroom. Whereas the 290s have no minimum base clock defined and can adjust to which ever clock settings, usable at the given thermal headroom.

    • UnfriendlyFire
    • 9 years ago

    I’m interested to see if Nividia, Intel, and other tech companies also pulled this stunt.

    • Bensam123
    • 9 years ago

    I know people are hating on him for questioning the validity of the article, but when it comes down to razor thing levels of variance all this stuff matters. That’s why you don’t see people trying to find the cure for aids outside in the open… Really any sort of scientific testing uses highly stringent testing conditions where there is as close to absolute control on all the extraneous variables as possible.

    The smaller your tolerances and room for error, the more control you need. A trip through any reputable university will show you just this. I’m not sure how much it matters in the scope of the article (retesting definitely helps with this), but given the tolerances we’re seeing here, cracking a window may definitely affect the results, more so if there if this article was written over more then one day (and night).

    • sunaiac
    • 9 years ago

    Well, thank you nVidia for showing that the so awefull changes in frequency don’t change a thing in actual game performance and that your product is still totally overpriced, as well as having the same frequency variance.

    Thing is, we buy performance for price, not frequency. Try again.

    • krutou
    • 9 years ago

    Wouldn’t they just do the tear down AFTER the benchmarks?

    • Firestarter
    • 9 years ago

    Which only affected overclockers so far. This time it affects [i<]everyone[/i<].

    • Bensam123
    • 9 years ago

    Interesting… so this could also amount to board manufacturers chincing on their choice quality components.

    • tanker27
    • 9 years ago

    Thanks for the assurance, well actually, peace of mind.

    • Bensam123
    • 9 years ago

    I don’t know what JP0P660 is, but there are a few people mentioning the TIM and I would agree. Makes it seem like one of those scenarios where TIM really does make a difference.

    • Bensam123
    • 9 years ago

    I believe some would argue that the stock cooler is a separate entity and could easily be replaced by someone other then AMD. This seems to be arguing semantics though.

    • Bensam123
    • 9 years ago

    I’ve seen the density argument before, but there are quite a few people replacing their stock coolers with 3rd party ones and they aren’t running into this issue.

    The 780ti uses a different cooler. It’s not a apples to apples comparison. You could also put the 780ti cooler on the 290x and may get improved results from the 290x (if this was possible).

    • Bensam123
    • 9 years ago

    Interesting results in a lot of different ways, at the end though you don’t seem to take into account that your second review unit should also be cherry picked (but doesn’t look cherry picked in the results, even worse them some retail units at some points) and draw the conclusion that review unit(s) were cherry picked.

    Some more things to consider ‘noise levels’ are mentioned a few different times, lampooning AMD for results that were posted in the initial review, but there are no decibel readings in this review. Finding the difference between sample 1s fan speed with the different drivers is .976% (Skyrim). This also means that the review sample performs slightly better with it’s slightly higher fan speed (or correct fan speed I suppose it would be), but that wasn’t mentioned.

    Nvidias units also feature a boost feature, but retail cards have not been monitored to see if a similar variance was also present in their cards (which is something that should be looked into). It seems as though this variance wouldn’t have even been noticed had people not run into the fan issue afflicting some cards before the driver update.

    Truth be told, AMD should totally just make a carbon copy of the 780 cooler and plaster it on a b2 version of their cards. It would fix all their PR problems and then some… maybe even allowing them to approach higher clock speeds.

    Another point worth mentioning and I’m sure people will love me for it, Mantle is coming… How much of a difference it will make, whether or not it catches on is up in the air… but for Battlefield 4 you may end up outrunning a $700 graphics card with a $150 one.

    Although this doesn’t completely absolve all of the above issues, at the end of the day $150 will buy you a nice 3rd party cooler and then you may have quieter computing and you may even be able to overclock it a bit. Sorta surprised TR didn’t try slapping a 3rd party cooler on the 290x for some contextual retribution.

    • Airmantharp
    • 9 years ago

    Sure- except that it implies that the cooler was ordered for a GPU putting out less heat, and then the GPU was cranked up later in the product cycle to the absolute limit of the cooler, with no margins for fan speed or GPU die variability to ensure consistent performance.

    • sschaem
    • 9 years ago

    … looking into this is seem the VRM also has to potential to overheat <<<
    From some report, I see people getting 8% better performance by better cooling of the VRM.


    After doing some more research I have discovered the R9-290X will throttle because when the VRM’s get hot, so i changed the TIM to JP0P660 on the MEM & VRM’s with some interesting results…

    Standard UBER Mode
    OLD TIM 2260
    NEW TIM 2445

    Maybe we have been looking at the wrong location for heat throttling?

    • sschaem
    • 9 years ago

    Not all asic are created equal, and with a product that is limited by power dissipation,
    you are bound to get fluctuation in top performance. Thats a given…

    But what a F.U situation AMD put itself into, again.

    a) why keep that on the down low and send special review boards with top of the line asic
    b) Decided to put a second DVI port to block ‘half’ of the exhaust grill (the 7970 didn’t have one)
    c) Decided to save $10 by use a ‘lighter’ cooler block

    I understand that AMD doesn’t have to build a cooling solution that can handle furmark,
    but the fact that wee see this in Battlefield 4 is a face palm.

    The 290x chip is a wonder, but the cooling solution is inadequate since ALL boards are certified to hit at least 1GHZ but rarely does so…

    Seriously how does this design make any sense.. was the extra fyll size DVI port worth it ?
    [url<]http://cdn4.wccftech.com/wp-content/uploads/2013/10/AMD-Radeon-R9-290X-Cooler.jpg[/url<]

    • Herem
    • 9 years ago

    Looking at the results, ignoring the HIS R9 R290X – 13.11b8 results, it appears that the sample 2 HIS and Sapphire cards all perform within a small margin of error of each other.

    Now there is a base lined all of the cards with their default TIM application it would be great if you could try applying after market TIM to see if this is the differentiating factor between the cards.

    • the
    • 9 years ago

    I woudl like to see some of this testing as well but there are some restrictions in play for good testing. There is a need for a ‘control’ in testing. In this case, that’d be the review units sampled by AMD. Scott is lucky enough to have acquired several retail samples but there is still the need to have control from the retail side of things. At best, one of the retali samples could be disassembled for testing in this manner given that hypothetically additional samples can be acquired. The catch is the terms on how the retails samples were acquired (are they going back to NewEgg or nVidia? Do they go back at all? ) which may impact the ability to perform such mods.

    • Milo Burke
    • 9 years ago

    Have you seen the price of a 780 ti? Sure, Nvidia has a budget to promote their line, but not even [i<]they[/i<] can afford a 780 ti!

    • the Lionheart
    • 9 years ago

    The 780TI uses a far better cooler with a fan that provides much better airflow at high RPM and less noise.

    • the
    • 9 years ago

    There are explanations for a variation of firmware between review and retail samples. It could simply be the difference in time between release to manufacturing to get a launch date and when press samples are sent for review. The press samples likely come from a batch used for internal testing/validation and received a newer firmware the release to manufacturing date. This of course depends on the timeline as well as the dates on the firmware. Unfortunately, some of this data could be under NDA and difficult to get (dates on the firmware). 🙁

    FWIW, it has been disclosed in reviews that there was a late change to drivers/firmware for the vanilla R9 290 while press samples were already sent out.

    • the Lionheart
    • 9 years ago

    What are you planning on working for both companies?

    • slowriot
    • 9 years ago

    [quote<]I don't. Nvidia isn't in business to be fair to AMD.[/quote<] So what are you suggesting? That Nvidia's knows its cards have just as much variance? Again, if the GTX780Ti cards are markedly more consistent than the 290X then Nvidia would stand to benefit from proving it by providing retail samples in the same fashion. I'm not asking Nvidia to be far, I'm wondering why they didn't go all the way and nail the 290X's coffin shut. I just can't come to a conclusion until we know this type of situation is unique. I would have liked TR to hold off until they could have tested retail samples of both models. I'm willing to bet we'd see as much variance among GTX780Ti retail cards and that Nvidia knows this.... and they also know that by the time it's proven the PR battle will already be won.

    • NeoForever
    • 9 years ago

    I’m also curious about this. 290X variance is a real issue, but what else will we find if dig deeper? Exactly *how much* variance is there in Nvidia cards?

    ^ This is especially important considering Nvidia agreed to fund the 290X retail samples. Meaning that they must be really confident, for example, that 780Ti retail cards vary much less than 290X.

    • superjawes
    • 9 years ago

    It’s not just the power consumption, though. The Hawaii chip might draw less power, but that’s only because it uses a smaller silicon process. The [i<]density[/i<] of the power is the bigger problem because whatever heat you generate has to be blown off to keep temperatures down, and the 290X is dropping from peak frequencies because it hits its temperature ceiling. So while it might draw more power under load, the 780 Ti also runs [url=https://techreport.com/review/25611/nvidia-geforce-gtx-780-ti-graphics-card-reviewed/11<]about 11 degrees C cooler[/url<], and that means that it can cool itself without cutting into performance like the 290X does. Furthermore, even if you put a good aftermarket cooler on the 290X, you could put the same or similar cooler on a 780 Ti and also get improved results.

    • Krogoth
    • 9 years ago

    Not really, it is just more sensationalism over something that every player in the semiconductor has pulled at some point and time.

    • chuckula
    • 9 years ago

    [quote<]I find it odd that Nvidia would offer to purchase two retail R290X cards but not offer to also purchase two GTX780Ti cards.[/quote<] I don't. Nvidia isn't in business to be fair to AMD. Conversely, AMD isn't in business to be fair to Nvidia (see Mantle), and there's no rule saying that AMD couldn't make TR the same offer that Nvidia made TR (only for the 780ti instead)... in fact I'd welcome it to see if Nvidia has been up to anything on its end.

    • slowriot
    • 9 years ago

    This is my big question. I’m not sold this level of variance is unusual. I find it odd that Nvidia would offer to purchase two retail R290X cards but not offer to also purchase two GTX780Ti cards. It seems Nvidia was very confident in regards to the R290X variance. Wouldn’t they want that directly compared to the supposed consistently of the GTX780Ti cards?

    • the Lionheart
    • 9 years ago

    Well, the stock cooler is the cause of the problem, not the thermal limit on the chip as being suggested. The cooler is simply horrible. In fact, some people were able to get substantially better thermals by removing a part of the I/O plate on the back of the card.

    • Krogoth
    • 9 years ago

    This kind of non-sense has been going on for years. Pressure from marketing and management can sometimes force engineers to cut corners and push their designs to their limits to make the product look better than it is. Cherry-picking samples to get sent professional reviewers is old news.

    I always thought that a researching customer would always take results from various sources, compile the results and put in a range. Retail units should be within 1-5% of that range.

    Of course some sue-happy customer is going to force reviewers and marketing drone to put in a *results may vary disclaimer.

    • Krogoth
    • 9 years ago

    All of the semiconductor companies have been guilty of this at some point and time when they tried to push something to their limit in order to “beat” the competition.

    AMD did it before with T-bred A, Barcelonas and Thunderbirds back in the day. ATI did it with the HD 2900, X1900XTX and X800XT PE. Intel did it with the Prescott, Smithfield and Pentium 3 Coppermine 1.13Ghz. Nvidia it was GF3 Ti500, FX5800, GTX 480.

    • derFunkenstein
    • 9 years ago

    Not irrelevant. The stupidity that knows no bounds! Just when you think AMD’s PR can’t get worse, it does!

    • indeego
    • 9 years ago

    And OCZ sent you and other reviewers non-crashing SSDs. News at 11.

    • the Lionheart
    • 9 years ago

    This is truly unfortunate and funny; how AMD shot themselves in the foot and Nvidia, being the snake they are, are totally Exploiting the issue.

    But from the looks of it, it seems that the 780 TI exhibits some variance too in clock speed going from Crysis 3 to BF4.

    The funny thing is, the 290X is right there with the 780TI in FPS despite running at a substantially lower clock speed and consuming less power on load. If only AM&D equipped the card with a decent cooler, they would’ve had a 10/10 winner in the market, and most likely the GPU crown at the price.

    • weaktoss
    • 9 years ago

    That’s a pretty expensive shill! Especially when so many people seem to enjoy doing it for free…

    • the
    • 9 years ago

    Indeed. I would argue that there are two nuanced things from this conclusion. The first is the performance of the initial press sample and the second being the general variance of all R9 290X cards.

    There is data to support the idea of golden samples given to the press but it is not conclusive on its own. The second press sample conforms to what a retail behaves like. What would be helpful is a bit of a time line of when these samples were requested, when they arrived and what BIOS revisions were used (particularly for the retail cards).

    A driver revision and additional data points has closed the performance gap of the R9 290X overall. Still the performance profile remains cloudy as some cards will perform better than others while within specification. A 10% variance in performance can command a $150 price premium here. That’d have the effect as appearing as a line instead of a point on TR’s conclusion scatter plots.

    [s<]AMD's PR silence on this matter is note worthy but I wouldn't use it for judgement on the press sample matter either.[/s<] AMD does have the generic 'we're looking into this matter' response now. Edit: Grammar fixing per furher dodozoid 🙂

    • Milo Burke
    • 9 years ago

    Has anybody compared a press version with a retail version of the 780 ti or a similar card from the green team?

    • Milo Burke
    • 9 years ago

    Scott, perhaps in lieu of having a temperature controlled laboratory, you could record the ambient lab temperature for each test?

    Look at the colored bar chart and small paragraph right above it on this CPU cooler review: [url<]http://www.hardwaresecrets.com/article/Cooler-Master-Hyper-212-EVO-CPU-Cooler-Review/1407/6[/url<]

    • slowriot
    • 9 years ago

    I read some of the comments first before the article. I expected to find a much larger spread in the results. Instead what I see is the biggest difference among the cards using the latest driver is roughly 6% or just over 2 FPS. What am I missing? Are people not noticing that the card which was significantly slower was running the out dated driver?

    • Airmantharp
    • 9 years ago

    You can’t really compare raw FPS here- they’re running the cards in a GPU-limited situation at lower FPS to keep the CPU variable as far out of the equation as possible. At this resolution with these games, you’d have two or three of these cards.

    At more reasonable resolutions, a 10% difference in performance is a pretty big deal, especially for users of higher performing displays trying to hit 120Hz to 144Hz.

    • CocoBongo
    • 9 years ago

    Hi folks! I honestly don’t consider it a big deal. Really. I mean..come on…both NVIDIA and AMD are running close to the edge when it comes to +500 bucks cards so…low 30s fps vs mid 30s fps in Battlefield 4 with *everything* maxed out…I for one would probably not notice it. And even if I did, I’d probably scale something down a notch and forget about it.

    @TR: Do you still have the previous generation of cards from AMD/NVIDIA that you’ve been given to review? Could you see if this was true for that too? What about two generations ago? Did anybody even checked?

    Cheers!

    • drfish
    • 9 years ago

    [quote<]I fail at media sensationalism yet again.[/quote<] Classic!

    • Airmantharp
    • 9 years ago

    The stock cooler is part of the design.

    • the Lionheart
    • 9 years ago

    The 780 TI consumes 20-30 Watts more power than the 290X review sample on load? The problem here is the stock cooler on the 290x/290 cards, not the thermal limit of the Hawaii chip. Of course the turbo scheme is also a bit too aggressive, but that’s not an issue.

    • the
    • 9 years ago

    Excellent article Scott. It seems that there is an undeniable large variation between R9 290X cards with your initial press sample being better than the rest. Though with a small sample size it is difficult to absolutely conclude that the press samples were cherry picked golden samples. (Granted, the same data does not in anyway discount this either.) So the data hunt has to go to other possible sources of variation and /or acquiring a larger sample size for testing.

    What are the actual BIOS revisions the cards? For that matter, are all the cards using the same board revision?

    Edit: Re-reading the article again, I think what would be helpful to have a time line of events (NDA permitting of course). IE when did the two press samples arrive vs. the retail samples? It’d be helpful to put the BIOS revision dates on this time line, though I’m not sure how this information could easily be obtained short of dumping the file and looking for a date code in it. The idea that keep popping up in my mind is that the press sample used a newer BIOS than some of the retail samples. This could happen for several reasons due to the delays involved in mass production.

    • oldDummy
    • 9 years ago

    IMO,

    Cherry picking review products is neither new nor extraordinary. ethical?, by the nature of current competition; my belief is when survival rears it’s ugly head: yes.
    they did not lie.
    .

    • Milo Burke
    • 9 years ago

    I hope you don’t game with a webcam on…

    • Goty
    • 9 years ago

    [quote<]Press samples got known good (or perhaps I should say 'best') GPUs[/quote<] This statement isn't supported by the data, though. If the press got "good" samples, then the variation between any press sample and a "bad" retail sample (e.g. the HIS card) should be much more than a fraction of one percent, shouldn't it? The "5-10%" number quoted by the author is NOT representative of the performance produced by the retail cards.

    • Goty
    • 9 years ago

    Yes, the variance IS the point, but that’s not how the conclusion is worded. The author goes out of his way to use numbers from a card NOT using the fix supplied by AMD to cast the product in a harsh light and to push his indefensible conclusion that AMD cherry-picked their press samples.

    • Forge
    • 9 years ago

    If anyone from AMD is reading, I’m available to shill for cheap. A nice top end Eyefinity setup, complete, would probably keep me on retainer for a year or two.

    • moose17145
    • 9 years ago

    MY pricing broke down to about this (in case anyone is interested. Im sure there is at least one or two others out there who are looking into a home lab and want more information in pricing for one)

    Routers – $300.00
    Each 2821 Router was 100 dollars after shipping. $75.00 for the actual router, $25.00 for shipping. Given what these things weigh I wasn’t gonna argue that it probably cost 25 dollars to ship each one.

    Switches – $50.00
    The 24 port Catalyst 2950 switches were only 25 dollars a piece after shipping (10 for the routers and 15 for shipping LOL). I only needed to buy two switches since a friend gave me one of his 2950 switches he wasn’t making use of anymore.

    Other – $150.00-200.00
    another 50 bucks for a two post rack kit, another 50 or so for two wic-1t and wic-2t card, and maybe lets say another 50-100 or so in other misc cables, new 2GB CF cards, and other misc hardware.

    Total – $550.00 – 600.00
    So guess I spent a few more dollars than I thought initially. Still think I did pretty good on pricing though.

    • odizzido
    • 9 years ago

    It made me smile, though it was totally irrelevant 😛

    • Damage
    • 9 years ago

    The R9 280X is based on the same Tahiti chip as the Radeon HD 7970 and has an older generation of PowerTune with a much narrower range of clock speeds. The 280X is still a bit faster than the GTX 770 and remains my pick in that segment.

    • tanker27
    • 9 years ago

    Umm what about all the R9’s, does the 280x suffer from this too? And just think I just pulled the trigger and bought a R9 280x. :/ First time in a long, long time I would own something other than NVIDIA……..

    • derFunkenstein
    • 9 years ago

    Full disclosure is good. But if nVidia offered to lots of other sites and TR is the only one that fessed up, that reflects badly on everyone else, not Scott.

    • chuckula
    • 9 years ago

    A lot of people think that Nvidia’s cards have just as much variance. I’m sure there’s some variance, although I doubt it’s as great because the Nvidia cards aren’t running at their thermal limits.

    However.. to be fair… if AMD still had a marketing department and if they truly believe that Nvidia is up to no good, there’s nothing stopping them from emailing Scott and making him an offer he can’t refuse….

    • YukaKun
    • 9 years ago

    There was a time in F1 when cars burnt engines just to have the fastest qualifying lap before the race. And when I say burnt, I mean melted and sold as junk after the Qualifying laps.

    Maybe the press release firmware is close to that. They push the card to the very limit so it can show good results. And I would think nVidia might do it too. I’ve seen variance from nVidia as well, but as usual, no one notices it or makes it public.

    Cheers!

    • derFunkenstein
    • 9 years ago

    He did say they were sealed in the box, indicating that they were picked like any other order.

    • Andrew Lauritzen
    • 9 years ago

    The aggregate is not the point, the span/variance *is* the point. No two cards with the same name should ever vary by that much in any workload. As a consumer, it’s unacceptable for a card priced at that level to fall into the next lower bucket of performance in a game, especially one as popular as Battlefield.

    I agree that there’s not enough evidence to claim that AMD is specifically cherry-picking review samples (although I’m sure they and everyone else actually are), but that’s not even the problem here. The problem is that these cards are being pushed too close to the edge and exhibiting poor uniformity due to that.

    • lilbuddhaman
    • 9 years ago

    I dare say that virtually every modern business does exactly what you described in the first sentence.

    • IPlayNaked
    • 9 years ago

    What facts? The temperature wasn’t controlled except by “leaving a window open”

    We know for a fact the cards are thermally constrained in quiet mode. Everyone noted that in their reviews. The fact that the clock speed difference is sub-5% given an open window in a completely non-controlled environment says to me that there’s nothing to see. And if you either switch the card into Uber mode, again, something every review noted, or if you adjust the fan speed profile very slightly (if you’re averse to flipping a switch) then the problem disappears, almost regardless of environment).

    • superjawes
    • 9 years ago

    TR Kickstarter?!?! Can a stretch goal be the lolcat version of the site from April Fool’s Day? 😀

    • IPlayNaked
    • 9 years ago

    Why is no one pointing out this line?

    [quote<]I did try to keep room temperatures from rising too high by cracking open a window when running a load test caused the room to heat up, but maintaining a perfectly steady environment just wasn't realistically possible." [/quote<] Why is this line being ignored, given that we know the sole limiting factor of this test is temperature, and every other variable only matters in how much it relates to temperature? Why was this review even written, given that? Why is the headline "Are retail Radeon R9 290X cards slower than press samples" when the cards were only tested here in quiet mode, but the performance claims made about uber mode? Objectively, you know the cards are not slower, but that temperature differences affect the boost clocks, so why were tests not done on Nvidia cards? Intel CPUs? AMD CPUs? Anything with a boost clock? You have one here, a single one. If you only tested a single AMD one, you wouldn't see an issue either, so why the blind trust? Why didn't they offer 2-3 of their own cards? And finally, why is any of this event relevant, given that you're trying to run the GPU in quiet mode. Uber mode shows every one of these devices to be 100% the same. So why the fuss? Well, the fuss is because Nvidia sent you some nice cards. Who would turn that down, and who wouldn't give them the article they want. Quite honestly, I consider this review, test, whatever you want to call it to be less than useless. Not controlling for temperature in a temperature constrained scenario is a complete joke. Not mentioning anywhere in the article that uber mode does not have the same issues is insane. I can understand you making a value judgement and saying that it can't stand that AMD is releasing wildly different cards, I even support that to an extent. What I can't understand is you not mentioning anywhere that it can be entirely mitigated with a switch on the card, or fan speed settings in AMD's software. What that tells me is that this is way more about being a marketing arm for Nvidia and way less about getting realistic and worthwhile information to your readers.

    • superjawes
    • 9 years ago

    Actually, if you take a gander at the [url=https://techreport.com/aboutus.x<]"About us"[/url<] page, you can probably figured out why TR couldn't just go out and buy a few cards... And Scott accepted the offer (which was to buy cards from a third party, Newegg) because it's a big issue. Consumers rely on benchmarks because we can't test drive the latest GPUs to find out which one we want to buy. If a retail card isn't performing to its benchmark, that's very bad for the consumer.

    • Prion
    • 9 years ago

    I’m kind of glad that competing companies are willing to use the free press to attempt to stir up a poopstorm. It makes my day more entertaining, at least.

    • FuturePastNow
    • 9 years ago

    Most likely it means there’s a noticeable variability in the quality of GPUs shipping to consumers. Press samples got known good (or perhaps I should say ‘best’) GPUs, and some consumers will surely get those, too. But many won’t, and we have no way of knowing how widespread this is or what any given card will ship with. And Nvidia either knew or bet that the “won’t” in this case was likely to be found in two semi-random retail cards.

    As Scott notes:

    [quote<]I've already mentioned that the 290X's fairly generous card-to-card variance isn't a good fit for the realm of high-end video cards, where performance differences of less than 10% can command a premium of $150 or more. Bragging rights aren't cheap, folks.[/quote<] Dishonest? Yeah, a little. Press samples have been cherry-picked since the dawn of time, but usually there isn't as much variance among products sold as the same model at the same price. [i<]Usually[/i<] in the video card industry, if two GPUs perform 10% apart, the slower one would be binned into a cheaper product. Or the better one binned into a more expensive product, whichever.

    • jihadjoe
    • 9 years ago

    Maybe they heard it on the [url=https://techreport.com/review/25659/the-tr-podcast-145-the-mailbag-and-top-tier-graphics<]podcast[/url<]? Or perhaps they saw TR had an [url=https://techreport.com/news/25609/updated-retail-radeon-r9-290x-cards-may-be-slower-than-press-samples<]article[/url<] where they ask their 290X owning readers to chime in with their experiences... I'm pretty sure it's somebody's job at Nvidia to read what the press says about their products.

    • cosminmcm
    • 9 years ago

    Great prices you have in US. I paid more than 1000$ for 3 3550s and one 3560 from ebay.co.uk (a year and a half ago). The switches are a must as they can’t be fully emulated, but GNS3 is just great when at work or on the go. Good luck in your studies, I think that the beginning is the most interesting, you start to loose the enthusiasm as you get older and learning isn’t that attractive anymore.

    • Antimatter
    • 9 years ago

    The third party coolers can’t come fast enough. It’s so unfortunate for AMD, if they had just included a better cooler the 290(X) cards would not be dogged by as much controversy.

    • Deanjo
    • 9 years ago

    Because AMD couldn’t afford to purchase the cards for them.

    • chuckula
    • 9 years ago

    Did you have any evidence that the retail cards you received had been used before (either by Nvidia or even by a regular customer who returned them)?

    Is there any evidence that Nvidia did anything other than merely ordering the cards just like any ordinary customer would do?

    Obviously Nvidia is a biased party, but as long as they weren’t actively gaming the system or tampering with the cards, then I trust your results.

    • jihadjoe
    • 9 years ago

    Off-topic:

    Now that you have four 290Xs, please test quad-crossfire! You can even say “Thanks Nvidia” on the report. =)

    • Damage
    • 9 years ago

    Yeah, I’m pretty sure I wasn’t the only one who got that offer, but that’s up to others to disclose. I knew the moment the offer was made that I would be telling everybody up front where the 290X retail cards came from. That’s the best I can do here, just be up front with everyone. We may try crowd-funding these little projects in the future.

    • GeneS
    • 9 years ago

    I see his point. I think any journalist or politician would agree that it does not and cannot pass the ‘smell test’

    • landsome
    • 9 years ago

    That’s a good thing in general, but a bad thing in this case.

    It seems OK when you review a product (although repasting and retesting is OK too as long as the product is also tested in the original form – you just give the reader more info, and she is still free to choose depending on how risk adverse she is).

    But not OK when you “review” a problem, as in this case. You might simply miss a plausible explanation. In the Computerbase.de test, repasting increased the 290 speed by 17-22 MHz, which in that case explained the (admittedly small) variation from the press 290.

    • Pwnstar
    • 9 years ago

    Triumph, is that you? [url<]http://www.youtube.com/watch?v=r5g_gs6nnyo[/url<]

    • anotherengineer
    • 9 years ago

    “if both nvida and amd are guilty”

    Guilty of what?

    The AIB partners are making the cards, and although they are following a reference design by AMD, it does not mean that every single item used on the cards is exactly the same item used on the the AMD review cards.

    • WATERCHEMIST
    • 9 years ago

    They should pull the recommendation immediately.

    • derFunkenstein
    • 9 years ago

    If they’d been shipped directly from nVidia to TR I’d be with you, but they weren’t. nVidia knew the problem was SO WIDESPREAD that it most likely affects every card to a degree so they had a 100% chance of getting what they wanted. AMD discredited themselves, IMO.

    • Milo Burke
    • 9 years ago

    I was reasonably sure this issue was overblown. After reading this article, I see that it’s a real problem.

    I find myself increasingly wary of the sensationalism of the other news vendors, and more loyal to the genuine fact-finding of Tech Report.

    • Scrotos
    • 9 years ago

    I assume they were calling all their reviewer contacts and making the same offer. Damage should contact HIS reviewer buddies for confirmation.

    And I don’t see what the big deal is. If nvidia buys retail ATI cards shipped direct from Newegg, it’s not like they are tampering with anything. It’s an opportunity and nvidia is savy enough to take it.

    That’s marketing. That’s business.

    • anotherengineer
    • 9 years ago

    Because he wanted some free Christmas gifts 😉

    Who wouldn’t want two new R9 290X cards for free??

    • hieu.noob
    • 9 years ago

    Except waiting for retail means being late on launching the review, which probably equates to lost revenue. It’s a slippery slope.

    • derFunkenstein
    • 9 years ago

    If it was as easy as saying “avoid all the HIS ones and you’re basically fine” then great, we could probably move on. But it’s not that easy because the article has a sample of 2 retail cards. Enough to say “yes this is most likely a possibility” but not enough to say “this model is OK and this one is not”. And in every case, the first press sample was the fastest.

    • moose17145
    • 9 years ago

    The home lab is complete for now. I built it just as a study aid for CCNA. CCNA studies are currently progressing (at a slow pace as its been a hectic couple of months, but progressing none the less), and I am currently half way through the ICND 1 book for the ICND 100-101 test.

    I am rather enjoying having the physical switches and routers so far. Packet tracer and GNS3 are great tools, but I seem to learn and retain the knowledge better having the actual hands on routers I can physically touch and remote into. I know from the command line standpoint it SHOULDN’T matter… but for some reason it does with me. Maybe It’s just cause I enjoy setting up the routers and switches and being able to physically see them working and doing what I told them to do and seeing it all working on actual hardware rather than just a simulation. packet tracer and gns3 just feel so artificial to me it kinda takes the fun of configuring routers and switches out of it for me.

    Either way I do not regret the money spent (just under 500 dollars in total on ebay) on the home lab. I am enjoying having it quite a lot and as a study aid it definitely seems to be doing its job. A bit overkill for CCNA? Perhaps, but I have something I ENJOY using, which makes a load of different.

    • maxxcool
    • 9 years ago

    I’d like to see if the nvidia cards do the same thing. If they don’t… then pull your recommendation on the 290x and tell amd your telling your readers to not buy into bait and switch bullcrap.

    they will never do it again..

    if both nvida and amd are guilty, well then well need a “retail review” after a “Pre-release” review .. and in the “Pre-release review” be sure to note that these are “golden sample” cards… like when AMD and INTEL would only send the best cpu they could find that would overclock the best.

    • swiffer
    • 9 years ago

    The thermal paste application would be my first thought as well.

    Source: I’ve replaced the stock thermal paste on many Sapphire cards from the HD 3870 to the 6970 and noticed immediate temperature and audible fan speed drops in each instance. In addition to the firmware differences, HIS may be using either a cheaper TIM, or one with different physical properties (longer lifespan, less efficient).

    • Goty
    • 9 years ago

    Ahhhh, that explains it. You’re cherry picking your data. Good one, didn’t actually see that coming. Now, if you actually pick two comparable data points, you might get a meaningful result.

    • Arclight
    • 9 years ago

    What a mess…

    • Goty
    • 9 years ago

    Only if you can explain to me how a 6% difference somehow magically becomes 15%.

    *EDIT* Nevermind, I figured it out. You intentionally compare the numbers where the HIS sample is running its fan at a significantly lower RPM. Good one, there.

    • anotherengineer
    • 9 years ago

    Agreed.

    However it just goes to show though, that if AMD dos not explicitly state to the AIB partners to use only “this” thermal paste, applied with ‘this’ amount, in ‘this’ way, there will be variances from maker to maker.

    edit – a reason why I used to stick to ATI branded cards when they existed, they ‘seemed’ to be better than the AIB cards for whatever reason.

    • moose17145
    • 9 years ago

    [quote<]Messing around with the clocks (especially RAM) makes a lot of difference.[/quote<] Yea I have noticed that. I spent a lot of time trying to tweak that dang 7970... it just acts like its a low hashrate card and nothing I do seems to let me break 600 khash/sec on it... a bit of investigation on the internets later revealed that I wasnt the only one having issues with that card pushing low hash rates. The 290's seems to be doing fantastic at mining though once you unlock their fan speeds. When I left the fan at its stock 40% max fan speed (and also stock clocks for that matter) the card was only pushing about 400 khash because the card was hitting its 95 degree thermal limit and was throttling itself pretty bad. Unlocking the fan though sure made a world of difference! At stock speeds it was rarely spinning above 50% and was holding the stock clocks rock steady. Amazing what that extra 10% fan speed made! Seems like the 1GHz core and 1.5GHz mem clock is about the sweet spot for the 290's for mining.

    • swaaye
    • 9 years ago

    It’s also entertaining that AMD was somewhat evasive on the press sample vs. retail firmware front. As if they don’t know exactly what their own firmware is doing.

    • chuckula
    • 9 years ago

    I did, that’s why I downthumbed your post.

    Go read the article first, then explain why a 15% performance drop in Battlefield 4 is no big deal.

    • dpaus
    • 9 years ago

    [quote<]Then something funny happened. We got a call from the folks at Nvidia offering to purchase a couple of retail R9 290X cards for us to test. [/quote<] Yes, that is funny. Damn funny. How would anyone at Nvidia know you weren't able to purchase the cards yourself? Who told them? If nobody told them, why would they phone you out of the blue to offer to buy some for you? And why would you accept anything from any company for the sole purpose of discrediting their competitor? Damn funny.

    • chuckula
    • 9 years ago

    Battlefield 4:
    Review sample: 37.5 FPS
    Retail product: 32.6 FPS
    Result: Review sample is 15% faster than the retail part. Sorry, but for this level of product that is *way* too much of a variance to be considered acceptable.

    Please read the article next time before you post accusations that TR is either biased or incompetent. The only person blowing things out of proportion is you in overly minimizing a real issue.

    • Goty
    • 9 years ago

    Read my latest post. It’s much less of an issue than the article would lead you to believe.

    • Goty
    • 9 years ago

    So, AMD cherry-picked their review samples, but one of said two review samples is almost an exact match to the two retail cards. So what you’re actually saying is 50% of AMD’s review cards might be faster than retail cards with an error of +/- 50%.

    You know what they say about small number statistics.

    The responsible way to have concluded this article would have been to mention that yes, there is variance between cards and that, yes, consumers should be aware of this fact. Other than that, the conclusions come off as overly accusatory when you can’t make any sort of concrete claim as to the superiority of the press sample cards.

    Seriously, there is an average of less than 1% clockspeed difference between the 2nd 290X sample and the HIS 290X and 0.081% difference in actual in-game performance between the two. Comparing to the faster sample, the differences are 5.3% in clockspeed and 4.5% in actual performance. Where do you get your “5-10%” number in the conclusion? If I were in school writing a lab report, such a conclusion would have lead to a failing grade for misrepresenting the data.

    • cosminmcm
    • 9 years ago

    So how is the preparation for the CCIE Lab working for you? Hurry up, from june we’ll get version 5 and we don’t want that, do we? 😛

    • jjj
    • 9 years ago

    So at this point, i guess, the entire new line should be tested against retail cards.
    And maybe future GPU reviews should include clocks and fan speed so we can easier spot discrepancies.

    • Firestarter
    • 9 years ago

    ~670 khash/s. Messing around with the clocks (especially RAM) makes a lot of difference.

    • WaltC
    • 9 years ago

    The inclusion of the “result” from a *single* 780TI *sample* in this article is fairly disturbing as it would seem to have no basis for inclusion in the article at all–which purports to examine minute differences between R9 retail cards but includes a result for only *one* 780Ti card–and a “sample” at that. That’s the “stuff I didn’t expect”–to see a single 780TI *sample* (furnished by nVIdia?) in an article comparing several off-the-shelf R9 cards. It’s rare when I see an article so thoroughly grasp at straws *seemingly* in order to just try and reduce the huge success AMD has on its hands with these products…!

    [quote<]Also, in the "stuff you didn't expect" department, notice that the blower RPM for the GeForce GTX 780 Ti is higher than for any of the 290X cards, even though the 780 Ti is much quieter under load than the R9 290X. Nvidia's blower appears to have a slightly smaller diameter, but I'm impressed that it runs at substantially higher RPM and produces less noise[/quote<] Yea...;) That's the other thing I didn't expect to see--[i<]nary a mention[/i<] of the $150 in MSRP pricing between the 290x and the 780Ti, if that still exists. I'm actually more impressed by the price differential...! Different strokes, I guess. As far as the 780ti result (notice the lack of plural), unless TR was comparing it with 4-5 other off-the-shelf retail "samples" of 780ti products, might as well toss out the 780ti result entirely as it would seem irrelevant to everything discussed in the article. Eh? Test one R9 against itself you'll get extremely consistent results--just as consistent as testing only one 780Ti ("sample,") against itself--which is what this article does (for some strange reason.) This article leaves some fundamental questions unanswered, such that I'm now wondering what entity it was who "suggested" TR write an article like this...:) (No wonder AMD is silent--speechless, likely, the company is being polite, I suppose, in waiting for you to discover the glaring inconsistencies here before they have to come along and point them out.) IMO, if you artificially hobble these cards by intentionally setting in max fan values of less than 100%, you'll get back unpredictable results because almost every system is in some way different from others in terms of airflow, case temp, and other thermal properties. No two cards are 100% identical, either, in terms of yields ASIC and fan cooling, etc. Measuring slight differences between like cards of 2-3% would therefore also seem irrelevant. Max fan speed should always be set @ 100% in a comparative card review because, just as Moose17145 points out, setting the max value at 100% does not mean you're going to see 100% spin-up of the fan at all! (He is seeing far less than 100% fan even though that's the max value he's set.) It wasn't clear in the article whether the fan maximums were set to 100% on all of the cards--and sorry if the info is there but I missed it somehow.

    • chuckula
    • 9 years ago

    Yes, because it was an issue that has come up and TR went out of its way to put these cards to the test in an independent manner.

    Now, if this turned out to be a non-issue I wouldn’t have been surprised if the published article was shorter, but I’m sure TR would publish it all the same.

    • clone
    • 9 years ago

    I’m hoping for all the bad AMD news that can be had, anything to lower the price of a full R9 290X to under $500 would be wonderful and much appreciated.

    more pooping on AMD please.

    • moose17145
    • 9 years ago

    I have noticed that seems to be a trait of many of the 7950’s out there that you are able to under-volt them for mining. May I ask what your hashrates are for that 7950? My 290 is pushing just over 875khash/sec at its current settings.

    I also have a 7970GHz edition card, but I have to fight tooth and nail for that card to push anything above 550 Khash/sec. Works great for gaming, not so much for mining. Thinking I might just sell that one on ebay.

    Also chuckula, what the heck does everyone one this site have against crypto currency mining? If you can do it and make money then why not? I rent a place and electricity is included in the rent, so I pay the same flat rate whether I mine or not. So if my electricity is free, then why not?

    The main point I was getting at was that the card seems to be able to handle its stock, and higher than stock clocks just fine if you unlock the fan some to allow it to cool itself properly. I agree the reference cooler leaves a bit to be desired compared to the competition, but It would appear that the Hawaii silicon DOES have a bit of headroom when given proper cooling. So when third party solutions come around I wouldnt be shocked in the least to see hot clocked variants with lower than 95 degree target temperatures.

    • clone
    • 9 years ago

    highly doubtful.

    • Goty
    • 9 years ago

    You really think this would have been published had there been no differences?

    • NarwhaleAu
    • 9 years ago

    No company I know that wants to stay in business, deliberately releases a poor product on the hope that they can then release a better product and capture the untapped market. Your entire post is an example of the negative publicity and public sentiment that not releasing a solid product can generate. Far more likely is that they had a product that they stretched beyond its design capabilities in order to try to win the performance crown. At a lower performance point, the cooler is more than adequate.

    To your point though, it’s not poorly cooled (the cooler is removing a lot of heat!). It is sub-optimally cooled. It’s cooled enough to reset the high end video card space, so it’s not like the card is a flop. They purposely made a card to hit a price point – and at the price point they released, we don’t get an awesome cooler. What we get it a card that on the one hand is noisy, hot and has stability issues, but on the other hand made the Titan obsolete over night, and forced Nvidia to drop prices on their high end card by hundreds of dollars.

    Once they bed their silicon down, reduce costs, improve drivers etc, they’ll have the margin headroom to release a GHz version. That will be the signal for me that it’s time to buy.

    The good news for the rest of us is, all this controversy may reduce demand, driving down prices even further. 😀

    • Ratchet
    • 9 years ago

    Not to be “that guy” (again) but have there ever been any similar “retail” tests done on Nvidia cards or even other types of industry supplied review samples like CPUs, motherboards, and memory?

    • Firestarter
    • 9 years ago

    my HD7950 needs significantly less voltage when mining as opposed to gaming

    • chuckula
    • 9 years ago

    Cause: [quote<]The card seems to be perfectly stable and is used to mine litecoins[/quote<] Effect: [quote<]Edit: I notice the downvotes are coming... [/quote<] 'nuff said.

    • Deanjo
    • 9 years ago

    The more AMD relies on pushing their designs close to their limits with little headroom, the more you will see discrepancies like this.

    • chuckula
    • 9 years ago

    That’s a very good thing. A consumer end-user should be under absolutely no responsibility to break down & rebuild a video card just to get the level of performance that is stated on the box and ostensibly demonstrated by independent review sites like TR.

    If you want to go crazy overclocking beyond spec, then feel free to go exotic* but that should never be a prerequisite for getting what’s promised on the box.

    * I’ve delidded a Haswell so I’ve done that too. At the same time, I also tested the chip out at stock speeds and it worked as advertised even with the crappy retail cooler.

    • Damage
    • 9 years ago

    None of them were stripped down–for this very reason. All of my results come with the original TIM application intact.

    • puppetworx
    • 9 years ago

    I remember someone over at PCPer pointing out the TIM problem in this equation, that is, most reviewers strip down the review cards they get for photographs (especially of the shiny new silicon) therefore they all must apply their own thermal paste (usually AS5 or other aftermarket TIM).

    The TR’s original review didn’t include any teardown pictures, but it would be nice to know if it ever was stripped down.

    • puppetworx
    • 9 years ago

    AMD’s silence on this is shameful. It’s been some weeks since the accusations started and now here it is in black and white. AMD must have known there were differences, it does not speak well for their trustworthiness to know that they willingly misrepresented their products to customers.

    • tviceman
    • 9 years ago

    Funny how people were complaining that Nvidia’s boost was akin to a GPU lottery when it was first intorduced. At least current nvidia cards guarantees a certain performance out of the box and doesn’t require headphones to enjoy. They learned their lessons with GF100.

    • esterhasz
    • 9 years ago

    Yeah, a German site (computerbase.de) changed paste on the Sapphire model and the delta with the reference card went down to 1%. Seriously, though, AMD really went to the hilt on this one.

    • anotherengineer
    • 9 years ago

    So many factors to consider.

    The thermal paste could be another item since clocks are temperature dependent.

    “The GPU in the press sample 1 card is obviously a higher quality piece of silicon; it runs at higher frequencies with lower average and median voltages without instability. ”

    AIB partners could have even better silicon, but there is a ton of components on a board, inductors, cap, resistors, even the temperature sensors themselves, etc. etc. and these have tolerances and ranges.

    The Sapphire seemed to be very close to the one review sample also……..

    Even though there is a difference, how much should be a concern? If +/- 3% tolerance is given, then that would work out to 30Mhz.

    Now toss these in a case with poor airflow, and a room at 25C and they could end up running at 700 Mhz.

    • moose17145
    • 9 years ago

    You know how you solve this issue? Let the fan spin up faster than 40% or whatever crazy low percentage it is default at if it needs to!

    I currently have a XFX R9 290 (non X version) that is overclocked
    -Core clock set to 1,000 MHz
    -Memory Clock set to 1,500 MHz
    -Max fan speed is set to 100%
    -Target GPU temperature lowered to 85 Degrees
    -Power Limit set to +30%

    The card seems to be perfectly stable and is used to mine litecoins, so the card is under full load 100% of the time, and I rarely see the fan spin up beyond 65% and it holds the clocks right at 1GHz core and 1.5GHz memory. I honestly don’t notice the card being nearly as loud as everyone says it is. Granted I do have a stack of three Cisco Catalyst 2950 switches and three Cisco 2821 routers right next to my computer as well… so anything is gonna be quiet compared to those…

    Edit: I notice the downvotes are coming… but the fact is these things do seem to have some head room on the clocks if you give the card the ability to cool itself properly. That being said yes I agree AMD should have spent some more time on their coolers. Would have been nice to have seen a proper competitor to that premium blower than NV is putting on their cards.

    • superjawes
    • 9 years ago

    Thanks for that clarification. I still find it strange that they don’t have an “investigating” statement, but I don’t want to criticize them for not communicating if they are in some way.

    • Damage
    • 9 years ago

    Don’t get the wrong impression. AMD PR has been communicating with us, but they’ve so far chosen not to make a statement about what happened with the evidently superior press samples.

    • superjawes
    • 9 years ago

    The silence is surprising because this was reported on a few weeks ago. It’s not like they haven’t had time to respond somehow. Heck, they could just do PR hocus pocus and say “We are aware of the reports and are investigating.”

    • derFunkenstein
    • 9 years ago

    SPOILER ALERT

    • derFunkenstein
    • 9 years ago

    Probably. I just wanted to use the Dumb and Dumber quote.

    • derFunkenstein
    • 9 years ago

    The “why” is very easy – there are some chips that could tolerate lower voltages and some that needed a boost. To get the best performance possible, they sent review cards with the aggressively low voltages on silicon they knew could handle it. To get higher yields, they cranked it up a bit and it had a negative effect on performance.

    And they had to know the effect this would have. They knew what they were doing all along for sure. To say otherwise implies such stupidity that they never would have come up with this silicon in the first place.

    • chuckula
    • 9 years ago

    This article actually came to the exact opposite conclusion of what I expected… wow! I was under the impression we would see a debunking of claims that there were large variations in card performance and that the initial reports were just due to minor issues with fan speed that were corrected with early drivers. This report along with all the other information out on the web seems to put a big hole through that theory though.

    I think the real smoking gun is the firmware swap that Legit Reviews found where the review cards don’t use the same firmware as the retail cards, and the retail cards showed a big performance boost when reflashed.

    Now, you have to ask *why* would there be different versions of firmware floating around out there? Surely AMD isn’t trying to make its retail cards perform worse on purpose right?

    Well, think about the review process: A site like TR gets a card in for review with a relatively short (2 weeks… maybe?) window before the NDA lifts and everybody needs to get a review posted to stay relevant. During that time, the card is being benchmarked feverishly but then…. what happens? The card may get returned to the manufacturer or if TR gets to keep it, then it likely stays on a shelf and only gets pulled out for the occasional followup comparison with other video cards. That’s not exactly prolonged use over the course of months or years.. now is it?

    I’m thinking that the review firmware is showing performance of the cards in the best light with potential sacrifices to long-term board stability that wouldn’t manifest themselves during a typical review process. The retail firmware will keep the cards more stable over the long run, but at the cost of lower performance and greater fan noise to pull it off.

    I’m not overly impressed with AMD’s behavior here, and their wall of silence to TR is making this look less & less like an innocent mistake.

    • Firestarter
    • 9 years ago

    I’m pretty sure there’s been a bit of cherry-picking in press samples of previous generations, but with normal static clock targets instead of “turbo”, “boost” or a hard limit of 95C, it only really mattered in the overclocking tests. This time, it actually matters for [i<]everybody[/i<].

    • jdaven
    • 9 years ago

    With all the cheating as of late, it might be best to test retail only versions of products and not press samples. Also, real-world tests should take precedent over industry benchmark tests.

    • chuckula
    • 9 years ago

    [Ricky Ricardo]Rory!! You got some ‘splainin to do![/Ricky Ricardo]

    • derFunkenstein
    • 9 years ago

    Holy crap, AMD, this is bonkertown. Just when I thought you couldn’t possibly be any dumber, you go and do something like this… and totally redeem yourself!

    • lilbuddhaman
    • 9 years ago

    I just feel like AMD purposely made a poorly cooled card so that they can later do a “Ghz edition” card, as well as let vendors do their various “OC” versions, but still gave reviewers the cream of the crop test cards for the initial release performance crown and whatnot.

    • Wirko
    • 9 years ago

    Ingenious. You actually said “first” without actually saying “first”.

    • Forge
    • 9 years ago

    Finally an article that breaks the rule! When the title is question, the answer is no. In this case, though, it’s yes!

Pin It on Pinterest

Share This

Share this post with your friends!