With the release of the Radeon R9 290 and 290X, AMD upended the high-end graphics market by offering performance competitive with Nvidia's existing products at substantially lower prices. The new Radeons didn't just improve the value proposition, either. The R9 290X captured the overall GPU performance crown, wresting it away from the GeForce GTX 780 and Titan by the slimmest of margins. Such little differences are magnified in the world of high-end graphics, where the spoils—and sales—often go to the victor. After all, if you're forking over something north of 500 bucks for a graphics card, bragging rights are probably involved to some extent.
You can imagine, then, how things went a bit pear-shaped when folks started reporting that Radeon R9 290X cards purchased at retail don't seem to perform as well as the review units AMD supplied to the press.
Whoops. Sounds bad, doesn't it? How can that be?
Well, from here, things get kind of complicated. Although the retail R9 290-series cards appear to have the same basic hardware and specifications as the review samples, the zillion-dollar question is what happens during everyday operation. You see, like the Turbo Boost mechanism in Intel CPUs, the Radeons' PowerTune algorithm adjusts clock speed dynamically, from moment to moment, in response to current chip temperatures, the GPU workload, and the video card's pre-defined power limits. For one reason or another, folks found that at least some retail R9 290-series cards seemed to operate at lower clock speeds than those initial review units.
AMD identified one apparent cause of the problem pretty quickly: the blowers on some retail cards weren't spinning as fast as expected, and the reduced cooling capacity resulted in lower clock speeds. This explanation was quite plausible. Heck, we'd already seen how an increase in blower RPM can improve the R9 290X's performance when we switched it into "uber" fan mode during our initial review. One can imagine that different blowers might not respond to increases in voltage quite the same way. If blower RPM were varying substantially from card to card, that might well explain the clock speed differences.
AMD soon issued a fix in the form of a software update. The Catalyst 13.11 beta 9v2 driver sought to equalize blower speeds from card to card by monitoring RPM directly, thus hopefully improving performance on retail cards that seemed to lag behind.
That change seemed sure to help, but as we discussed on our podcast, we had lingering questions. Had blower speeds increased generally, making the R9 290-series cards even louder? Because, you know, they were awfully darn loud before. More importantly, how much of the card-to-card variance remains, even with the new driver? I really wanted to know.
We had motive to test some R9 290X retail cards against our press samples, but we lacked the means. Although you may have heard stories about the glitzy lifestyles of semi-obscure hardware reviewers, the truth is that we can't just order up several $549 graphics cards on a whim. Heck, these days, I can't order lunch on a whim. Loading up a shopping cart at Newegg with 290-series Radeons wasn't really an option.
Then something funny happened. We got a call from the folks at Nvidia offering to purchase a couple of retail R9 290X cards for us to test. The cards would be ordered from Newegg and shipped directly to Damage Labs for our scrutiny. The sample size wouldn't be large, only two cards (with boxes still sealed) pulled at random from Newegg's stock, but apparently the green team was confident enough in the likelihood of differences between our review samples and the retail cards to make the purchase. Since we were interested in exploring the question—and a little amused by the prospect of these fierce competitors buying one another's products—we accepted the offer.
A couple of days later, we took delivery of two Radeon R9 290X cards: one from HIS and the other from Sapphire. Apart from the stickers on the cooling shrouds, the two look to be identical to one another and to our two R9 290X review samples. Almost immediately, I started some initial testing, to see if I could spot any obvious differences between the cards. Little did I know how much work lay ahead.
Our testing methods
Our test systems were configured like so:
|Chipset||Intel X79 Express|
|Memory size||16GB (4 DIMMs)|
DDR3 SDRAM at 1600MHz
|Memory timings||9-9-9-24 1T|
|Chipset drivers||INF update
Rapid Storage Technology Enterprise 126.96.36.1990
with Realtek 188.8.131.5271 drivers
|System drive||Corsair F240 240GB SATA SSD|
|Power supply||Corsair AX850|
|GeForce GTX 780 Ti||GeForce 331.82 beta||876||928||1750||3072|
|Radeon R9 290X sample 1||Catalyst 13.11 beta 8/9v2||-||1000||1250||4096|
|Radeon R9 290X sample 2||Catalyst 13.11 beta 9v2||-||1000||1250||4096|
|HIS Radeon R9 290X||Catalyst 13.11 beta 8/9v2||-||1000||1250||4096|
|Sapphire Radeon R9 290X||Catalyst 13.11 beta 9v2||-||1000||1250||4096|
Thanks to Intel, Corsair, and Gigabyte for helping to outfit our test rigs with some of the finest hardware available. AMD, Nvidia, and the makers of the various products supplied the graphics cards for testing, as well.
Unless otherwise specified, image quality settings for the graphics cards were left at the control panel defaults. Vertical refresh sync (vsync) was disabled for all tests.
In addition to the games, we used the following test applications:
The tests and methods we employ are generally publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.
|In the lab: Corsair's Bulldog mini-PC kit||18|
|Crytek releases Cryengine source code on Github||17|
|Zotac beefs up lineup of mini-PCs for Computex||18|
|Toshiba releases 8TB X300 HDD||14|
|Microsoft announces 1850 more job cuts in mobile division||73|
|OCZ RD400 NVMe SSD heats up the enthusiast storage game||33|
|Samsung's 750 EVO SSD family grows with a 500GB model||9|
|Report: Windows Phone market share drops below 1%||92|
|Cryorig teases a distinctive pair of Mini-ITX cases||41|