Modern CPUs and GPUs employ turbo mechanisms that adjust frequencies based on various factors. Although they aspire to higher speeds, these clock-boosting schemes have well-defined base frequencies that guarantee a minimum level of performance. Or most of them do, anyway. AMD defied convention with its Hawaii-based Radeon R9 290 and 290X by only publishing the peak burst frequency for those cards. We know the GPU speeds under ideal conditions, but there's no minimum baseline.
Our reviews of the Radeon R9 290 and 290X illustrate that both cards have trouble maintaining their peak boost frequencies while running games. As the GPUs heat up, the clocks scale back from their peaks, and frame rates drop accordingly. The performance hit only works out to a few FPS on the press samples we've tested. However, Tom's Hardware has seen a bigger performance drop on a couple of retail 290X cards purchased from Newegg.
According to the site, the GPU frequency of one of the cards falls as low as 727MHz—well below its 1GHz boost rate—and stays there. As a result, the retail 290X is notably slower than not only the equivalent 290X press sample, but also the Radeon R9 290 supplied by AMD. Tom's doesn't provide much detail about the behavior of the second retail 290X, but it says that card is also slower than the 290 press sample.
AMD told Tom's Hardware that something is wrong with the retail cards, and I'd tend to agree. If the GPU consistently pulls up 27% short of the advertised boost speed, that ain't right. But since AMD hasn't defined a base frequency for the R9 290X, a 727MHz sustained GPU speed technically isn't out of spec. We've asked AMD to comment on this matter. We've also inquired about whether the 290X should be running as slowly as 727MHz under normal gaming workloads.
Chip-to-chip variance is common in semiconductor manufacturing. Any overclocker can tell you that some GPUs are simply more comfortable running at higher speeds—and with lower voltages—than their peers. Those differences typically don't produce large performance deltas between stock-clocked products that share the same model number, but that appears to be what's happening with the Radeon R9 290X. The dynamic PowerTune mechanism in the Hawaii GPU, coupled with AMD's apparent desire to wring every last drop of performance from its new Radeons, seems to have created a situation where performance can vary quite a bit based on the individual characteristics of each chip. Worse, it's possible reviewers may have been seeded with cherry-picked samples capable of maintaining higher speeds than typical retail products.
We're eager to hear more from AMD about this issue, and we'll update this story as new details roll in. We also have a couple more review samples we can test internally. If you bought one of the new, Hawaii-based Radeons, you can monitor the GPU frequency using GPU-Z, which is available here. We're curious to hear your results.
Update: AMD has issued a statement on the matter.
A media outlet has uniquely reported instances of AMD Radeon R9 290X boards purchased in retail that have exhibited an uncharacteristic level of performance variance as compared to press samples issued by AMD. We’re working to secure the board(s) in question for further analysis. Boards purchased by other media outlets have not exhibited similar characteristics that we’re aware of. In the meantime, we’ve identified areas where variability can be minimized and are working on a driver update which will minimize this variance. We will provide an update shortly.
Interesting. We should note that TR reader JohnC has encountered clock speeds well below 1GHz on his retail 290X. You can read about his experiences in the forums.
|New Need for Speed looks like a lean, mean machine||49|
|Friday night topic: how dinosaurs probably looked||18|
|Thermaltake's Suppressor F51 mid-tower looks a tad familiar||2|
|Umbra action RPG uses Megascans tech to glorious effect||14|
|Deal of the week: 27'' AHVA monitor for $300, The Witcher 3 for $39||16|
|F1 2015 offers a new formula for racing fans||8|
|The Witcher 3 developer explains controversial graphics downgrade||33|
|Frostbite engine lead teases next-gen Radeon||28|