Custom-cooled Radeon R9 290X cards from Asus and XFX reviewed

Ok, so these Radeon R9 290X cards have been kicking around for a shamefully long time in Damage Labs without getting a proper review. They were, a few months back, a Very Important Consideration in my list of things to cover.

You see, the first wave of retail R9 290X cards with AMD’s reference coolers tended to run pretty hot, and as a result, they were sometimes slower than the initial press samples. That fact was somewhat scandalous, of course. The folks at AMD told us that it shouldn’t be that way, and they took a two-pronged approach to addressing this problem. First, they asked to take one of the retail 290X cards I’d tested into their own labs for further testing, pledging to get to the bottom of the issue. Two, they pointed to the upcoming release of 290X cards with custom coolers as a reason why this problem shouldn’t matter so much in the near future.

After that, well, we never heard anything definitive back from AMD about the problems with reference-based 290X cards. We asked about it, but AMD didn’t have any findings to share after looking at the retail card we gave them. I got the distinct sense they were hoping that story would just kind of fade away.

Besides, they said, the custom-cooled offerings were on the way. Two of them arrived in Damage Labs, one from Asus and another from XFX, and I began testing them. Somehow, though, I was briefly distracted from that task by higher-priority reviews.

And then, suddenly, none of it mattered.

The crypto-currency mining boom hit hard, and demand for high-end Radeon cards skyrocketed. Supplies of the R9 290X were so tight in North America that Newegg’s auto-pricing algorithm briefly marked up the cards to $900—and that was kind of a good thing, since they were at least in stock. For months, gamers in North America were effectively priced out of the market for most of the R9-series Radeons.

Naturally, I found it hard to prioritize reviewing a couple of video cards that were nearly impossible for PC gamers to acquire at a reasonable price. Plus, I was distracted by other pressing matters, like the astounding Radeon R9 295 X2.

Happily, right about now looks like an appropriate time to revisit these custom-cooled 290X cards. For one reason or another, supplies of high-end Radeon GPUs appear to have recovered. The cards are in stock at retailers like Newegg and selling for reasonably sane amounts, not far off their original suggested list prices. Meanwhile, I’ve used these custom-cooled 290X cards in multiple tests, including the R9 295 X2 review, and I have good things to report. Might as well finally get on with it.

XFX R9 290X Double Dissipation

Let’s be frank: depending on how you look at it, you either have to give XFX some credit or some grief for naming its graphics cards “DD edition.” Because breasts. That’s clearly the imagery the name is meant evoke, and the twin fans on this cooler certainly are large and full. Ample, one might even say. And you know, there’s very little that a lonely male gamer likes more than a really large pair of… fans.

I almost think the too-clever name is kind of a shame, since this cooler is definitely sexy on its own terms. We first got a look at this basic design on XFX’s R9 280X card, and it was practically overkill atop that mid-sized GPU (although in the best way possible). The folks at XFX almost certainly had the 290X’s beefy Hawaii GPU in mind when they designed this thing.

As you can see, this Double Dissipation cooler is larger than the circuit board beneath it in every way that counts. The result is a video card that’s 11″ long—about a quarter inch longer than a reference 290X—and protrudes fully three-quarters of an inch above the top of the expansion slot area. This video card is one of the tallest ever to pass through Damage Labs, and XFX has used the additional height to house a large heatsink. The fins stick up above the PCB and run the length of the card, allowing for more cooling capacity than would be possible in a conventional dual-slot format. The obvious downside is that this card won’t fit into every case. You’ll want to check for clearance before ordering one of these things.

Yes, the XFX logo on this puppy lights up when the card is powered on. That little touch is purely cosmetic, but I can’t help myself—the eight-grader somehow still inside my head thinks it looks awesome. Between the light show and distinctive flat finish and rounded corners (surely Apple will sue) of the cooling shroud, XFX’s take on the R9 290X would be a perfect addition to a custom gaming rig with a big case window.

Beyond the bling and beefy cooling, this card follows AMD’s template for the 290X very closely. The ports on that nifty XFX backplate are the same as the reference card’s, with one DisplayPort, one HDMI, and two DL-DVI connectors. The GPU has a 1GHz Boost clock, and it’s paired with 4GB of GDDR5 memory running at 5GT/s on a 512-bit interface, just like on the stock 290X.

Perhaps best of all to gamers looking for a high-end graphics solution, the XFX 290X DD Edition is in stock right now at Newegg for the low, low price of $599.99. That’s 50 bucks above AMD’s original list price for the reference 290X, but a whole lotta water has passed under the bridge since then. From today’s perspective, a 290X card with custom cooling for that price seems like blessed relief.

Asus R9 290X DirectCU II OC

If that XFX card somehow wasn’t quite beastly enough for you, perhaps this entrant from Asus will do the trick. The DirectCU II cooler has a name that makes engineering sense—it refers to the dual 10-mm copper heatpipes that make direct contact with the GPU’s surface—but sounds clumsy enough to make me wish for a return to XFX’s not-so-veiled references to boobies.

If the XFX cooler is practically spilling out of its dual-slot shirt, then this Asus one is showing a bit of nip. The Asus card is 11.5″ long—which is pretty long as these things go—and, at peak, that heatpipe sticks up 1.5″ above the top of the expansion slot cover. Many of today’s best PC cases leave enough clearance for this extra height not to become a problem. However, I suspect this card may not fit into some of the more compact mid-tower cases on the market.

So long as it fits, though, the oversized cooler ought to be a blessing. Asus says this thing has a 30% larger dissipation surface than the reference design, and the picture above pretty much confirms it. Beyond the size, Asus says it has worked to make the cooler more effective by angling its fan blades to send air both downward and outward, in order to take better advantage of the expanded heatsink area.

This card’s PCB is larger than stock, too, and is clearly a custom design. Asus tells us the board has six power phases feeding the GPU, up from five on the reference design, and four phases for the memory I/O PWM, up from two stock. The firm has used concrete alloy chokes to quiet the electronic buzzing noise one might otherwise hear. For what it’s worth, I didn’t notice any buzzing noise coming from either the XFX or Asus 290X cards in regular use.

The two tones of stickers in action. Source: Asus.

Asus doesn’t have a light-up logo on its DirectCU II cooler, and frankly, I think the default all-black aesthetic is pleasing—very Batmobile, which is always a good thing. There is a bit of bling available to those who want it, however, in the form of two sets of custom stickers, one red and one gold, that can be attached to the cooling shroud in order to match the look of the system around it. These aren’t just shiny stickers, either. They’re metallic in look and feel, substantial and thick. If you can get ’em on straight, the card’s bound to look like it came from the factory that way.

Speaking of which, the DirectCU II OC comes from the factory with a higher-than-stock Boost clock of 1050MHz and 4GB of GDDR5 memory running at 5.4GT/s. That’s a 5% faster GPU clock and an 8% faster memory clock than the reference cards, and having the two together ought to translate into a modest-but-consistent performance improvement. The R9 290X DirectCU II OC is in stock at Newegg right now for $619.99, and like the XFX card, it comes with your choice of three “free” games as part of the Never Settle Forever bundle.


So do these run slower than the 290X review samples?

In a word, nope. In fact, the heart of this review is right here on this page, in these few words. I tested these cards just like I did the ones in our original article on 290X variance, subjecting them to long periods of sustained loads and tracking clock speeds over time. I was going to plot out all of the results for you, so you could compare the cards visually and such, but the numbers themselves were a serious demotivator. You see, both the XFX card and the Asus maintained constant clock speeds, even during our worst-case thermal workload, MSI Kombustor. The XFX stayed steady at its 1GHz peak frequency, and the Asus one-upped it by staying constant at 1050MHz.

This is an outcome so simple I figured I didn’t need to draw you a picture.

Steady speeds mean both of these cards are faster than retail 290X cards equipped with AMD’s reference cooler. They should also be somewhat faster than our original R9 290X review unit in its default “quiet” fan mode. Only in its noisy “uber” fan mode is a reference-cooled 290X able to avoid PowerTune’s temperature-based clock throttling entirely. Both of these custom-cooled cards have no trouble doing so.

That’s very good news for prospective R9 290X owners, in my view. I consider this steady-speed operation the new “normal” for 290X cards generally, and I used the XFX card in my R9 295 X2 review, so the performance numbers you see there ought to be superior to a reference-cooled 290X’s numbers.

Cooler performance

With that bit of good news in mind, let’s look closer at exactly how these custom coolers perform while running Battlefield 4.

Welp. Both of the cards are nearly whisper-quiet under load, registering fewer decibels on our meter than the reference GeForce GTX 780 Ti—and pretty much embarrassing the reference-cooled HIS 290X. All the while, the custom-cooled 290X cards are able to keep GPU temperatures relatively low, as well. This is a major improvement.

I suppose I should say a few words in defense of the R9 290X’s stock cooler. After all, it has a single blower that exhausts heat from the PC case and ought to be more tolerant of having another card nestled up next to it in the adjacent expansion slot. The reference cooler is shorter than either of these custom coolers, and it doesn’t jut up into the space above the expansion slots. AMD had good reason to make the stock cooler in the form that it took. Still, I suspect most folks will prefer the tradeoffs offered by custom-cooled cards like these.

Power consumption

Probably thanks to their cooler operating temperatures, the Asus and XFX 290X cards draw a little less power under load than the reference-based HIS offering. The Asus card may be a little more efficient thanks to its custom power delivery, as well.

Also notice how similar in power draw the GTX 780 Ti and R9 290X are. Although it fits into the same space as the reference 290X cooler, the GeForce GTX 780 Ti’s cooler manages to keep GPU temperatures lower while making less noise. Chalk up another strike against the 290X reference cooler, I suppose. The counterpoint here is that the GTX 780 Ti GPU is a bigger chip with more surface area, so it doesn’t present the same thermal density problems that the 290X coolers must face.

Battlefield 4 performance

Now that we’ve seen how those custom coolers perform, we could probably end the review immediately. But this is a PC hardware review site, so we’re required by international law to pad this baby out with some unnecessary benchmark results.

Our weapon of choice for that task is BF4, where we’re comparing the various 290X cards and a competitor from the green team, the GeForce GTX 780 Ti. We conducted these tests a while back, with older graphics drivers and an older version of BF4, but they should be sufficient to demonstrate the differences between the various 290X cards.


Interesting. I’ve said that the XFX 290X should be faster than the reference-cooled HIS card while gaming, and I stand by that assessment. We’ve seen the HIS throttle in similar scenarios in ways that impact performance measurably. That obviously didn’t happen here. What we can say with confidence is that the custom-cooled cards are unlikely to be affected by thermal slowdowns at all.

Also, as you can see, the Asus card’s higher core and memory clocks translate into a minor performance advantage over the other two 290X offerings. Add in the goodness of AMD’s Mantle API, and the Asus 290X essentially matches the GeForce GTX 780 Ti in our latency-focused 99th percentile frame time metric, which is a better way of assessing gaming smoothness than average FPS.

Conclusions

At this point, I should probably be reporting on the overclocking potential of the individual cards or delving deeper into overall comparative performance, because that’s generally what we do here. I think, however, that we’ve covered the basic points that I wanted most urgently to address.

The bottom line is that these custom-cooled Radeon R9 290X cards from XFX and Asus are almost embarrassingly superior to AMD’s reference design. You’ll have to accept that these puppies may require a larger case and an open slot next door. If you can live with that, what you’ll get in return is a cooler, quieter, better-looking graphics card whose performance should be higher because the 290X’s occasional thermal throttling has been banished.

I’m not sure I could choose between the Asus and XFX cards. The Asus is faster and snazzier thanks to a custom board design and high-zoot components, but it costs $20 more. And the XFX card lights up. Once you’re down to the lighting considerations, well, these are matters of taste. Take your pick, folks.

As for the question of how these new-look Radeon R9 290X products compare to the GeForce GTX 780 Ti, well, you can take a gander at the results summary in our R9 295 X2 article to get a sense of 4K gaming performance. (Hint: the GTX 780 Ti is faster, but it also costs $100 more.) I think performance at 4K is a little bit different than what you’ll see at more common resolutions, though. We have additional testing in the works with these cards, including that FCAT stuff I’ve mentioned before. Stay tuned.

You can yell at me about the lack of overclocking results on Twitter.

Comments closed
    • Crackhead Johny
    • 5 years ago

    “Let’s be frank: depending on how you look at it, you either have to give XFX some credit or some grief for naming its graphics cards “DD edition.”

    It could just be that the card is an upgrade.

    Which he spells thusly, u-p-g-r-a-y-e-d-d with two D’s, as he says, “for a double dose of this pimping”

    • ronch
    • 5 years ago

    I wish card manufacturers would employ nicer, cleaner-looking coolers like the one used by XFX here. Compared to the XFX, the Asus cooler looks like it was just taken off the shelf and slapped on Asus’ card. Many cards are unfortunately like this, with ccoolers that are a tad longer than the boards, exposed board components, etc. I know that it’s a non-issue for most folks, but it also wasn’t for me until I saw how refined this XFX here looks. It’s obvious XFX gave their product’s appearance a little more thoight. I’d do away with the LEDs though — I don’t like cases that have side panel windows on them and bright lights seeping out from my case’s seams.

    • Prestige Worldwide
    • 5 years ago

    Just curious, what CPU / frequency was used for these benches, and how much ram at what speed?

      • Damage
      • 5 years ago

      Here are the specs for my GPU test rigs:

      [url<]https://techreport.com/review/26279/amd-radeon-r9-295-x2-graphics-card-reviewed/4[/url<]

        • Prestige Worldwide
        • 5 years ago

        Many thanks!

        Edit: I have the same CPU, it’s like you knew I would want to know exactly how it would run on my rig! Above and beyond the call of duty, you have gone 🙂

    • DPete27
    • 5 years ago

    Another confirmation that Asus’ DCU coolers are not ideal for Hawaii cards. They perform well on Nvidia chips because they have a larger surface area. Sapphire’s Toxic cooler is king on the 290/x

    • cegras
    • 5 years ago

    Why does TR never disassemble and look at the cooler? It seems especially relevant in an article like this where the cooler is the main differentiating point.

      • Billstevens
      • 5 years ago

      I believe the answered this a few times. If I recall they stated that they only get a certain number of testing samples and thus they do not want to risk damaging cards by taking them apart since they may want to keep them for future comparison testing.

    • JustAnEngineer
    • 5 years ago

    Edit:
    Doh!

      • Voldenuit
      • 5 years ago

      BF4 was [url=https://techreport.com/review/26092/custom-cooled-radeon-r9-290x-cards-from-asus-and-xfx-reviewed/4<]tested at 2560x1440[/url<].

    • Voldenuit
    • 5 years ago

    Waitaminute… are both these cards DVI-D only? I don’t see a cross-shaped flange which would indicate a DVD-I on either card.

    While I doubt many ppl would buy a $599 card to run an ancient VGA monitor off it, it might be worth noting.

      • mczak
      • 5 years ago

      The Hawaii chip does not support VGA – AMD apparently omitted circuitry required for analog signals… According to the plans other future chips should follow.
      If you want to connect old-style VGA monitor you can of course still use active DP->VGA adapter which aren’t all that expensive.

      • sweatshopking
      • 5 years ago

      i have a 770 and run vga off of it….

      • Bensam123
      • 5 years ago

      You can buy a displayport to VGA adapter for $20 if you need to.

    • Bensam123
    • 5 years ago

    Curiously, why would you not review the cards when you guys made a big deal out of it regardless of gamers not being able to afford them? Miners definitely were buying them and needed information like this to make informed purchases (I myself being one of them). The information probably would’ve been more useful considering their demand then, then now.

    However, since then I had to make my own choices as far as mining hardware goes and now am getting bitten by it. After mining for four months, cards starts to exhibit wear and tear you see in video cards in a bout 2-3 years of normal gaming sessions every day.

    The XFX has one of the worst coolers I’ve seen and is one of the worst video cards out of all the bunch I’ve tried. They’re simply not that good. It also has a manufacture defect. That little piece of metal that sits in the middle and spins? It’s not always center. While this doesn’t matter with a sticker, the piece of metal weighs a lot more then that sticker. This results in fan wobble, which then eventually causes the fans to fail.

    I’m starting to have 7970s fans (and cards) fail that were otherwise working perfectly fine because of this piece of metal. Every fan that has failed (not always two fans on each card) and ones that are exhibiting signs of earlier failure (noise, low RPMs, increased heat on the motor head), has had a piece of metal that was slightly off and caused the fans to wobble. This is a huge deal and XFX (nor reviewers) seem to notice this. I’m not the only one that is having problems with these fans failing either. The fans simply aren’t durable enough to spin the little piece of metal on top. Having figured this out, I’ve removed the piece of metal from all other cards (it just sticks on).

    The VRMs on XFX cards are also low quality. They use some sort of cheap VRM that results in power distribution that isn’t always steady. This can cause hashrates to vary and of course overclockability of the cards themselves.

    I also have cards with the above Asus coolers, I instead have a three slot model and despite what you think, bigger isn’t always better. The three slot Asus coolers, of the same design as the cards reviewed here (they use the same one on pretty much all the cards) don’t cool that much better then other cards. It doesn’t need three and perhaps this is a newer version of the cooler as it’s on a 290x instead of a 280x (which is what I have).

    I can vouch for the VRMs though. Asus cards are solid overclockers and the fans as well as heatsinks are very well constructed. The power they distribute is very clean if you look at GPUz numbers.

    However, that giant piece of metal on the back of the cards hinders heat dissipation. I’ve seen manufacturers (such as Visiontek) just throw them on there and don’t even connect them to the memory modules. They essentially become giant heat walls. They also take up critical space between cards if you have them seated next to eachother (or hanging next to eachother if you have a mining rig).

    I would highly recommend Asus cards though. They definitely seem very high quality.

    That said, I’ve also noticed two big issues in a lot of heatsinks being attached to video cards. One is they incase the heatsink almost completely so air can’t move around or through the cards. This is a really big deal. While funneling the air through the heatsinks is good, they take it a step further and just fully in case the damn thing. It’s like entombing a card for death. The three slot Asus card I mentioned suffers from this. Although the above model in the review looks more open, which is good.

    But that’s not nearly as bad as the XFX models. As you can see, they just cover everything with plastic so there is no where for the air to escape. Although I have no proof for this as of yet, I also believe this increases strain on the fans pushing the air through as they have more resistance, resulting in shorter fan life expectancy, regardless of how well it cools. I imagine these cards being horrid for smokers.

    Which leads to the second issue. The heatsinks aren’t far enough off the PCB and there isn’t enough room underneath to properly blow air through them. It just blows air into the board and it stagnates. So you end up in a situation where, regardless of how good the fan is or high quality the heatsink is (putting aside death spinners, literally), they don’t cool nearly as well as other heatsinks.

    Asus models are much better as I noted compared to XFX, but they still also have this same issue. Asus more has problems with encasing their heatsinks too much, rather then not having enough room under them.

    Surprisingly, Visiontek cards I have actually cool really well. They feel and look super cheap. However because of this, there is less heatsink material underneath the fans so there is more room for air to flow through and the shrouds just cover the top. I don’t know how long the coolers will last, but they also use cheap looking stickers on them so I’m not worried about extra strain on the fans. Really simpler turns out to be a lot better in this case.

    Most off the cards I’m running are 280x’s, however I also have some 290s and heatsinks matter a heck of a lot more for them as well as VRMs. The cleaner the power and better the cooling, the better they perform and I’m not just talking about throttling. It seems there is a definitely correlation between spikes in performance and power distribution to the cards, which can be watched through GPUZ. This once again depends a lot on the VRMs on the cards themselves. Better VRMs result in smoother performance, as long as it’s all cooled properly.

    It’s interesting… when you think about video card reviews you think they’re generally all the same, I did. I thought at one point XFX was one of the best manufacturers to buy from. But if you run a graphics card 24/7 for a few months issues start to become quite apparent. I will once again iterate, Asus fans are very solid as are their heatsinks and power distribution.

    It’s really a shame TR didn’t hop on this. They could’ve written up a formal and professional guide on mining and their experience with cards. This would’ve given them an excuse to stress test their hardware for more data, relate to a fast growing community (miners), gain readership, and actually earn some money while doing it. All of the above is invaluable. Maybe Geoff can look into this, since he does more sporadic long term stuff (like SSDs).

    All of my experiences with mining have been pretty much from the enthusiast community. There aren’t any real professional websites doing write ups on it. Not just ‘this is how you mine, lets write a article to get some views!’. There are tons of new websites starting up to talk about mining, but they’re all really amateur. There still aren’t any sites that focus on benchmarks or offer empirical results on mining hardware. The best you can do is skim forums looking for user posts which can vary wildly from person to person, even among the same hardware. Reddit seems to be one of the best resources for such information.

    It’s weird, it almost feels as though professional websites are intentionally ignoring the mining community and just waiting for it to burn out and disappear rather then having to do more work and actually embrace it. TR hasn’t even done anything regarding hash rates for stuff like the above video cards. We didn’t even get overclocking results, which also matters for mining. Hashrates DO vary from manufacturer to manufacturer, model to model (for the same GPU too). It depends on a lot of things and is extremely dynamic, but capable of being quantified as specific cards (the same model number from the same manufacturer) are usually the same within a smaller percentage of variation.

    I guess you guys have your hands full with other things, but this was and is a pretty big deal with tons of people participating in it. AMDs sales can definitely attest to this. However ASICs are starting to emerge and the exodus to things like Scrypt-N and x11 has begun, where the market will be in six months… no one knows.

      • Meadows
      • 5 years ago

      amaze
      wow
      much wall of text

        • Bensam123
        • 5 years ago

        Read it, it’s much more useful then that doge meme.

          • derFunkenstein
          • 5 years ago

          Well it started out “why didn’t you just do it sooner? miners miners miners” so I kind of quit.

            • Bensam123
            • 5 years ago

            Yeah… maybe you should’ve read it, there was more there then miners… I assume we’re both here for the article about GPUs and how good the coolers are. I’m sorry if my area of expertise in which I use GPUs isn’t approved of by you.

            • Meadows
            • 5 years ago

            I refuse to read full articles in the comment section. Additionally, you get -1 for confusing then/than.

            • DancinJack
            • 5 years ago

            He should get -1 for each instance. Honestly, that made me stop reading more than the wall of text.

            • Bensam123
            • 5 years ago

            You read the comments before you read the post and then decided you shouldn’t read the post before even knowing what it’s about?

            ><;

            • Chrispy_
            • 5 years ago

            If the text box for the comment is longer [i<]then[/i<] it is wide, it's time to learn a way of communicating multiple points more effectively. For several ideas, bullet-list them and [i<]than[/i<] pad out anything that needs clarifying.

            • chuckula
            • 5 years ago

            [b<]EYE[/b<] [i<]SEA[/i<] WHAT YOU DID [i<]THEY'RE[/i<]!

            • Bensam123
            • 5 years ago

            I refuse to have meaningful information given to me that can’t be distilled into one paragraph!

            • derFunkenstein
            • 5 years ago

            Your “area of expertise” is a hilarious waste of energy.

            • Bensam123
            • 5 years ago

            mm… so now giving people information on products that fail is a moral dilemma now. Are you going to explain to everyone that you shouldn’t read my post because it’ll give kids autism now?

            I do appreciate the righteous attitude though.

        • ptsant
        • 5 years ago

        +1 for use of the doge meme, even though Bensam does have some valid points…

          • Bensam123
          • 5 years ago

          Shhh… don’t mention you actually read the post or people will start hating on you for being a long post sympathizer! XD

      • cynan
      • 5 years ago

      With respect to XFX cards, the previous iteration of the DD coolers as found on Tahiti cards were notoriously sub-par. Are these cooling and reliability issues with XFX of which you speak specific to your 7970s? Because these new DD coolers seem to be a significant improvement.

      Not that I’ve experimented too much, but I’ve always found advertised improved power circuitry to largely be gimmicks, while cooling is paramount when it comes to performance. Likely because, if within thermal envelope for which they were designed, the reference power specs are more than adequate.

        • Bensam123
        • 5 years ago

        I have 7970 and 280x XFX cards. They all have the same problems. I think I had the ‘low end’ 280x’s as they don’t have the metal spinners in the middle. The inherent problem with that doesn’t change. These still have the metal spinners and I’m sure they all aren’t applied exactly on the center.

        Googling XFX and mining brings up all sorts of lovely things.

        You probably wouldn’t notice VRMs normally… but when you game extensively or do anything that’s intensive, it starts to show. Video cards DO draw a decent amount of power. We’re talking close to 300w a pop, depending on the vcore for the chip, the ASIC quality, and of course whatever you end up with for core/mem. VRMs I think matter more then people give it credit for.

        ‘Adequate’ isn’t the same as performs perfectly.

          • Billstevens
          • 5 years ago

          You realize non of these gaming GPUs were ever designed or tested to survive round the clock full load operation…

          I don’t even know if Titans or Tesla professional GPUs were meant to work under that kind of load, though I suspect since they are meant for scientific data crunching they probably have more hearty cooling designs and use better quality components.

          These are gaming cards and were not meant to be used for mining. I am holding back a lot of anger here so I will just stop right there…

            • Bensam123
            • 5 years ago

            Says who dude? There is no sticker anywhere that says ‘don’t use 24/7 or they don’t work’ or ‘We wont warranty these cards past X hours of use.’. If under full load you mean ‘load’ too, then I surely hope they’re made to do that otherwise we’re all screwed.

            Sorta interesting that you’d say these cards are meant purely for gaming… There are all those sick individuals using them for other things like watching movies, rendering graphics, and my god… sitting on the desktop! The heinous sin of all! This is sick, sick I tell you!

            Do you know what a render farm is?

      • anotherengineer
      • 5 years ago

      Since no one else did……..TL;DR

      🙂

        • Bensam123
        • 5 years ago

        Thanks bro… They tried to say that with more words a few posts up, but I couldn’t understand it cause it was too long.

      • Irascible
      • 5 years ago

      Plus four for passion and useful data relevant to the topic at hand. Minus one for *too much* passion for the crypto-bubble. ;^)

      Your critics might have learned something useful had they read it. All the mining stuff almost stopped me from reading it as well.

        • Bensam123
        • 5 years ago

        There isn’t really anything about mining till the last couple paragraphs.

        I’m not sure why mining would make you stop reading it though. It’s happening whether you like it or not. It’s like rebelling against overclockers or something like that.

          • Irascible
          • 5 years ago

          It’s nothing like “rebelling against overclockers”. Overclockers aren’t money changers.

          I plus three’d you because it’s a well written, albeit too long, comment. And you’re right in that the cards are the focus of your reply, not mining. Nevertheless, I find mining distasteful at best. That’s no comment on you personally. That’s my opinion.

          Used car salesmen need to have thick skins as they are viewed prejudicially. Crypto-miners are in the same boat. I’m not saying it’s right. It just is. And it’s happening whether you like it or not.

            • Bensam123
            • 5 years ago

            I don’t think money changing fits… It’s more along the lines of financial transaction processing and minting under heavily controlled conditions.

            The whole reason cryptos work is because miners exist. If they didn’t, the market would freeze or be easily exploited due lack of security (confirmations between miners serves as security redundancy for the block chain). I, and other miners, are essentially doing what you pay credit card companies and banks to do.

            I appreciate it though. There seems to be quite a misconception around here about mining, but that’s what happens when people don’t understand a topic and make a superficial opinion based on whatever ‘facts’ they have available.

      • Billstevens
      • 5 years ago

      I browsed some snippets from your wall of text and the answer is a resounding yes. We are waiting for mining to die at least when it comes to wasting perfectly good gaming GPUs.

      If you need a GPU with higher quality components to survive month long endurance tests to earn you money you should have to pay a premium to get that higher quality card.

      The build and design of these “gaming” GPUs is just fine for what they were intended to do. I don’t want to pay extra for a card to play games because a bunch of god damn miners decide its a good idea to see if theirs will catch on fire if they leave them running at full load for half a year.

      Your barking up the wrong tree for sympathy at gaming hardware site.

        • Bensam123
        • 5 years ago

        Dude, you can pick up GPUs for 60% of the cost on eBay now. Take that blinding hate out of your eyes and look around awhile. Not just that, I offer you meaningful information that would allow you to make a informed purchase and all you can do is act like a haughty stuck up…

        YOU PAY A PREMIUM FOR HIGHER QUALITY CARDS! That’s why they don’t all cost the same doofus.

        As far as using GPUs solely for gaming, I think you’re being completely irrational here. It’s just as silly as someone saying ‘you can only use these CPUs for word processing lolz’. Rejecting the idea that GPUs can be used for anything other then gaming is very shortsighted.

        At no point did I look for sympathy too. You just acted like a arrogant douche to someone that was helping you make a better purchase.

          • Billstevens
          • 5 years ago

          You’re right there is no disclaimer on a GPU that says don’t run this at full load for 3 months straight, probably because they haven’t had to. But you have to assume that most video card makers are going to stress test and rate their card architectures based on “normal” use cases. These are not enterprise class cards they were not designed to be worked this hard and still maintain their stated warranty.

          Now generally I am fine with the idea that GPUs, made for gaming mind you, are able to used to do scientific research, folding, and different non-gaming rendering tasks. But the GPU mining craze went way past that point and actually drove gamers right of the market for their own products…

          Forgive a gaming hardware review site if we don’t want to encourage the use of these cards for an activity that makes it more expensive and difficult for us to buy that card to play games.

            • Voldenuit
            • 5 years ago

            I think both Bensam123 and his detractors have merit in their arguments.

            While I do think that mining places undue stress on the graphics cards beyond what the expected use case would be, it might also be a good way to stress test cards from different vendors to pick out the cream of the crop vs the ones that fail early.

            In that respect, it’s no less legitimate a practice than running Furmark, or TR’s SSD endurance rig, both of which definitely run hardware past normal use scenarios.

            • Bensam123
            • 5 years ago

            Or folding.

            Keep in mind the things I mentioned aren’t even related to the hardware itself failing. Like caps exploding or VRMs burning out. It related to fan failures and improper cooling, which is pretty basic. If I’m figuring this out that means something pretty fundamental went wrong here…

            Most of the people replying to me didn’t take time to read my original post and figure out exactly what I’m talking about. Fan failures would definitely happen regardless of how much load you’re putting at it. Making it spin a bit faster obviously would make it burn out faster, but it’s still inevitable because of what exactly is wrong with them (off sized metal caps on the fans).

            • Bensam123
            • 5 years ago

            Dude… Some people game 12-14 hours a day, sometimes longer. It’s entirely possible to utilize a GPU for extended periods of time, just the same as a CPU. I know, I am, for instance someone that leaves the computer on 24/7… I know there are plenty of people that leave their computers on for doing other tasks, like render work, encoding, pretty much any sort of video editing… not even taking into account server usage (you don’t need to buy Xeons to use a computer as a server).

            Normal use cases include having a fan NOT fail within 4 months. Are you telling me I burned the fan out by having my computer on? Having a fan spin at 100% utilization should NOT burn it out in four months. Where these cards are failing has ABSOLUTELY NOTHING to do with the workload being put on them. There are people that have case fans running 24/7/365. Fans should not simply burn out, nor should they be mounting metal plates on them that are more then likely applied by hand and off center. That’s just dumb.

            Dude you didn’t leave the market. The prices are back to normal, you can buy cards for 60% on eBay, stop being a irrational hater. Even during the craze you could still buy Nvidia which were priced exactly the same as normal, if you really, really, really needed to.

            Why is mining different from scientific research or folding? Do you even know what mining does? Folding is actually less useful then mining is and this is putting aside people using their computers for whatever they deem fit, which doesn’t need to fit on your ‘approved’ list of items that you seem to be abiding by and think everyone else should abide buy. TR actually goes out of their way to benchmark hardware for folding and scientific research.

            This is a hardware review site. This is NOT a ‘gaming’ hardware review site. Just because they have gaming benchmarks and occasionally talk about games, does not make it exclusively gaming… just the same as they cover other aspects in their benchmarks, like said folding and scientific research.

            Once again, you are being extremely irrational. YOU COULD ALWAYS BUY NVIDIA, you can now purchase hardware for 60% of original cost! Meaning you can get a 280x for $200 on eBay right now. You can get dual 280x crossfire for $400. This is a gamers wet dream you silly goose and that’s because of mining!

            • HisDivineOrder
            • 5 years ago

            Mining is more intensive than folding or gaming. Mining is basically a “game” of trying to beat out how much money you spend in power trying to make money before the bubble of whatever new ___coin is in fad bursts to the point where power costs more. Plus, throw in your hardware failures, what not.

            You say that cryptocoin experience is the same as gaming or folding, but I’d argue the majority don’t care about those experiences. I know when I read recently that Gigabyte is going to be testing SOME of their cards for a week doing coins, I thought, “Wut? That’s not really that useful.” But it sounds neat to them and they get to make money doing it, so that’s a win for them. That’s a real marketing point COMBINED with real moneymaking. Why not, right?

            But for the end user to be running large arrays of these cards packed in tighter than sardines for months and months 24/7? You really think that represents the normal workload enough to be even remotely relevant to this site? This site is for users who are building gaming PC’s and enthusiast PC’s.

            I’m surprised these GPU makers haven’t already added some kind of proviso to a licensing agreement that says you’re voiding your warranty if you mine. Once more users start trying to do what you’re suggesting, they very well might do it. Look at your complaints. “Omg, the caps blew up. Omg, the fans started to wobble. Omg, the whole card is unstable.” Yeah. 24/7 for four months at 100%.

            That’s insane. Nobody runs their gaming machine 100% for 24/7 for months and months on end. Especially not with arrays of multiple cards packed together densely so much that heat can’t escape like some miners do.

            The Miner workload is not the gamer workload and it’s not the enthusiast workload. The Miner design ethos for their system is not the gamer design or the enthusiast design. The Miner is trying to make the most out of the least amount spent while milking a given system for as long as they can before they chuck the hardware and replace it to start over again. The enthusiast usually wants their system to last. The gamer wants their system to last, period. There is some overlap between gamers and enthusiasts. There’s very little overlap with the Miner system.

            Also, I could point out to you that I told you and other AMD fanboy apologists that running your card at insanely high temperatures just to get around power leakage was going to lead to early failures when combined with coolers built for several generations ago, but I think you’ve outlined enough money loss on hardware with AMD to make the point far more decisively than I did. I can only imagine what would have happened with those reference boards you loved so much way back…

      • HisDivineOrder
      • 5 years ago

      You shouldn’t be surprised by the reactions of enthusiasts to you. You’re taking a hobby and turning it into something else entirely.

      What you’re doing reminds me of Steve Jobs. Specifically, when he took the Computer club from way back and turned it into a corporation dedicated to the exact opposite of what the early computer club was about.

      In the early days, computer designs were free and open. People were sharing everything freely. They exchanged tips, suggestions, information. Then Steve Jobs showed up, took their early work, and locked everything down while trying to wring as much money as possible from it. His big idea was to take what they were doing as hobbyists and turn it into something he could own, control, dominate, and thus profit from. He made Woz stop contributing.

      That’s you. You’re the enthusiast who’s “thinking different” to wring every dollar out of our hobby as possible. Forgive us if we look down on you. You reek of opportunism and then you show up here wanting everyone to cater to your needs after you’re part of the reason these cards have actually gone up in price instead of going down. You’ve made every enthusiast have less options for upgrades because of the inflation of cost for several months and you don’t think that’s going to cause everyone to look at you differently?

      Just take your profits. Don’t mind the rest of us. We’re just here to have fun in the PC enthusiast park. Not set up a digital strip mine and mine everyone else out by raising prices so high no one else can play in the park… 😉

        • Wild Thing
        • 5 years ago

        Blimey,cue the violins.
        Think people would stop folding out of principle if Folding@home suddenly became profitable?
        To the point that it paid off their video card?
        Pretty sure most AMD gamers that mined on the side and paid off the cost of their card(s) are pretty happy right now.
        Best leave it at that.

    • Meadows
    • 5 years ago

    At what RPM do the breasts spin on the XFX card?

      • dmjifn
      • 5 years ago

      Variable, depending on the level of excitement of the card.

        • Meadows
        • 5 years ago

        Do they get hot?

          • sweatshopking
          • 5 years ago

          you wouldn’t care if they did.

            • Meadows
            • 5 years ago

            I’m not *that* gay.

    • USAFTW
    • 5 years ago

    This is the first review site in which I’ve learned what double d DD cooling on these XFX cards actually refers to. I honestly didn’t get it before.
    Aside from the naming scheme, the XFX looks cooler to my taste, but the Asus has a better PCB.
    Maybe I’ll wait for 20nm cards (quite a long time I suppose), when we get less power hungry cards that can power 4K resolution and cheaper 4K displays to go along with them.

    • anotherengineer
    • 5 years ago

    “Naturally, I found it hard to prioritize reviewing a couple of video cards that were nearly impossible for PC gamers to acquire at a reasonable price. Plus, I was distracted by other pressing matters, like [u<]playing games on my 4k monitor[/u<]" There, edited that for you Damage 😉

    • hoboGeek
    • 5 years ago

    I definitely like the XFX. Cool as can be.
    Now, I have $9, I need a volunteer to sponsor the other $590 in order to buy this puppy.

    • I.S.T.
    • 5 years ago

    [quote<]Interesting. I've said that the XFX 290X should be faster than the reference-cooled HIS card while gaming, and I stand by that assessment. [/quote<] That XFX should be ASUS.

      • Damage
      • 5 years ago

      No, I meant XFX.

        • I.S.T.
        • 5 years ago

        Yeah in retrospect i misread the sentence. Oops.

    • Freon
    • 5 years ago

    Enjoying the single page view.

    $599 is still too salty for me. It doesn’t seem like a clear win against the 780 Ti anyway if I were spending that kind of money.

    • wizpig64
    • 5 years ago

    the xfx one has sold me on looks alone. now if only i had the need for a new GPU 🙁

    EDIT: how easy is it to take apart the shroud? I’m wondering about spraying it with plasti-dip, if that’s not insane.

      • tay
      • 5 years ago

      Yeah I don’t care what the inside of my computer looks like, but the XFX card looks really nice. Simple clean lines, and even the lighting is classy. Unusual for PCs in my experience.

      • Bensam123
      • 5 years ago

      XFX cards usually come with warranty stickers on the screws for the GPU on the back so you can’t remove the heatsink without voiding it. TR didn’t note this, not sure if it’s that way with all the models, but it’s that way for all the XFX’s I own currently.

        • wizpig64
        • 5 years ago

        yup, they’re on this too 🙁

        [url<]https://techreport.com/gallery/26092/-/72055/xfx-card-back-2560[/url<]

        • jpinconline
        • 5 years ago

        I have the ASUS card and it has a warranty void sticker as well. I contacted ASUS and they WILL void your warranty, where XFX will not void the warranty in north America as long as you contact them first. I would not have bought my ASUS 290X had I known this before purchasing.

          • Bensam123
          • 5 years ago

          I have Asus 7970s with the triple wide coolers and they don’t have the stickers. The one in the review doesn’t have them either.

            • jpinconline
            • 5 years ago

            My 290x direct cu2 and the pictures on newegg have stickers over one of the 4 screws mounting the heatsink.

            After doing research I was able to find that Sapphire, MSI and XFX do not void warranties as long as no damage is done to the PCB. Where ASUS confirmed they do void warranties if the sticker is broken.

    • cynan
    • 5 years ago

    Charging more for a slightly faster factory “overclock”, all else being equal, is a good way of parting fools from their money. (And I remember commenting something similar a year ago when the prospect of paying $50+ more for a GHz edition of the HD 7970 was prevalent).

    If you can empirically determine that the beefier power circuitry in the ASUS actually conveys a real-world benefit – which would probably require somewhat intensive overclocking analysis across a few card samples – then perhaps the ASUS is worth a few bucks more.

    If not, then the somewhat cheaper (and apparently at least marginally better cooled XFX) probably deserves the win.

    Edit: At kudos to XFX as they had some of the worst custom cooling solutions for Tahiti, funbag-reminiscent monikers or not.

      • Billstevens
      • 5 years ago

      Its 20 bucks… hardly a price gouge but yeah not a huge difference between the 2 cards outside of a small overclock which may net you a frame or 2 improvement.

      I believe the take away from this testing is that AMD cooling engineers fail at life.

        • cynan
        • 5 years ago

        Even at the same price, I’d probably take the XFX as the cooler seems to be a tad better. (You can always bump the clock speed up a few notches yourself). The ASUS would only win if I was convinced that the advertised custom power components actually resulted in a real benefit (and was not just marketing).

          • Billstevens
          • 5 years ago

          My current card is a DirectorCU 560 Ti and its been a great card. The only real complaint I saw about it was that it came packaged in foam and didn’t have a anti-static plastic baggy which I admit was odd and probably not smart. Maybe it was anti static foam…

          I personally liked the asthetic of the DirectorCU cooler design. I’ve mostly owned Asus and EVGA cards though never their overpriced overclock versions… All of them have worked past the point of me wanting to upgrade.

          Either of these cards sounds like a good choice. May as well save 20 bucks though. Every little bit counts.

      • USAFTW
      • 5 years ago

      Couldn’t agree with you more on you valid points, sir.

    • StaticFX
    • 5 years ago

    that is one sexy looking card! 🙂

Pin It on Pinterest

Share This