AMD’s Radeon R9 290 graphics card reviewed

Sometimes, in this job, the task is rather complex. Delving deep into the guts of a new GPU architecture, summarizing a chip comprised of billions of transistors, understanding the subtleties of frame dispatch and delivery—these things can be hard to do well. At other times, though, things are actually rather straightforward. Happily, the Radeon R9 290 isn’t a difficult product to understand if you’re familiar with its big brother, the Radeon R9 290X. Heck, what you need to know is this: it’s nearly the same product but a way better deal. Allow me to explain.

The Radeon R9 290

You see, the Radeon R9 290 is almost the same thing as its elder sibling. The 290 shares the same basic card design and cooler, and it’s based on the same brand-new “Hawaii” graphics chip as the R9 290X. AMD knows not everyone is willing to fork over 550 bucks to have one of the fastest graphics cards in the known universe. To help ease the pain a bit, they’ve strategically trimmed the R9 290’s graphics performance and reduced the price accordingly.

GPU

Boost

clock

(MHz)

ROP

pixels/

clock

Texels

filtered/

clock

(int/fp16)

Shader

processors

Rasterized

triangles/

clock

Memory

transfer

rate

(Gbps)

Memory

interface

width (bits)

Starting

price

Radeon R9 290 947 64 160/80 2560 4 5 512 $399
Radeon R9 290X 1000 64 176/88 2816 4 5 512 $549

Well, I say “accordingly,” but between you and me, I think they may have been a bit too generous. The table above tells the story. Versus the R9 290X, the 290 has had only two minor adjustments: the peak clock speed is down from 1000MHz to 947MHz, and the number of active compute units on the chip has been reduced from 44 to 40. That means the 290 has a truly enormous amount of shader arithmetic power, but not quite the borderline terrifying capacity of the R9 290X. Both should be more than sufficient.

Now look at the other specs. The 290 retains the Hawaii GPU’s full complement of 64 pixels per clock of ROP throughput, so it has loads of pixel filling and antialiasing power, and it can still rasterize quad primitives per clock cycle for high-polygon tessellation goodness. Even better, the R9 290 has the exact same memory config as the 290X, with a 512-bit-wide path to four gigabytes of GDDR5 running at 5 GT/s. Memory bandwidth is oftentimes the limiting factor in graphics performance, so this choice is especially notable.

But yeah, AMD somehow dropped the price by $150 compared to the 290X. That’s a mighty big price break for not much change in specs. The 290 stacks up very well against the fastest graphics cards available today.

Peak pixel

fill rate

(Gpixels/s)

Peak

bilinear

filtering

int8/fp16

(Gtexels/s)

Peak

shader

arithmetic

rate

(tflops)

Peak

rasterization

rate

(Gtris/s)

Memory
bandwidth
(GB/s)
Radeon HD
5870
27 68/34 2.7 0.9 154
Radeon HD
6970
28 85/43 2.7 1.8 176
Radeon HD
7970
30 118/59 3.8 1.9 264
Radeon
R9 280X
32 128/64 4.1 2.0 288
Radeon
R9 290
61 152/86 4.8 3.8 320
Radeon
R9 290X
64 176/88 5.6 4.0 320
GeForce GTX 770 35 139/139 3.3 4.3 224
GeForce GTX 780 43 173/173 4.2 3.6 or 4.5 288
GeForce GTX
Titan
42 196/196 4.7 4.4 288

The R9 290 has higher theoretical peaks of ROP throughput, shader flops, and memory bandwidth than a thousand-dollar GeForce Titan. And it’s just not that far from the R9 290X in any of the key graphics rates.

Of course, the numbers above are theoretical peaks. Especially in the case of Hawaii-based cards, the GPU won’t always be operating at those clock speeds. AMD’s PowerTune algorithm raises and lowers GPU clock speeds dynamically in response to various workloads, and it does so more aggressively than any other GPU we’ve seen before.

Now realize that both the R9 290 and 290X apparently have the same PowerTune limits for power draw (~290W, from what I gather, although AMD has been coy on this front) and temperature (94°C). You can imagine what that means for actual operating clock speeds. In fact, here’s a look at the operating clocks during our short (4-5 minutes) warm-up period in Skyrim for power and noise testing.

Once the cards have both heated up, near the end of span of time in question, the 290X’s clocks drop down to nearly match the R9 290’s. During those moments when the clocks almost match, the only real performance difference between the two is a small amount of texture filtering and shader computing power. Now, this is just one scenario. You will definitely see both of these cards throttle more with different workloads, and changes in ambient conditions will cause GPU speeds to vary, too. Also, as you can see, raising the fan speed limit on the 290X by putting it into “uber” mode keeps its GPU clocks closer to 1GHz. Just know that we’re talking about some pretty small differences between the 290 and the 290X in its stock fan mode. We’ll show you more of the actual performance shortly.

First, let me tell you a little story about the 290’s early life and upbringing, which will help you understand how it became the card it is today.

Back in its formative days—that is, when it arrived in Damage Labs roughly two weeks ago—the R9 290 wasn’t quite the same. Although we didn’t know the price yet, AMD supplied us with info showing the R9 290 positioned against a specific competitor: the GeForce GTX 770, a $399 card from the green team. The 290 was well prepared to take on this foe, more than ready to embarrass the competition with its performance.

Then, just as the 290’s big day approached, the wily green team decided to slash prices rather dramatically in response to the new Radeons. Suddenly, the GTX 770 was out of the 290’s price range, down at $329, and the closest competition was the GeForce GTX 780. The GTX 780 was now priced at $499, but it was faster than the 290 and came bundled with three major games and a $100 discount on Nvidia’s Shield handheld Android game console. One could conceivably make a case for the GTX 780 over the R9 290—and the 290X, for that matter.

AMD’s product team sprung into action, delaying the 290’s release by a week and supplying us with a new driver intended to help the card match up better against the GeForce GTX 780. The one change contained in that driver was an increase in the card’s max fan speed. Originally, the 290 shared the same “40% of max speed” limit as the R9 290X in its default, or “quiet,” mode—and it was a little more subdued than the 290X, according to our decibel meter. The new driver raised the 290’s fan speed limit to 47%. That change alone endowed the 290 with a few percentage points of additional performance, with the obvious tradeoff that it was a little louder while gaming.

So that’s how the R9 290 as you’ll know it came to be. This card is a little more aggressive and noisier than originally expected, but its difficult upbringing hardened it against the knocks it’ll encounter in this cruel world. Totally like Eminem. Or some other rapper.

Any of them, I guess.

Anyhow, AMD tells us the Radeon R9 290 should be available at online stores starting today, thankfully in higher numbers than the so far hard-to-find R9 290X. We’ll have wait and to see how that supply picture meets the demand, of course.

Test notes

To generate the performance results you’re about to see, we captured and analyzed the rendering times of every single frame of animation during each test run. For an intro to our frame-time-based testing methods and an explanation of why they’re helpful, you can start here. Please note that, for this review, we’re only reporting results from the FCAT tools developed by Nvidia. We usually also report results from Fraps, since both tools are needed to capture a full picture of animation smoothness. However, testing with both tools can be time-consuming, and our window for work on this review was fairly small. We think sharing just the data from FCAT should suffice for now.

Our testing methods

As ever, we did our best to deliver clean benchmark numbers. Our test systems were configured like so:

Processor Core i7-3820
Motherboard Gigabyte
X79-UD3
Chipset Intel X79
Express
Memory size 16GB (4 DIMMs)
Memory type Corsair
Vengeance CMZ16GX3M4X1600C9
DDR3 SDRAM at 1600MHz
Memory timings 9-9-9-24
1T
Chipset drivers INF update
9.2.3.1023

Rapid Storage Technology Enterprise 3.5.1.1009

Audio Integrated
X79/ALC898

with Realtek 6.0.1.6662 drivers

Hard drive OCZ
Deneva 2 240GB SATA
Power supply Corsair
AX850
OS Windows 7
Service Pack 1
Driver
revision
GPU
base

core clock

(MHz)

GPU
boost

clock

(MHz)

Memory

clock

(MHz)

Memory

size

(MB)

GeForce GTX 660 GeForce
331.40 beta
980 1033 1502 2048
GeForce GTX 760 GeForce
331.40 beta
980 1033 1502 2048
GeForce GTX 770 GeForce
331.40 beta
1046 1085 1753 2048
GeForce GTX 780 GeForce
331.40 beta
863 902 1502 3072
GeForce GTX Titan GeForce
331.40 beta
837 876 1502 6144
Radeon
HD 5870
Catalyst
13.11 beta
850 1200 2048
Radeon
HD 6970
Catalyst
13.11 beta
890 1375 2048
Radeon
R9 270X
Catalyst
13.11 beta
? 1050 1400 2048
Radeon
R9 280X
Catalyst
13.11 beta
? 1000 1500 3072
Radeon
R9 290
Catalyst
13.11 beta 5
947 1250 4096
Radeon
R9 290X
Catalyst
13.11 beta 5
1000 1250 4096

Thanks to Intel, Corsair, Gigabyte, and OCZ for helping to outfit our test rigs with some of the finest hardware available. AMD, Nvidia, and the makers of the various products supplied the graphics cards for testing, as well.

Also, our FCAT video capture and analysis rig has some pretty demanding storage requirements. For it, Corsair has provided four 256GB Neutron SSDs, which we’ve assembled into a RAID 0 array for our primary capture storage device. When that array fills up, we copy the captured videos to our RAID 1 array, comprised of a pair of 4TB Black hard drives provided by WD.

Unless otherwise specified, image quality settings for the graphics cards were left at the control panel defaults. Vertical refresh sync (vsync) was disabled for all tests.

In addition to the games, we used the following test applications:

The tests and methods we employ are generally publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.

Crysis 3


Click through the buttons above to see frame-by-frame results from a single test run for each of the graphics cards. You can see how there are occasional spikes on each of the cards. They tend to happen at the very beginning of each test run and a couple of times later, when I’m exploding dudes with dynamite arrows.



We’re at a nice place with our current selection of games, test scenarios, and the latest video card drivers. The FPS averages and our various frame latency-focused metrics tend to agree about which solution is best. In the case of Crysis 3, there are still some spikes in rendering times for each card, but those appear to be some caused by some sort of CPU or system performance limitation. The cards from both brands are all affected similarly by those slowdowns, as our “time beyond 50 ms” metric demonstrates.

At the end of it all, the R9 290 outperforms the GeForce GTX 780 ever so slightly in this test sequence, and it’s just an eyelash behind the R9 290X.

Far Cry 3: Blood Dragon




We’re not moving around the level or doing anything too crazy in this test sequence. We’re mostly just sniping bad guys from a ways off. As a result, all of the cards produce relatively consistent frame times throughout this test. The R9 290 performs very well, rendering every single frame in 33 milliseconds or less. The 290X is measurably faster, but the difference would be very tough to perceive.

GRID 2


This looks like the same Codemasters engine we’ve seen in a string of DiRT games, back for one more round. We decided not to enable the special “forward+” lighting path developed by AMD, since the performance hit is pretty serious, inordinately so on GeForces. Other than that, we have nearly everything cranked to the highest quality level.




Just like the 290X, the R9 290 renders each and every frame in less than 16.7 milliseconds. That’s a perfect 60Hz rate of production. Notice that the FPS average is 50% higher than that. You can’t really get a feel for smoothness with FPS averages alone.

Then again, this game just isn’t much of a test for video cards this fast. Everything from the GeForce GTX 770 on up cranks out frames at a near-perfect 60 FPS rate. Interestingly, though, the latency curves show that none of the cards are producing frames quickly enough to match a 120Hz display. Even the fastest cards are above the requireed 8.3 milliseconds per frame.

Tomb Raider





Although the 290’s average of 43 FPS might seem iffy, in truth its performance is stellar, with virtually no latency spikes and every frame produced in under 30 milliseconds. Those nice, smooth latency curves tell the tale. The 290 again comes out slightly ahead of the GeForce GTX 780, although the two are practically equivalent here.

Guild Wars 2




Don’t let those occasional frame time spikes on the faster cards bug you too much. This game has some kind of issue that causes the fastest solutions to run into periodic spikes, but the spikes themselves are fairly small in magnitude. We’d be better off without them, of course, but they’re not easy to notice while playing.

Here’s one game where the GTX 780 outperforms the R9 290. The 290 continues to acquit itself well, running just a smidgen behind the R9 290X.

Power consumption

Please note that our load test isn’t an absolute peak scenario. Instead, we have the cards running a real game, Skyrim, in order to show us power draw with a more typical workload.

The 290’s power draw at idle is a little higher than the 290X’s, possibly due to differences in voltage or chip binning. This small delta isn’t anything to worry about, though.

Under load, well, like I said, the power limits on the 290 and 290X appear to be the same. The 290 is often faster than a GeForce Titan, but it gets there by using more power.

Noise levels and GPU temperatures

AMD’s eleventh-hour decision to raise the 290’s fan speed bought a few percentage points worth of added performance, but it means the 290 is relatively loud for a high-end graphics card.

For what it’s worth, had AMD stuck with the original plan, the 290 would have been quieter than the R9 290X. The 290 originally registered 46.5 dBA on the meter under load.

Conclusions

Ok, you know the drill here. We’ll sum up our performance results and mash ’em up with the latest prices in a couple of our handy scatter plots. As ever, the best values will gravitate toward the top left corner of the plot, while the worst will be near the bottom right. The two legacy Radeons are shown at their introductory prices to make you dance with glee about progress—or, you know, not be impressed, if somehow that’s your reaction.


Remember how I said up front that my task was simple this time around? Here’s why. The R9 290 is just ever so slightly slower than the R9 290X and essentially matches the GeForce GTX 780. Yet it costs $150 less than the 290X and a hundred bucks less than the GTX 780. This card’s value proposition is outstanding. AMD clearly wants to win in this product category, and they’ve practically sacrificed the viability of the R9 290X in order to do so. The R9 290 is virtually the same thing as the 290X at a handsome discount—and it’s a way better value than the GeForce GTX 780, too.

Much has been made of the R9 290X’s relatively high power draw, operating temperatures, and noise levels. Obviously, the R9 290 shares these same characteristics, with a somewhat louder default fan profile. In my view, the only one of these properties that’s really worth fussing over is the noise, since it’s the thing you’ll notice in day-to-day use.

We’re apparently going to have to face this price/performance-versus-acoustics tradeoff for a while, so I spent some quality time with the R9 290 trying to get a handle on what I think of the noise, beyond the readings on the decibel meter. I’ve gotta say, there are some mitigating factors. For one, I like AMD’s choice to stick with a blower that exhausts hot air out of the case rather than going for a triple-fan cooler that doesn’t. I’ve seen those fan-based aftermarket coolers perform poorly in multi-GPU configs, and they often occupy quite a bit more space—maybe even a third expansion slot—in order to work their magic. I’m also not convinced AMD’s cooler is a poor performer and therefore noisy, as some folks seem to think. Remember, it has more heat to remove than any of the coolers on the other cards we tested. Finally, I don’t think this blower’s ~49 dBA reading is the worst of its type. The quality of the sound isn’t grating. Subjectively speaking, there are much more annoying coolers in this territory on the decibel meter. The impressively smooth, gradual ramp of fan speeds up and down in the new PowerTune algorithm helps make the noise less noticeable, too. This ain’t an FX-5800 Ultra, folks.

Before you run off and do some damage to your credit card, I would advise waiting just a few more days. I’ve been working on upgrading our GPU test rigs to Windows 8.1, attaching a 4K monitor, and installing new releases like Battlefield 4 and Arkham Origins. Fancy new game testing at 4K will soon commence. I really need to spend some quality time closely inspecting the behavior of AMD’s new XDMA-based CrossFire mechanism, too. As you might be aware, Nvidia plans to release the GeForce GTX 780 Ti in two days. You can imagine some activity in Damage Labs around that development. If you can sit tight, we’ll know a lot more very soon.

Then again, if you can’t wait and want to pull the trigger on an R9 290 today, I can’t say I’d blame you. It’s a great value, and nothing that happens later this week is likely to change that fact.

Follow me on Twitter like a boss.

Comments closed
    • kamikaziechameleon
    • 6 years ago

    This card is 900 dollars right now. Update that graph.

    • Geonerd
    • 6 years ago

    TR, are you guys allowed to modify these cards?
    (Hey, removing the stock leaf blower is just a minor tweak!)

    Is there an aftermarket cooler currently available that will work with these puppies?

    If so, you know what to do! 🙂

      • Meadows
      • 6 years ago

      Fix something just so it finally performs as advertised? Isn’t that AMD’s job by default?

        • Geonerd
        • 6 years ago

        IMO, ‘broken’ is maybe too strong a word, although I agree that the thermal situation is certainly ‘sub optimal.’ 🙂

        In any case, I AM very curious to see how much extra clock speed these cards can attain with a high quality aftermarket air HS+F combination.

          • sschaem
          • 6 years ago

          Put some hearplug and run the stock card at 100% fan and you will get your answer 🙂

          The card should at least do 1GHZ solid on games. So about 10% faster overall ?

          quick note. at 1ghz full load and a fan cooler, expect HUGE amount of heat to swirl in the case.
          So you will need a solid case with good airflow to move that amount of air. No free lunch…

          Maybe AMD found that this causes more problems in term of system failure, thats why they insist on a exhaust blower ?

            • Pwnstar
            • 6 years ago

            No, the exhaust blower is so you can put a bunch of them side by side for SLI.

            • JustAnEngineer
            • 6 years ago

            It’s “Crossfire” rather than “SLI”, but yes.

            • Pwnstar
            • 6 years ago

            You know what I mean.

    • Gam3sTr
    • 6 years ago

    I have a question, will this be good for editing (high res video, app dev) ? I know cpu does most of the work but I would like to know. I may be saving up to crossfire two of these.

      • Airmantharp
      • 6 years ago

      That will depend quite a bit on exactly what applications you’re using and what types of editing you’re doing. Can you give us more information on your setup, workload, other tools, etc.?

    • NarwhaleAu
    • 6 years ago

    I had told myself that when I could purchase a R9 290X or GTX 780 for around $400 I would do so. The R9 290 seems to fit that, but I’m hesitating.

    I can’t help but think that while this card is GREAT – both from a performance and price point of view (its $50 cheaper and much higher performance than I expected), that it is really going to sing once it is on a 22nm or 20nm process.

    …maybe I have that backwards though, and both heat and noise will increase on the smaller node.

    • Modivated1
    • 6 years ago

    If you want crossfire power just imagine, you can buy 2 R9 290x’s for $1100 or 3 R9 290’s for $1200!

    Which would you buy? …..Power Over kill, even on 4k gaming!!

      • Airmantharp
      • 6 years ago

      Well, given that nabbing a board to support two GPUs is quite a bit cheaper than three- I’d save my pennies with the top bin 290X and put the money into a custom loop!

        • Modivated1
        • 6 years ago

        Yeah plus I never thought of the Power requirements to support 3 of the 290’s, you would definitely need a 1600 watt PSU and I am not ready part with the additional cash for a setup like that.

          • Airmantharp
          • 6 years ago

          I’d think that 1200W would do, given ~300W/card plus say ~200W for the rest of the system comes out to 1100W peak load. Maybe get a 1350W unit to be safe :).

      • chuckula
      • 6 years ago

      [quote<]Which would you buy?[/quote<] That's easy. The ones with complementary earplugs (and a fire extinguisher if you want to go buy 3... yikes).

    • indeego
    • 6 years ago

    99th percentile still matching almost exactly the Average FPS. Good job, TR. Seems like now though that you have to do double the work for zero difference in outcome.

      • Airmantharp
      • 6 years ago

      “Trust, but verify”

      Please continue to deride the one group of people committed to ensuring that these companies actually provide the products that they’re promising through exhaustive and invasive testing. Maybe you’d prefer to read THG instead?

        • indeego
        • 6 years ago

        I don’t see them doing image quality tests anymore. Do you still trust them?
        I don’t see them testing Office applications for CPUs anymore. Do you still trust them?
        etc etc. I could go on and on for tests that used to be important that are abandoned.

        The FPS has basically matched the 99th % almost to the “t” for over a year now. Even when they started there was very little difference between them.

        Great groundbreaking research and all that, but it’s extraneous at this point. At least combine? Or do people really sit there like zombies and compare between the two looking for differences?

        edit: Also the end of that review:

        Armed with that info, we can dispense with the talk about game bundles, rebates, and pricing shenanigans that might shift the value math in favor of one camp or another. Instead, we have a crystal clear recommendation of the GeForce GTX 660 Ti over the Radeon HD 7950 for this winter’s crop of blockbuster games. Perhaps AMD will smooth out some of the rough patches in later driver releases, but the games we’ve tested are already on the market—and Nvidia undeniably delivers the better experience in them, overall. ”

        You know what happened? AMD DID release bundles AND price drops that made them a better buy than equivalent GeForces.
        [url<]https://techreport.com/review/24646/tr-april-2013-system-guide/4[/url<] (They recommend a 7870!) [url<]https://techreport.com/review/24646/tr-april-2013-system-guide/6[/url<] (They recommend a 7950!) A mere three months after that review!

          • Airmantharp
          • 6 years ago

          Yet AMD still hasn’t released a fully-functional frame-pacing beta driver, or a WHQL frame-pacing driver at all? TR shouldn’t ensure that they make good on their promise?

      • superjawes
      • 6 years ago

      Except, you know, Average FPS puts the 780 ahead of the 290 instead of almost even and closes the gap between the 290X and 290X Uber data points.

      Also, this might be an old example but [url=https://techreport.com/review/23981/radeon-hd-7950-vs-geforce-gtx-660-ti-revisited/11<]this[/url<] is an example where the distinction makes a [i<]huuuuuge[/i<] difference in the results.

        • indeego
        • 6 years ago

        Oh yeah that $25 made a [b<]massive difference.[/b<] The prices listed on TR are outdated the very minute they are published. So even that is kinda silly. It also doesn't take into play bundles, rebates, SLI/CF differences, drivers, driver features, TDR/crashes. beta drivers, other features, etc. That one graph is cute and all, but really, a $25 difference for 9 FPS on 99th %ile (likey fixed on AMD's part after the review) is [b<]silly.[/b<]

          • superjawes
          • 6 years ago

          What? I wasn’t talking about prices. I was talking about performance. The price is how you judge the performance.

          And sure, it doesn’t take everything into consideration, but when your metric changes which card is considered the fastest, that’s a big deal.

            • indeego
            • 6 years ago

            You were considering prices, you linked a page where price/performance was the entire point!

            How silly. You *must* consider prices. How can you not? Then we can just compare some arbitrary card to whatever we want, call it a day.

            The graph TR linked doesn’t tell the whole story. In their paragraph after they completely ignore aspects of buying a card that are important factors. Kinda like seeing the forest through the trees. Big picture. These are amazingly complex devices. The FPS/frame issue is closed as important.

            Next.

      • travbrad
      • 6 years ago

      [quote<]99th percentile still matching almost exactly the Average FPS. Good job, TR.[/quote<] I know you were being sarcastic, but sites like TR are one of the reasons the frametime and framerate measurements are so close now. That wasn't always the case when they first started this kind of testing, particularly when it came to multi-GPU setups. We saw/heard anecdotal micro-stuttering issues for many years before TR/PCPer started testing frametimes, but once there were sites actually proving it was a real issue, most of the problems got worked out pretty fast. Nothing gets a big company moving like holding their feet to the fire.

      • Zizy
      • 6 years ago

      Start fraps, record gameplay and throw data in a short program in Matlab. Data in, graphs and numbers out. The very same thing you would want to do if you want just average FPS.
      The only thing remaining is to place everything on the image and write the thing. With consistent layout there are no troubles placing stuff, no extra work here. The text is about as brief as it can be, average FPS wouldnt shorten this by much. They know everyone looks at pretty graphs and reads just the conclusion 🙂

      So, almost no need for ANY extra work, let alone double the work. Just twice as many graphs.

    • chuckula
    • 6 years ago

    I’m getting really tired of these paid Nvidia shills complaining about the ultra-realistic audio output of these amazing Radeon solutions.

    When you are playing Battlefield 4 and you are coming into a hot LZ in the chopper do you know what these cards will give you? I’ll tell you: THE MOST REALISTIC HELICOPTER AUDIO EXPERIENCE POSSIBLE! Now that’s what I call [b<][i<]TRUE[/b<][/i<] audio you paid Nvidia shills! When you are playing Thief WITH MANTLE ENABLED do you know what you'll get when you sneak up right behind that guard? I'll tell you: THE MOST REALISTIC HELICOPTER AUDIO EXPERIENCE POSSIBLE! He'll be so busy looking for the helicopter that he'll never hear you coming! Automatic WIN when using the only officially approved cards for the MANTLE gaming experience!

      • JustAnEngineer
      • 6 years ago

      That’s one of your more creative trolling efforts in a while.

        • chuckula
        • 6 years ago

        Somebody needs to show the Jimbo/Fighterpilots of the world that it takes intelligence to write a proper troll instead of the usual “Hey Beavis, huh huh.. Nvidia/Intel/anybody not AMD sucks! huh huh” level of discourse they are used to.

          • auxy
          • 6 years ago

          Okay, I lol’d. Here’s your begrudging +1.

          • superjawes
          • 6 years ago

          You should be ashamed, chuckula. You totally missed the obvious joke…

          Airmantharp, etc. – “They need to put a blower on this card.”
          – “Huh huh huh…blower…huh huh huh.”

      • the
      • 6 years ago

      Heh. Reminds me of back in the day when I had an old Seagate Barracuda SCSI hard drive that was loud enough it genuinely was confused for a helicopter flying around. You couldn’t hold a normal conversation near that system when I had the case off of it.

    • tomc100
    • 6 years ago

    It’s time that techreport start using Battlefield 4 to test with.

      • indeego
      • 6 years ago

      Too new. Give it a few patches. No game of that caliber will run perfectly OOB on either AMD or Nvidia.

      • the
      • 6 years ago

      From the article’s conclusion:

      “Before you run off and do some damage to your credit card, I would advise waiting just a few more days. I’ve been working on upgrading our GPU test rigs to Windows 8.1, attaching a 4K monitor, and installing new releases like Battlefield 4 and Arkham Origins. Fancy new game testing at 4K will soon commence. I really need to spend some quality time closely inspecting the behavior of AMD’s new XDMA-based CrossFire mechanism, too. As you might be aware, Nvidia plans to release the GeForce GTX 780 Ti in two days. You can imagine some activity in Damage Labs around that development. If you can sit tight, we’ll know a lot more very soon.”

    • Disco
    • 6 years ago

    I understand that the acoustics of a card can be very important to some people. I see a lot of discussion about the noise levels under load. For myself, I don’t see what the big deal is. I have a 7970 which can ramp up fairly loud when under load, but for any of my computer time where I need ‘to think’ (i.e. work), the gpu is NOT under load and I can barely hear my computer’s fans. When I am playing games and pushing my card, I have headphones on and can’t hear the fans at all. I only hear them once I finish playing and take off the headphones. They’re loud for a little while as the card cools off.

    Obviously quieter is better, but I think the importance of the acoustics under load are being blown out of proportion. The main thing you don’t want is some high frequency squealing fan noise that cuts through the headphones, and it doesn’t appear that these 290(X) cards have that characteristic.

      • Airmantharp
      • 6 years ago

      But what if you want to play with speakers? What if you use your computer for some other workload that benefits heavily from GPU acceleration?

      Nvidia and AMD’s partners have shown that it’s well within reason to quietly cool ~300W GPUs, and Nvidia has shown that it can be done with blowers!

      That’s why there’s no excuse for AMD’s omission here. Their blower is definitely quite nice, but it wasn’t designed to keep up with this GPU when pushed, and push AMD did.

        • Disco
        • 6 years ago

        I’m just pointing out that the acoustics are not the end all be all characteristic that some people in these 290 threads are making it out to be for the ‘typical’ gamer, who I assume will play with headphones. For those of you that don’t play with headphones, or have other uses for the gpu, obviously the acoustics become more important. But just because they are important for you, doesn’t mean you need to convince every other gamer of that fact. Just wait and see what the 3rd party coolers add to the mix.

          • Airmantharp
          • 6 years ago

          The point will be moot with third-party coolers. We already know what they can do in terms of dB/W TDP, this GPU will not be a challenge.

          But really, acoustics are a big deal to a lot of people. Sure, there are plenty that don’t care, but there are plenty more that make acoustics a focus of their builds- usually people who’ve owned something louder than they were comfortable with in the past :).

      • superjawes
      • 6 years ago

      Just think of it this way:

      [b<]290[/b<] +Competitive Price +As Fast as a 780 -Hot -Noisy [b<]780[/b<] +Cool +Quiet -More expensive than a 290 What you're looking at is a card that is cheaper but about as fast as a 780, and one of the [u<]tradeoffs[/u<] is noise. Anyone looking to purchase a card needs to consider all of the pros and the cons when making a decision. That's what people are talking about...perhaps some are making a bigger deal than it needs to be, but that's the general perspective.

    • CaptTomato
    • 6 years ago

    A KILLER card other than the cooler, which will be fixed by the usual suspects.
    Btw…7950 and 7970 DO NOT get remotely hot with custom coolers.

      • Airmantharp
      • 6 years ago

      No card gets ‘hot’ with custom coolers- the heat just gets dumped into the case :).

        • CaptTomato
        • 6 years ago

        So, my oclocked 7950 Vapour X has never exceeded 70c in my stock Antec 1100 case, IOW, a basic but brilliant case like the 1100 can easily deal with heavy oclocks if the custom cooler can.

          • Airmantharp
          • 6 years ago

          You’re doing it right. If you use a high-end open-air cooled card, you put it in a large enclosure with lots of balanced airflow. The card has plenty of fresh air to exchange with so that it never winds up using it’s own exhaust to ‘cool’ the card.

          The problems that you’ve avoided are two-fold- if you’d used a smaller enclosure with less robust cooling, the card would be running hotter and potentially suffocating itself and running slower than optimum, and if you had a second card in there, you’d likely need significantly more airflow to keep them from strangling each other as arrays of GPUs using open-air coolers are wont to do.

            • CaptTomato
            • 6 years ago

            I hear what you’re saying, but aren’t all these latest cards{Nv+AMD} throttling relative to temps?…if so, who in their right mind would buy a small case for Xfire/SLI…..?

            • Airmantharp
            • 6 years ago

            Well, I did!

            My Fractal Design Define R3 is about as small as you can make an ATX mid-tower, yet it has 3x120mm fans, 4x140mm fans, and I have a pair of GTX670’s with blowers and an H80i on the CPU (as that rear 120mm fan).

            Load temps are still in the 70’s, below the throttling limits, and noise stays under control with all of the fans set to run low, and with the case outfit with noise dampening material. The intakes are all filtered too, so the case remains dust-free after two years of use!

            But that’s why I like blowers- they’re the only reasonable way to build a quiet, compact multi-GPU system, which was my goal this time around.

            • derFunkenstein
            • 6 years ago

            Is the R4 significantly larger than the R3? Because the R4 is the biggest case I’ve ever owned. Widest, tallest, deepest. The only case I ever had that was larger was one of those full-ATX Chieftech towers that were popular about 10 years ago.

            • Airmantharp
            • 6 years ago

            It’s a little bigger, I believe, as they made room behind the motherboard for better cabling. But it’s dwarfed by the P180 it replaced :).

            • derFunkenstein
            • 6 years ago

            I never had one of those, bought one on Ars used and there was a mishap in shipping – the front of the case snapped off. Not the seller’s fault, i don’t think, but it went to UPS and I got the insurance and got the R4.

    • deathBOB
    • 6 years ago

    This review (and the recent story about the Steam Box) highlights why we need to move away from ATX. The GPU is the single most important component in a gaming PC. It consumes the most power, it produces the most heat, and it provides to most benefit for each dollar spent. It should be treated as such with room and cooling appropriate for its power and importance.

      • Airmantharp
      • 6 years ago

      Actually, if you have no intention of running two GPUs, ITX makes the most sense. However, if you’re going to run two or more, only ATX or larger really makes sense- well, ATX for two, EATX for three on air, either for four on water.

      But the attention really should be focused on ITX. Literally every necessary component has been shrunk, and if you need no other expansion slots, you can get top-end performance out of a tiny little box that makes the first-gen PS3’s look ginormous :).

        • deathBOB
        • 6 years ago

        I think your missing my point or maybe I didn’t explain it well. Current standards like ATX, ITX, etc. treat the GPU like a secondary component with most space and cooling capacity going to the CPU. That’s what’s wrong. There is no reason such a powerful and important component like this card should be shoehorned into the tiny space provided by the ATX, ITX, etc standard.

        BTX was supposed to deal with high power CPUs, but it was never needed because CPU power leveled off. GPU power hasn’t and we’re still stuck with add-on cards treated like an afterthought by ATX, ITX, etc. We need a BTX-like standard to deal with GPUs.

          • Airmantharp
          • 6 years ago

          Well, I did miss your point- I’m not seeing GPUs being treated as ‘secondary’ components that don’t get their allotment of space inside standard form factors; the expansion slot area, at least in ATX, is quite large, and I’m not sure how one might make an improvement to it without radically changing the form factor; even the SteamBox example just put the GPU on a riser to flatten the enclosure out, which is a pretty common configuration.

          But as you can see from my first response, I am quite interested in your idea- could you explain it a little further?

      • albundy
      • 6 years ago

      so you’re saying that it needs more room and cooling, and then you say it should be in an atx box. notice your contradiction?

        • deathBOB
        • 6 years ago

        Can you point me to where I said it should be in an ATX box?

    • ClickClick5
    • 6 years ago

    Three comments:

    1) Another great write up!
    2) Good to see AMD throwing out a serious card.
    3) The amount of Nvidia fanboi comments are priceless!

      • Fighterpilot
      • 6 years ago

      Yep its a laugh,
      AMD’s second stringer just kicked sand in the face of Nvidia’s whole lineup.
      R9 290 is this year’s enthusiasts/gamers choice I’d say.

    • sschaem
    • 6 years ago

    Oh my… $399 and I have a feeling it will spank the $699 780 ti at any and all games that support Mantle.

    I also wonder how much tweaking is possible… who run any PC part at stock voltage anymore?

    ex: on my fx-8350 I went from 3.5 on all core stock to 4.2ghz on all core, with a voltage drop.
    I wonder if the same crazy overvolting is being done on those chips ?

    • Kretschmer
    • 6 years ago

    I don’t understand why AMD is offering the 290 and 290X if the performance delta is this small. Wouldn’t it be better to differentiate based on robust/quiet cooling instead of a tiny difference in frame times? E.g. ship the 290X with a monster cooler and bump the price up a bit more.

      • Arclight
      • 6 years ago

      My guess is that the R9 290 will fall behind the R9 290x as drivers will get better.

      • sschaem
      • 6 years ago

      upto 10% faster for $150… for some people that make sense.
      The r9-290x is for high end gaming PC, probably in the $1500+ range. so paying <10% extra to get upto 10% extra FPS makes sense.

      The 290x is AMD Titan… but its $549.
      nvidia use to price the Titan at $999, GTX 780 $649… $350 more, for little more FPS

      I’m ok with AMD pricing. $399 and you get pretty much GTX 780 performance.
      You recently had to pay $650 for that…

      If only AMD could do the same with CPUs. a I7-4930k for <$400 would be more then welcome.

        • Airmantharp
        • 6 years ago

        A six-module/twelve-core Steamroller with a slight bump in IPC over Piledriver could find a home in my system. I’ll figure out how to cool it quietly on my own!

        As for GPU prices- talking about them now is pretty silly, given that we’re still one release out from rounding out this refresh cycle and that prices look to need more adjusting. Oh, and we don’t have any AMD cards you’d actually want to buy- and we don’t have a solid forecast on their arrival. They can’t come soon enough!

      • the
      • 6 years ago

      I guess the R9 290X can be seen as AMD’s halo product, much like nVidia’s Titan. The performance delta between the halo product and the tier below is minor but the pricing delta is not. Thus the high end cards are there to make the tier below seems reasonably priced. And it works. Before the price cuts, paying $650 for a GTX 780 is outrageous but not when you put it up against the $1000 Titan. As for the R9 290X, at $550, it does feel like a halo product as that is still a lot to put down for a video card, especially compared to the R9 290 at $400 with nearly identical performance.

      • Freon
      • 6 years ago

      Looks like AMD is just countering the 780 Ti and their 770/780 price cuts, which were a counter to the new 280x/290x. Finally, some competition driving performance to cost ratio again! About f’ing time.

    • Kretschmer
    • 6 years ago

    Scott,

    Would it be possible to follow-up with some quick tests on underclocking and fan speed reduction? I’m curious about the performance impact of running fans at 40%, 30%, etc.

    I’m a potential customer of the 290 who wouldn’t necessarily require 100% of the performance at my monitor’s native resolution. If I could double my current performance for $400 while retaining decent acoustics, this would be a tempting product.

      • LukeCWM
      • 6 years ago

      This is exactly what I’m looking for as well. I’m still at 1080p. I bet I could max the settings in Tomb Raider [i<]and[/i<] lower the fan speed and still have fantastic frame rates and frame times.

        • CaptTomato
        • 6 years ago

        More than likely, but if you’re patient you can also have excellent cooling and maximum GHZ at peak loads.
        It seems odd that TR readers should ever buy reference cards that have slightly inadequate coolers.

        • Airmantharp
        • 6 years ago

        If you plan on sticking with 1080p, a single 290X with a custom cooler will likely be an unbeatable value that will last you for years.

          • LukeCWM
          • 6 years ago

          I’m all for high resolutions, but more out of a desire for progress than a personal near-future upgrade plan. I like gaming, but it’s not my primary hobby, which deserves the lion’s share of my hobby-dedicated funds. =]

          I’ll probably upgrade to 1440p when prices come down a bit more (decent monitor for $200 or so), which I’m hoping 4k will accelerate.

          As it is, $400 is a stretch for me for a GPU. But the value proposition of the 290 is wooing me.

          I sincerely appreciate Scott including the 5870, because that’s what I’ve got in my system now! Yikes!

            • Airmantharp
            • 6 years ago

            Never regret getting some miles out of a card!

            And yeah, it’s hard for me to spend $$$ on gaming these days, being a bit of a photography geek now myself. Hell, I just picked up a new camera system, on fire sale, for the price of two R9 290X’s! Thing is, I’ll get far more enjoyment out of that camera system :D.

            • LukeCWM
            • 6 years ago

            Nice, which camera? And which lenses?

            • Airmantharp
            • 6 years ago

            Canon EOS-M, with:
            -18-55 kit zoom
            -22/2 pancake (incredible little lens, like the 40/2.8)
            -11-22 (from Canada, still in the post!)
            -90EX compact flash
            -EF -> EF-M adapter, which works incredibly well with all of my eight EF lenses

            I wanted a second, compact body to back my 6D up, and the fire-sale prices of the EOS-M were pretty hard to pass up given that it works just as well with adapted Canon lenses as with it’s native lenses, and image quality from it’s 18MP sensor is the best Canon has produced, exceeding my expectations. The 22/2 is amazing little lens, too, making the camera cargo-pocketable, while the 11-22 is the optically best wide-angle zoom Canon has made, the only one that’s stabilized, as well as the cheapest, making for great lightweight landscape work.

            Now, I mainly got it to get the lenses, adapter, and flash on fire-sale, as I expect Canon to upgrade the body soon enough, at which point I’ll probably sell the body and kit zoom to upgrade :).

            • CaptTomato
            • 6 years ago

            I have a 7950, so if you bought a custom cooled 280x aka 7970ghz, you’d do very well at 1080p, but 290 would be better for heavy SSAA.

            • LukeCWM
            • 6 years ago

            Yeah, the 280x isn’t as shiny, but it’s probably much more appropriate for my budget and uses. I bet it would be a great card for $300.

            Maybe something nice will work its way into the $250 price point to save even more coin, but still following this fantastic price/frametime ramp up that extends from $200-400 on this chart. Crazy!

    • scmpj
    • 6 years ago

    Don’t know why this 290 doesn’t appear in the main search (Newegg) but here is the link for what I think is the last 290 available online.

    [url<]http://www.newegg.com/Product/Product.aspx?Item=N82E16814131527&Tpk=n82e16814131527[/url<]

    • d0g_p00p
    • 6 years ago

    Well dangold. It’s time for my yearly video card upgrade come January and I usually prefer nVidia based video cards. I have been eyeballing the GTX 770 for quite some time. However AMD has made this potent card for a price that is right in the ballpark of what I am planning on spending. It reminds me of when the 4850 & 4870 was released and the performance you could get for the price.

    I looks like I’ll be on the red team for 2014. It’s good to be back!

    • HisDivineOrder
    • 6 years ago

    Good on AMD for continuing the price war even lower. I respect that and I like that a lot. I like their performance level they’re offering.

    I wish it was quieter. A LOT quieter. I mean, this card as it is not one I could ever invest in. Not ever. I mean, no. Just no. I want to hear myself think and that would not let me do that. Perhaps my hearing is better than yours, people who could tolerate this, or perhaps I’m just more finicky about such things in my old age, but the days of me tolerating Delta fans or FX5800’s never existed. Haha, but I tried them and yeah, no.

    I think AMD could have scrapped their reference launch and had OEM’s putting their third party coolers on that from Day 1 and they’d have been WORLDS better off in the reviews. I don’t know if nVidia is going to feel the pressure to drop prices against a card this loud. I know AMD fanboys have been waiting for something, anything at all, to crow about for what seems like years now, but man to crow about a card with this volume level feels ridiculous for fanboys that were arguing about how power efficient the Radeon 7970 was at launch previously.

    Does whatever AMD suit you, no matter what it is? If (or perhaps now when?) nVidia releases a card this loud just to reach higher performance levels, I’d say the same then. A computer component should not be this loud if it’s going into my computer. It just shouldn’t. It’s ridiculous levels. I can only imagine how much louder it would get once you put it in an actual case and it had less airflow than an open bench. It would be enclosed, but that’d be a small amount of buffer to block out the piercing sound of the fan when it starts throttling like crazy because of lack of open airflow, which is what happens when you have a card operating so close to the limit it can handle. Moreover, other parts are going to start suffering when heat like that is raging inside your system for hours on end and the heat starts raising the temperature of the entire system. That much power converted to heat ain’t all going out of your system because that’s what the whole point of their raising the temperature to such a high level and using the same cooler is.

    They want to use the fact that higher temperatures will transfer more heat for the same cooling apparatus, but the consequence of that is they’ll also have more passive heat transfer through the card or any surface on the card out into your system than a blower for a more conservative card. I don’t know about you, but I can’t say my enclosed system is as good at airflow as an open bench past a certain point of heat.

    I really want to love the card. I really do. $400 is great as a price point for the performance offered. It’s a heck of a deal. $100 plus bundle is a lot to pay, but the sound would drive me crazy and soon I’d be sorry if I hadn’t paid the extra for the quieter card with somewhat less performance. I had the same choice between the 670 or the Radeon 7970 last year and that’s how I chose the 670 then, too. I’m glad I did and I’d be glad I did again faced with the same choice between somewhat higher performance at incredibly higher acoustics. Moreover, your entire system will be working harder to push that hot air out, raising the speeds of more of your fans on top of the volume of the card. At least, that would happen in my system.

    No, I’m sorry, AMD. I want to, but the compromises you made are too loud for me. I looked at Anand’s attempt to quiet the card by lowering the speeds and fan speed to try and match the volume level of the 780 out of box (where they compared the benchmarks). The joke is you can’t match the 780’s out of box acoustics with a 290, no matter how low you go. Performance plummets and the acoustics never get better. So I’d have to replace the cooler altogether.

    I hope maybe that custom coolers show up and fix it, but this thing is putting out so much heat in that vaunted smaller space, I don’t know how much of that can be easily transferred out by the open air coolers based on tri- and dual fan combos. I know the Gigabyte tri-cooler windforce that was beloved for being quiet with nVidia was loud for the 7970GHZ. Much louder. Hard not to imagine the silent coolers for 780’s being much louder when paired with a card pumping out this much heat and all THAT heat will be going into your system for certain. Again, raising your overall system temps and forcing your fans to run louder on top of that.

    I can’t escape the feeling that AMD got a GPU part back that didn’t perform as well as they’d hoped at the voltage they thought was reasonable, so they decided winter was coming, people wouldn’t have hot rooms for a few months, and they could get away with releasing mini-space heater elements as video cards. They figure by the time people start having warmer temps and hot summers, they’ll have something else to talk about to keep people distracted from the throttling issues that will creep in as time goes on, solder gets old, and house temperatures start to rise.

    I don’t think these are the temperatures these cards were originally built to do. I don’t care what they say to the contrary. They are leaking power like crazy at levels that vaguely remind me of Bulldozer and they are pumping out heat like crazy as a result. Like with the FX 9xxx series, they’re pumping out a high performance part at a ridiculous power usage and heat production cost, hoping people will lock onto the benchmarks without worry for how loud it will be.

    I predict Newegg won’t be taking ANY returns on the R9 series. If they haven’t already, they’re going to lock that down in a hurry.

    The underlying technology is great and I love the return to form for price wars by AMD. But I hate the cooler and I’m beginning to think they’re driving their underlying GPU too hard (or using more faulty dies at higher voltage levels) to try and catch up to a more mature 780 architecture through sheer brute force.

    The consequences of that are too much for my ears to bear.

      • DancinJack
      • 6 years ago

      Dude, your posts are too long. I applaud you for putting effort into them, but you’re really not going to get people to read them if they are this long.

      • swaaye
      • 6 years ago

      I sense instability in the force…

      • pohzzer
      • 6 years ago

      “I hope maybe that custom coolers show up and fix it.”

      Like that’s going to happen.

        • auxy
        • 6 years ago

        Why wouldn’t it happen? Or are you being facetious?

          • Airmantharp
          • 6 years ago

          Looks like classic jackassery to me, auxy :).

      • Bensam123
      • 6 years ago

      A two page long rant about the sound and why you shouldn’t buy the card because of it if you don’t want to read that.

        • Airmantharp
        • 6 years ago

        Thanks for the TL;DR. I thought about reading it on three different occasions, but I just couldn’t convince myself that there was a topic concerning this release that deserved so much attention.

    • jdaven
    • 6 years ago

    Talk about night and day reviews. Compare TR’s conclusion with Anandtech’s. Here is the first paragraph of Anandtech:

    “Bringing this review to a close, it’s admittedly not very often that we write a negative video card review, especially for a major SKU launch from NVIDIA or AMD. Both companies have competitive analysis teams to do benchmarking and performance comparisons, and as a result know roughly where they stand long before we get their cards. Consequently they have plenty of time to tweak their cards and/or their pricing (the latter of which is typically announced only a day or two in advance) in order to make a place in the market for their cards. So it’s with a great deal of confusion and a tinge of sadness that we’re seeing AMD miss their mark and their market, and not by a small degree.”

    Total opposite conclusion as TR. I guess Chuckula’s crackpot theory that Anandtech is pro-AMD is just that crackpot.

      • chuckula
      • 6 years ago

      I never said Anand was pro-AMD… I said that accusations of a huge anti-AMD bias at Anand were silly in the light of AMD plowing a rather large amount of advertising dollars into Anandtech.

      Also, you’ll see plenty of positive reviews of AMD products on Anand’s site, but you are on a fishing expedition to prove that Anand is in an evil anti-AMD conspiracy just because every other word in their review isn’t taken directly from AMD’s marketing materials. So basically, if Anand has such a huge anti-AMD bias, why is AMD so stupid as to actively fund his evil campaign? Answer that one Jdaven… let’s see if you can get an answer before 2015 or so….

    • anotherengineer
    • 6 years ago

    Nice little review Scott.

    One question, I see all your testing seems to be done at 2560×1440 resolution, is this done to really stress the card and/or other reasons?

    I’m just a bit curious why you don’t do any testing at 1080p since it seems to be so common?

    [url<]http://store.steampowered.com/hwsurvey[/url<] expand primary display resolution. (32.5% 1080p vs 1% 1440p I know steam is only a rough estimate) Thanks

      • superjawes
      • 6 years ago

      Stress is the primary reason. More pixels = more work to do. If you don’t stress the cards, you’ll likely see them all cap out performance and you won’t be able to tell much difference between two cards.

      And even if you pick between cards that are overkill for 1080p, the extra headroom will allow for keeping “eye candy” turned on for longer.

      • HisDivineOrder
      • 6 years ago

      Because the card isn’t going to see much hard work at 1080p. People who buy these cards want resolutions higher than 1080p, so testing at 1440p (or even 1600p) is a better indicator of where the cards’ performance levels are.

      Plus, do yourself a favor and stop gaming at 1080p. That’s for the console gamers. Splurge a little, invest in a new monitor and get some high resolution action going. Your eyes will thank you.

        • anotherengineer
        • 6 years ago

        I don’t game anymore, no time with little ones. Also I see a enough of people that put big hardware in their PC and use an old crt because it works and they don’t want to “waste money on a monitor” when then can spend it on “an ssd”!!! Crazy stuff IMO, but to each their own.

        Also I’m not saying don’t run a 2560×1440 test either, I am just curious why not a 1080p test also since it is so common? Time constraints maybe?

          • indeego
          • 6 years ago

          Any card >$200 made in past 3 years will run 1080p at acceptable framerates. You might even say 4 years. My GTX 460 never had a problem with games at those resolutions. (nor did it have issues with 120Hz monitors.)

            • travbrad
            • 6 years ago

            [quote<]My GTX 460 never had a problem with games at those resolutions.[/quote<] How long ago did you upgrade from your GTX460? Mine was already starting to struggle towards the end of last year in some games @ 1080p. Obviously I could play less demanding games (Source Engine games and console ports), but in games like Natural Selection 2 and Planetside 2 the GTX460 wasn't really cutting it anymore. Even with most settings at low I was dropping to the low 20s FPS in NS2 when it was busy. I got a 660 in early spring and those games run sooo much smoother.

        • kamikaziechameleon
        • 6 years ago

        Console gamers don’t even game at 1080p but 720p FYI.

          • anotherengineer
          • 6 years ago

          Don’t tell Neely that! 😉

          • Airmantharp
          • 6 years ago

          720p if they’re lucky!

            • auxy
            • 6 years ago

            Indeed! [url<]http://forum.beyond3d.com/showthread.php?t=46241[/url<]

          • NeelyCam
          • 6 years ago

          WHAT??!!

        • auxy
        • 6 years ago

        Obviously, DPI matters, not resolution. DPI on a 27″ 1440p panel is 109, barely higher than the 98 of a 23″ panel at 1080p.

        Find me a 1440p monitor which supports >100Hz — and has a panel with the pixel response to not turn everything into a blurry mess, like the Catleaps, Shimians, and so on do — and we’ll talk.

          • Airmantharp
          • 6 years ago

          They both matter-

          But find me a 1440p/1600p monitor that has great color, viewing angles (or rather, no color shifting off-axis), and panel uniformity, and we’ll talk.

          The problem? There is no ‘perfect’ solution yet. None. Not CRTs, not OLED, not various forms of LCD, not plasma, so the solution is derived through prioritization.

            • auxy
            • 6 years ago

            That EIZO Foris FG2421 might be close to what we want, even if it is “only” 1080p.

            I don’t mind “only” 1080p, just run SSAA — problem solved!

            • Airmantharp
            • 6 years ago

            I agree- it’s very, very close. Maybe they’ll do something in 1440p/1600p using DP to push the refresh rates?

            Then again, I don’t think I’m up for investing in a new monitor if it doesn’t have G-Sync :).

        • LukeCWM
        • 6 years ago

        I can’t afford a higher resolution monitor than 1080p, nor can I afford the highest end cards that would just cap out anyway.

        I’m a dedicated TR reader (and listener!), and I’d be very happy to know what kind of card would get me maxed settings at 1080p in Tomb Raider or Sleeping Dogs at a consistent 60 FPS.

      • Freon
      • 6 years ago

      You probably do not need a $350+ video card to run 1080p. Hell, I think my $280 video card will do everything just fine at 2560×1440.

    • UnfriendlyFire
    • 6 years ago

    Wait for the aftermarket coolers.

    Let’s see what ASUS, Gigabyte, and other companies can pull off.

    • alienstorexxx
    • 6 years ago

    [i<]Before you run off and do some damage to your credit card, I would advise waiting just a few more days. [/i<] is not that much damage for the performance given, it's a perfect deal. besides, nvidia isn't launching any competition to this card, they just can't compete. price-performance isn't on nvidia plans. just buy amd if you want best price performance, it isn't any much more to review. you can see 4k tests on anand or guru3d. both 290x and 290 perform outstandingly

    • esterhasz
    • 6 years ago

    Honest question: is it going to be complicated to undervolt/clock that thing a little to get down the fan noise and power consumption? I guess that 95% of performance with 85% of power should not be impossible, in theory at least.

    • dpaus
    • 6 years ago

    Hmm, wasn’t I saying something the other day about drawing a ‘best fit’ curve through the R9 270X and R9 280X data points?

    I sincerely hope that AMD isn’t sacrificing too much margin on this card. But at a minimum, they’re costing Nvidia a lot of cash.

    • jdaven
    • 6 years ago

    On these high end products, haven’t we left the days of the HSF combo? Shouldn’t we be migrating en masse to water cooling solutions by now?

    • BoBzeBuilder
    • 6 years ago

    Why AMD? More than anything, they just killed off their newly launched R9 290X and any profits they hoped to squeeze out of it. They should have aimed for better thermals and a lower performance and this card still would have been great value while still preserving 290Xs rightful place. Instead, they launch the same card at a lower price tag. Hey, it’s excellent for the consumer I suppose.

      • pohzzer
      • 6 years ago

      Market share. What can Nvidia do to counter the 290 plus Mantle, start selling the 780 at a loss?

    • hasseb64
    • 6 years ago

    Nvidia > AMD in power efficency.

      • alienstorexxx
      • 6 years ago

      amd > nvidia in performance per dollar, performance per mm2 and consumer care.

        • HisDivineOrder
        • 6 years ago

        nvidia > amd in # of WHQL drivers per year, great coolers per reference card, good linux drivers, money spent, and money saved in a few years at your ear doctor.

        amd > nVidia in decibels, power wattage, temperature of your card, # of different versions of the same beta for the Battlefield 4 launch in a one week period, and dollars saved per use that will be invested into water cooling or third party cooling.

        Fun times.

          • Waco
          • 6 years ago

          Why does anyone care at all what temperature the card runs at? My 4870X2 routinely ran well over 100 C and I couldn’t have cared less.

            • superjawes
            • 6 years ago

            Heat tends to shorten the useful life of computer hardware…

            • MiG754
            • 6 years ago

            My 4850 would also routinely run over 100 degrees and even idle at 85, which is probably the main reason it started melting exactly after its two-year warranty period was over, and I had to buy a new card one more year later, even though I hadn’t needed it at all… Heat is bad. Otherwise PCs wouldn’t have cooling.

            • swaaye
            • 6 years ago

            Cards have been running hot for ages. People were complaining about Voodoo3 being a 90C card back in 1999. There are lots of Voodoo3 cards still working perfectly fine.

            The problem more recently has been solder failure caused by thermal cycles and the resulting thermal expansion/contraction. The composition of solder changed in the past decade and that has caused problems. I don’t know if the temps GPUs run at is a variable that affects longevity significantly. I would imagine it has been analyzed thoroughly by smart people though.

            • superjawes
            • 6 years ago

            The smaller the silicon, the more likely that things break, too.

    • Mr. Eco
    • 6 years ago

    As for non-reference coolers – there is already an article at Tom’s with a number of R9 280X. Using good cooling solutions as provided by Asus, Gigabyte, Saphire etc. dropped temperatures 25-30 degress Celsius.

    • Fighterpilot
    • 6 years ago

    Awesome card,needs a quieter cooler and a bit of overclocking.
    The $400 price is a (well deserved)smack in the face to Nvidia and at least gets some sanity back to high end card prices.
    Well done AMD,not perfect but damn good.

    • Prestige Worldwide
    • 6 years ago

    I can’t be alone in wanting to see BF4 as a staple in new GPU reviews. Any chance we could have it included in the GTX 780 TI review alongside AMD’s new cards?

    edit: typo

      • oldDummy
      • 6 years ago

      This article implies that is coming within a few.
      4k inclusive.

      • willmore
      • 6 years ago

      But, will they let AMD use the Mantle code path or “keep it fair” by limiting all cards to DX11?

        • Prestige Worldwide
        • 6 years ago

        They should use both options, DX11 and Mantle. DX11 needs to be included to have an apples-to-apples comparison, but they should also show what (if any) real advantage Mantle can provide.

        But most importantly, I just wanted to see the most relevant multiplayer FPS that most users who shell out for a GK-110 or Hawaii card are likely to be playing included in GPU reviews. Correct me if I’m wrong, but haven’t the multiplayer portions of all Crysis games been flops in the medium / long term?

          • Airmantharp
          • 6 years ago

          You know, TR hasn’t been big on benchmarking the multiplayer element of games due to repeatability issues, right?

          I’d love for them to take a stab at it too, but the only ‘good’ place I’ve found that kind of information has been at HardOCP, and the guys over there are very clear about the reliability of the data they’re presenting and that there’s a good dose of subjective judgement involved.

            • superjawes
            • 6 years ago

            Yeah, even if TR was to include multiplayer benchmarks, I doubt they would include the results in the final price/performance analysis because of the variation.

            The only game I can think of that could give genuinely meaningful results is Starcraft 2, but even then you’re cheating a little bit by going through the replay feature, and I’m not completely sure if that is the same stress level as a live game.

        • Airmantharp
        • 6 years ago

        I’d expect them to do what they always do and exhaustively explore all of the options and release a very well presented rundown of all of the options and their effects.

        I’ve no doubt that they’re dying to pound out the effects of Mantle support in Frostbite 3 in conjunction with AMDs GCN lineup just like the rest of us :D.

    • jimbo75
    • 6 years ago
      • chuckula
      • 6 years ago

      Sure Jimbo, let’s test this card out against my GTX-770 playing Metro Last Light… on Linux since Steam does that now. Unfortunately, while the 290 undoubtedly has higher end hardware, AMD’s refusal to give proper driver support for Linux means my GTX 770 still wins.

        • jimbo75
        • 6 years ago
          • chuckula
          • 6 years ago

          Lol.. (since I don’t have to use them) Catalyst drivers under Linux.

            • brucethemoose
            • 6 years ago

            I just started using linux and the Catalyst beta with my APU, it isn’t that bad.

            Also, lol at gaming on linux.

            • chuckula
            • 6 years ago

            [quote<]Also, lol at gaming on linux.[/quote<] I've been using Linux since the 20th century and the uptick in the ease of use and power of gaming in the last year has been tremendous. Wait 2 years, you won't be lolling then. [Oh, and that's not even counting Android, which is already becoming a major platform for mobile gaming].

            • HisDivineOrder
            • 6 years ago

            You’re betting against the future lol’ing at gaming on Linux. 😉

      • HisDivineOrder
      • 6 years ago

      I don’t know. I wouldn’t say it’s THAT fine.

      I mean, honestly when you can say that it’s a card so loud you can’t hear anyone talk over it without yelling, you’ve got yourself a card that ain’t THAT fine.

      nVidia already moved the bar to acoustics. AMD is trying to backtrack in desperation, but if nVidia responds in kind, we’ll all be the ones to suffer.

      Remember when AMD was launching the Radeon 7970 as an efficiency part way back in the day? Them days are long gone now.

    • oldDummy
    • 6 years ago

    Darn.
    Just bought a GTX 690 and looks like I bought on the cusp of Obsolescence.
    Well I have ~80 days to “step-up” if possible.
    The question is: Will a new dual chip offering come within that time frame….or if it’s needed…
    Should know within a few days.
    Great article.
    EDIT: corrections…of course.

    • superjawes
    • 6 years ago

    So the takeaway from this article is: “don’t buy a 290X.”

    AMD probably could have cut performance on this card a little bit and still made an excellent value proposition. The only real issue is that, like the 290X, it’s still loud and hot, but for $150 less, I’d be willing to overlook that.

      • HisDivineOrder
      • 6 years ago

      Actually, the takeaway is the R9 290 is pretty much maxed out from the get-go.

      The R9 290X is not. You can up the fan speed on it yourself and ensure it boosts as constantly as the R9 290. Then you’ll have your advantage for the R9 290X. It’ll sound as loud as the R9 290, but if you can take the sound of one, you could take the sound of the other.

      I couldn’t stand a card that loud, but if you can, that’s where the R9 290X comes in.

        • mesyn191
        • 6 years ago

        If you pop a better cooler on either the 290 or 290X both suddenly tend to get significantly more overclocking head room though.

        This suggest the design and hardware itself isn’t really maxed out but is instead heat + power limited.

    • Risme
    • 6 years ago

    Thanks for another informative review, Scott.

    • ultima_trev
    • 6 years ago

    I sure hope a better cooler (or drivers) can improve the position of the R9-290X somewhat; Thermals, noise AND performance! In this state, it’s not even worth $50 more than the R9-290, let alone $150 more.

    That being said, I can’t wait to see a Direct CUII Top, Vapor-X, Windforce 3x or Twin Frozr IV version of 290-X! Even at $650 I thought the GTX 780 was certainly the better deal versus R9-290X simply because it has much better thermals/acoustics, however aftermarket versions of R9-290 will no doubt be price/performance kings in the high end.

    AMD, after the flop of that was the HD 7xxx series GPUs and FX series CPUs you have miraculously restored my faith in ye!

      • USAFTW
      • 6 years ago

      Radeon HD 7000 series were decent performers, not fair to dismiss them as failures. That’s why they were on sale until now and people kept buying them. Although game bundles might have had something to do with it.
      But I agree with everyone who says the reference cooler is a let down. AMD needs to spent some dollars on Thermal Management Units for there upper end cards. The reference coolers on HD 7970, R9 290x and R9 290 have a lot of copper, lots of dense fin-stack and everything, they just need some tweaking. Also, getting some nice blowers with more air pressure and less nice would be a nice improvement.
      I, like Scott, prefer blowers to open air designs, as long as they’re not obtrusively loud. Hence I’m excited for aftermarket coolers on these.
      And whoever wants to see an FX-8450 steamroller thumbs up!

    • pohzzer
    • 6 years ago

    The R9 290 plus Mantle is going to be a ball breaker.

    • jstern
    • 6 years ago

    I’m not an expert on graphics card, but I read other benchmarks about the Uber card, and it was designed to run at 95c, with the fans barely making any noise always running at 40%. They even left it running for 2 or 3 days straight, running a benchmark in a loop, and it was still working perfectly at the end. No crashes or any errors. So I’m feeling sorry for this card because it’s misunderstood.

      • Airmantharp
      • 6 years ago

      The temperature’s fine- but you can make a 780 faster while still producing less noise than this cooler in ‘quiet’ mode. Unlock the card’s full potential, and you’d be forgiven for wondering why your computer isn’t hovering :).

      So yeah, we’re still waiting on the custom coolers. AMD’s blowers can sit this one out.

    • Meadows
    • 6 years ago

    A performance drop, and the card is [i<]even louder than before[/i<] for it. The only saving grace is the attractive price point, if even.

      • Krogoth
      • 6 years ago

      Blame the mediocre reference HSF.

      Aftermarket solutions can remedy most of the noise and temperature issues.

        • Meadows
        • 6 years ago

        It doesn’t matter. These are the cards people review.

          • Arclight
          • 6 years ago

          I guess frame times and price/performance don’t matter anymore, the stock cooler is where it’s at. I never looked at it this way before you mentioned it…it makes so much sense. /s

            • Meadows
            • 6 years ago

            I have open headphones, I don’t need a whining PC.

            • Arclight
            • 6 years ago

            You must be the whole market and the 290 will probably never have a custom cooler or an aftermarket cooler. Oh my, how can AMD not see the error of their ways…

            • willmore
            • 6 years ago

            [quote<] I don't need a whining PC.[/quote<] - That one is so easy, I'm not even going to do it.

            • willyolio
            • 6 years ago

            PEBKAC?

            • willmore
            • 6 years ago

            No, it was going to be “you certainly don’t need a whining PC, you whine enough by yourself.”

            Edit: so? No, ‘you’!

        • HisDivineOrder
        • 6 years ago

        If they can keep up with its temperature and fan requirements. If.

        I don’t get why it’s so difficult for AMD to design a custom cooler that’s worth something. It seems like they’d just stop reusing the same crappy part they’ve used for several reference designs even as the parts use more and more power and produce more heat… they’d just sit down and say, “Hey, maybe we ought to fix this.”

        But nope, they keep re-using the same cooler time and again, ignoring how much it was originally rated to handle and just raising the temperature target on each later product.

        Then they tell you the product was “rated” to handle that temperature. I’m sure they tested it. Honest.

          • MFergus
          • 6 years ago

          Scott seems to think the cooler isn’t that bad, it’s just a ton of heat to remove. I wonder if Nvidia will start to raise the noise levels on their cards or start marketing how they are quieter. Performance is a lot easier to market than being quieter.

            • Airmantharp
            • 6 years ago

            I doubt it. Nvidia knows their target market expects well rounded products, and they’ve built their brand on meeting that expectation these last few years.

            But a set of comparisons mapping out performance/dB and dB/watt would definitely be nice. As bad as the AMD blower apparently isn’t, it’s still quite likely more bad than the Titan blower :).

            • Antimatter
            • 6 years ago

            AMD’s blower fan isn’t bad compared to other blower fans. However a double/triple fan solution (I think) would likely offer superior temperature and noise levels.

            • Airmantharp
            • 6 years ago

            Compared to which other blower fans?

            Nvidia’s blowers seem to be able to dissipate nearly as much heat while producing less than half the noise at the same basic level of performance. If I want a blower, it won’t be on an Nvidia card, and like others here, I definitely prefer blowers. They’re quieter when you put them inside a case that feeds them air.

            • Pholostan
            • 6 years ago

            The fan is IFAIK almost the same. The Titan reference deign also dumps hot air inside the case, not all hot air is exhausted out the tiny expansion slots. That probably plays a big part in it being less loud.

        • Vasilyfav
        • 6 years ago

        What you’re saying is that the actual cost of the card is $399 + whatever the cost of the aftermarket cooling solution is that lets this card work properly.

        That means the price/performance charts need redoing.

      • NeelyCam
      • 6 years ago

      Your view of the situation is too narrow.

      Yes; it’s louder and slower than what came before it. But keep in mind that what came before it was heavily binned; the magical chips that were within the target power consumption window while staying within an acceptable power consumption window.

      The X-chips were the cream-of-the-crop, high-performing, lowish-power chips. They deserve their price points. What’s notable here is that in the pure mainstream performance/price points, AMD chips just took the crown. Instead of tweaking the design to be quiet, they tweaked it to be loud as hell but high-performing.

      That’s the exact opposite of what I would personally do, but I can see why it works for the majority. This time around, NVidia is going with performance/noise, while AMD is going with performance/dollar… and AMD is probably going to win.

      Well done!

    • jjj
    • 6 years ago

    After THG got far worse performance from retail cards vs the review sample i can’t trust the 290 and 290x numbers until we get a lot more retail cards tests and see how they really clock.
    Hope AMD didn’t tried to con us and the retail cards perform well but that has to be verified at this point

      • Airmantharp
      • 6 years ago

      Performance is tightly coupled with the thermal envelope; as has been mentioned, these cards are at the ragged edge of said envelope, and thus any decrease in cooling efficiency will directly result in a decrease in performance.

      Also, don’t read THG. It makes you less smart.

        • jjj
        • 6 years ago

        It’s ok i would need to read it for a few million years to drop to your level.

      • nanoflower
      • 6 years ago

      Did they test in an open case or an enclosed case? What was the ambient temperature of the room when they started testing? Scott showed that just a few degrees difference in room temperature can make a difference in the performance of the AMD reference card.

    • Arclight
    • 6 years ago

    The card might as well come without a cooler cause that PoS is going right off. That said, OMG this card is amazing for the price. A good cooler and some OC love and this will match and even surpass the stock 290x.

    The poor bastards who bought the GTX 780 at $600 or more a few weeks ago…

      • JustAnEngineer
      • 6 years ago

      [quote=”Arclight”<] The poor bastards who bought the GTX 780 at $600 or more... [/quote<] You can see the pain that Airmantharp is feeling in his hundreds of redundant anti-AMD posts.

      • bfar
      • 6 years ago

      The 290 is the way to go from today by a long shot.

      Those who bought 780s May/June/July aren’t that badly off though. That’s an additional 5 months of top end performance, which is quite a long time with respect to graphics cards these days. It’s still highly competitive with the very top-end, especially with a little overclocking.

      • chuckula
      • 6 years ago

      LMFTFY:
      [quote<]The poor bastards who bought the [s<]GTX 780[/s<] [u<]R9-290x[/u<] at [u<]$550[/u<][s<]$600[/s<] or more* [s<]a few[/s<] [u<]last[/u<] week[s<]s ago[/s<]...[/quote<] Also remember that a lot of those "poor bastard" Nvidia fanboys likely jumped on the 780 at launch over 5 months ago. Anybody expecting to buy a high end part that will never get outpaced by future parts doesn't understand how the computer business works. * Potentially quite a bit more given the low availability.. a search of Amazon yesterday showed the only affiliate claiming to have the card in stock selling it for $750.

        • Fighterpilot
        • 6 years ago

        Aww….upset your wittle feelings did it.
        Fanboy-trolls like you are always so cute when their world falls apart…

          • chuckula
          • 6 years ago

          Uh… I was making an accurate observation based on facts since $150 is $150 no matter if it’s going to AMD or Nvidia. You’ve never made anything but personal attacks since you obviously don’t know anything about the underlying technology.

          Funny but true observation: I believe that Airman Tharp is actually an Airman in the real world. I believe that Fighter Pilot is a Fighter Pilot… with his GI Joe toys in his mom’s basement.

            • HisDivineOrder
            • 6 years ago

            Don’t resort to ad hominem. There are so many other problems that there’s just no reason to.

        • superjawes
        • 6 years ago

        I would feel bad, but anyone looking to buy their own computer parts should be doing their research no matter what the price point. We knew that Nvidia was in a position where they could cut prices on the 700 series when the 290X released, and we knew that the 290 was on the way, and we knew that the 780 Ti was on the way.

        Btw, no one should buy a 290 right now, either. Give it at least a week for the 780 Ti reviews and the market to settle.

      • Vasilyfav
      • 6 years ago

      So this card is going to need an $80 aftermarket cooler just to work properly is what you’re saying.

      GJ making a great reference design AMD.

        • Mr. Eco
        • 6 years ago

        The cards with non-reference coolers will be on the same price, no more than a few bucks
        Opinion based on the R9 280X cards with non-reference coolers, e.g. [url<]http://www.newegg.com/Product/Product.aspx?Item=N82E16814127759[/url<]

        • Arclight
        • 6 years ago

        [quote=”Vasilyfav”<]So this card is going to need an $80 aftermarket cooler just to work properly is what you're saying. [/quote<] Yes, but aftermarket coolers usually outperform even custom coolers not only stock coolers, that's how they justify the price. Expect the custom cooled cards, non OCed versions, to cost ~$420 while the custom cooled and heavily OCed versions ~$450. Like others said the R9 290 is at this moment the best option for a gamer, if he/she plans on adding an aftermarket cooler later down the road, or is willing to wait for the custom cooled designs.

    • SetiroN
    • 6 years ago

    It’s so strange to me that closed loop graphics card coolers aren’t more common.
    As much as that kind of liquid coolers don’t really perform better than proper air coolers on CPUs, they seem like the perfect match for graphics cards, where things are very different: conversely to the former, on GPUs the switch to water alone makes 20-30°C worth of difference, regardless of radiator+pump quality, due to the very limited space and positioning that don’t allow for very well performing air coolers. The pump would fit perfectly in place of the blower fan and even a single 12×12 radiator would have multiple times the radiating surface of any stock cooler.
    It’s time.

    • USAFTW
    • 6 years ago

    The price is simply outstanding!
    Rumors had it to be 450 bucks, but I always thought according to AMD price structure on older cards, It would be 400.
    Am I right (or alone) in thinking that at 400$ the 290 is a steal?

      • Damage
      • 6 years ago

      You’re not wrong, Walter. You’re just a shill.

        • USAFTW
        • 6 years ago

        How am I a shill for AMD when I haven’t got a 280X for free and do in fact own a 460?
        Not a good practice, accusing people of being a corporate shill.
        I hate AMD. I love NVidia. That good enough for ye?
        No love for a brother expressing their own PERSONAL thoughts?
        I know that you’re getting tired and everything, but better get some sleep first and then hammer your own readers for something you only accuse them of and they haven’t done.
        People in forums of this type usually are free to take side of one company, that’s not a good measure for accusing someone getting paid from a particular company.

          • Airmantharp
          • 6 years ago

          I can’t think of a better endorsement- might call AMD for that check!

          (I’ll be here all night)

            • USAFTW
            • 6 years ago

            I’m just saying better respect each other’s opinions more. This is an american website, with the emblem “liberty” plastered all over. This is not soviet Russia. I realize my mistake was not being a troll in my first post.

            • Airmantharp
            • 6 years ago

            Your name is ‘USAFTW’ yet you can’t remember to capitalize ‘American’? We’re supposed to respect your opinion?

            • USAFTW
            • 6 years ago

            Yes you are. You name is Airman right?
            This is a place where people can openly express their opinions and not fear some troll who picks on you for a spelling error.

            • Airmantharp
            • 6 years ago

            I’m just making light of Damage trolling you- that’s a rare delight, to be sure!

            (but yeah, if you could please capitalize the names of countries, that’d be cool)

            • USAFTW
            • 6 years ago

            Well, I’m not a native so it’ll be a while ’till a GMT +3:30er gets these right. Excuse my English, American!

            • Airmantharp
            • 6 years ago

            Excused, lol.

            I apologize for antagonizing you, native speaker or not. I was hoping to find out just why Damage called you out as a shill, honestly. And yeah, I could care less about misspellings, though I did find that particular misspelling ironic give your name :).

            • Anomymous Gerbil
            • 6 years ago

            Why do Americans say “[b<]could[/b<] care less", when they mean "[b<]couldn't[/b<] care less"? /waits for smartarse "Could care less..." response.

            • willmore
            • 6 years ago

            As an American (as in States of, not just somewhere on the two continents), I’d suggest a massive ignorance of the language?

            To be clear: you’re not the ignorant one.

            • superjawes
            • 6 years ago

            For the record, I don’t think it’s fair to lump all Americans into one group. We are a large country with many dialects.

            And yes, Americans make fun of other Americans for said dialects.

            • clone
            • 6 years ago

            your understanding makes the question pointless.

            • superjawes
            • 6 years ago

            [url=https://techreport.com/news/25609/report-retail-radeon-r9-290x-cards-may-be-slower-than-press-samples?post=775266#775266<]I am disappointed...[/url<]

          • Ryu Connor
          • 6 years ago

          “The lady doth protest too much, methinks.”

        • Pwnstar
        • 6 years ago

        If he were a shill, he’d be banned.

      • JustAnEngineer
      • 6 years ago

      [quote=”Some lunatic”<] I'm waiting for the Radeon R9-290. It's going to be 85-90% as fast for less money. [/quote<] [url<]https://techreport.com/news/25561/radeon-r9-290x-scarce-online?post=771206[/url<]

        • clone
        • 6 years ago

        the negative being power consumption turned out to be about the same.

        on the positive side performance is even closer to R9 290X than predicted and the price is even lower than predicted.

          • Airmantharp
          • 6 years ago

          I hope, honestly as all get out, that someone puts a good blower on these cards when AMD gets around to putting out the 8GB versions and getting whatever their version of G-Sync is on the cards.

          Till then, I’ll be weighing the 6GB vs. the 12GB versions of the 780Ti when they get here…

            • clone
            • 6 years ago

            Scott says the cooler is actually very good, literally went into detail about it….. why are you still so heavily invested in crapping on it?

            as for the 12gb 780 Ti, it’s still going to be a 28nm chip built around a tweaked/stripped Titan core and the prospect reminds me of budget cards that brag about having 2gb’s of ram when the best they can make use of is 512mb.

            having not seen the benches yet my guess/prediction is that 780 Ti will be a little higher clocked, a little more capable than Titan with a whack load of ram thrown onto it so that it can pretend to be a game changer.

            it’d be great if it was a game changer but tossing a bunch of ram at a die process limitation seems like the wrong solution.

            • Airmantharp
            • 6 years ago

            And Anandtech literally recommends that you don’t buy it, and they go into incredible detail to show what happens when you make it as quiet as a 780; namely, it’s slower.

            These things are subjective, sure. But the numbers don’t lie.

            • clone
            • 6 years ago

            that’s interesting in a few ways.

            1: it shows that 780’s cooler is no better than AMD’s so really did AMD “screw up the cooler”
            2: it gives the impression that AMD designed R9 290 to be a 20nm part but couldn’t wait any longer and had to go with 28nm.
            3: it’s nice that you can dial back AMD’s part to equal a “lowly 780” and be just as quiet.
            4: if the parts were in quantity I wouldn’t hesitate to grab the cooler that allows for this kind of variability.

            all of that said both AMD and Nvidia are kind of in a pickle on this one, reference coolers have to meet certain space limitations, especially in regards to SLI and Crossfire.

            I don’t believe the cooler is bad but it’s clear in order to achieve the best performance from a single GPU that the cooler will likely kill any chance of going Crossfire unless a motherboard comes out with PCIe slots more widely spaced apart….. which to be clear probably should have been introduced 4 years ago if not earlier…. .kinda surprised it still hasn’t been done.

            • Airmantharp
            • 6 years ago

            Well, I wouldn’t say that AMD ‘screwed up the cooler,’ as by all accounts it’s a very nice cooler. You don’t get to >50dBA without producing offending tones by skimping on the cooler, after all!

            But it is quite apparent that the cooler is under-spec’d for the level of performance they’re demanding from this GPU, and that they could have alleviated any concerns during this launch by tackling that issue head-on before release, and would have truly knocked this one out of the park. Hell, if they’d done that, I might already own two…

            But they didn’t- the acoustics, while apparently not ‘offending’, are unacceptable for most when compared to Nvidia’s offerings in the same arena. Nvidia was criticized for loud coolers too, and they responded by applying some real elbow grease to the situation and providing a solution that’s been universally lauded as the new standard. So no, AMD didn’t do anything ‘wrong’, as much as they have highlighted just how much Nvidia’s efforts to quiet their GPUs has paid off.

            So, for 1 through 4:

            1. The 780’s cooler is far superior if you normalize any of the variables to make a direct comparison. For both dB/FPS and dB/watt, the Titan cooler produces better results. That means that Nvidia’s expensive cooler is capable of dispatching similarly large amounts of energy while producing less noise, as well as being capable of providing higher performance for each dB of noise produced.

            2. As far as which node the part was designed for- well, that’s a discussion for a circuit engineering forum :). What we do know is that TSMC has had regular problems with each new node in recent memory. 40nm and 28nm have proven to be a challenge for both GPU vendors, resulting in tardy releases of new parts after tardy fab startups of the new nodes they were designed for. And as you mention, AMD has commented that they’ve had to fast-track development of parts using a hybrid of current-gen and next-gen technologies due to failures at the fab.

            3. As with #1, if you dial AMD’s part back to measured 780 noise levels, it’ll be slower; the cooler just isn’t as efficient. Still, this is subjective, given that the noise profile of AMD’s cooler is reported to be rather inoffensive, so one could reasonably argue that it could be objectively ‘louder’ than the Nvidia cooler without a subjective effect on perceived noise levels, allowing the AMD part to match or exceed the 780.

            4. You can have all of these reference cards you want :). We both know that better solutions are on the way, both from the industry and from the community, and that there’s a tremendous interest in unleashing the full performance potential of these cards while keeping the racket down.

            As for using multiple open-air-cooled 290Xs, you really just need to find a board that allows you to space the cards out, and an enclosure that provides lots of vertical space below the bottom card and horizontal space ‘above’ the expansion slot area, and then set the system up for a high level of bottom-to-top airflow. If you can pull that off, you have a good chance of keeping the GPUs at their boost or overclocked speeds quietly.

            The reason I dislike that setup, personally, is that it requires a large enclosure, more fans with careful attention paid to intake and exhaust balance where intake airflow exceeds exhaust airflow only marginally, and careful attention paid to fan filters.

            Now, I’ll add that there’s nothing wrong with this setup, just that I prefer a more compact, sealed system :).

            • clone
            • 6 years ago

            780 Ti seems to be pushing 50 db as well so I guess your sol.

            • Airmantharp
            • 6 years ago

            You like distorting facts, don’t you?

            780Ti SLI is quieter than a single 290X in quiet mode, and it’s quieter than the 290X across the board, while providing better performance.

            [quote=”Ryan Smith at Anandtech”<]Though even with the increased noise levels in particular, GTX 780 Ti is still able to outperform 290X on noise while also delivering better gaming performance, which makes this another tidy victory for NVIDIA.[/quote<]

            • clone
            • 6 years ago

            when AMD was superior across the board you cried and whined endlessly about R9 290X and R9 290’s cooler hitting 50db, I respond with a more objective, “when you factor everything it’s really quite good.” and “the cooler isn’t that bad”

            in response to this you whine and cry, and cry and whine endlessly about how crappy the AMD cooler is because it’s hitting 50db…. the tears are golf ball sized, endless whining and crying, non stop for 2 weeks…. literally Airmantharp, a hundred plus posts about how bad the cooler is, how loud it is, how anyone saying otherwise is a fanboy, you cite reviews, you talk about how reviewers are idiots when they don’t agree with you, they don’t understand, you talk about the unhappy masses, you ignore performance you ignore price you just want to whine and cry about the 50db’s.

            well….. Ti is hitting 50db… and Airmantharp’s response is….. “but, but, but, but, …. the performance.”

            yeah I get that and you won’t find me saying the cooler or the card is crap, I didn’t before and I’m not now…. you cried about noise… oh my the noise it’s so bad until today when it no longer matters?

            • Airmantharp
            • 6 years ago

            We knew that the 780Ti would be louder- and yes, it is ‘loud’. Nvidia would have had to spun a more efficient bin for the 780Ti, or improved on the cooler further, and neither of those possibilities happened, so performance is right where it was predicted to be.

            But as already pointed out, pick any performance target for the 290(X), and stock cooler to stock cooler, the 780 will be quieter, in many cases significantly so!

            See, that was the point. We knew AMD would have to push their new GPU to the limit to challenge Nvidia’s larger GPU’s performance, which they did successfully, and we knew that they’d have to put some real effort into the cooler to keep noise levels from getting uncomfortable- which they did not.

            Remember, I’m all for AMD staying in the game. If we lose either AMD or Nvidia, prices will skyrocket and innovation will slow to a crawl. And really, AMD has a whole lot of great stuff coming down the pipe. To gain my business, all they’re really missing is compatibility with G-Sync equipped monitors, and I outright expect that to be a non-issue for them.

            • clone
            • 6 years ago

            while I knew 780 Ti would be faster, I did not expect it to be running so close to R9 290X in both power consumption and noise.

            I’ve been looking at AMD’s power consumption as a tradeoff for the additional features that came with the core while Nvidia’s card is capable it’s not as feature rich and I’d assumed that was a tradeoff made for the sake of power consumption.

            PC per was equally so and assume it’s only because Nvidia was surprised by R9’s performance that they decided to go so aggressive on frequency and power at the expense of noise.

            personally could care less about the “noise” issue, it was always vapor to me.

            for me it’s a toss up, as mentioned earlier I didn’t expect much from Nvidia aside from the last bit o’ perf they could squeeze out of it.

            for those who don’t care 780 Ti is wonderful but the darling looks to be R9 290…. at least in it’s segment.

            • Airmantharp
            • 6 years ago

            Yeah, Nvidia seems to have just asked themselves, ‘why not?’. As long as they produce the measurable increase in performance over the R9 290X that GK110 was always capable of while charting lower on the acoustics results, their customers will be happy. Just like AMD, Nvidia knows that their partners have quieter open-air coolers ready to go, too.

            And when it comes down to features, it’s really TrueAudio vs. G-Sync. I still think Nvidia wins this one, as I can live with Audio staying the way it is, I think :), but G-Sync solves a problem that’s been annoying gamers like myself for well over a decade. Luckily, G-Sync looks fairly straightforward to implement on the GPU side, so AMD may actually have the advantage here if they can get it working!

            • clone
            • 6 years ago

            audio while interesting is vapor to me, G-sync can’t be ignored but I’m a legacy guy and any added DX functionality immediately gets the nod especially when it’s in the new generation consoles as well.

            the user base with DX 11.2 is where I’m looking.

            • Airmantharp
            • 6 years ago

            DX 11.2 is still a ‘wait and see’ thing for me, along with Mantle. It’s very hard to objectively quantify the advantages of such technologies with examples in hand, and forecasting their effects is downright impossible :).

            Sadly, G-Sync is the one technology that’ll make a universal difference- yet it isn’t universal yet :/.

      • Medallish
      • 6 years ago

      I was quite amazed at the price as well, AMD could easily have asked for $449, but this is most likely due to the fact that they achieved a lot more performance out of the 290 just before release, and before that update it was going to be positioned between 770 and 780.

      Btw. Fan speed isn’t that bad according to everyone.
      [H]
      “Fan noise at 47% is negligible, we heard the fan, but it was nowhere near annoying or loud while gaming. Even when we increased it to manual of 100% it did not increase much beyond 50% while gaming.”

      Personally though I never stick with stock coolers, not even nVidia’s if I can avoid it, my preferred graphics solution company isn’t my preferred cooling solution company.

    • RdVi
    • 6 years ago

    The performance and price was definitely a surprise. Now for those aftermarket coolers.

    I was going to wait a year for 20nm, but given the price, I might just jump on one of these if aftermarket coolers make the difference I expect them to make. I blame my 1440p monitor for the upgrade itches… 🙁

    • Bensam123
    • 6 years ago

    And there are results that I think even the most hardened Nvidia supporter can’t argue with, I hope. A $400 graphics card on par with a $1000 one (unless we’re all going to pretend the Titan never existed). AMD definitely has been going balls deep and it’s showing. There isn’t a lot of room for interpretation here… all they have to do now is fix the biases that exist and still persist (like AMD has bad graphics drivers).

    I appreciate the nod to rear exhausting fan coolers. This has been something I as well as others have been pointing out for awhile. Exhausting hot air out the back of the case completely removes it instead of expecting case fans to do all the work. Test benches are completely different from a normal case. Even if a double or triple cooler performs well on a test bench, that doesn’t mean it’ll do the same inside of a case (especially with poor airflow).

    TR should consider adding a 7850, 7870, and/or 7950 (and Nvidia equivalents) crossfire setup in here to compete on the price comparisons. You could get a 7950 setup for almost the same as a R9-290, it’d be interesting to compare the performance, power draw, and noise of such setups. Seeing as Crossfire/Sli have matured quite a bit over the years, a lot of people are looking at the prospect of upgrading their last gen cards with an extra one to extend the life of cards instead of simply buying a newer card.

    I see such results as almost more relevant then adding older gen cards in (like 2-3 gens back).

    As far as I can tell, upgrading my 7870 with a extra card for instance for $150 is a superior solution in just about every way compared to buying a R9-290, but there aren’t really sites that compare cards in such a way. We have a thread on the forums currently discussing such a thing too:

    [url<]https://techreport.com/forums/viewtopic.php?f=3&t=90293&p=1183681#p1183681[/url<]

      • SetiroN
      • 6 years ago

      Lol, no. You aren’t very literate about TR’s work in real world graphics performance during the last two years, are you?
      Many sites compare cards in such a way. Serious ones won’t even consider them as alternative options, until AMD actually fixes crossfire frame pacing to run in every game.
      “Superior solution” haha. I was hoping to never see such nonsense again, here on TR at least.

        • Bensam123
        • 6 years ago

        No, I suppose not… that’s why I’ve been commenting on TR articles for over a decade. I’m sure you’ve seen such posts in the comments section though, being the literate one you are.

        I have yet to see a website that compares last gen solutions to current ones on a price/performance curve. If you of course believe these exist, you’re welcome to point them out.

        AMD fixed frame times in crossfire solutions (outside of 4k and big eyefinity setups, which 99% of users don’t have). I was also talking about Nvidia (sli and crossfire) on a price/performance graph. You once again, being ever so literate noticed this.

        I was hoping I’d never see your nonsense on TR ever again too? (not a very nice thing to say, but I’m sure you can appreciate this)

        • BestJinjo
        • 6 years ago

        You are behind the times and aren’t understanding the new cross-fire tech on R9 series.

        R9 290/Xs have dual XDMA engines which means cross-fire is now done via PCIe, not via cross-fire bridges.

        “We’ve been telling our readers for years that CrossFire just didn’t feel as good as SLI while gaming. Those times have changed, at least on the new Radeon R9 290/X series. The new CrossFire technology has improved upon the CrossFire experience in a vastly positive way. Playing games on the Radeon R9 290X CrossFire configuration was a smooth experience. In fact, it was smoother than SLI in some games. It was also smoother on the 4K display at 3840×2160 gaming, and it was noticeably smoother in Eyefinity at 5760×1200.

        We saw from 20-40% performance advantages with R9 290X CrossFire over GTX 780 SLI. These are real, large increases, in performance. That extra money spent, is more than going into more performance delivered and a better gameplay experience.”
        [url<]http://www.hardocp.com/article/2013/11/01/amd_radeon_r9_290x_crossfire_video_card_review/10#.UnrJD_nbTXk[/url<]

          • Airmantharp
          • 6 years ago

          Now we just need TR to verify AMD’s solution to ensure that there isn’t any frame-time goofery going on. And AMD to provide support for G-Sync monitors :).

      • Lazier_Said
      • 6 years ago

      Exhausting coolers do indeed take the load away from the case fans. Big, slow fans in relatively open locations which can take that load no sweat.

      And they do it by greatly restricting airflow on what’s already the most restricted, hottest, and noisiest component in the system.

        • Bensam123
        • 6 years ago

        People say that all the time, but no one has tested it. None of the benchmarking results for heat and noise are done in a standardized enclosure so we can actually see real world benchmarks as far as that goes.

          • Airmantharp
          • 6 years ago

          Kyle at the [H] did an article a while back where he shoved a GTX260 immediately between a pair of GTX480s into what is still the best air-cooling enclosure on the market, Silverstone’s FT-02. Said that it was the quietest he’d ever heard the GTX480s perform under load, but it sure did heat his room up fast :).

            • Bensam123
            • 6 years ago

            The enclosure didn’t make any noise?

            • Airmantharp
            • 6 years ago

            The noise level was far below what the same setup would have produced in any other enclosure available at the time.

            [url=http://www.hardocp.com/article/2010/09/30/my_quiet_galaxy_geforce_gtx_480_sli_build#.UnqJ0fltitM<]Here's the article, for reference.[/url<] He used the uglier/cheaper RV-02, same thing as the FT-02, just with more plastic. The FT-02 is still the premier air-cooling case, as old as it is!

      • clone
      • 6 years ago

      I’d wait 2 months before getting excited about crossfiring HD 7950’s or HD 7870’s…… the Crossfire drivers have improved radically for the XDMA’s but it’s still an unknown if the older Crossfire setup has an inherent flaw or not at 4k resolution.

      and that’s a question that is aimed squarely at your focus.

      on a side note: it was nice to read a more in depth explanation about the acoustic characteristics of the reference cooler with a definitive “the noise isn’t annoying.”

      personally shocked to see the price is $150 below R9 290X.

        • Bensam123
        • 6 years ago

        You can do crossfire without XDMA on older cards.

        Almost no one has a 4k monitor so that’s a non-issue unless even you plan on dropping $3.5k on a 4k monitor in the next few months.

        All of that aside, this pertains to Crossfire AND Sli, AMD AND Nvidia. It’s weird that both you and Set seem to think I’m only talking about Crossfire as the only feasible option here. I did mention both in my original post. Both offer a value proposition by upgrading a older card over buying a newer one.

          • clone
          • 6 years ago

          in a thread about AMD, in your post you mention AMD products several times over with SLI getting a nod in brackets…… what did you think we’d all assume?

          anyway I just did the work for you by looking at some more comprehensive reviews on R9 290X and all of the cards included in the list.

          the card I chose to compare was the HD 7870 because you’ve already got one.

          while you will on occasion (rarely) get close to an R9 290 in performance you will not surpass it, you will always have lagging support as is the nature of Crossfire and SLI, you will have higher cpu overhead as is the nature of Crossfire and SLI along with the higher power consumption and heat in your case.

          you’ve already mentioned 4k isn’t your goal here and for 2k I’d say for the immediate future 2 X HD 7870’s is more than enough.

          it’s a given you won’t get much for your HD 7870 and it’s also given it won’t cost much to get a 2nd HD 7870, add this to your focus on doing it and looking for any reason to justify it…..hell just go for it, you don’t really have much to lose so take the leap and good luck.

            • Airmantharp
            • 6 years ago

            He really should just go through with it. At least both cards would be GCN, and if they’re 2GB cards, he should be okay for a year or so.

            ‘Course, a single custom-cooled 290 or a GTX770/780 would be a better deal in the long haul, with more RAM and more legs. But hey, we’ve already worked all that out in the forum :D.

            • cynan
            • 6 years ago

            I think there are too many unknowns to say for sure. If you have the extra cash now, and want to play it a bit safer, then I’d agree, the faster single GPU always wins – especially in the case of going multi GPU with AMD as they’ve not yet convincingly been able to foster confidence in their drivers insofar as crossfire configurations, despite the frame pacing fix that came this summer.

            However, if you’re already the owner of something like a 7950/70 then for $100-$150 less than an r9 290 you’ll get performance that significatnly exceeds either a GTX 780 or 290. But no, perhaps not with 4K, frame pacing issues aside, these older cards don’t have the memory bandwidth or buffer space. But they could very well hold up for a while at 4MP and less (@60Hz).

            The point is, it all depends how risk averse you are and how much you are willing to budget at the moment. Spending < $300 on a second HD 7950/70 could end up being a very good bang for your buck at the momentand hold up fine for the next couple of years. Or, if driver support isn’t there, and memory requirements of games suddenly go through the roof (which is still not certain, despite the “next-gen” console titles looming), perhaps not so much.

            I think recommending a GTX 770, at this point to anyone looking for a top tier gaming GPU, however, is not doing anyone any favors. 2GB is simply not going to be enough for anything over 1080p. And if you’re talking about the 4GB version, that’s even worse as they’re currently price more than the r9 290 and are only a hair or two shy of the 780’s new price – though that could change quickly.

            • Bensam123
            • 6 years ago

            RAM seems to be something that is just as blown out of proportion as the 4k crossfire crap. 3GB of memory isn’t going to save you from the extra 2GB you’re missing. This is, of course, if it’s fully exploited in the near future. Not like software is ever made to scale down to less robust hardware either.

            • Airmantharp
            • 6 years ago

            It’s blown out of proportion for now- even BF4 seems to run smooth when constrained by VRAM, but as you say, it is most definitely made to scale down quite well.

            The real difference will be made clear when games ship with some real assets- BF4 is not any more impressive than BF3, graphics wise. There’s more stuff, sure, but the texture quality is only slightly improved as a whole. Expect significantly more from true ‘ntext-gen’ games, and the resulting sacrifices in fidelity to allow them to run on lower-end hardware to be harder to swallow.

            Heck, I’m in the same boat as you. I have 2GB cards at 1600p, which return roughly the same framerates two of your cards would at 1080p, and I’m going to be feeling that crunch soon too, and worse, as I’m looking toward 4k as well.

            • Bensam123
            • 6 years ago

            IF and WHEN… You don’t recommend 3GB of vram right now for the same reason you don’t recommend everyone buy a hex core, simply because you don’t need it right now or any time soon. Will it make a difference right now? Sure… Will it in the future? Also sure… But is it worth spending the extra hunk of dough on it right now when there are other really good options available? No…

            Building for the future is always good, but only up till the point where you’re spending a inordinate amount of money to do so. In two years when you may need a vram upgrade to use these mega textures it will be right around the time when you’ll be looking at video card upgrades anyway. There is so much changing right now we really don’t have any idea how much vram will be on video cards then, how HSA will influence EVERYTHING, and what Mantle will change… Of course there will also be new doodads and thingermajigs on video cards you want to purchase them for as well.

            Video cards only last so long…

            • Airmantharp
            • 6 years ago

            You’re spot on- but the VRAM thing starts to make a whole lot of sense when you look at the bigger picture. For me, it’s hard to consider purchasing or recommend purchasing a video card with less than 4GB-6GB of VRAM right now. I use them for at least two years, and that’s probably below average, putting any purchase made today well into the timeframe that true next-generation games will be hitting the market.

            These new consoles don’t really have a lot of rendering power, but they do have an unprecedented amount of memory available to games. If you build it, they will develop for it; and then they’ll port it to the PC with even more stuff available.

            My expectation? If you buy a card or set of cards with 2GB or 3GB of VRAM now, you’ll be wishing you’d gotten more before this time next year.

            • Bensam123
            • 6 years ago

            That I’m more familiar with the AMD lineup then Nvidia so I don’t know the exact matching cards off hand. If I wasn’t talking about Nvidia I WOULDN’T MENTION THEM.

            You’re free to read into the text all you want, but what I said is sitting right there.

            I mentioned the 7870 as an example. The overall bottom part of the thread was about dual GPU setups and their price/performance compared to buying a big new GPU.

            Part of why I’m suggesting this is because no one is testing it. You’re once again free to point me to benchmarks showing the 7870, 7950, or any of the Nvidia equivalents compared to a R9-290 in a summarized price/plot comparison. From what I’ve seen though, the 7870 in crossfire surpasses a 7970 by quite a wide margin, which in turn is surpassed by a 290x by a not so wide margin. But that’s all from looking back and forth between different tests and comparing different cards, this is why I was asking TR to test things.

            • Airmantharp
            • 6 years ago

            I’m surprised that we’re not seeing such reviews either, given the inherent value involved, but I suspect that it may be related to AMD’s driver priorities. I’ll definitely be pining for such comparisons when they finish out their WHQL frame-pacing driver for sure!

            • clone
            • 6 years ago

            ok found one, look below provided a link.

            • Bensam123
            • 6 years ago

            AMD or Nvidia… both should be compared for mid-range sli/crossfire upgrades compared to current gen cards.

            • clone
            • 6 years ago

            this R9 290X review has some SLI results, you’ll need 2X GTX 760 or better in SLI to compete with it.

            [url<]http://www.legitreviews.com/amd-radeon-r9-290x-video-card-review_126806[/url<]

            • Airmantharp
            • 6 years ago

            Which sounds about right when you scale everything out and account for imperfect SLI scaling- and makes the value prospect of custom-cooled versions quite enticing. In theory, I could replace my GTX670’s with one and likely wind up with better performance, but given that I’m set on doubling my resolution to 4k sooner rather than later, two’s the minimum :).

      • chuckula
      • 6 years ago

      [quote<] A $400 graphics card on par with a $1000 one (unless we're all going to pretend the Titan never existed). [/quote<] What I love about you is that you selectively pretend that the Titan is literally the only GPU that Nvidia sells other than maybe a GTX-640. You are expressly ignoring the GTX-780 that isn't really much more expensive than the 290 while offering much better acoustics in the process. Where was this process of yours when AMD launched the FX-9590 at $900? Where was your post saying "The 4770K is a CPU at $330 that is faster than a $900 AMD CPU... (unless we're all going to pretend the FX-9590 never existed)." I can tell you where it was... in the back of your head where you killed it once you realized it wasn't twisted in a half-truth way to make AMD look good and everyone else look bad.

      • superjawes
      • 6 years ago

      [quote<] A $400 graphics card on par with a $1000 one (unless we're all going to pretend the Titan never existed). [/quote<] No one is pretending that the Titan doesn't exist, but--as stated numerous times--Titan's value is more than just gaming. No one should buy a Titan unless they know, beyond a shadow of a doubt, that they need more than just gaming power. A fairer comparison is to the 780, and I would likely recommend a 290 over a 780 because the performance is similar for $100 less.

        • Bensam123
        • 6 years ago

        I don’t know… if you’re buying a Titan for more then gaming wouldn’t you be looking at something not marketed towards gaming, like a Quadro?

          • superjawes
          • 6 years ago

          A Quadro would not have good gaming performance, though. The example I used last week was Youtube Let’s Play content. Such users need strong gaming performance for quality, but they also can make heavy use of the computing ability since, after they record content, they have to edit and render the video into a format that can be uploaded.

          Ideally you would record on a gaming machine and export to an editing/rendering machine, but that is far less economic than being able to do both on one.

            • Airmantharp
            • 6 years ago

            I stand by superjawes assertion that a strong compute card is a godsend for heavy content creation users. My long-gone GTX570 would blow these GTX670’s out of the water, for instance, and a pair of Titans would be ideal for my setup, if they made 12GB versions for 4k use.

            Granted, I’d prefer not to take out a loan to support a hobby… so that’s never going to happen 🙂

            • Bensam123
            • 6 years ago

            Yeah and if you’re actually rendering things you’d look into buying a card actually made for it, a Quadro.

            Titan is a curiosity marketed towards gamers. That’s why it’s called a ‘Titan’ and not a low end Quadro model.

            • Bensam123
            • 6 years ago

            I stream almost every night, almost all of it is done on my CPU… encoding. None of it is done on the GPU. If you’re talking about the ‘Lets Play’ series they don’t do any rendering beyond normal video game work, the rest is done on the CPU.

            If you’re talking about actual video editing applications, that’s different and falls into the Quadro category. A Titan doesn’t help at all with streaming or recording anymore then a 780 does.

            People don’t encode with GPUs, Intel, AMD, or Nvidia using directcompute or cuda. They barely support their built in encoders. OBS only started supporting Quicksync last month and you need to jack up the bitrate to use it.

          • cobalt
          • 6 years ago

          Quadros, like ‘real’ Teslas, cost a lot more. Just a quick price search, the K6000 Quadro is like $5000, and the K20c Tesla is $3000. The Titan is a steal if you need high double precision compute capability but can skip things like GPUDirect. But it’s not a great value if you only want gaming performance; that’s where the 780’s strength lies on the NVIDIA side.

            • Bensam123
            • 6 years ago

            Yup… Those are real professional grade cards. I don’t think anyone is looking to ‘skimp’ on things when they start getting into workstation rendering.

            • cobalt
            • 6 years ago

            There’s a (growing) difference between compute and rendering — the Titan is more like a personal Tesla. The Quadros still have some interesting features that make them useful for various rendering things, like CAD, render farms, and display walls.

    • puppetworx
    • 6 years ago

    Consider your Jimmies rustled Nvidia.

      • Airmantharp
      • 6 years ago

      Just means another price drop!

    • slowriot
    • 6 years ago

    Damage, any clues on when R9 290 and 290X cards with third-party coolers might start arriving?

      • HisDivineOrder
      • 6 years ago

      The question that is on everyone’s lips. We’re all like, “Wow, great performance, great price… sooo… how long do we have to look at the tease and not get the goods?”

      Because this card is way, way too loud. Just read anandtech’s review to see that. I don’t know what’s up with TR, but man that noise level is unacceptable and I can’t believe they’re justifying it as, “Hey, I’m good. I can handle it. I can.”

      I think maybe reviewing the 290X destroyed your hearing and now you can’t hear the 290 as well. 😛

      I really don’t think anyone should be encouraging a move back toward loud video cards. We just got off that train not so long ago and here you are inviting nVidia to ramp up the 780’s fan speeds to “catch up” to AMD?

      Because that’s what they could easily do. Ramp up the max temp, ramp up the fan, and suddenly nVidia’s back in the game easy as pie. Anyone could play that game.

      It’s harder to make a card that performs with great acoustics. Clearly, AMD can’t do it.

        • BestJinjo
        • 6 years ago

        No, it’s more like AMD does the bare minimum with reference design, but this allows AIBs to differentiate their products and come up with superior coolers/upgraded components. It’s no wonder AIBs prefer AMD over NV and many have complained for years after working with NV. Regardless, slowriot’s comment was asking when after-market R9 290s would shopw up, but instead you went on a tangent discussing the reference design.

        You realize PC enthusiasts running up to 3 cards and who overclock on air cooling prefer open air coolers to reference designs anyway? Very few people even consider reference NV or AMD cards.

          • Airmantharp
          • 6 years ago

          People running two or three (or four) cards prefer blower-style coolers, whether they be reference designs, offered by AIBs, or as separate aftermarket solutions they themselves install. Assuming they don’t just put the cards under water, which is recommended for most triple- or quad-card setups.

            • Diplomacy42
            • 6 years ago

            I would never get anything that exhausted into my case, ever.

            • f0d
            • 6 years ago

            i actually prefer the open air coolers as i have plenty of airflow in my case and the open air coolers perform great (my favorite is the gigabyte windforce which i have 2 of atm)

      • Antimatter
      • 6 years ago

      Rumours suggest at the end of the month.

      • mesyn191
      • 6 years ago

      Updated rumor: Kyle heard mid December.

      [url<]http://hardforum.com/showpost.php?p=1040382043&postcount=1[/url<]

      • JustAnEngineer
      • 6 years ago

      I thought there was a clue here, but it seems to have disappeared again.
      [url<]http://www.sapphiretech.com/presentation/product/?cid=1&psn=000101&lid=1&psn=&lid=1&leg=0&gid=3&sgid=1227[/url<]

    • funko
    • 6 years ago

    How come cards in the AMD 7XXX series are not included in the latest reviews but the 6XXXX and 5XXX are? did i miss something?

      • Damage
      • 6 years ago

      The Radeon R9 280X included here = Radeon HD 7970 GHz Edition.

      Same thing, give or take a few MHz.

      The Radeon R9 270X = Radeon HD 7870.

      So the same Tahiti and Pitcairn GPUs are right there in yer face. 🙂

        • funko
        • 6 years ago

        got it, thanks!

        • Bensam123
        • 6 years ago

        Not everyone realizes this, that’s why it’s important to still have them for the sake of price comparisons. You could even label the R9-280x as a 7970 equivalent or something, just so people know that they’re pretty much identical.

          • Srsly_Bro
          • 6 years ago

          I mentioned that in the review of the 290x. Apparently informing the readers is too much of a hassle.

            • Damage
            • 6 years ago

            You mean like this:

            In all cases, though, the 290X offers a nice increase over the Radeon R9 280X—which is just a re-branded Radeon HD 7970 GHz Edition, essentially.”

            [url<]https://techreport.com/review/25509/amd-radeon-r9-290x-graphics-card-reviewed/6[/url<] Do you even read, bro? Seems to me like you're not really offering anything constructive in your criticism.

            • esterhasz
            • 6 years ago

            You’re right, of course, but it could be made even more explicit without too much additional work, e.g. by creating a comparison table that could be reused in future reviews.

            I went back to the 6870 review (it’s my current card) to get reminded again that the 5870 is basically identical in performance. I didn’t mind, but it’s indeed hard sometimes to fit one’s own current performance into the charts…

            • Srsly_Bro
            • 6 years ago

            nope. I went straight to the benches. thanks for replying to me though. 🙂 +1

    • danny e.
    • 6 years ago

    Hopefully, the shrink to 22nm next year fixes some of the heat/noise issues. It might not be dust-buster loud but it’s sure awful hot. And it isn’t whispering.

      • NeelyCam
      • 6 years ago

      TSMC calls it 20nm.

      And, I’m not sure if it’ll fix anything; the people moving from 40nm to 28nm had a choice to either make the system quiet or make it fast. Enthusiasts went with “fast”, and this is the result. The same thing is gonna happen at 20nm

    • cynan
    • 6 years ago

    As soon as these puppies start shipping with some decent after market direct-flow, multi-fan coolers these will be pretty sweet for the price. Until then I wouldn’t touch ’em.

    With GPUs such as this drawing 350W and running at 94 deg C, maybe we’ll finally see more than one closed loop GPU after market liquid cooler? Heck, it’s about time OEMs started offering them on their higher-end offerings out of the box. I know I’d pay an extra $40-$50 dollars if it meant quiet performance at sane temperatures with some actual overclocking head room to boot.

    • derFunkenstein
    • 6 years ago

    So it would have been just as compelling but quieter if they hadn’t messed with the fan speeds. Lame.

    Definitely time to reconsider the gpu fan though. Yikes those temps. I know that it would just throttle down without the fan speed as opposed to run hotter. Can’t run much hotter than it is.

    • DeadOfKnight
    • 6 years ago

    LET THE PRICE WAR BEGIN!

      • Airmantharp
      • 6 years ago

      My thoughts exactly-

      12GB GTX780Ti, I’m looking at two of you.

        • BestJinjo
        • 6 years ago

        Do you have any benchmarks that show 780’s 3GB of VRAM being insufficient for games against the Titan’s 6GB VRAM? I can’t see how having 12GB will benefit this generation of cards for gaming alone, unless you use your card for compute tasks in which case the 780Ti is probably not suitable since it will be DP-crippled.

          • Srsly_Bro
          • 6 years ago

          His epeen would shrink considerably if he went with 3GB instead of 12.

          • Airmantharp
          • 6 years ago

          I was being facetious, given that Nvidia has been a bit stingy with graphics memory, but I do consider 6GB to be my minimum for a new GPU. It just won’t be happening this generation.

            • BestJinjo
            • 6 years ago

            6GB the minimum for a gaming GPU? Most games use less than 2GB of VRAM. Otherwise, GTX680/770 2GB would tank in benchmarks. I can’t think of many that use more than 3GB at resolutions up to 2560×1600. Can you?

            6GB of VRAM might matter if you are running 3x 4K monitors; however, in that case you won’t have enough GPU power with today’s generation.

            • Airmantharp
            • 6 years ago

            [i<][b<]MY[/b<][/i<] minimum. That's next gen games with more assets at 4k. If you have the VRAM, you can just add cards until you have the performance you need.

    • Krogoth
    • 6 years ago

    Another interesting review, shame that AMD hasn’t bother to improve mediocre reference HSF.

    Anyway, 290 is another round in the price wars. Nvidia has no choice but to reduce MSRP on 770 and 780 to make them price competitive. 280 and 770 are starting to look very attractive for people who want a ton of power without spending a mortgage payment.

      • HisDivineOrder
      • 6 years ago

      I don’t know. nVidia may try to ride it out based on the fact that the 770 wasn’t dropped to the same price point as the R9 280X. nVidia probably thinks, “Hey, we have a bundle and we’re a LOT quieter. We’ll sell.”

      And I think they may be right about that. The R9 290 is really, really loud. Most sites are saying it’s louder than any card released (relatively) recently and it’s putting off a LOT of heat, too. Whether it goes into your case or your room, it builds up faster than you think.

      How many are going to eventually see the argument between the R9 290 vs 780 turn into an argument for efficiency and acoustics? I suspect more than a few will.

      Then toss in the bundle. Right now, the bundle is going for over $50 on ebay, but I’m sure that’ll go down in time. Still, right now it looks pretty damn good and AMD’s got… no bundle as of yet.

      They keep saying they’re going to add it, but they haven’t done so?

    • JosiahBradley
    • 6 years ago

    I want this card as bad as I did the 9700 Pro. You will be mine! R300 vs R290? A series that was ripe for the 9xxx moniker. It’s like Déjà vu in the best way.

      • Krogoth
      • 6 years ago

      Reminds me more of the X1900 and HD 4870 than anything else.

        • derFunkenstein
        • 6 years ago

        Hot, loud, and fast. Good comparisons.

          • JosiahBradley
          • 6 years ago

          I can fire up my X1900XT right now and it only has one of those qualities you mentioned, fast. Maybe you had a bad case or improper ventilation.

          I ran that card for 6 years without a problem. That’s also a ton of catalyst updates.

        • JosiahBradley
        • 6 years ago

        My last video card (before this excellent 7970 Lightning) was the X1900XT. Still have it too and it works. But before that was the 9700 Pro and I beat it to death playing UT. My old nVidia 4600 Ti couldn’t match that level of awesome. And my bro was stuck on the GeForce MX 200. But everything was an upgrade from the TNT2 and the z-80 I was gaming on as a kid. Nostalgia!

          • Airmantharp
          • 6 years ago

          You don’t upgrade very often, do you? 🙂

          (4600Ti competed quite successfully against the 8500 Pro, that it launched against- the 9700 Pro was the next generation, and competed with the initially ill-fated ‘FX’s)

            • JosiahBradley
            • 6 years ago

            I upgrade when I have the money, and to whatever seems to be the best deal. I lean towards Ati/AMD a bit more because they match my philosophy on open computing a bit more. But before the 7970 Lightning I was looking more into the GTX 680, because I love Skyrim and mods, but by the time I got around to it, and drivers improved, an OC’d 7970 beat out the OC’d 680 and had the better price.

            Edit: You should see my CPU upgrades. z80, p3, 2700+, 5000+ X2, 1090T X6, 4670K.

            • Airmantharp
            • 6 years ago

            The CPU upgrades are all solid- mine look much the same, just trade the X6 for a C2Q and throw in a few Pentium IV’s I resorted to using instead of dealing with VIA’s crappy chipsets :).

            Otherwise, I have to agree for the HD7970 in a single-GPU setup. I went with Nvidia because my previous HD6950 Crossfire setup was haphazard at best when running BF3, which I bought the cards for, and because the HD7970’s were just too damn hot and too damn loud with the reference blowers, and the ‘open air’ solutions just aren’t well suited for running more than one card at once :D.

          • Krogoth
          • 6 years ago

          I had an ASUS X1900XT for a while, until one of the solid caps blew out. 🙁 The card was fast and a bit toasty for GPUs in its heyday.

          • jihadjoe
          • 6 years ago

          I had a 9500 that unlocked! I was so stoked when it did.

      • Jakall
      • 6 years ago

      I think you mean you want this card as bad as we all wanted the 9500 Pro. That was the equivalent for the R9 290.

      • dashbarron
      • 6 years ago

      I was just thinking something along those lines because I had a 9700 PRO and have had two (x2 SLI) NVIDIA GPUs since. This is a very positive sign that AMD is offering some very competitive products at an attractive price. I’d be happy to switch back to AMD for my next setup in 3-4 years.

    • torquer
    • 6 years ago

    I personally prefer Nvidia and have run their stuff exclusively for years. I own 2 780s and 2 670s.

    I am extremely pleased to see AMD competing so well. Competition is a wonderful thing, even if the Internet can’t see the forest for the trees.

    All Nvidia and Intel fans should be rooting for a thriving and competitive AMD. Without them, our beloved companies would turn to the dark side quicker than Anakin Skywalker.

      • Airmantharp
      • 6 years ago

      And without the dark side, there can be no light..

      See how quickly AMD prices up their products when they have a clear advantage- they’re just as bad :D.

        • torquer
        • 6 years ago

        When competition is fierce and thriving, everyone wins. In a monopoly, everyone loses (including the monopolist, eventually)

        Everyone loves to bash on the phrase “corporations are people, too,” but in a way its true. So goes human nature, so goes the nature of corporations. It is natural for companies without strong competition to start “being evil” or at the very least complacent. Just look at Blackberry. Eventually a plucky upstart will come along to unseat them, but in the meantime none can resist the urge to raise prices, slow innovation, etc.

        *IF* AMD were ever to fail, it would be a sad day even if I haven’t owned any of their products in a few years. We should all remember that if it weren’t for AMD, the Core architecture from Intel would likely not exist.

          • jihadjoe
          • 6 years ago

          A bit OT, but going on that tangent:

          If it wasn’t for AMD then the maybe horrible Netburst architecture would not exist! Intel basically did that because they were getting in a MHz race with AMD’s Athlon.

          And remember that the first iterations of Core achitecture (Core and Core2) was basically just a return to the old P6 with a lower power consumption, double speed FSB and improved memory interface.

          P6 was just THAT good. The first generations of Athlons running DDR were actually slower clock-for-clock than Pentium IIIs running SDR ram.

          • Diplomacy42
          • 6 years ago

          insert rant about the supreme court ruling that corporations are people here.

          {very angry rant}

        • BestJinjo
        • 6 years ago

        5870 was $369 and stomped all over the GTX280.
        7970 was $549 when 580 1.5GB was $449 and 580 3GB was $500-550.
        R9 290X is faster than the 780 but AMD released it at $549 when 780 cost $649.

        Your statement is not true.

          • Airmantharp
          • 6 years ago

          Your statement is shortsighted.

          When either company is ahead of the other, they price stuff as high as they can get away with. When they’re having to compete, they do the same thing, except that what they can get away with is usually lower. Note that Nvidia can usually get away with higher prices for longer periods of time than AMD; that’s something AMD has brought upon themselves.

      • anubis44
      • 6 years ago

      “All Nvidia and Intel fans should be rooting for a thriving and competitive AMD. Without them, our beloved companies would turn to the dark side quicker than Anakin Skywalker.”

      I believe that happened a long time ago…

        • superjawes
        • 6 years ago

        [quote<]I believe that happened a long time ago...[/quote<] In a galaxy far, far away...

    • End User
    • 6 years ago

    Another great review.

    Can you mention memory usage in future reviews?

      • Jon1984
      • 6 years ago

      Guru3D has memory usage in it’s reviews lately, although it would be much better to see all the good info here on TR 😉

    • Anvil
    • 6 years ago

    Nice review, and a nice card. I was expecting that the 290 was going to be 450 bucks, but was hoping for the $400 price point. Looks like Nvidia’s thunder has been stolen again in the space of a week and a change.

    Also, in before First, etc.

Pin It on Pinterest

Share This