AMD’s Radeon R9 290X graphics card reviewed

Well. The run-up to the release of this here graphics card has certainly been unusual. AMD revealed a bunch of details about the Radeon R9 290X and the new “Hawaii” chip on which it’s based at a press event one month ago. As a result, most folks have had a pretty good idea what to expect for a while. Today, the 290X should be up for sale at online retailers, and it’s finally time for us to review this puppy. Let’s have a look, shall we?

Say aloha to Hawaii

Not only is the Radeon R9 290X a beefy graphics card intended to compete with the likes of the GeForce GTX 780, but it’s also something else: the platform for a truly new chip, with updated technology inside. Most of the rest of the cards in the Radeon R7 and R9 series introduced recently are renamed and slightly tweaked cards based on existing silicon. Not so here. The Hawaii GPU that powers the 290X represents the next generation of GPU technology from AMD, with a number of incremental improvements over the last gen.

ROP

pixels/

clock

Texels

filtered/

clock

(int/fp16)

Shader

processors

Rasterized

triangles/

clock

Memory

interface

width (bits)

Estimated

transistor

count

(Millions)

Die
size

(mm²)

Fabrication

process node

GK104 32 128/128 1536 4 256 3500 294 28 nm
GK110 48 240/240 2880 5 384 7100 551 28 nm
Cypress 32 80/40 1600 1 256 2150 334 40 nm
Cayman 32 96/48 1536 2 256 2640 389 40 nm
Tahiti 32 128/64 2048 2 384 4310 365 28 nm
Hawaii 64 176/88 2816 4 512 6200 438 28 nm

The main thing Hawaii is, though, is bigger: larger than the Tahiti chip in the Radeon HD 7970 (and R9 280X), with more of everything that matters. As you can see in the table above, Hawaii has very high counts of every key graphics resource. In fact, Hawaii matches up well on paper against the GK110 chip that drives the GeForce GTX 780 and Titan, even though it’s over 100 mm² smaller in terms of die area—and both of those GPUs are manufactured on the same 28-nm fab process at TSMC.

At its core, Hawaii is based on familiar tech: the Graphics Core Next architecture first introduced in the Radeon HD 7000 series. However, this is the next iteration of GCN, with some minor tweaks to the compute units and larger changes elsewhere. Also, AMD has overhauled the layout in this GPU in order to ensure the right performance balance at its larger scale.

The chip’s graphics processing resources are broken down into four separate “shader engines,” each one almost an independent GPU unto itself. Graphics tasks are load balanced between the four engines. Each shader engine has its own geometry processor and rasterizer, effectively doubling the primitive rasterization rate versus Tahiti. That upgrade should improve performance when there are more polygons onscreen, particularly with higher levels of tessellation. In addition, the geometry units have been tweaked to improve data flow, which makes sense. By all accounts, the geometry amplification that happens during tessellation remains a hard problem for GPUs to handle.

If you’ve been following these things for even a little while, looking at these shader engines will make you feel old. Each one of them has four render back ends capable of blending and outputting 16 pixels per clock cycle. That was pretty much a whole GPU’s worth of pixel fill and antialiasing power back in the day. And by “the day,” I mean two weeks ago, when we reviewed the Radeon R7 260X. Each shader engine also has 11 of the GCN compute units that give the GPU its number-crunching power. Every CU has four 16-wide vector math units. That works out to 704 “shader processors” per engine, again almost enough to match the scale of a mid-range GPU.

Now, multiply everything in the above paragraph by four, and you’ve got Hawaii, with a total of 2816 shader processors and 64 pixels per clock of ROP power. At an even 1GHz, Hawaii is capable of 5.6 teraflops of single-precision compute, making it easily the new leader in consumer graphics chips. For compute-focused applications, it can handle double-precision floating-point math at one quarter that rate, still well in excess of a teraflop.

All of this computing power is backed by a 1MB L2 cache. This cache is fully read/write capable and is divided into 16 partitions of 64KB each. The L2’s capacity is a third larger than Tahiti’s 768KB L2, and AMD says bandwidth is up by a third, as well. The firm claims Hawaii’s L1 and L2 caches can exchange data as fast as one terabyte per second, which is more than I can say for my USB 3.0 drive dock.

Oddly enough, the most intriguing thing about Hawaii’s basic architecture may be a fairly straightforward engineering tradeoff. The chip has eight 64-bit memory interfaces onboard, giving it, effectively, a 512-bit-wide path to memory. In order to make that wide memory path practical while keeping the chip size in check, the Hawaii team chose to exchange the complex memory PHYs in Tahiti for smaller, simpler ones. Complex PHYs, or physical interface devices, are necessary to drive GDDR5 DRAMs at peak clock frequencies, but they also eat up silicon space. AMD claims Hawaii’s 512-bit memory interface occupies 20% less die area than Tahiti’s 384-bit interface. As a result, Hawaii’s memory operates at lower speeds. The 290X’s GDDR5 runs at 5 GT/s, down from 6 GT/s for Tahiti-based cards like the Radeon HD 7970 GHz Edition. Still, overall memory bandwidth is up from 288 GB/s on Tahiti to 320 GB/s with Hawaii, thanks the wider data path.

Of course, Hawaii’s advantage of this front extends beyond Tahiti. Nvidia chose a 384-bit interface and 6 GT/s memory rates for its competing GK110 chip, too.

The Radeon R9 290X

The end product of all of this silicon wizardry is the Radeon R9 290X, which has a single Hawaii GPU clocked at 1GHz and 4GB of GDDR5 memory running at 5 GT/s. Here’s how it stacks up, at least in theory, versus the competition.

Peak pixel

fill rate

(Gpixels/s)

Peak

bilinear

filtering

int8/fp16

(Gtexels/s)

Peak

shader

arithmetic

rate

(tflops)

Peak

rasterization

rate

(Gtris/s)

Memory
bandwidth
(GB/s)
Radeon HD
5870
27 68/34 2.7 0.9 154
Radeon HD
6970
28 85/43 2.7 1.8 176
Radeon HD
7970
30 118/59 3.8 1.9 264
Radeon
R9 280X
32 128/64 4.1 2.0 288
Radeon
R9 290X
64 176/88 5.6 4.0 320
GeForce GTX 770 35 139/139 3.3 4.3 224
GeForce GTX 780 43 173/173 4.2 3.6 or 4.5 288
GeForce GTX
Titan
42 196/196 4.7 4.4 288

The R9 290X leads the pack by a mile in several key graphics rates, including ROP pixel rate, shader arithmetic, and memory bandwidth; it trails the GTX Titan slightly in texture filtering and primitive rasterization rates, and it essentially ties the GTX 780 in those same categories. The 290X’s combination of a killer ROP rate and gobs of memory bandwidth should make it particularly well suited for multi-monitor and 4K resolutions, especially when combined with high levels of multisampled antialiasing. Surely that’s the sort of target that Hawaii’s architects had in mind.

These cards should be available for purchase today at online retailers for the low, low price of $549.99. That may sound like a lot, but it’s a hundred bucks less than the sticker on a GeForce GTX 780. Unlike many of AMD’s recent graphics cards, the 290X starts life without any sort of game bundle. I guess you can pick the games you want with that $100 savings.

You can see in the pictures that the 290X requires two aux power inputs, one six-pin and one eight-pin. The 290X’s circuit board is 10.5″ long, bog standard for this class of graphics card and a match for the GTX 780 and Titan. However, its plastic cooling shroud extends a little beyond the PCB, bringing the total length to just under 11″. Strangely enough, AMD hasn’t disclosed a power spec for the R9 290X, but the card’s connector config dictates a max power draw of 300W, so long as AMD has honored the PCI Express power limits.

Additional goodness: TrueAudio, displays, and XDMA

AMD has built several new technologies into its Hawaii chip, and some are complex enough I can’t do them justice in the time I have to finish this review. Only two of AMD’s current graphics chips, Hawaii and Bonaire, have these next-level capabilities built in. I suspect we may see more GPUs from this same family in the coming months.

The most notable of the new features is probably the TrueAudio DSP block for accelerated processing of sound effects. There’s much to be said on this subject, and I intend to address TrueAudio in more detail in a separate article shortly. For now, you might want to check out my live blog from the GPU14 event for some additional details on this feature. We don’t yet have any software to take advantage of the TrueAudio hardware, but I suspect we’ll spend quite a bit of time with TrueAudio once the first games that support it arrive.

AMD has also freshened up the display block in its latest GPUs. You can see the connector payload above. Both of the DVI ports are dual-link, and the DisplayPort output is omni-capable: it supports multi-stream transport (MST) and can sustain the pixel rates needed to drive a single-tile 4K display at 60Hz, once such mythical beasts become available. Furthermore, Radeons will support the DisplayID 1.3 standard, written by an AMD engineer, that allows for auto-configuration of tiled 4K displays—provided those displays also support this standard. The current 4K monitors from Sharp and Asus do not, but AMD intends to recognize those panels and take care of them automagically in its drivers.

Perhaps the biggest change here is the elimination of the requirement that multi-monitor Eyefinity configs include at least one DisplayPort connection. With the 260X and 290X, users can finally connect three monitors via the HDMI and DVI links alone. Huzzah.

You may have noticed the distinct lack of CrossFire connectors on the 290X. There are, uh, vestiges where the “golden fingers” connectors ought to be, but no actual fingers. That’s because AMD has replaced the CrossFire bridge connector with a new solution called XDMA. Rather than pass data from GPU to GPU over a dedicated bridge, XDMA incorporates a direct memory access (DMA) engine into the CrossFire image compositing block. This DMA facility can transfer data directly from GPU to GPU via PCI Express, without a detour into system memory.

XDMA is purportedly compatible with AMD’s frame pacing tech, which reduces the micro-stuttering problems associated with multi-GPU teaming. Even more importantly, the firm claims XDMA can handle resolutions above four megapixels, including Eyefinity multi-display configs and 4K monitors. Since current CrossFire configs have serious problems with such setups, this new data sharing method could bring a very notable improvement.

AMD insists XDMA carries no performance penalty compared to a dedicated CrossFire bridge. They make a good argument when they point out that the situation can’t get much worse than it is now with CrossFire at resolutions above four megapixels. The Radeon driver software shifts frame data from the secondary GPU into system memory and then to the primary GPU. The end result? At 4K resolutions, the transfers are too slow, and pretty much every other frame is dropped completely, never to be displayed. Quicker, more direct GPU-to-GPU data transfers can only help.

The firm is confident that lower-bandwidth CrossFire configs aren’t any worse off without the dedicated bridge, either. In fact, they wanted to be sure before deciding to go with XDMA as their only solution, which is why 290X boards retain those phantom fingers. Early boards included bridge connectors for comparative testing. Once AMD was convinced the solution was solid, the fingers were, uh, snipped off.

So how well does XDMA work? We’re dying to try it, and we’ve asked AMD for a second 290X card explicitly for the purpose of testing CrossFire with XDMA at 4K resolutions, but we don’t yet have a second card. We’re hoping to rectify that problem shortly.

We have lots of questions about what sort of PCI Express configurations will prove to be suitable for high-resolution CrossFire configs. XDMA seems well-suited for systems based on Intel’s X79 chipset, with 16 lanes of PCIe 3.0 bandwidth running to two expansions slots, or for dual-GPU cards like the Radeon HD 7990 with PCIe switch chips oboard. Those are notable configs for high-end CrossFire setups. But will XMDA play well in systems with less bandwidth, like those based on Haswell or Richland processors with dual x8 PCIe links? Or systems with higher PCIe latency, like AMD’s 990FX platform? What happens with three- and four-card setups? We’ll have to push the limits in order to find out.

PowerTune gets smarter

The one other new feature in AMD’s latest GPUs is a smarter version of the PowerTune dynamic power management mechanism. This revised PowerTune is made possible by some enabling hardware: an interface to the card’s voltage regulator known as SVI2. SVI2 is built into several recent AMD chips, including Hawaii, Bonaire, and the Socket FM2 APUs. It allows these chips to gather real-time voltage and current feedback very quickly—the sampling rate is 40KHz, and the interface has a data rate of 20Mbps. The SVI2 interface also enables fast and fine-grained control over the power coming into the chip. AMD says it can make voltage switches in about 10 microseconds in steps as small as 6.25 mV, and the interface allows for multiple voltage domains per VR controller.

Armed with faster instrumentation and control, the new PowerTune is able to pursue the best balance of power consumption, temperature, performance, and fan speed available within its defined limits. The algorithm behind it all has evidently grown pretty complex. For instance, the 290X knows better than to crank up fan speeds in simple steps, because doing so can be acoustically jarring. Instead, it ramps up fan speeds gradually in order to maintain a small perceptual footprint.

Also, somewhat like Nvidia’s GPU Boost 2.0 algorithm built into its GTX 700-series cards, one of PowerTune’s key parameters is the card’s GPU temperature limit. The algorithm will seek to maximize performance without letting the GPU’s temperature exceed the programmed peak. Functionally, that means the GPU clock will vary somewhat in response to different workloads and variance in ambient temperatures. In that way, it’s no different than the latest GeForces or most desktop CPUs. The performance of our biggest, fastest chips hasn’t been entirely deterministic for a while now.

The difference with the Radeon R9 290X is one of degree—or degrees. You see, the default temperature limit for the 290X is a steamy 95°C, which will definitely keep your toes toasty on a cool fall evening. Meanwhile, the card’s peak fan speed is 40% of its max potential. AMD is pushing the GPU pretty hard and asking PowerTune to keep things in check. Practically speaking, that means GPU temperatures generally remain pretty steady at nearly 95°C after a few minutes of gameplay—and GPU clock speeds vary more widely than we’ve seen in other graphics cards. Although the 290X can operate at its advertised “Boost” clock speed of 1GHz, it will often dip below that frequency when a game is running.

If you’d like to trade acoustics for performance, the 290X offers an “uber” mode where the fan speed limit is 55%. Just flip the little DIP switch next to the, er, missing CrossFire connector and reboot to get there. The 290X is running close enough to the edge that this adjustment to the fan speed profile can have a clearly measurable impact on performance. We’ve tested the 290X at both settings, so you can see the difference the “uber” switch had on our open-air test rig.

For a quick illustration, here’s a look at the data from our power and noise testing session in Skyrim, as logged by GPU-Z. The time represented on the graph is our warm-up period of approximately four minutes.

The 290X stays at the 1GHz Boost clock for most of the period, but near the end, with the default fan mode, its clock speeds begin to fluctuate, ranging as low as 943MHz for a moment. With the higher fan speed in uber mode, the 290X’s frequency mostly stays put, with only a brief drop to around 970MHz.

Long-time readers, set your OCD ticks at the ready, because performance testing just got a little more complicated. I conducted my first round of tests on the 290X before realizing how much impact GPU temperatures could have on performance. Going back over the data later, I saw that, in some games, the 290X’s performance dropped slightly with each successive test run. For example, in Tomb Raider, we saw FPS averages of 46.8, 45.3, and 44.1. We typically report the median score, which is 45.3 in this case.

To better understand the 290X’s behavior, I improvised a quick test over a longer time window. I ran our Tomb Raider test sequence twice starting with a cold card and then twice more at successive four-minute intervals. Here’s what I saw.

First run After
4 minutes
After
8 minutes
After
12 minutes
FPS 47 44 44 44

At least in this case, the 290X looks to be pretty quick to reach its max temperature, and the card’s performance doesn’t change too much after it gets there.

Still curious about GPU speeds over the long run, I fired back up our Skyrim load test and used GPU-Z to log clock frequencies over a period of about 30 minutes. I’d graph it for you, but, well, the window was open in Damage Labs during that time—and the 290X remained rock steady at 1GHz throughout. Evidently, ambient temperatures have a pretty big impact on the 290X’s behavior. We’ll have to mull over how this information should affect our testing procedures going forward. We may need to raise the number of testing sessions back to five per card (which is better anyhow), institute some sort of warm-up period prior to testing, or install stricter climate controls in Damage Labs. Hmmm.

For tinkerers, AMD exposes quite a bit of control over the 290X’s PowerTune settings in the Overdrive section of the Catalyst Control Center. The card doesn’t have any additional thermal headroom available, but you can push even harder on the fan speeds, power limits, and max clock frequencies, if you wish. I doubt those tweaks will net much additional performance without a more robust cooling solution.

Test notes

To generate the performance results you’re about to see, we captured and analyzed the rendering times of every single frame of animation during each test run. For an intro to our frame-time-based testing methods and an explanation of why they’re helpful, you can start here. Please note that, for this review, we’re only reporting results from the FCAT tools developed by Nvidia. We usually also report results from Fraps, since both tools are needed to capture a full picture of animation smoothness. However, testing with both tools can be time-consuming, and our window for work on this review was fairly small. We think sharing just the data from FCAT should suffice for now.

Our testing methods

As ever, we did our best to deliver clean benchmark numbers. Our test systems were configured like so:

Processor Core i7-3820
Motherboard Gigabyte
X79-UD3
Chipset Intel X79
Express
Memory size 16GB (4 DIMMs)
Memory type Corsair
Vengeance CMZ16GX3M4X1600C9
DDR3 SDRAM at 1600MHz
Memory timings 9-9-9-24
1T
Chipset drivers INF update
9.2.3.1023

Rapid Storage Technology Enterprise 3.5.1.1009

Audio Integrated
X79/ALC898

with Realtek 6.0.1.6662 drivers

Hard drive OCZ
Deneva 2 240GB SATA
Power supply Corsair
AX850
OS Windows 7
Service Pack 1
Driver
revision
GPU
base

core clock

(MHz)

GPU
boost

clock

(MHz)

Memory

clock

(MHz)

Memory

size

(MB)

GeForce GTX 660 GeForce
331.40 beta
980 1033 1502 2048
GeForce GTX 760 GeForce
331.40 beta
980 1033 1502 2048
GeForce GTX 770 GeForce
331.40 beta
1046 1085 1753 2048
GeForce GTX 780 GeForce
331.40 beta
863 902 1502 3072
GeForce GTX Titan GeForce
331.40 beta
837 876 1502 6144
Radeon
HD 5870
Catalyst
13.11 beta
850 1200 2048
Radeon
HD 6970
Catalyst
13.11 beta
890 1375 2048
Radeon
R9 270X
Catalyst
13.11 beta
? 1050 1400 2048
Radeon
R9 280X
Catalyst
13.11 beta
? 1000 1500 3072
Radeon
R9 290X
Catalyst
13.11 beta 5
? 1000 1250 4096

Thanks to Intel, Corsair, Gigabyte, and OCZ for helping to outfit our test rigs with some of the finest hardware available. AMD, Nvidia, and the makers of the various products supplied the graphics cards for testing, as well.

Also, our FCAT video capture and analysis rig has some pretty demanding storage requirements. For it, Corsair has provided four 256GB Neutron SSDs, which we’ve assembled into a RAID 0 array for our primary capture storage device. When that array fills up, we copy the captured videos to our RAID 1 array, comprised of a pair of 4TB Black hard drives provided by WD.

Unless otherwise specified, image quality settings for the graphics cards were left at the control panel defaults. Vertical refresh sync (vsync) was disabled for all tests.

In addition to the games, we used the following test applications:

The tests and methods we employ are generally publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.

Texture filtering

We’ll begin with a series of synthetic tests aimed at exposing the true, delivered throughput of the GPUs. In each instance, we’ve included a table with the relevant theoretical rates for each solution, for reference.

Peak pixel

fill rate

(Gpixels/s)

Peak

bilinear

filtering

int8/fp16

(Gtexels/s)

Memory
bandwidth
(GB/s)
Radeon HD
5870
27 68/34 154
Radeon HD
6970
28 85/43 176
Radeon HD
7970
30 118/59 264
Radeon
R9 280X
32 128/64 288
Radeon
R9 290X
64 176/88 320
GeForce GTX 770 35 139/139 224
GeForce GTX 780 43 173/173 288
GeForce GTX
Titan
42 196/196 288

Although the 290X has, in theory, much higher fill capacity than the Titan, this test tends to be limited more by memory bandwidth than anything else. None of the GPUs achieve anything close to their peak theoretical rates. The 290X’s additional ROP power will more likely show up in games using multisampled anti-aliasing.

The back-and-forth here is kind of intriguing. 3DMark’s texture fill test isn’t filtered, so it’s just measuring pure texture sample rates, and the Titan manages to outperform the 290X in that test. The results from the Beyond3D test tool are bilinearly filtered, and in the first of these, the 290X takes the top spot.

Once we get into higher-precision texture formats, a major architectural difference comes into play. Hawaii and the other Radeons can only filter FP16 texture formats at half the usual rate. Even the GK104-based GTX 770 is faster than the 290X with FP16 and FP32 filtering.

In all cases, though, the 290X offers a nice increase over the Radeon R9 280X—which is just a re-branded Radeon HD 7970 GHz Edition, essentially.

Tessellation and geometry throughput

Peak

rasterization

rate

(Gtris/s)

Memory
bandwidth
(GB/s)
Radeon HD
5870
0.9 154
Radeon HD
6970
1.8 176
Radeon HD
7970
1.9 264
Radeon
R9 280X
2.0 288
Radeon
R9 290X
4.0 320
GeForce GTX 770 4.3 224
GeForce GTX 780 3.6 or 4.5 288
GeForce GTX
Titan
4.4 288

I’m not sure what to make of these results. I expected to see some nice gains out of the 290X thanks to its higher rasterization rates, but the benefits are only evident in TessMark’s x16 subdivision mode and with our low-res/extreme tessellation scenario in Unigine Heaven.

A couple of potential explanations come to mind. One, TessMark uses OpenGL, and it’s possible AMD hasn’t updated its OpenGL drivers to take full advantage of Hawaii’s quad geometry engines. Two, the drivers could be fine, and we could be seeing an architectural limitation of the Hawaii chip. As I noted earlier, large amounts of geometry amplification tend to cause data flow problems. It’s possible the 290X is hitting some internal bandwidth barrier at the x32 and x64 tessellation levels that’s common to GCN-based architectures. I’ve asked AMD to comment on these results but haven’t heard back yet. I’ll update this text if I find out more.

Shader performance

Peak

shader

arithmetic

rate

(tflops)

Memory
bandwidth
(GB/s)
Radeon HD
5870
2.7 154
Radeon HD
6970
2.7 176
Radeon HD
7970
3.8 264
Radeon
R9 280X
4.1 288
Radeon
R9 290X
5.6 320
GeForce GTX 770 3.3 224
GeForce GTX 780 4.2 288
GeForce GTX
Titan
4.7 288

Welp. This one’s unambiguous. That massive GCN shader array is not to be denied. The 290X wins each and every shader test, sometimes by wide margins.

Now, let’s see how these things translate into in-game performance.

Crysis 3


Click through the buttons above to see frame-by-frame results from a single test run for each of the graphics cards. You can see how there are occasional spikes on each of the cards. They tend to happen at the very beginning of each test run and a couple of times later when I’m exploding dudes with dynamite arrows.

You can see from the raw plots that the 290X looks good, with more frames produced and generally lower frame rendering times than anything else we tested. Every card encounters a few slowdowns, and the spikes on the 290X aren’t anything exceptional.

The traditional FPS average and our frame-latency-focused companion, the 99th percentile frame rendering time, pretty much agree here. That’s a good indication that none of the graphics cards are encountering any weird issues. When they don’t agree, as sometimes happens, bad things are afoot. What they agree on is simple enough: the 290X is the fastest graphics card in this test. The uber fan mode doesn’t seem to make much difference here.


We can get a broader sense of the frame time distribution by looking at the tail end of the curve. In this case, both brands of GPUs, faster and slower models, all suffer from a small number of high-latency frames in the last ~2% of frames rendered. I suspect the performance problem here is at the CPU or system level, not in the graphics cards themselves, since it’s fairly consistent.


Our “badness” index concentrates on those frames that take a long time to produce. For the first two thresholds of 50 and 33 ms, the results are pretty similar among the newer GPUs, which again suggests a CPU bottleneck or the like. However, for slinging out frames 60 times per second, once every 16.7 milliseconds, the R9 290X is easily the best choice.

Far Cry 3: Blood Dragon




Click through the frame time distributions above, and you’ll see very few frames that take beyond 50 ms to render from any of the cards—even with the geezer of the group, the Radeon HD 5870. However, we can still get a sense of gaming smoothness from the numbers, and all of them point to the 290X as the top dawg in this test scenario. There are a few minor frame time spikes on the GTX 780 and Titan, although they don’t really amount to much. Still, the R9 290X is glassy smooth, especially in its uber fan mode, which makes a real difference here.

GRID 2


This looks like the same CodeMasters engine we’ve seen in a string of DiRT games, back for one more round. We decided not to enable the special “forward+” lighting path developed by AMD, since the performance hit is pretty serious, inordinately so on GeForces. Other than that, though, we have nearly everything cranked to the highest quality level.




Everything from the GeForce GTX 770 on up turns in a near-flawless performance here, churning out nearly each and every frame in 16.7 ms or less. Only the 290X is the very definition of flawless, though, never once missing the beat at 60Hz.

Tomb Raider





Those averages around 40-45 FPS for the high-end cards don’t seem terribly impressive until you look at the frame time plots or our latency-focused metrics. Then you realize that the fastest cards never once produce a frame in more than 33 ms. That’s a steady 30 FPS or better for each of them. Of course, the 290X again leads the pack.

Guild Wars 2




This is an odd one, because the faster cards tend to have some minor frame time spikes to about 30 ms. You can see it in the plots. The GTX 780 and Titan suffer the most, but the 290X also participates in this problem. Seems like the cards with the highest frame rates are the most affected.

Nevertheless, the GTX 780 just edges out the R9 290X across multiple metrics for a rare outright performance win.

Power consumption

The Radeons have a unique capability called ZeroCore power that allows them to spin down all of their fans and drop into a very low-power state whenever the display goes into power-save mode. That’s why they tend to draw less power with the display off.

Please note that our load test isn’t an absolute peak scenario. Instead, we have the cards running a real game, Skyrim, in order to show us power draw with a more typical workload.

The 290X’s power draw under load is… considerable at roughly 40W more than the GTX 780. The card’s cooler will have more heat to expel as a result.

Noise levels and GPU temperatures

Remember that 95°C PowerTune limit? Yeah, the 290X runs right up against it with either fan profile. AMD calls the card’s default fan profile “quiet” mode and the more aggressive 55% profile “uber” mode. You can see why I’ve resisted calling the default profile “quiet.” The 290X ain’t exactly that.

Switching the fan to uber mode pushes the 290X past 50 dBA, which is somewhere near my personal threshold of true annoyance. Premium graphics cards have been making strides toward good acoustic citizenship in recent years, and we lauded the Radeon HD 7990 for furthering that trend. The 290X sadly loses ground on this front. Yes, it’s possible the tweak PowerTune with a lower fan speed threshold, but you’re sure to lose performance if you do so.

As for the 290X’s penchant for blowtorch-like temperatures, well, AMD has definitely chosen a more aggressive tuning point than the 80°C GPU Boost target on the GTX 780 and Titan. All things considered, I’d rather not lose my fingerprints when going to swap out a video card. However, I can’t bring myself to fret over GPU temperatures of 95° too much, since Nvidia chose the same target for the GTX 480 several years back. Heck, even the old GeForce 8800 GTX ran relatively hot, and I think some of those are still going strong to this day.

What AMD has done, though, is squeeze all of the thermal headroom out of this card. Don’t expect much overclocking success on a 290X with the stock cooler.

Conclusions

Ok, you know how this goes. We’ll magically compress our test results into a couple of our famous price-performance scatter plots. The performance scores are a geometric mean of the results from all the games we tested. We’ve converted our 99th-percentile frame time results into FPS for the sake of readability. As always, the better values will be situated closer to the top left corner of the plot. The worse buys will gravitate to the bottom right of the plot. Since the Radeon HD 5870 and 6970 aren’t current products anymore, we’ve shown them at their starting prices for comparison.


Well, that was easy. The Radeon R9 290X is a bit faster than the GeForce GTX 780 and costs a hundred bucks less. Beats the Titan for nearly half the price, too. So yeah. AMD has substantially reduced the cost of graphics processing power in this category, and it has grabbed the overall performance crown from Nvidia in the process. What’s not to like about a faster-than-Titan graphics card for just over half the price?

Unfortunately, there are some pretty good answers to that question. I have to admit, I was more impressed with the Hawaii GPU’s architectural efficiency—it is a much smaller chip than the GK110, remember—before seeing the 290X’s power draw and temperature readings. Looks to me like AMD has captured the performance crown through a combination of newfangled architectural prowess and the time-honored tactic of pushing its silicon to the ragged edge. The Hawaii GPU brought the 290X to the cusp of success, but a bigger power envelope and a really aggressive PowerTune profile ensured the victory. That victory comes at a cost: a relatively noisy card, whether on the default or uber fan profiles, and positively toasty GPU temperatures. 290X owners will also see more variable clock speeds (and thus performance) than they’ve come to expect. These aren’t deal-breaker problems—not when there’s a $100 price difference versus the GTX 780 on the table—but they’re still hard to ignore.

More seriously, if you have any intention of using a Radeon R9 290X in a multi-GPU configuration at some point down the road, I’d advise you to put down the credit card and step away from the Newegg browser tab until we can test XMDA-based CrossFire thoroughly. Hopefully, we can do that soon, along with some additional single-card testing at 4K resolutions. If only we could raise the PowerTune limit on my frail human flesh, I’d have done some 4K testing already.

Of course, Nvidia has already signaled that it has a multi-pronged response to the 290X in the works. Most notably, something called the GeForce GTX 780 Ti is coming soon. We expect the 780 Ti to outperform the Titan, but the precise details about it are something of a mystery. Beyond that, the green team is taking a page from AMD’s book and rolling out a triple game bundle for the holiday season, along with some discount coupons for its Shield handheld gaming console doohickey. There’s also some GeForce-specific goodness in the works, including the amazing G-Sync and some low-overhead game recording and streaming features. All of those things sound great, and I truly do prefer the GTX 780’s muted acoustics and slick all-metal cooler. But just like the 290X’s rough edges aren’t deal-breakers, the GTX 780’s perks aren’t deal-makers. Not when one of those crisp new Benjamins is on the table. Nvidia desperately needs to cut prices, or AMD wins this round on sheer value.

Follow me on Twitter to see pictures of my food.

Comments closed
    • deb0
    • 6 years ago

    When I want to use my pc to heat my house, Radeon is my first choice.

    • ronch
    • 6 years ago

    Personally, I’m not a big fan of high end, gas-guzzling graphics cards. I’m more of a practical guy and I want a good balance between performance and power. Price is also a factor, as it usually is for most folks. Having said these, I wonder when they’ll trickle this new GCN iteration down to the midrange or low midrange ($130 / 60-80w) market segment. Running an HD7770 now, which I’m pretty happy with given how I don’t really have much time to play games these days.

      • Airmantharp
      • 6 years ago

      Well, for direct gaming performance, these GCN cores aren’t faster than the first ones found on your current card, and AMD has already rebranded the mid-range cards to fit them into their new naming scheme.

      I honestly don’t expect them to make a change until TSMC’s next node comes online, nice as it would be to see a performance SKU below $400 with support for TrueAudio.

    • brucethemoose
    • 6 years ago

    [s<]641[/s<] 642 comments here... what's the record for a TR article?

      • D@ Br@b($)!
      • 6 years ago

      R U gonna edit Ur post every time there’s a new comment?

        • Chrispy_
        • 6 years ago

        This code you use is intriquing, but what does R U stand for and who is Ur?

          • Meadows
          • 6 years ago

          Your word “intriquing” is intriguing.

          • D@ Br@b($)!
          • 6 years ago

          Serious?

          R –> Are
          U –> You
          Ur–> Your(brucethemoose)

        • BIF
        • 6 years ago

        He edited his post TWICE. For shame!

        Downvotes galore!

      • BIF
      • 6 years ago

      Eighteen billion.

      95% either for or against blue skys.

      The ones I’m worried about are those who vote “unsure” and “don’t know”.

      • Krogoth
      • 6 years ago

      Still the infamous 75GXP thread……

        • anotherengineer
        • 6 years ago

        This guy?

        [url<]https://techreport.com/news/2799/dr-evil-asks-gxp-problems[/url<]

    • Modivated1
    • 6 years ago

    The fact that this thread discussion has raged on for so long acknowledges the launch of the R9-290x as a success. It has certainly made an impact on the market, it will certainly sell well, and no matter what side of the argument you are on you cannot say that the competition walks away with the crown.

    The argument boils down to performance VS heat and noise. Which do you prefer? Either way if this product were a failure it would have been dismissed as clearly inferior in all categories and we would have move on to something else by now. But those in favor feel a passionate need to support it as well as those against it feel that they need to emphasize it’s flaws so that many of of the masses don’t buy into it’s merits.

    That alone makes this launch a victory, no manufacturer would really ask for more.

      • Airmantharp
      • 6 years ago

      Victory is a companies’ employees being able to keep roofs over their head and food on their plate.

      For those that prefer performance and low noise, the victory is that prices just went down, as predicted. Now they can have everything they want at a lower price from Nvidia, thanks to AMD!

      (the R9 290X is a resounding success- the reference version with the botched cooler isn’t)

        • CaptTomato
        • 6 years ago

        You aren’t the entire market though and you don’t need to keep repeating that the 290X has a “botched” cooler without acknowledging that water cooling, custom air cooling and AIB’s will solve this problem.

          • Airmantharp
          • 6 years ago

          Sure, I’ve mentioned that numerous times. Granted, given AMD’s history, that’s almost a given.

      • BlackDove
      • 6 years ago

      The fact that people are commenting a lot doesn’t mean it’s a good product.

      The GK110 beats it in most respects, especially heat and noise. With GK110 you get power, without heat and noise.

      SLI works as advertised too.

      • Chrispy_
      • 6 years ago

      The 780 is the cut-down version of the Titan.
      The 290X is the full-fat product.
      The fact that the full-fat product can beat the 780 comfortably, and do it with a cheap and nasty cooler for over a hundred bucks less is nothing short of staggering.

      At current pricing, the 290 (not the 290X) will totally destroy Nvidia because it will have two things:

      1) A lower price than even the 290X – making it $150-200 cheaper than the 780 with very similar (if not outright better performance

      2) OEM coolers which have always been vastly superior to the AMD reference coolers. If you can’t wait two weeks for cards with proper coolers to turn up, then the noise and heat is the result of your impatience.

      I have already linked elsewhere in this comments thread that AMD’s coolers are awful. Previous reviews have shown plenty of cards that dissipate higher wattages with less noise AND at lower temperatures. I’m just glad that OEM’s mostly ditch the reference cooler ASAP. In a few months you’ll struggle to find reference coolers at retail because we’ll have Asus DirectCU, Windforce, Twin Frozr, Vapor-X and other high-quality, reliable and efficient coolers instead.

        • Airmantharp
        • 6 years ago

        Good comparison- those after-market coolers can’t come fast enough!

        Further, I like the way you present the pricing comparisons. You make it seem like Nvidia is going to need to drop prices even further, and that’s good news for everyone :).

        • Chrispy_
        • 6 years ago

        Someone mentioned elsewhere in these 600+ comments that the 7950 was 2/3 the price of a 7970 at launch, for product segmentation reasons.

        If AMD do the same thing now for the 280(non-X) it would be around $370, but I suspect they’ll most likely match the street price of the factory-OC’ed 4GB GTX770 models which is around $400-$420.

        I’ll still bite at that price though….

      • Bensam123
      • 6 years ago

      Pretty much… compare it to the 226 for the 780 and the 220 for Titan. AMD actually had something interesting this time around and it put the red guys into comment mode and green guys into defensive mode.

      It’s not really just the performance, but all the other things attached to this card. If it was just a performance increase we probably wouldn’t see this much interest in the card.

      Like half the comments on this article are from Airman telling everyone how horrible the cooler is and that they shouldn’t buy it though.

    • npore
    • 6 years ago

    Just finished reading the review (been travelling). Good stuff as usual, looking forward to the new crossfire going through TR QC.

    Then I come to the comments… Ima need a TL;DR..

    I’m guessing there would be a bit of red and green inter-tribal violence, speculation around Mantle/TrueAudio/G-Sync, Krogoth not impressed, everyone else impressed by the price and performance, and the general consensus being you would be a fool to buy this (unless, of course, you needed a card right now) until we see better coolers, Nvidia price cuts and the 780 Ti?

      • Airmantharp
      • 6 years ago

      You got it 🙂

      • Chrispy_
      • 6 years ago

      Do you do TL;DR’s for other things?

      You’re hired.

    • BoBzeBuilder
    • 6 years ago

    Yeeeeeeeeoooooooowwww!

    600. The number of the beast minus 66!

    • Airmantharp
    • 6 years ago

    [url=http://www.youtube.com/watch?v=u5YJsMaT_AE&feature=youtu.be<]Purely for your enjoyment.[/url<]

      • Airmantharp
      • 6 years ago

      [url=http://www.youtube.com/watch?v=1W5OXvIzRKc<]And here's the non-parody version.[/url<]

        • sschaem
        • 6 years ago

        780
        [url<]http://www.youtube.com/watch?v=RwBCMa7W4dw[/url<] titan [url<]http://www.youtube.com/watch?v=1AtAClCESX0[/url<] The 290x, 780, titan are all loud at full load... And dont expect the 780 ti to be any quieter then the 780. Drop the performance of the 290x to GTX 780 level and the noise level will also drop to similar levels. At least the 290x is 100$ cheaper and give you the choice.

          • Airmantharp
          • 6 years ago

          +1 for good rebuttal- the ‘downvote’ trolls don’t really like anyone around here!

          Still, remember that ‘loudness’ has to be measured as sound pressure level (SPL) in decibels (dB). That’s what the charts on the reviews are for, as you can’t tell exactly how ‘loud’ each card is unless every variable is controlled and the test setup is reasonable.

          The YouTube videos can however tell you what the character of that sound is, as can subjective assessments of reviewers. That’s why I started the thread with the ‘purely for your enjoyment’ comment :).

      • Modivated1
      • 6 years ago

      /ROLLS ON THE FLOOR LAUGHING!! I am not on the same side of the coin as you when it comes to the R9 290x but it’s hilarious what people will do to get a laugh!

      That’s funny it really was. Wonder how much money they put into setting up that skit.

      • 0g1
      • 6 years ago

      MAAAAN that was funny. LMAO.

      • Meadows
      • 6 years ago

      Oh yes. Oh, very yes.

      • ClickClick5
      • 6 years ago

      One of the best posts in a long time!

    • blitzy
    • 6 years ago

    Anyone know what the ETA of the R290 non X is likely to be? I need a new graphics card in the next 3 months, just figuring out which one will suit best

      • JustAnEngineer
      • 6 years ago

      [url<]https://techreport.com/discussion/25509/amd-radeon-r9-290x-graphics-card-reviewed?post=770484[/url<]

        • blitzy
        • 6 years ago

        thanks

    • spanky1off
    • 6 years ago

    N00b question…I got a 5850 running off a mainbaord that has a pcie-2 connector (i think thats what its called)…the cpu is a Intel 2500K can i upgrade to this gpu card if it uses pcie-3? or would i need to upgrade the whole motherboard/cpu? cheers 🙂

      • Pwnstar
      • 6 years ago

      You can, as PCI-e 3 is backwards compatible with PCI-e 2.

    • jimbo75
    • 6 years ago
    • CaptTomato
    • 6 years ago

    IMO, the GPU/CPU cooler is far more important than the case cooling, especially if you’re not going overboard on oclocking, as such, those planning to oclock should wait for a brand name with effective cooling.

      • Airmantharp
      • 6 years ago

      You’re right, but with this card’s power draw and ability to scale with better cooling, case cooling is even more important.

      To get the most out of one of these cards you’d want to have your case airflow setup to accommodate the kind of cooler it comes with- with a blower like this, I’d use Silverstone’s Fortress FT-02 as a reference.

      With the custom coolers that are coming, one of the larger, open Corsair cases would make a pretty good base; just add another intake fan and another exhaust fan to get more airflow in and out of the expansion card section. Fans are cheap :).

    • tanker27
    • 6 years ago

    WOW! As I stated elsewhere, I haven’t bought an AMD product since their infamous driver issues. But this, THIS is making me think twice. I am about to upgrade a Gen 1 i7 here in the next few weeks and man, I have some serious considerations.

    • alienstorexxx
    • 6 years ago

    nvidia, just surrender at this gpu, cut the prices already. stop that titan big steal. i think we all know who will won the next gen.
    the only one who worries about consumer wallet, and to give the best possible performance.

      • Klimax
      • 6 years ago

      Well, it ain’t AMD. Titan with unlocked fan will beat it. (I would do the test myself, but I don’t have monitors. Just 17”)

      Simply too thin lead.

    • Mr. Eco
    • 6 years ago

    How can I see the history of the most commented articles? Is this one with 399 posts the record?
    Definitely the hottest topic as of last two years.

      • chuckula
      • 6 years ago

      [quote<]Is this one with 399 posts the record?[/quote<] Not even close. Go back to the Bulldozer launch and the collective cycle of disbelief followed by rationalization took up way more posts.

        • xres625e
        • 6 years ago
        • rxc6
        • 6 years ago

        Pftt!! That one isn’t even half of the record.

        For all you youngsters: [url<]https://techreport.com/news/2799/dr-evil-asks-gxp-problems[/url<] It doesn't have more because the discussion was closed, but it is in a class of its own.

    • ultima_trev
    • 6 years ago

    Definitely one of the better reviews of R9 290X. Once again proving why TR has been my preferred review site for the past 7 years, even more so than AnandTech or Tom’s.

    As for the product reviewed, I find it ludicrous that AMD chose 95 centigrade as a safe threshold. It reminds me of my GTX 470 and the intolerable heat and fan noise that came with it. Although this thing seems even noisier. I hope slapping a Gigabyte Windforce cooler on this sucker will at least quell the racket somewhat. Those coolers seemed to have worked wonders for the HD 7xxx series, my HD 7850 no exception. While I could never justify paying more that $300 for a GPU, I sure hope the R9 290 is a bit more sane in terms of thermals and decibels. The GTX 780 may be a bit slower than R9 290X, however I actually feel its $100 premium is justified by the much quieter and cooler operation… And this is coming from an AMD fanboy.

      • Duct Tape Dude
      • 6 years ago

      Years ago I’d disagree, but from my own experience, I’ve never heard of a graphics card failing because of an overheated [i<]core[/i<], with the exception of the faulty nvidia 8400m/8600m series. I swapped the GPU in my old laptop which made me lose thermal control, so I've accidentally pushed it to 118C multiple times, and it's spent hundreds of hours gaming at 102-108C. Still works great with no artifacts. Silicon is more resilient than our current mindset would have us think.

    • Klimax
    • 6 years ago

    l do Krogoth. Not Impressed.

    Pure brute force. Against Fan limited Titan… If you want to see what Titan can do, just push Fan speed over 70%, you’ll trade silence for massive increase in clocks. (Dirt 3 Showdown 1056MHz ; Einstein@Home pushed to 950MHz).

    This won’t hard to beat for NVidia. (Seen some other results and 290x had rarely sufficient lead over stock Titan to clear OC headroom)

    BTW: [url<]http://www.gigabyte.cz/products/page/vga/gv-ntitanoc-6gd-b/[/url<] Sorry AMD, but this won't be sufficient. You don't lead you had with 7970...

      • Modivated1
      • 6 years ago

      If you read the details of many reviews a lot of reviewers could not stand the noise of uber mode (obviously a testament to how loud it was) so they tested it in quiet mode. So while you talk about unleashed fan mode that is not what the testing conditions or the results represent.

      The quiet mode produced those Titan and 780 beating numbers, just look at the review here again and read the mode the card was tested again. But! you bring up a good point what would it look like to see these cards go at it when you take off the governor and let them red line. Would the Titan take the crown? Would the R9 290x increase it’s lead? Would the outcome be similar where the numbers keep the same ratio’s?

      Both companies still have some choices to play with when messing with a refresh. Nvidia has a portion of the card it has gimped and can release future versions of card with full capacity, AMD has chosen to use slower memory and can choose to use faster memory and the aftermarket companies can design quality coolers to further push the limits of what the card can do. It’s a good time to be a Computer Enthusiast that’s for sure.

        • f0d
        • 6 years ago

        yet even in normal mode it was 7.8Db louder than titan (which is a lot) and 16 degrees hotter

        im not saying titan would have beaten it with the same noise and thermal headroom but i do wonder how much closer it would have been

        • Klimax
        • 6 years ago

        It is loud even in “quite”, which In a way is extremely newspeak. So your whole point is invalid.

        And I think I forgot to add that in full fan mode, temperature never exceeded 70°C, which means full fan speed was overkill and thus no need for such noise floor.

        Sorry, try again.

          • Goty
          • 6 years ago

          290X = Simple copper block, no heatpipes
          TItan = Vapor chamber cooler

          Given the same cooling capacity, the 290X would run marginally hotter than Titan and potentially outperform it to an even greater degree (depending on whether or not the given test causes the card’s clocks to throttle). And I’m going to guess we won’t be seeing AMD’s cheap cooler on many designs from here on out.

          So, in your words, ‘Sorry, try again.’

            • Airmantharp
            • 6 years ago

            If only someone would put one of those coolers on the 290X.

            • CaptTomato
            • 6 years ago

            It’s time for you to stop trolling and making a fool of yourself…….OF COURSE SOMEONE IS GOING TO PUT BETTER COOLERS ON THE 290X……AS MANY AS 4-6 NAME BRAND MANUFACTURES ARE GOING TO DO IT.

            • Airmantharp
            • 6 years ago

            Should I really need to be more specific for you? That wasn’t a troll. You’ll know when I’m actually trolling, I don’t do it very often. I’ve been quite serious in this thread, and that’s pissed a multitude of disillusioned fanboys off.

            I mean, is it really hard to get that by ‘those coolers’ I specifically meant the Titan vapor-chamber cooler mentioned by Goty in the post I replied to?

            Should I stop by your place, draw you a map, and hold your hand too?

            • CaptTomato
            • 6 years ago

            The truth is, you’ve been harping on about the 290x and it’s short comings{from your POV} for probably 250-300 comments…..we get it already, there’s no need for you to save anyone else from the 290x, and there’s no need to long for custom coolers as they’re inevitable.

            As a former owner of a HIS ICE cooled 4850 and a current owner of a Vapour X cooled 7950, I’m happy to inform you that both run quiet and cool, even when oclocked…….especially the 7950.

            The 290x may not have as much oclocking headroom, but with a custom cooler, it’ll be quiet, mid temps and POWERFUL.

            • Airmantharp
            • 6 years ago

            Oh definitely- and I’ve summed up my views elsewhere, but to keep things simple, I’m disappointed both that AMD shipped a cooler this loud at all, and that by doing so they’ve essentially given up on competing with Nvidia on well-engineered blower-style coolers.

            The open-air coolers are fine for most purposes, but they’re far from ideal when approaching cooling from a system perspective, especially if the goal of said system is to run more than one card as is useful when trying to power top-tier games on high-speed 120Hz+ monitors or higher resolution monitors.

            Just as an example, I’d need two R9 290x’s or two GTX780+ cards to be able to run games like BF3 (and BF4 tonight) at max settings at 2560×1600@60Hz. The mid-range GTX770/HD7970/R9 280X cards won’t cut it, and the same can be said for 120Hz+ users.

            • CaptTomato
            • 6 years ago

            Patience=problem solved+the 780’s are now cheaper as well.
            AMD probably releases the ref design as is so that those who want to water cool or buy regardless of temps can do so, and then the AIB’s can sell with fancy custom coolers.

            As for requiring 2 cards to run at 1600p maxed…….that’s your problem to some degree, ie, they’ll look very good running at 1600p at whatever a 290 or 780 can deliver…..you hardly need max SSAA with the level of texture quality built into most modern games.

            Being that I’m 99% certain that the AIB’s will solve the 290’s real or imagined problems, there’s not that much to worry about.

            The fact is, 7870, 7950, 7970, 270, 280, 290 and eventually 290x all will be excellent single GPU’s, of course if u water cool, 290x will be excellent immediately.

            I like Nvidia engineering btw, but have a hard time recommending their grossly inflated prices.

            • Airmantharp
            • 6 years ago

            Well, that is what I’ve been saying these past few months- wait. Wait for the AMD cards, and wait for the Nvidia response. I’ve mentioned that Black Friday would be a good time.

            And for 1600p maxed- well, I don’t need AA, but that has very little to do with texture quality, and no, I don’t bother trying to play games on max- I just ensure that I can play them well, which I absolutely can with my current setup.

            • CaptTomato
            • 6 years ago

            Well you have been giving the 290x a hard time though…..

            • Airmantharp
            • 6 years ago

            AMD botched the cooler, it deserves it.

            But I’ve only been giving the cooler a hard time, and poking fun at those who either insinuate or outright claim that somehow AMD just felled Nvidia with a sword, as if this week and a half between releases will somehow last forever :D.

            • CaptTomato
            • 6 years ago

            As I said to you, it seems to me it’s a different approach from AMD and Nvidia wrt the coolers, Nvidia’s are excellent out the box{Titan/780}, but AMD allows AIB to improve the cooling at a small cost increase.
            This isn’t a bad idea as water coolers don’t have to overpay for a air cooler they won’t use…..and remember, water coolers are hardcore oclockers so even “good” air coolers might not satisfy them.

            • Airmantharp
            • 6 years ago

            Sure- AMD could give a rats, Nvidia listened to their customers and improved their product. They made a ton of progress between the GTX400/500-series and the GTX600/700-series. AMD made negative progress from HD6000->HD7000->R9 290(X).

            And Nvidia allows AIB vendors to improve on their cooling too; usually the same basic cooler gets used on both brands’ cards.

            • Klimax
            • 6 years ago

            I doubt that we’ll see chip pushed much too far, that power consumption is already high as it is. We are talking about stock cooler vs. stock cooler. AMD loses.

            furthermore I posit that I can keep Titan at 1046 without Fan going 85% (aka full permitted mode) while keeping temperature under75°C

            Remember, we are talking stock vs. stock and characteristics of alternatives are not known thus you cannot use them.
            Nope, you need new attempt, mine wasn’t still invalidated.

            ETA: stock->stock, although stick cooler might be interesting idea.. 😀

            • Goty
            • 6 years ago

            We DO know the alternatives, as they’re already seen on many boards from both NVIDIA and AMD, and it is well-known that their performance betters that of AMD’s stock cooler. Also, we aren’t pushing the chip any further, we’re just letting it hit its design limits when it isn’t hamstrung thermally.

            Listen, I understand you feel threatened; that’s a perfectly normal reaction for a fanboy, but that’s no reason to completely ignore logic.

          • Modivated1
          • 6 years ago

          I brought the point up not to address how loud the card was it is obvious by many reviews that the cooler is not the strong point of this card.

          My point was concerning your point of saying that the r9 290 needed to be cranked up to it’s tops speed to beat the Titan which is absolutely untrue. All this argument and bickering to me about noise and heat is irrelevant to me as I am buying two of the best custom cool cards when the vendors are allowed to release them.

          Also even if the performance tide or was slightly inferior to the Titan that would still make the Titan a massive waist of money to anyone who decided to buy it strictly for games. $450 difference is way to much for a 5 -10 frame per second lead. The 780 is more a matter of opinion but I agree with many of the responses in this forum as well as every reviewer I have read and say that I would buy the loud card with more performance at $100 cheaper than the quiet one that is inferior in performance and yet costs more.

          Some may feel that 780 is still a great performer and it doesn’t carry the vices of noise and heat. I have 5.1 THX surround with a bomb sound card so when I am playing a game the fan will not bother me. When it is idle the fan will make little noise. Heat is a different story, that’s why I will be waiting for Asus and the like to build me a better blower to keep the heat down.

            • Airmantharp
            • 6 years ago

            Just a few days later, the GTX780 is cheaper than the R9 290X. There’s a reason that the ‘value’ argument doesn’t hold much water- these price drops have been anticipated for months.

            • Modivated1
            • 6 years ago

            So…. you’re saying that Nvidia had anticipated being Triumphed by the the R9-290X! 🙂

            I can agree with that assessment 😉

            • Airmantharp
            • 6 years ago

            Sure- it’d be really out of character for AMD to disappoint them twice in a row 😀

            • Klimax
            • 6 years ago

            Have your really missed all things out there? Chip with limit at 95°C using quite more power as it is, isn’t running at the edge at all. Heh. Sorry, but bad news is, that most of characteristics are worse then Titan. (Titan doesn’t need to be boiling hot to have that performance)

            You are missing quite lot even about Titan too, because you think Titan had only gaming value. Sorry, but you are wrong. There is quite more to Titan then just “thing to be beaten by 290x half a year later”.

            I should know what value Titan has and its nowhere near your idea.

            • Modivated1
            • 6 years ago

            At the end of the day a chip is only worth the performance I get out of it not what it could do. If I don’t have access to the benefit of what it can do then that unused potential cannot be considered when assuming the value of the product. This point of view covers both points you are trying to make here.

            1.

            I at 95c I would get more performance out of a R9 290x at $550 than a Titan (Keep in mind if this boiling hot thing is a issue causing the card to malfunction I have an AMD warranty that covers it as long as I leave it at stock settings. Titan could possibly get that performance if you overclocked it but it would be at your own risk, no warranty there) at $1000 off of the shelf. Clearly a better value for performance if I am willing to deal with the sound and heat vices which clearly seem worth $450 to me.

            2.

            I and many other gamers could not care less about the cuda features of the card because it does not improve our games in any way. So gaming value is the only consideration for us and therefore why Titan is subject to such scrutiny on these forum for costing $1000. I know that Titan is the best buy for those who use it for work and research purposes but I couldn’t care less about that aspect of the card because it is of no benefit to me.

            That leaves me with my verdict: Titan though a prime performer is edged out by the R9 290x (which lead grows with every new driver release), when you consider performance and price from me and many other gamers point of view than the Titan is an embarrassing card for whoever would shell out that much additional cash for less performance than the R9 290x. You could bring up sound and noise but that would not justify the $450 premium, you could bring up the Physx features but most games don’t include it because it is Nvidia’s proprietary feature and would alienate half of the PC market so no real gain there. From a gamers perspective a person would look foolish to even consider buying a Titan just to game, it’s Simply a rip off deal.

    • jihadjoe
    • 6 years ago

    AMD should cut a secret deal with Intel and have them fab this card.

      • Airmantharp
      • 6 years ago

      Well, AMD would have to give their CPU business to someone else to even get a dialog going :).

      But aside from Intel, I don’t know who could do a better job than Intel and still provide the volume that AMD (or Nvidia) would need. I’d consider Samsung, but they don’t seem to do much better than TSMC in the mobile area- and remember that TSMC is also making all of the console APUs too. It ain’t the best process, but it sure is effective.

    • BehemothJackal
    • 6 years ago

    TR peeps,

    I would love to know how Scott gets 62 FPS on average for Guild Wars 2 with the 280x with maxed settings, Supersample, at 1440p. (no sarcasm intended. I wish I could get mine to run like that!) Is the processor the difference maker?

    For some reason I only get 44 on average with my MSI Lightning R7970 if I enable supersample, and I only play at 1920×1200. I am running a i7 2600k. Would appreciate any input. I noticed the same on previous reviews, but it’s really starting to bug me that I seem to have some kind of bottleneck somewhere.

      • Bensam123
      • 6 years ago

      Last year I tried to compare (while testing powertune) and I got 10 less fps when comparing with a 7870 and a 3570k. I’m not entirely sure myself… I just took it as crap I had running in the background. I was running at 1080p too.

      • Airmantharp
      • 6 years ago

      Step one: if your 2600k isn’t screaming for mercy from it’s overclock, you’re doing it wrong.

        • BehemothJackal
        • 6 years ago

        I have it @4.6GHz

      • JohnC
      • 6 years ago

      “Benchmarking” and “playing” are two different things, especially when it comes to multiplayer games 😉 Anyways, try a synthetic benchmark for a “better” comparison.

      • vargis14
      • 6 years ago

      step 2 get gpuz and make sure it is running at pcie 2.0 @ 16x loaded with furmark or some other windowed stress tester.

        • BehemothJackal
        • 6 years ago

        under load it jumps to 2.0 @ 8x

    • BIF
    • 6 years ago

    Scott, you forgot to include the F@H test results.

    • blitzy
    • 6 years ago

    That’s interesting, the 99th percentile and FPS graphs are quite similar. Looks like drivers from red/green are now handling frame delays more evenly, relative to overall performance

      • HisDivineOrder
      • 6 years ago

      Yeah, I was impressed by how smooooth the R9 290X was in the frame latency tests. That’s great news. I’m eager to see if Gigabyte or Asus can use one of their coolers to cool one of these down without it sounding like a hurricane.

      I’m also eager to see nVidia respond. This is the first spark of excitement I’ve had as a PC gaming enthusiast all year long. When the Titan came at $1k, I was very disappointed and 780 showing up at a (to me) absurd $650 also disappointed.

      The reason the $550 price point of the R9 290X gets a different reaction is mostly because it looks like it’s going to jumpstart a price war that might lead to some of these cards edging down toward $500 or less. Plus, the R9 290 (non-X) and whatever nVidia tries to use as competition against it should be solidly sub-$500.

      Now if AMD would just magically pop out a serious competitor to Haswell, enthusiast PC users would have something to really read about.

    • YukaKun
    • 6 years ago

    Why did you keep Global Illumination and Advanced Lightning OFF in GRID2?

    Since it’s a benchmark of the heavy hitters, it really makes no friggin’ sense to do so.

    Will you guys keep it on from now on for the tests? I play the game with both options turned on and it looks amazing.

    Cheers!

    • B.A.Frayd
    • 6 years ago

    Nice review guys.

    Really exited to see the R9 290x outperform the Titan. I’m hoping that NV lower their prices so I can get a 780 at a more reasonable price.

    • Srsly_Bro
    • 6 years ago

    Does anyone care to explain why the previous flagship product wasn’t in the benchmarks with the current flagship product? This blows my mind why it was omitted from this review.

      • codedivine
      • 6 years ago

      Probably because 280X is practically a 7970GE.

        • Waco
        • 6 years ago

        It literally is.

          • Bensam123
          • 6 years ago

          It still should be there for the sake of price comparisons. I’d rather have a almost exactly the same last gen model in the benchmarks, rather then two and three generation old products. Not everyone knows that they’re the same card, but people still know they can buy them. So they’ll question it and go to a different site to find a review. Most people wont dig through older reviews to find results to compare by.

    • TwoEars
    • 6 years ago

    It’s actually faster than I expected.

    The initial rumors were “between 780GTX and Titan” but as far as I can see this thing actually beats the Titan in most benchmarks (not only on this site).

    And this is with relatively fresh drivers remember, the Titan drivers have had a much longer time to get everything right.

    It even manages to match or beat the 690GTX in many tests, and that thing is a beast as we all know.

    It’s also priced significantly below the 780GTX, which honestly is a bit of a shocker.

    So in one swift stroke AMD has basically rendered both the 780GTX and the Titan obsolete, not too shabby! It certainly deserves the Techreport award!

    I’m sure Nvidia will answer with pricecuts and similar but this is no easy card to match! What are they going to do? Bring their Titan card down to the same price point? That will screw up their whole price point ecosystem.

    All I can say is well done AMD!! Good to have you back in the game!

      • Deo Domuique
      • 6 years ago

      I expexted to be slightly slower than 780, but with normal temps and voltage requirements, not this volcano. Of course, I don’t agree with Titan/780 either, due to pricing mostly. So, I wait to settle things down. My 7950 is truly adequate for the time being.

        • TwoEars
        • 6 years ago

        As is my 670GTX SLI setup. It still maintains a constant 60fps in everything I play with all the setting on max.

        Which is why we need better screens…. 2560×1440@100-120Hz would give me a reason to buy a fancy schmancy new computer. Until then my current old overclocked X58 i7-920 is doing just fine, more than enough horsepower it seems. I don’t think I’ve kept a computer this long in my life!

        And yeah – the 290X stock cooling blows, and not in a good way. Give it another couple of weeks and we’ll see some nice ASUS and Gigabyte solutions.

        But I don’t think most people are as picky as we are about noise, temp & power. I think 90% of the people will look at the benchmarks, the price and just buy one to play BF4 with. And they’ll keep their headphones on as well so it won’t matter too much.

        In my mind this is still a pretty clear win for AMD, especially at this price point.

          • JustAnEngineer
          • 6 years ago

          Why should people be picky?

          Temperature is a non-issue, except that it seems to contribute to throttling. With better cooling, the GPU could provide even more performance.

          Power is relevant only because it determines how quickly heat must be removed. For the hours that my graphics card spends at full tilt gaming vs. idle, screen off or system off, the small cost of the electricity is unimportant.

          It’s the noise that’s the issue. Even that isn’t a problem for many folks. The card is very quiet while idling. It’s only under gaming load that the GPU cooler gets noisy. If you’ve cranked up the volume on your game, you might not be bothered by this.

            • BlackDove
            • 6 years ago

            Why make excuses for poor design or efficiency?

        • WaltC
        • 6 years ago

        You’re making way too much over the temps, imo. It’s not like there’s a world of difference between 95C and 80C….;) Like 95C is lava and 80C is ice, but that’s the gist of some comments I read. My old 5770 has been running strong under load at 100C and has been doing it consistently for four years now (it’s now in another box.) AMD initially stated that the 5770 gpu could run at 100C indefinitely, and I am a believer after watching it do so for the last four years. The fan on the thing will fail before the gpu.

      • willmore
      • 6 years ago

      Though, we’re not using the newest nVidia drivers, right? We’re using the ones that are part of the reference platform, so that puts Titan and 780 at a bit of a disadvantage. There’s some downside to the reference platform concept.

        • TwoEars
        • 6 years ago

        Oh – forgot about that. Know anywhere you can find “latest drivers vs latest drivers”?

        That’s actually a worthwhile test when you’re in the business of crowning a new performance champion…

          • Airmantharp
          • 6 years ago

          Well, they had to use the ‘latest’ AMD drivers to test the R9 290X, since that’s all anyone has, but Nvidia’s drivers have been stable for years, so using the last WHQL to ensure consistency between reviews, just like they’re using ‘older’ drivers for the other AMD cards, does make sense.

          I expect that they test every new set of drivers, and commission articles when there’s a difference worth talking about, like AMD’s frame-pacing drivers.

    • Meadows
    • 6 years ago

    At the risk of sounding like Krogoth, this is a little underwhelming. I expected more for such power usage. Judging by the driver profiles, it looks like they were really desperate about squeezing out every bit of performance. As if they had realised late in production that “this won’t cut it quite like we thought it would”.

    [b<]However[/b<] (and this is a big, important "however"), the vast memory and the improved interconnect may just be what sets this apart from the rest. You might even be able to set the fan speed to 30 or 20 percent, the temperature limit to 80 or 85 degrees Celsius, and still end up with a pretty efficient Crossfire system with insane performance regardless. [i<]The wrong GPU in the right place can make all the difference in the world, Mr. Freeman.[/i<]

    • iatacs19
    • 6 years ago

    I await to see some custom solution like ASUS DCUII or MSI Lightning. These should give us better performance/acoustics numbers.

    Very impressive at $549 indeed! Well done AMD! Let it rain $549 GTX 780s!

    😀

    • JohnC
    • 6 years ago

    The food pictures are a lie, Scott 🙁

      • Damage
      • 6 years ago

      [url<]http://t.co/kP16lVuo2x[/url<]

        • JohnC
        • 6 years ago

        What… what is that? Is that a pie of some sorts? Or something that uncivilized savages outside of NY call a “pizza”?

        • willmore
        • 6 years ago

        Looks like Chicago style deep dish.

    • DPete27
    • 6 years ago

    Too many posts to read through and see if anybody has already mentioned this but: I wouldn’t these rapidly fluxuating clock speeds be detrimental to frame delivery smoothness?

    Perhaps they don’t fluxuate enough to cause problems, but there’s something to be said about “less agressive” clock-speed adjustments.

    • WaltC
    • 6 years ago

    Good review, Scott. You nailed it, too. This thing exceeds my expectations–especially in pricing. nV is going to have to seriously cut prices. AMD is a great company to have on the side of consumers, as just think where pricing would be if nVidia and Intel could have their preferences–horrid thought, really.

    While 95C seems too hot for some people, temps don’t matter if the gpu is designed for them. My last card, a lowly HD 5770 rang the bell under load at a solid and consistent 100C. AMD said that the gpu was designed to run indefinitely at 100C and I believe ’em. Almost four years later and that 5770 is still running strong under load in another box. When AMD says “indefinitely” you can take it to the bank, whether it’s 95C or 100C… I think the fan on the 5770 is far more likely to fail first (before the gpu.)

    • Chrispy_
    • 6 years ago

    AMD really need to fire the guy/team responsible for their reference coolers….

    [url=https://techreport.com/review/25509/amd-radeon-r9-290x-graphics-card-reviewed/12<]R9 290X blower[/url<] 348W total system power consumption Single fan cooling solution 50.9dBA 94 degrees C [url=https://techreport.com/review/22890/nvidia-geforce-gtx-690-graphics-card/9<]GTX 690 blower[/url<] 380W total system power consumption Single fan cooling solution 46.2dBA 75 degrees C [url=https://techreport.com/review/20629/nvidia-geforce-gtx-590-graphics-card/11<]GTX 590 blower[/url<] 365W card TDP alone (and a whopping 541W total system power consumption!) Single fan cooling solution 48.2dBA 87 degrees C I mean seriously, it's hotter and noisier than a GTX590's cooler by a significant margin and the card itself is drawing way less power. AMD's blowers are a joke. Can you please have a word with them, Scott? Also, kudos for getting just the right level of detail. Ryan at Anand just left blank pages and I ran out of willpower reading about the architecture over at Guru3D.

      • MrJP
      • 6 years ago

      I agree with you in principle, though in fairness the 690 and 590 both have the virtue of spreading the heat output over a larger total die area (indeed two separate dies) and therefore might reasonably be expected to have more efficient cooling for a given heat output. In the same way that Haswell is hotter than Ivy Bridge at a similar TDP.

      Having said that, AMD’s reference coolers are still clearly inferior to their recent Nvidia counterparts. I suppose they simply don’t have the same resources to put into this, and so leave it up to the board partners to provide better options. In the long run this probably doesn’t matter since you can always find a decent option if you’re prepared to do some research, but it does mean they’re always going to be disadvantaged in launch comparisons like this where only the reference card is represented.

        • Chrispy_
        • 6 years ago

        Yeah, I did consider that as a possible reason but then I remembered that heatpipes and vapor-chambers are supposed to be super efficient at carrying heat away from a tiny contact point and moving it to a large fin array.

        The GTX 580 results would seem to confirm that it’s AMD who are failing to cool the chip, and not that surface area is the limitation:

        [url=https://techreport.com/review/20629/nvidia-geforce-gtx-590-graphics-card/11<]GTX 580 blower[/url<] 394W total system power consumption (250+ Watts card TDP with 6+8pin power connectors) Single fan, [i<]single GPU[/i<] cooling solution 46dBA 85 degrees C And again, in comparison to the R9 290X's retail blower: 348W total system power consumption (also 6+8pin power connectors) Single fan cooling solution 50.9dBA 94 degrees C It's much noisier. It's much hotter. AMD coolers are godawful. I [b<]*WANT*[/b<] to buy a 290X, but I'm not paying for something cooled that badly.

      • danny e.
      • 6 years ago

      this is a good point.

    • chµck
    • 6 years ago

    I honestly can’t say that I’m blown away by this card.
    Yes, it performs better than the titan and 780, but at the cost of higher power consumption and heat (under load).
    You could argue “what about the price?” nvidia can easily knock down the prices of their cards to make them competitive.
    I like AMD, but this card is just a healthy improvement over the last generation; just as it should be. Nothing to get excited about like the 4870 was.

      • superjawes
      • 6 years ago

      It is kind of a boring time overall for graphics cards, honestly. I’m hoping this time next year will be interesting. That will give Nvidia time to release a “real” new gen while AMD has a chance to filter new silicon into everything below this card.

      And, of course, we’ll have a better feel on how Mantle and G-Sync will work out.

      • swaaye
      • 6 years ago

      Its performance paints it similar to 4870 but it’s not as cheap and I suppose that’s what you’re looking at. We didn’t have $1000 cards in 2008 though and the $550 price for 290 is quite aggressive considering that.

      I didn’t like RV770’s idle power issues. Never bought into that generation from either vendor. Hawaii obviously has considerably higher load power consumption though, sort of Fermi-like I think.

      • LukeCWM
      • 6 years ago

      “nvidia can easily knock down the prices of their cards to make them competitive.”

      They can, but they haven’t so far. So I’m not ready to dismiss the price advantage of AMD just yet. =]

      It’s funny, because even between generations of cards, the value proposition for GPU’s continues to get better and better, as one reduces prices to outsell the other. I’m sure everything will be different in six months, and probably even in one month. But for today, you’d have to be crazy to get a Titan instead of a 290x if your primary focus is gaming.

      (Absurdly rich counts as crazy.)

        • chµck
        • 6 years ago

        Well, the 290x has yet to go on sale, but we’ll see how this plays out.

        • Diplomacy42
        • 6 years ago

        in all honesty, if you are willing to spend even 400 on a single gpu setup, you are probably a little crazy anyway.

        (absudly rich counts as crazy 😉

    • jessterman21
    • 6 years ago

    I think the last time the word “uber” was still cool was in the days of SSX3…

    • swaaye
    • 6 years ago

    Nice to see that it is 2x as fast as my 6970. That’s a nice number and worthy of upgrade consideration.

      • Airmantharp
      • 6 years ago

      I know, right! I skipped the HD7950 because it just wasn’t enough of a bump from my HD6950 CF setup.

        • swaaye
        • 6 years ago

        I forgot to look at the power consumption. Oh dear.

          • Airmantharp
          • 6 years ago

          That’s only an issue once the ‘better’ open-air coolers start coming out- because that’s how much heat you’ll now have to figure out how to get out of your case on your own!

    • ptsant
    • 6 years ago

    Thanks for the extensive test. I don’t mind the power consumption (I have to justify my Seasonic X-750), but the noise and temperature would probably annoy me. Nevertheless, that’s something that can easily be cured by a decent cooler. We’ll have to see what the manufacturers bring to the table. I’m sure they’re going to differentiate their products.
    I’m also happy to notice that this is a hard launch. Products are underway even in Switzerland, that is a really small, secondary market.
    Finally, the great unknown in the equation is 290 price and performance. For most of us who game on 1200p or 1080p, the 290 should suffice for medium-high detail on most games.

    • albundy
    • 6 years ago

    +1 for the uber card that gets uber hot and one that i will never be able to afford.

    • brucethemoose
    • 6 years ago

    What I’m really interested in now is the R9-290.

    The 290X is in what I consider the stratospheric price range, but like the 7950/7970 OC for OC, the R9-290 will probably be 95% as fast as a 290X, and at a ~$400 price point would make similarly priced cards look silly.

      • Pwnstar
      • 6 years ago

      If $549 is “stratospheric, what do you think of Titan?

      [quote<]what I consider the stratospheric price range[/quote<]

        • Airmantharp
        • 6 years ago

        Cheap for a compute-oriented card? A gaming card for yuppies? The best solution (until now) if you were trying to run stupid-high resolutions?

        Mostly I call it ‘I ain’t never gonna buy that’ :).

      • Chrispy_
      • 6 years ago

      The 7950 was 7/8 of a 7970 clock for clock for about 2/3 the price.

      I’ll take 7/8 of a 290X for under $400, sure.

        • JustAnEngineer
        • 6 years ago

        If the leaks are to be believed, Radeon R9-290 is 10/11ths of R9-290X at 94.7% of the maximum core clock speed with the same 512-bit memory. That’s at least 86% of the performance.

          • HisDivineOrder
          • 6 years ago

          It does look like a great card. Too bad it’s under another one of those coolers. Probably a worse one.

          That said, here’s hoping it has more than 90% of its effective performance because that would be a sick deal and put a lot of pressure on the plain-jane 780, too.

            • Airmantharp
            • 6 years ago

            The performance does look awesome- and if it follows the pattern of the above-mentioned HD7950, the ‘stock’ cooler likely won’t even be available at retail, so hopefully ‘launch’ retail availability will be comprised of third-party coolers in the same fashion :).

    • ssidbroadcast
    • 6 years ago

    Huh, well this is good but now I’m more excited from the more mainstream spinoffs from this silicon. Probably in the 256-bit mempath range. I’m gonna hazard a guess those will be known as the “R8” series… whenever that comes out.

    Then again, nVidia seems to be going whole-hog on Steam OS support so maybe when I build a steambox I’ll go with the vendor that has more support. Unsure.

      • chuckula
      • 6 years ago

      [quote<]mainstream spinoffs from this silicon. Probably in the 256-bit mempath range.[/quote<] They've already got one... it's the [s<]HD-7970[/s<] [u<]R9-280x[/u<].

        • Airmantharp
        • 6 years ago

        I think he didn’t realize that the R9 290(X) is actually the ‘high-end’ spinoff of the HD7970- it’s the result of AMD realizing that they’d need a bigger GPU to compete with Kepler* tit-for-tat :).

        *Note that it does, indeed, compete with Kepler, while still being smaller. That is a notable feat.

          • ssidbroadcast
          • 6 years ago

          The article makes it pretty clear that the R9 290x is totally different silicon from the R9 280. I understand the R9 280 is the previous gen. What I’m looking for is maintream binned parts from the next-gen R9 290 “hawaii” parts.

            • chuckula
            • 6 years ago

            [quote<]The article makes it pretty clear that the R9 290x is totally different silicon from the R9 280. [/quote<] You're right but.. it's not a totally different GPU architecture... it's still pretty much the same GCN cores, just more of them with a new memory interface. In order to cut down the bigger chip you'd be turning off GCN cores and potentially disabling part of the memory interface... Turn off enough GCN cores... you're right back where you started with the HD-7970. Turn off some of the memory? It sounds like it might not matter BUT.. there's a huge catch here: As has been pointed out in the article and in some posts, AMD intentionally simplified the memory interface circuitry to cram a 512-bit interface into a smaller area than the 256-bit interface of the earlier HD parts.. the new interface is expressly designed to work with slower memory and makes up for it in raw bandwidth. However, if you cut off that additional bandwidth, you are left with a memory interface that's slower than what the 7970 already has.

            • Modivated1
            • 6 years ago

            At the end of the day results are what matter, would you care if AMD went back and used the 1900 xtx design but managed to get it to produce better graphics/performance than anything out there? How it’s done is not nearly as important as actually getting there. In a way it’s even more of a feat to take what you have already and rework it to beat your rivals bleeding edge tech.

            It won’t matter to me if you lifted the hood up on the card and found Mickey Mouse running on a mouse wheel to power the card as long as they produced a Winner.

            • chuckula
            • 6 years ago

            [quote<]At the end of the day results are what matter,[/quote<] Judging from your other posts.. that very interestingly all seem to be directed to this one article... I'm not taking you for a very fair-minded type, but I'm going to give you the chance to redeem yourself with a display of intellectual honesty: If that statement you make above is actually true, then you will be more than happy to say that Nvidia can respond to this launch by enabling all the SMX units on the GK-110 and pushing out a competitive response to the R9. After all.. at the end of the day results are what matter... right?

            • Modivated1
            • 6 years ago

            Right, they would probably have the leading card on the market. The problem with that result is their chip is naturally bigger with or without all SMX units enabled so they would have to have a pretty effective lead to justify the price that they would demand. But if they could do so then yes they would have the best offering for the price and the flagship card and thus the best result. I stand by my claim that the result is what matters.

            Give me the best Price/Perfomance and I will claim you the best at what you do, end of story.

            • chuckula
            • 6 years ago

            Fair enough… you’re a fanboy, but an honest fanboy*, which is in short supply around here.

            * So am I BTW.

    • chuckula
    • 6 years ago

    In honor of this launch, I though it fitting to give a little speech:

    [quote<] Comrade fanboys, this is your captain. It is an honor to speak to you today, and I am honored to be trolling the forums with you on the maiden review of our AMDland's most recent achievement. Once more, we play our dangerous game, a game of Battlefield 4 against our old adversary - The Nvidia Fanboys. For forty years, your fathers before you and your older brothers played this game and played it well. But today the game is different. We have the advantage. It reminds me of the heady days of Cypress and Jerry Sanders when the world trembled at the sound of our Opterons. Now they will tremble again - at the sound of our cooling fans. The order is: ENGAGE THE UBER FAN DRIVE. Comrades, our own fanboys don't know our full potential! They will do everything possible to benchmark us; but they will only benchmark their own embarrassment. We will leave our previous generation GPUs behind, we will pass through the Nvidia forums, past their G-sync nets, and run demos outside their largest OEM, and listen to their "Jen-Hsun speeches"... while we conduct 4K graphics drills. And when we are finished, the only sound they will hear is our laughter, while we get invited to a press junket in Hawaii, where the sun is warm, and so is the...GPU core temperature. A great day, comrade fanboys! We benchmark into history![/quote<]

      • dpaus
      • 6 years ago

      I tried reading that out loud with a cheesy Russian accent, but it actually sounds better with a faint Scottish brogue. It sounds best of all in a cheesy Russian accent done with a faint Scottish brogue.

      • Airmantharp
      • 6 years ago

      Back to the top with you- this needs to ship as the introduction for every reference R9 290X manual!

        • antinsa35531
        • 6 years ago
      • flowerspike
      • 6 years ago

      Huang: There is another matter… one I’m reluctant to…

      Read: Please.

      Huang: One of our GPUs, a Titan, was last reported in the area of Volcanic Islands. We have not heard from her for some time.

      Read: Jen-Hsun, you’ve lost another GPU benchmark?

        • Klimax
        • 6 years ago

        Titan: Titan here. They tried to disrupt our comms, sir. Dealt with situation and whole base by way of maximum power, but had to unlock the fan. Titan, over.

          • JustAnEngineer
          • 6 years ago

          You lose the meme. I don’t think that Tom Clancy wrote yours.

          [quote<] It would be well for your company to consider that having your shills and ours, your fanboys and ours, in such proximity... is inherently DANGEROUS. Flamewars have begun that way, Mr. Huang. [/quote<]

            • Klimax
            • 6 years ago

            Shoot memes, I take my own direction.

    • CaptTomato
    • 6 years ago

    Saphire Vapour X or ASUS and we’ll probably see 6-12c drops in temps.

      • Airmantharp
      • 6 years ago

      The temperature can remain at 95c- it’s supposed to be there!

      I’d rather see a 6-12dB drop in noise- but hopefully far more :).

        • willmore
        • 6 years ago

        How about you get his chocolate in your peanut butter!

        Edited to add: I’m guessing someone didn’t see the old Reese’s Peanut Butter Cup comercials where one guy with chocolate would run into some guy with a jar of peanut butter with the result to have the chocolate and the peanut butter mixed. The punchline was always “You got your chocolate in my peanut butter!” “No, you got your peanut butter in my chocolate!” The tag line was “Two great tastes that taste great together.”

        I’m suggesting that *both* of these things could be done at the same time to even better benefit.

          • Airmantharp
          • 6 years ago

          Actually, they are both done at the same time- better cooling efficiency can bare out as either more performance (assuming the card can handle it) or lower noise, or some compromise of both.

          Optimally, you’d be able to push the card as far as it can go and keep noise under wrap- but you’d probably need a water loop to get there. The third-party open-air coolers are likely a great compromise, up until you need to run more than one card :D.

        • CaptTomato
        • 6 years ago

        It was 33c in my room at 2-3pm yesterday{Brisbane}, so 95c is out!!!, lol.
        In this instance, the stock card sux somewhat, but for those who do pony up for one of the respected brands, I suspect that temps, noise and oclocking should be more than acceptable, and let’s be honest, one must suffer a little with a high end card.

          • Airmantharp
          • 6 years ago

          The better the cooling system in your computer, the faster you run it, the hotter your room, because physics!

    • the
    • 6 years ago

    This isn’t the first time AMD has moved Crossfire data over the PCI-E bus (low end 5000 and 6000 series chips did this). However this is the first time one Radeon cards can perform a DMA operation directly to another Radeon card, thus saving a trip to main memory and additional latency. I’m really curious how the new Crossfire implementation scales between Z77/Z87, Z77/Z87+PLX and X79 platforms. This maybe a good test case for the additional bandwidth of the X79 platform, especially for setups with three or four R9 290X.

    I’m curious what AMD removed from its GDDR5 memory controllers to make eight 64 bit controllers smaller than their previous six 64 bit controllers on Tahiti. I wonder if Tahiti’s supported differential signalling over the actual memory bus which hasn’t been used in any GDDR5 memory chips. That’d allow for a pretty good reduction in die usage in the memory controllers. Having said that, I’m really curious just how far the memory bus on the R9 290X can go and how memory bottlenecked the GPU is. Doubling the number of ROPs with 38% more TMU’s and only increasing memory bandwidth 11% sounds like a potential issue. I suspect that 6 Ghz effective is possible which would bring bandwidth up to 384 GB/s. Considering the number of memory chips and the board’s power consumption, this boost to the memory bus may make the card exceed the 300W design limit of the PCIe spec.

    I’m really curious if partners can do a custom PCB. Moving to an 8 + 8 pin auxiliary power configuration would be a thing to look for alongside higher clocks, faster memory and a better cooler. I’m not optimistic that much overclocking can be done given the reference cooler and PCB with a 300W limit.

    One thing I haven’t seen covered anywhere is TrueAudio around the web. TR seems to have a followup as a work in progress. My main question here is if the card can output audio only over the HDMI port to a receiver.

    Another random note is that this card does not natively support VGA. It appears any sort of VGA compatibility is going to come from either a DP or HDMI active adapter.

      • Goty
      • 6 years ago

      According to things I have read on teh interwebz, the reduction in die space came from logic that enabled such high memory speeds, meaning that 6 Gt/s may actually NOT be possible, even if the memory ICs are rated for it.

        • the
        • 6 years ago

        That’s the thing, differential signalling hasn’t been used by the memory chips to hit 7 Ghz. Originally GDDR5 was conceived that hitting their 7 Ghz target would require that technology. If memory chips could reliably hit such speeds without that technology, why not removed it and save die space? Having said that, it is always going to be more difficult to increase bus width while maintaining clock speeds. Thus I wouldn’t expect R9 290X to hit 7 Ghz or more like some GTX 770’s can do but 6 Ghz doesn’t seem outlandish.

        On that note, I forgot to add the possibility of custom PCB designs carrying 8 GB of memory. That’d by way overkill today but a new wave of consoles about this, various ports could easily be using more than 4 GB of textures in the not too distant future.

          • Airmantharp
          • 6 years ago

          I don’t even think that a custom PCB will be needed- I’d be willing to bet that AMD already has 8GB versions ready to go, as FirePros if not as Radeons. Nvidia is shipping GK110-equipped pro cards with 12GB. Adding RAM isn’t hard, or expensive :).

          And yeah, I couldn’t conceive of getting a 4GB GPU today. I’m running 1600p, and along with 1440p users, 3GB is the bare minimum for current top-end games; 2GB barely cut it for the last round. I’m expecting to need at least 6GB for real next-gen games that take advantage of these consoles’ memory budgets, and I’d prefer 8GB just to be safe!

            • the
            • 6 years ago

            I should probably hunt down a high res screen shot of the R9 290X to see the part numbers on the memory chips. There very well could be higher density chips out there what would genuinely permit an 8 GB card without modding the existing PCB. That wouldn’t address the power side of things if the memory chips are higher wattage (it uses a lot of memory chips to get a 512 bit wide bus).

            The 12 GB Quadro K6000 does use a custom PCB to get that amount though. 🙂 An 8 GB FirePro card is well within reason though I’d still expect a slightly different PCB to account for a different port layout (four miniDP and one DL DVI?).

            • Airmantharp
            • 6 years ago

            If they don’t exist, I’m sure Samsung can make them 😀

          • willmore
          • 6 years ago

          Let me start by saying “I don’t know what they left out either, so this is just speculation.”

          The higher in frequency you go, the harder it is to keep signal integrity. You have to:
          1) use fancier PCB designs (better controlled physical properties, more layers, manually routed signal lines, etc.)
          2) more complex equalization
          3) higher signaling voltages
          4) other techniques indistinguishable from magic

          Since 3 is set by the memory standard, that’s out. Only #2 effects die space much (since we ruled out #3), so I’d guess that’s what they left off.

          The down side of this is that it might also effect putting multiple chips on each memory controller. Which means that any 8G variant would need memory chips of higher density–not twice as many.

          Again, that’s all just speculation. I imagine someone will ask AMD nicely and get an answer from an engineer who actually knows.

      • BlondIndian
      • 6 years ago

      Anandtech said that they removed complex PHYs on the memory controllers and thus the hawaii memory controller is 20% smaller than Tahiti at the cost of lower clocks.

      AMD also used cheaper memory chips(elpida IIRC) to lower cost . So OC on stock is bad . OEMs with Hynix chips might do better.

      There will definitely be custom cards and 8+8 is almost guaranteed . Why would they not ?

      TrueAudio already got a write-up on TR . However the drivers/games for it are still not available .
      The interwebs do state that TruAudio will be more like a filter between CPU and the Codec . It will be able to output over everything your PC already does .

      RIP VGA . Time to put the old bird to rest 🙂
      I still find ultra-budget monitors(think 14-17″ 768p TN) shipping with VGA , however I doubt they would pair those with the 290X .

        • the
        • 6 years ago

        Complex PHY could mean what I was suspecting: they removed differential signaling support on the data bus. It isn’t that specific unfortunately.

        Agreed on the memory chips. I’m eager to see what this card can be pushed to. Breaking 400 GB/s will likely be possible for some golden samples.

        It took awhile for partners to come out with custom Radeon 7970’s. Some high end cards don’t even get custom PCB treatment: case in point all Titan cards are reference PCB’s. AMD may simply not allow their partners to experiment. I haven’t heard either way. Regardless, I’m optimistic about reaching higher memory clock speeds even on the reference PCB.

        There are a handful of 23″ 1080p displays floating around today with only VGA. I know because a bean counter at work managed to order some to the despair of all those using them.

        • Airmantharp
        • 6 years ago

        TrueAudio is perhaps the most niche-oriented of the new technologies- but it does have the potential to do for gaming/simulations what hasn’t been done since the days of Aureal.

        PhysX was supposed to help here too, but we all know how that went, and yes, that’s a dig on Nvidia. Still, TrueAudio looks like it’s designed to combine the type of channel mixing that Creative/EMU do, along with the spacial calculations that Aureal did, with live world data that’s already available to the GPU, using the GPUs extensive resources. It’s one of those technologies that should tie everything together quite nicely and provide that ‘next step’ of immersion.

        As for VGA, well, unless you’re running an AMD FX CPU or a Xeon, you’ve got integrated video that should work alongside your discrete GPU and provide VGA output. If not, adapters are cheap.

    • LukeCWM
    • 6 years ago

    1) Excellent review. Scott, you’re my hero! *blushes*

    2) It’s funny, because the 5870 is by far the dog of all the cards tested, yet that’s exactly what I have! Thanks for including it, to show me the benefit for upgrading. (Assuming I’d game at exactly the same settings, I could see 2.5x the framerate for only $250!)

    3) I’m really glad you pointed out that we have no idea how Crossfire will perform here. That’s key, and further testing is needed. I feel other sites give products the benefit of the doubt and then write a glowing review. TR is more prudent and cautious, and that’s one of the reasons it is the best.

    4) I assume we’ll see a 4k shootout among the best cards? Probably after a second 290x arrives?

    5) Of course the 290x is much cheaper than the competition, but it looks like they didn’t put much into their cooler. Is it likely we’ll see appreciable improvements in sound and/or overclocking headroom once third parties start putting different coolers on the 290x? Perhaps EVGA?

    6) For the love of donuts, please [i<]somebody[/i<] write a troll post about TR showing their true Nvidia fanboy-ism. I'm dying to read it! =D

      • Jigar
      • 6 years ago

      Scott is a Nvidia and Apple Fanboy since ages, all the regular at TR know it, so why should we mention the obvious ? 😉

        • Modivated1
        • 6 years ago

        Fanboi’izum is fine as long as you are not delusional. Everyone is a fan of something, the key here is to be able to admit the truth when you see it and concede that the one you favored lost, the rival has bested them.

        The real trolling begins when no matter what the outcome is the Fanboi cheer leads for his/her team and and proclaims doom and gloom for all rivals.

      • libradude
      • 6 years ago

      EVGA is a premier nvidia partner… so re: #5, no.

    • phez
    • 6 years ago

    So you can’t use water cooling because it’ll boil the water?

      • Airmantharp
      • 6 years ago

      Only if the pump breaks… but then you have other problems 😀

        • willmore
        • 6 years ago

        I’m starting to thing to think that some people around here don’t understand power, heat, and temperature. Are you getting that impression, too?

          • Airmantharp
          • 6 years ago

          Noise is the part that they’re really not understanding, in my opinion, but I agree with you. It’s like a whole new generation needs to be educated.

          Granted, I’m thinking phez was just being facetious. I actually appreciated his comment, it made me chuckle 🙂

            • willmore
            • 6 years ago

            You know, if the chip can tolerate just 5 more degrees, then they could use a water based phase change cooling system. 😉 That’d be cheap.

            Edited to add: Then it would become dependent on the elevation of the reviewer for how good it would perform. 😉 Wouldn’t that get the fanboys knickers in a twist!

            • odizzido
            • 6 years ago

            Water boils at around 95 degrees at about 1500m above sea level. There are a ton of cities above this elevation. Actually, there is a city that has over a million people living there and their boiling point is like 87 degrees.

            Even in my city water boils at 96 degrees. I am sure the 290X will reach that temperature sometimes.

            • phez
            • 6 years ago

            I was being serious. Because you know, the laws of physics or something like that.

            • Airmantharp
            • 6 years ago

            The GPU core is 95c- the cooling interface isn’t, because that energy is dispersed by conduction into some form of ‘sink’, be that a heatsink with fins or heatpipes (and then fins), or a waterblock, or a metal cup to hold dry ice.

            And if it’s a waterblock, not only will the interface temperature be cooler, but the water will both enter the block ‘cooler’ and will not heat up to that interface temperature, because it keeps moving.

            • willmore
            • 6 years ago

            Let’s start with heat is not temperature. Temperature is a property of some mass with a given energy (heat energy in this case). Spread the same amount of heat over more mass and the temperature goes down.

            Next, heat flows from hot areas to cold ones. If a source of heat is applied to a mass which has little thermal transport to the outside (generally called ‘ambient’), then it will heat up until that weak transport can move all of the heat being applied to the mass. This is equalibrium. If that thermal transport is imporved, then the equalibrium temperature is lowered.

            So, let’s look at your question. “So you can’t use water coolin gbecause it’ll boil the water?”. Now, that makes the assumption that the chip is an absolute temperature source–which isn’t the case, it’s a *heat* source. If we can remove the heat from it better than the existing HSF, then the temperature of that heat source will *decrease*. So, if we assume that a water cooling solution is better than the supplied HSF, then we won’t be anywhere near that magic 100C point where water starts to make a simple thermal calculation more complex (phase changes are fun!).

            Background info: For a given heat generation, a better cooling system will have lower temperature differentials throughout. In proper terms, for a given heat flow, lower thermal junction resistance leads to lower junction temperature differentials. Also, CMOS chips decrease in performance with increases in temperature. So, a cooler chip will actually dissipate *less* power for a given clock speed and voltage. Now, in the temperature range we’re working with, it’s not a huge difference, but it might be enough to eek a few 10’s of MHz out of the chip.

            Additionally, since the thermal management of this chip is geared to limit: power dissipation, temperature, and fan speed; if we take fan speed out of the equation with water cooling and lower the temperature to below where the chip will throttle the clock, then a water cooled R9 290X should perform well better than one with the stock HSF. The one big caviat to that is *as long as we can use more power without exceeding the voltage limits of the chip*.

            Okay, this got way too long. More later, if it proves necessary.

            TL;DR: No, because *physics*.

            • Billstevens
            • 6 years ago

            Basically you would need to do a fair amount of math but a 90+C temp reading off the chip doesn’t mean its putting off enough heat to boiling a flowing liquid cooling source. For the water to boil the water itself would have to be 100 C. Since the flowing water in the cooler is all in contact it should share its heat fairly evenly so the chip would have to raise all of the water to 100 C. You have the small surface of that chip running at say 90 C heating up a much larger body of water. There is no way that water is going to get to 100 C as long as heat from it is being dissipated by radiator.

            The water will be at some elevated temperature dissipating the chips heat through its radiator but it should be well below the temperature of the chip itself.

            • willmore
            • 6 years ago

            Billstevens, for a start, yes. Note I said that temperature is a certain amount heat per unit of mass. The great thing about using water to cool is that it has a lot of thermal mass and it’s easy to get a bunch of it past the thermal interface quickly. Compare this to air which has a fairly low thermal mass, so you have to move *a lot* of it to get the same thermal equivalence.

            Here’s the numbers that make water cooling a good solution. The Volumetric heat capacity in “J/(cm^3·K)”:
            Water (at 25C): 4.1796
            Air (STP): 0.00121

            So, one volume of water has 3454 times the thermal mass of the same volume of air. That’s a lot of air you’d need to move to cool as well as water does!

          • Billstevens
          • 6 years ago

          Most people avoid physics courses or type problems. I loved fluid dynamics because the concepts were so easy to visualize and grasp. But the calculations can get as hard and complex as you like. All depends on how accurate you have to be. I guess you can get away with being a computer enthusiast and not have the engineers tendency to look up how things work.

          I’ll assume for now that most TR readers though are just happy to joke that, in a race to boil water, this bad boy would likely lead the pack if you programmed its stack without the thermal shutdown enabled.

            • willmore
            • 6 years ago

            As a computer engineer, I probably missed the joke. 😉

      • UnfriendlyFire
      • 6 years ago

      You want to boil water? Try using one of those first-generation closed-loop water coolers and bolt it onto the GPU. And block all of the vents with expanding foam to reduce the tiny radiator’s heat transfer efficiency to almost useless.

      Oh, and hack the BIOS or PCB so the GPU doesn’t throttle or shutdown when it reaches beyond 100C.

        • WaltC
        • 6 years ago

        Nah…if you want boil water…use a stove. It’s much simpler, trust me…;)

      • Meadows
      • 6 years ago

      It doesn’t have enough wattage for that.

        • willmore
        • 6 years ago

        Okay, now I’m pretty sure we do need a course in basic thermodynamics. *sigh*

          • UnfriendlyFire
          • 6 years ago

          Insulate the system, turn off all safeties, and watch the water boil.

            • Airmantharp
            • 6 years ago

            I’m pretty sure I could set up an experiment that would boil some quantity of water with 1W of energy.

            • Meadows
            • 6 years ago

            Yes, but there’s not just “some quantity” in a water cooling circuit, but more.

            • Meadows
            • 6 years ago

            The card will fail before the liquid phase does.

          • Meadows
          • 6 years ago

          Oh pull your head out of your arse, you know it as well as I do that this card can’t do that. It’s the same reason why cooking eggs on top of a Fermi card failed (it was fun publicity, sure, but seeing an online publication confuse “temperature” with “heat” was disquieting).

          Maybe I’m not an expert physicist, but I’m not wrong.

      • jihadjoe
      • 6 years ago

      Then you have phase-change cooling!
      OH TEH AWESOMENESS!

      • JustAnEngineer
      • 6 years ago

      Even if we really did need for the water to get a little hotter than 100°C in our hypothetical GPU cooler, that’s an easy problem to solve. Look at car engine cooling systems. By pressurizing the system, you can raise the boiling temperature. You can also add something to the water. Increasing the concentration of solute raises the boiling temperature at a given pressure.

      As others have pointed out, heat flows from hotter stuff to cooler stuff. The bigger the difference in temperature, the faster the heat flows. The larger the contact area, the faster the heat flows. The more thermally-conductive the material, the faster the heat flows. The shorter the distance, the faster the heat flows.

        • DaveBaumann
        • 6 years ago

        There’s no difference between this ASIC and any other – they all get hot. In this instance all we’ve done is change the fan control mechanism to explicitly target a temperature that is optimal from a performance and acoustics perspective. When putting on a water cooler its not more different here than in other cases.

          • JustAnEngineer
          • 6 years ago

          I didn’t believe that there was any difference, Dave. I’m just a stickler for helping folks understand physics.

          The folks that have installed the commercially-available 3rd-party waterblocks on their early R9-290X cards are mostly running them at lower temperatures than the stock cooler’s target so that they can overclock more. Their performance-prioritized optimization doesn’t have as much challenge to balance acoustics with that goal as the stock cooler does.

      • D@ Br@b($)!
      • 6 years ago

      U can use water cooling and whether it will boil is dependent on the pressure and temperature of the water.
      Water can boil at a temperature of 50 degrees C when pressure is down to 0.1 Bar.
      [url<]http://www.engineeringtoolbox.com/boiling-point-water-d_926.html[/url<] And u should use boiling water to cool, because boiling water can absorb much more heat per given amount or flow of water. Latent heat of evaporation of water - 2,270 kJ/kg vs Specific heat of water - 4.187 kJ/kgK

    • Forge
    • 6 years ago

    I’m looking forward to the 600-650$ Titan this is likely to provoke. Titan was mildly interesting at 1K$, at 600$ it becomes a very viable option for me.

      • Airmantharp
      • 6 years ago

      It’ll be faster than the Titan and called ‘GTX780Ti’ instead, but yeah, it will become very viable- as long as they remember to weld some RAM to it!

      • jdaven
      • 6 years ago

      But the R9 290X beats Titan in almost every single benchmark and is only $550. Why would the slower Titan be more attractive at $600?

        • l33t-g4m3r
        • 6 years ago

        OpenGL, FP16(wtf amd), 3d Vision (amd has nothing/crappy 3rd party), FXAA (better than mlaa), Ambient Occlusion, day1 game support, power, temps, noise, smoother framerates & less inconsistencies with indie titles, not to mention gsync with future monitors, and 6GB ram. There’s a few.

          • Airmantharp
          • 6 years ago

          See, HERE’S an unrepentant Nvidia fanboy!

          Now get off my case :).

        • Klimax
        • 6 years ago

        Usually barely, not much of safety margins. Fan-unlocked Titan would likely burry it, but can’t test it. Got smaller resolutions and no FCAT.

      • Pwnstar
      • 6 years ago

      There is no way nVidia is dropping Titan to $600. It is their prosumer compute card. The 780 and 780 TI is what will drop in price, not Titan.

        • Airmantharp
        • 6 years ago

        Be careful, you’re using too much reason for this lot!

    • anotherengineer
    • 6 years ago

    Hey Damage, did you get a chance to test that things out on the 4k monitor you have or did you have to send it back to Asus? Or did I just miss it since I am sick and half asleep?

    • Bensam123
    • 6 years ago

    Definitely hope to hear more about Trueaudio! That seems to be one of the most exciting bits about this card. It’d also be nice to test out XDMA and see if it needs PCIE 3 to stretch it’s legs or if it’s still fine on 1 and 2 or x16 vs x8… All the nitty-gritty stuff. ^^

    I don’t really see a problem with the thermals or power draw, I do however see it with the acoustics. I’m not sure I’d want based solely on that as I’ve come to value a quiet computer quite a bit.

    Hopefully the r9-290 will address all of the above problems, which we all know will show up shortly, probably after Nvidia introduces something to try and trump AMD. That’s a card I’m actually interested in buying as it will be priced reasonably well and perform quite well, maybe even with some overclocking headroom.

    Speaking of overclocking headroom, I’m surprised Scott didn’t crank the fan up to 100% and go for some extra mhz. It may not something you’d want to endure for long periods of time, but I think it’d be a great to see what kind of leeway is there.

    Overall a pretty sweet card release, if a bit delayed. The big parts of it I think are yet to come… GCN, Trueaudio, Mantle…. The bridge finally going is also a plus point. G-sync isn’t bad either, but it’ll probably be some time before we really see monitors showing up with it and for AMD to release their own version (or adapting g-sync). That’s another locked down bag of tricks.

      • willmore
      • 6 years ago

      [quote<]Speaking of overclocking headroom, I'm surprised Scott didn't crank the fan up to 100% and go for some extra mhz.[/quote<] I'd guess time was the big constraint--he had already run everything twice to test the 'UBER' setting. Gotta meet deadlines and sleep once and a while. I would request it for an upcoming article or a refresh of this one. *please* Don't forget power and noise measurements for completeness sake. 🙂 I need to find some other site willing to slap a water block on it and hook it to a tub full of ice or something. 🙂 Winter is coming and cooling gets cheaper then. Heck, I heat my office with my PC in the winter. I wonder what HPC project I should support this winter. F@H? Mersenne prime conditate factoring?

        • Great_Big_Abyss
        • 6 years ago

        Apparently he was finished his review by the 15th….that leaves, by my count, 9 days to play around with it.

          • Bensam123
          • 6 years ago

          Yeah… it seems Scott and Geoff have real life stuff competing for TR time. Geoff doesn’t really do in depth articles anymore except for the occasional hard drive review or motherboard. Scott has been chincing on the main articles (cutting out 7xxx series to save time, mantle, trueaudio) I’m guessing because family time is competing with it.

          They should consider taking on another editor (or two) to do main articles.

            • Damage
            • 6 years ago

            I just want to say that I have been working really, really hard, like harder than any period in my life, in the past few months. Your perception and reality are far, far from one another.

            I suppose that makes sense, in a way, since we don’t talk about when we receive cards for testing, when we actually get drivers, when we find out about things like Uber mode, or exactly how much time is burnt on things like travel, testing, verification of results, or even graph creation. Or how hard writing is.

            I hope other TR readers don’t put any stock in your assessment. We are definitely not sloughing off as a publication. Quite the opposite. We just have a demanding job where the final output is the distillation of a tremendous amount of work you don’t always get to see.

            • JohnC
            • 6 years ago

            I think most of the people understand and appreciate that, Scott, but you guys should still consider finding more editors – it would be beneficial both to you and to your readers.

            • Damage
            • 6 years ago

            I’m glad to hear folks want more TR. That is a business challenge in the current climate, though. We are thinking about ways to expand in spite of that, but it will require some innovation!

            • Bensam123
            • 6 years ago

            I wasn’t trying to be rude, that’s the way it seems from the outside. I was basing this on reviews we receive, points in the benchmarks, and how thorough the review is. For instance discussion of Mantle, GCN, and Trueaudio, normally you guys would dip into more thoroughly and have an entire section of a review for discussion of them (or part of a page), but we just got a little snippet this time around that says ‘coming soon’. Other reviews are like that as well. Haswell for instance didn’t have nearly as many processors in it as Vishera did, although there was the mega compilation at the end. Other articles people are hoping that make the site never do, like a review of Creatives Recon chips.

            I wasn’t saying you guys aren’t working hard, rather that you have too much on your plate now, which is why you should consider highering more help. It’s entirely possible to be ‘too busy’. I think this is a sign of TR growing as a site, which isn’t a bad thing, but it’s important to note when you can’t keep doing everything yourselves.

            • Airmantharp
            • 6 years ago

            GCN’s been discussed ad-nauseum, it’s three year old technology.

            Mantle will never really be easy to address, and there’s literally nothing to address that hasn’t been- just like TrueAudio, it exists only in marketing slides, for now. Both are incredibly interesting and will likely change the computing landscape at some point in the future, but we don’t even have a tech demo for either yet.

            Now, I’d love to see a review of Creative’s SoundBlaster Z line, and I’d love for the ‘Recon3D’ line to be forgotten from our collective memory, too- and an analysis of where on board audio is would also be nice.

            But hiring people? That’s scary. I mean, I’d love to work for TR too- but hiring writers for tech articles is a truly scary proposition, to say the least. For HardOCP, the best thing they ever did was to get Kyle out of the picture- that dude just can’t use the English language. For Anandtech, it’s been a back-and-forth thing; their video card dude is decent, and I cherish the stuff that Anand himself writes, but the overall quality of the site has notably decreased since he got out of the scene.

            • Bensam123
            • 6 years ago

            Airman stop being bitter about the R9-290x being a Titan stomper and Nvidia losing completely this time around. Your bias is starting to show. Take a step back, take a breather, and be a bit more objective.

            Shit even Chuckula is doing a better job of controlling himself (probably more so because he’s under fanboi watch).

            • Airmantharp
            • 6 years ago

            See, here’s the thing- I’m not bitter that it’s fast; quite the contrary. I’m bitter that AMD assed up the blower (again). I actually wanted one of these things, and could buy one or two right now.

            • Ryu Connor
            • 6 years ago

            Speaking as one of the mods able to ban. Let me kindly ask you to not taunt/presume the status of others. It’s always better to worry about your own standing in such things.

            • Bensam123
            • 6 years ago

            Huh, well it would’ve been nice if you reminded Chuckula of such things 10 months ago.

            • Airmantharp
            • 6 years ago

            If you perceive a post/message as a personal attack, you should send it to the admins for review; that’s what they’re here for!

            • Bensam123
            • 6 years ago

            It’s interesting that I say the same thing JohnC does (only he doesn’t mention them being too busy to flush things out) and he gets +1 and I get -7.

            • Mentawl
            • 6 years ago

            How can you have a “main article” on Mantle or Trueaudio when these technologies are still in development? You need to lighten your expectations, sir. Rarely do I find info elsewhere online that is more comprehensive than TRs.

            • Bensam123
            • 6 years ago

            Perhaps you missed this…

            [quote<]The most notable of the new features is probably the TrueAudio DSP block for accelerated processing of sound effects. There's much to be said on this subject, and I intend to address TrueAudio in more detail in a separate article shortly.[/quote<] Trueaudio is done AFAIK and it's not unlike TR to talk about technologies that are in development, say like G-Sync. TR does more then just benchmark.

            • Modivated1
            • 6 years ago

            The 7xxx series is done with! Why would you think that there needs to be more reviews on a last generation card? Also so much information has been coming out that I think they are swamped with work and the articles they they write usually very in depth, in fact I would credit them with being the key factor of getting frame pacing to be an industry standard that almost all review sites use.

            I do think that they might benefit from another knight at the round table, seeing that shortbread seems to be a lot more scarce but the job they do with the team they do it with is among the best in the industry.

            Whatever personal favor they have for a product (if indeed they do have it) doesn’t effect their ability to be objective with reviews. They always concede to the truth of the matter and that’s what counts when you are trying to weigh the pro’s and con’s of tech.

            • Bensam123
            • 6 years ago

            The 6xxx series is done with, the 5xxx series is done with. It’s not about reviewing the cards, it’s about comparing the prices to current gen (for the 7xxx series), which are still relevant as 7xxx stock is being sold off (and the prices the 7xxx stock were at when the R9 and R7 reviews came out).

            Yes, they’re too busy… that’s why I was suggesting taking on more staff to fill in the gaps.

            I didn’t mention anything about biases. :l

      • BlondIndian
      • 6 years ago

      Agree that 290 is the card to get . GCN has proved to be pretty good arch overall .
      I love the idea of G-sync but hate artificially limited proprietary protocols .Here’s hoping DisplayPort comes up with a new spec 🙂

        • Airmantharp
        • 6 years ago

        It’s the card to get, but as you’ve said elsewhere, wait for better coolers!

        And I think we can all agree that a wide-ranging ‘G-Sync’-like specification for DP is absolutely needed. I’m seriously hoping that someone in the hardware hacking community is able to demonstrate that an AMD or Intel GPU can be made to work with a G-Sync monitor when they become available.

          • BlondIndian
          • 6 years ago

          I said “wait for better coolers” w.r to 290X . 290 is a card where I’ll put my money(after reading reviews ).

          I wish DP comes up with an alternative to G-sync . One where the final monitor side card is also open spec. Having G-sync as an exclusive to Nvidia will ensure monitor prices remain high for a long time .

          Nvidia (and AMD for that matter) will exploit any advantage they have by price gouging consumers as much as possible . Open standards are the way forward . I’m personally hoping Intel will champion the new spec as they have a lot to gain by this on mobile platforms.

            • Airmantharp
            • 6 years ago

            GIven how tightly coupled performance is to cooling, I’d expect that the 290 might not be that much of an improvement. It might even clock higher than the 290X, but it’ll need more voltage to hit those clocks, so if it does overclock to 290X speeds- it’ll probably be louder :).

            AMD has done themselves no favors with their coolers. They should probably have just used an open-air cooler as the reference cooler instead, since it’s pretty obvious that the blower is a real disadvantage, like it was for the previous generation, and because that’s what most people will be buying anyway.

            • Bensam123
            • 6 years ago

            There is less die there to power. Just like with the 6350 for AMD chips, you have much more thermal headroom there to overclock or even to manage normal temperatures.

            I disagree about a open air cooler, that’s bound to make my case extremely toasty (everyone’s case), but yes they need something more like the Titan cooler…

            • Airmantharp
            • 6 years ago

            You’re right, so here’s two more things to think about-

            If the die is smaller, that means that the contact surface is smaller, and thus temperatures are going to be higher, assuming equal power draw- but the 290X uses more power than any other single GPU card on the market, including the much larger Kepler-based 780. There’s a lot of ways for that to go wrong.

            Open air coolers are the only way to go with this card- that’s if one of AMD’s partners doesn’t pull a rabbit out of their hat and bolt an actually useful blower to it. That’d definitely be unprecedented.

            Neither AMD nor Nvidia have a history of revising coolers after the fact, unless it’s to make them cheaper- so we can’t really count on AMD to do anything with it until they refresh it, if they can. I don’t hold out hope for a ‘not so stupidly loud’ edition. Maybe HIS will fix this one, if open-air coolers don’t work (and they don’t for me either).

            • Bensam123
            • 6 years ago

            Sure if the surface area is what’s limiting the dissipation of heat and not the cooler. If it was the surface area, the temperatures wouldn’t be able to be kept in check.

            I don’t know, as other people have pointed out, the Titan cooler does a much better job and I would agree. That isn’t a open air cooler.

            • Airmantharp
            • 6 years ago

            If someone would source the Titan cooler and bolt it on to an R9 290X, that’d be fricken’ awesome. I want to believe that it’s possible, but history is running in the opposite direction.

            • Bensam123
            • 6 years ago

            Yup, I’m looking forward to the 290 as well. There is always a price premium on the top of the line card (even for AMD). The second best is really the sweetest spot for me (discounting double chip cards). That or the top of the line mid grade, but in this case they don’t exist yet.

            • Krogoth
            • 6 years ago

            I expect that 290 will yield roughly the same performance as 780 and hopefully, AMD will try to push it for a $399 price point which in turn forces Nvidia to do the same with 780.

            • Airmantharp
            • 6 years ago

            And hopefully AMD will not even bother bolting this cooler to it, and just let their partners put something useful on from the start, like they did with the HD7950.

        • Bensam123
        • 6 years ago

        Yeah, Nvidia really has to stop locking down all their new toys. People just come up with other options for the same thing, it delays the market adoption, and then Nvidia version is eventually forgotten because no one uses it anymore in favor of whatever open standard pops out. I think that’s one thing AMD really understand and unfortunately Nvidia is being too greedy about.

          • Airmantharp
          • 6 years ago

          You’d likely agree that the situation is the same between Intel and AMD- and it is frustrating.

          But I don’t think AMD, Nvidia, or Intel are at a loss of understanding or are flush with greed- everything comes for a price, after all, and there’s more than one way to approach the process of making ‘high end’ and ‘low end’ parts.

    • Damage
    • 6 years ago

    User dzoner has been banned because he’s spigzone, who was already banned. Have his IP ranges flagged now.

      • chuckula
      • 6 years ago

      Really… I liked spigzone a whole lot more than dzoner (unless there’s more than one of them!)

      • ssidbroadcast
      • 6 years ago

      dzone’d!

      • ClickClick5
      • 6 years ago

      Again I post: [url<]http://imageshack.us/scaled/landing/11/banhammer.gif[/url<]

      • JohnC
      • 6 years ago

      I still see him making posts here :-/

        • chuckula
        • 6 years ago

        It is almost Halloween… ZOMBIES!

          • superjawes
          • 6 years ago

          Rule #2: Double Tap

            • HisDivineOrder
            • 6 years ago

            Rule #32: Enjoy the little things. 😉

        • Damage
        • 6 years ago

        I banned him even harder just now.

          • ClickClick5
          • 6 years ago

          Just maximum ban him/her. Cast him into the Nth dimension of banhood.

            • Klimax
            • 6 years ago

            Maximum BAN. (Voice from Crysis)

            • UberGerbil
            • 6 years ago

            TRIPLE BAN… MEGA BAN…
            ….
            [b<]ULTRA[/b<] BAN

            • ClickClick5
            • 6 years ago

            BANTACULAR!

            DAMAGE IS ON A BAN STREAK!

            • Damage
            • 6 years ago

            MODLIKE!

            • Chrispy_
            • 6 years ago

            I believe this ends with M-M-M-MONSTER BAN!

            • ClickClick5
            • 6 years ago

            No no wait! One more:

            BANISH HIM! *in the voice of Mortal Kombat*

            • Krogoth
            • 6 years ago

            to finish it off…

            (Damage grabs a Dustbuster and uses it blow away “dzoner”)

            DAMAGE WINS!

            BANTAITY!

            PERFECT!

            • ssidbroadcast
            • 6 years ago

            This is the best subthread on Techreport right now. 😀

          • superjawes
          • 6 years ago

          Used a bigger hammer?

      • lilbuddhaman
      • 6 years ago

      Aww I didn’t see what the ban offense was, was he the “I’m a cheater and I like it” guy?

        • Firestarter
        • 6 years ago

        that’s JohnC, and he’s still around afaik

        • Pwnstar
        • 6 years ago

        I’m pretty sure cheating in online games isn’t against the rules here, so JohnC wouldn’t be banned for that.

          • Fighterpilot
          • 6 years ago

          Yet, as avid online gamers,most TR regulars would consider online cheats as Internet scum….

      • Krogoth
      • 6 years ago

      Duke Nuked: “ITS TIME TO KICK ASS AND BAN SOME SHILLS, AND I”M ALL OUT OF GUM….”

      • HisDivineOrder
      • 6 years ago

      Cue the dancing ewoks? 😀

    • willmore
    • 6 years ago

    AMD you must bundle this card with Titanfall!!!!

      • Jigar
      • 6 years ago

      Couldn’t agree more.

    • Airmantharp
    • 6 years ago

    Since I know you guys are working on 4k and probably CF companion articles, I’d like to request adding one more comparison using your data troves:

    FPS per dB!

    The question is, if you standardize on some level of ‘acceptable’ load noise by limiting fan speeds, how does that impact performance? Also, what happens when you limit the louder card to the noise ceiling of the quieter card, say, comparing to the the Titan, which is probably most representative to the 780Ti and will actually be positioned to compete with the R9 290X?

    If we’re going to put pressure on manufacturers to actually focus on ‘quiet performance’ in the way that Nvidia continues to do and AMD continues to flaunt, it seems only reasonable to show just how much of margin there is between their various product lines.

      • flip-mode
      • 6 years ago

      If noise evaluation during gaming is going to be meaningful then it has to be considered in the context of game audio. Other than that, the fan just needs to be quiet at idle and needs to run at idle speeds when it’s not gaming or otherwise heavily loaded, and it can’t change speeds too quickly or frequently when it is loaded to different degrees. I’ve had video cards in the past that would spin the fan up and down even for small changes in load, such as panning or zooming a view in Photoshop or CAD; there’s nothing more aggravating than a fan that varies its speed for every little change in load.

      I can’t remember which card I had that was like that…. maybe it was the HD 5870.

        • Airmantharp
        • 6 years ago

        I’ve got a laptop that does that- and it’s fans are whiny, too!

        And I agree that the measurement has to be standardized- and that that’s hard to make relevant to the varied majority of users.

        But consider this- on another forum, the ‘best’ entry-level gaming audio setup is considered to be the Sound-Blaster Z with Sennheiser HD558 open circum-aural headphones. I have the HD555s, the previous model, and I can tell you with certainty that the amount of noise my system produces at full load makes a difference.

        For those that don’t know, open headphones, and Sennheiser’s mid-range series in particular, are excellent for ‘imaging’, which means that they’re perfect for use in games that have effective positional audio. Of course, being open also means that you can hear everything going around you- these aren’t Bose QuietComfort noise-cancelling types (which are horrific for positional audio). You can even have a conversation with them on, and forget that you’re wearing them!

        That’s just one reason to consider how much noise various parts produce. In addition to gaming, as you mention content creation Flip, note that as more applications make better use of extended compute resources, how much noise GPUs make under load will become even more important; though we might switch the metric to ‘FLOPs’ per dB at that point instead :).

    • HallsMint
    • 6 years ago

    Is it just me, or does it make sense that if you’re spending this much on a single component, you probably also have the resources to get it set up in a liquid cooling loop, making the fact that the stock cooler isn’t beefy enough inconsequential?

      • Farting Bob
      • 6 years ago

      Liquid cooling, even for those who buy $500 GPU’s is still a minority market.

        • Great_Big_Abyss
        • 6 years ago

        Ya, a cooling loop with GPU block can easily add $500-$600 to a build. Plus, there are drawbacks. If you’re the kind of guy who changes video cards a few times a year for whatever reason, waterblocks usually aren’t swappable.

          • Deo Domuique
          • 6 years ago

          It’s not only the price of a liquid cooling solution, it’s also the fact that I don’t dare to try something I don’t know much about. In short, I hesitate even to think about water-cooling.

            • HallsMint
            • 6 years ago

            You’re making me feel reckless, considering I’ve thought about it!

          • Diplomacy42
          • 6 years ago

          I don’t think it would have to, if a company like AMD decided to make a reference water-cooled solution ala corsair, they could probably do it for under 150(leaving plenty of profit to spread around) over the reference model and it would likely sell like elephant ears.

          Considering the price disparity between the 780s and the 290x, I think it could be a missed opportunity, though obviously they will do well regardless however.

            • Great_Big_Abyss
            • 6 years ago

            Kinda like the ASUS ARES II?

            I think it’s a great idea, and quite frankly, don’t know why an AIO GPU cooler hasn’t been done yet. If they can make a cooler with different mounting systems for Intel/AMD sockets, they can make a cooler with different mountings for Nvidia/AMD graphics cards.

            • HisDivineOrder
            • 6 years ago

            A missed opportunity?

            Look. I think you’re missing the forest for the trees here. The missed opportunity–the first of several missed OPPORTUNITIES no less–was when they didn’t make a great air cooler to go with their great GPU. Then they go to all this trouble to make a high end GPU with a potentially great audio DSP solution (that they then condemn to just the very top and bottom of their stack) and Mantle, yada yada, but they slap a crap cooler on it for all the initial reviews and for the poor saps they con into buying a GREAT Battlefield 4 bundle they paid for with obscene amounts of money. They set up the perfect serve. Threw it up high and true. Got their feet right. Jumped with perfect angle. Brought their hand back…

            …and hit the ball with their elbow.

            Them’s the missed opportunities.

            Them not making a water cooling AIO secondary option is just ANOTHER missed opportunity after all that.

            • Great_Big_Abyss
            • 6 years ago

            [quote<] Then they go to all this trouble to make a high end GPU with a potentially great audio DSP solution (that they then condemn to just the very top and bottom of their stack)[/quote<] Wrong. Bonaire has True-audio, and if I'm not mistaken, anyone who bought a 7790 will be able to enable it, too. The only reason it's not in the mainstream cards is because the die lacks the necessary transistors.

            • Airmantharp
            • 6 years ago

            Spot on- expect TrueAudio to be in every new AMD graphics product going forward, possibly even their APUs. And you have to wonder if it isn’t in the console APUs in some form!

          • Pholostan
          • 6 years ago

          More like $250 I would say. A Cooler Master Glacer 240L* is about $130. GPU block about $100. Add some for a little tubing etc and you’re done.

          *Basically a Swiftech H220. Gotta love patent disputes.

      • clone
      • 6 years ago

      it does not make sense to me that anyone spending this much on a component should be required to spend more money on liquid cooling.

        • UnfriendlyFire
        • 6 years ago

        Nor does it make sense to use a budget cooler on a non-budget chip.

        I seriously doubt many manufacturers will use AMD’s reference design.

        • internetsandman
        • 6 years ago

        To be fair, people spend $600 on processors that need relatively beefy and expensive cooling solutions in order to perform their best

          • travbrad
          • 6 years ago

          It’s a lot easier to install an aftermarket heatsink on a CPU though. Installing a full water cooling loop in your PC is very time consuming and expensive in comparison. You can get a $30 heatsink that has great cooling performance too. A Hyper 212 Evo runs only 2-3C hotter than some of the best/most expensive heatsinks on the market.

          This is all somewhat of a moot point though, since I expect many of the actual cards will have better (non-reference) heatsinks on them to start with.

            • HallsMint
            • 6 years ago

            Isn’t that the fun of it all? Planning and building the loop. I feel as if it would make the computer experience more personal, like working on your own car.

            • Great_Big_Abyss
            • 6 years ago

            I see your point, I am one of those people who is in the midst of accumulating the materials to try my hand out at my first custom loop.

            But there are people out there who just want a ‘plug and play’ experience. Just like there are too many people who don’t know the battery of a car from the exhaust manifold.

            • Airmantharp
            • 6 years ago

            It’s not just ‘plug-and-play’, though.

            There’s the maintenance time and reliability factor as well. That’s the reason I don’t have a custom water loop, and instead engineered my system for a positive airflow setup and used cards with very good blowers. Nearly as quiet and nearly as fast as a custom water loop.

            What I would like to see, though, is a 140mm bolt-on water-cooler for graphics cards, but I’m not sure if anyone would be willing to produce one given the limitations involved.

            • f0d
            • 6 years ago

            not much maintenance once you have built it (maybe clean out rads every so often if you have a dusty house like mine) and if you do a good job in the first place (like anything) there are no reliability concerns

            i have been building custom water cooled systems for around 15 years and apart from my first one when inexperienced i havnt had any issues

            most of my systems i dont even touch until 2 or 3 years later when i decide to upgrade the watercooling parts

            i would bet its MUCH cooler than any air system – i run a 5.0ghz 1.5v 3930k 6core that barely reaches 70 degrees running linx (intelburntest/linpack in occt), at 4.6ghz its around 50 (temps dont constantly rise like most heatsinks) and this is in australia where its quite warm most of the time

            • Airmantharp
            • 6 years ago

            You’re spot on- if it’s done right the first time, you’ve got nothing to worry about.

            The ‘reliability’ question comes from the engineering wive’s tale that insinuates that any increase in moving parts increases the number of things that can break :).

            That, and to do it right, you’re talking about going from thirty minutes to put an air-cooled system together to something like a week for a properly water-cooled system, assuming you have a life too!

            • travbrad
            • 6 years ago

            It depends on the person I guess. I like being able to build my own PCs and completely customize them choosing every single component, but the actual building/upgrading part really isn’t my idea of “fun”. For me building a PC is a means to an end, not the end in itself.

            The cost is still MUCH higher than a good air-cooled setup too, and really doesn’t offer a huge advantage in terms of heat or overclocking headroom. Unless you already have the highest end hardware possible (like a R9 290X) you’d probably get more performance by just buying a faster GPU/CPU in the first place. We really don’t know how much headroom the 290X would have with a water block either. Temperatures aren’t the only limiting factor in overclocking.

            • f0d
            • 6 years ago

            “I feel as if it would make the computer experience more personal, like working on your own car”
            it does 🙂

            i actually used to spend all my money on old cars and doing them up, eventually i just got a new car and starting on customizing my pc instead
            its much cleaner and easier to work on – also dont shred my knuckles as much 😛

            • HallsMint
            • 6 years ago

            Haha, yeah I’ve cut up my knuckles plenty under the hood of my pride and joy, too.

            I actually quite like the physical effort of working on a car, but I do see your point! Computers definitely don’t fight back as much. I’m glad someone else sees the value of it!

            • Airmantharp
            • 6 years ago

            What’s scary is that blood is full of iron, and very likely could be more electrically conductive- I think that’s why you’re supposed to turn the system off before working inside :).

            • HisDivineOrder
            • 6 years ago

            Yeah, it’s fun until that water cooling block you installed turns out to be slapped onto a defective card. You go to remove it and then ship the card back. Newegg/Amazon says they see you removed the cooler and sends it back to you, charging you for shipping both ways, and refusing the refund. Then you try it with the maker of the card and they do the same.

            There’s you, $550 card, no warranty, defective product, but at least you got that water block on there. Yeah, fun.

            No. That’s why most of us don’t add water blocks to our video cards. We ain’t looking to destroy our warranties just to make up for AMD not putting a decent cooler on a $550+ card.

            I’d say the same if nVidia did it. I’d say the same of Intel did it.

            Here’s what AMD should do, though. Ship cards without a cooler at all. Put a giant sticker on it that says, “Please add after market cooler or card will burn up like a piece of toast.” Then they can have a hotline on the card in bright text so that when people call, having not installed a cooler, they can laugh at them.

            I’m sure that’d shave at least $2 (the estimated value of their cooler) off the overall price.

          • BlackDove
          • 6 years ago

          The highest end CPU from Intel, which is a 12 core Xeon, uses 130W. It doesn’t need watercooling.

          I like Asus, but the ARES II was a piece of junk. It’s basically a 500W paperweight, since CrossFire didn’t work on it, and it consumed A LOT more power than a Titan or 690.

      • JohnC
      • 6 years ago

      It’s not about “resources” or money, which are totally non-issue for some people. I’ve played with water cooling in the past and I also seen the type of damage the leaking coolant can cause – the inconvenience of connecting/disconnecting/bleeding/cleaning all this shit and replacing your video card and motherboard (in case of coolant leak) is not worth it.

        • HallsMint
        • 6 years ago

        I’ve seen some worst-case scenarios, too, but nothing that didn’t happen out of negligence/ignorance and the occasional honest mistake. That can be avoided by simply testing the loop with the computer off for a few days.

        • f0d
        • 6 years ago

        you test the watercooling system before you put in your hardware (usually for a few days) – if you did a good enough job in the first place you shouldnt have any issues

        the only time i have had any problems was with my first one when i was inexperienced about 15 years ago and i put in the hardware BEFORE i leak tested it for a few days

      • UnfriendlyFire
      • 6 years ago

      Your post reminds me of someone who used a stock Intel cooler on an $1000 i7 9xx years ago, and a 120mm intake fan and 80mm exhaust fan for his desktop case.

      I cringed when I looked at his setup.

        • HallsMint
        • 6 years ago

        Some people just have a bit more money than sense, haha

        • Mr. Eco
        • 6 years ago

        I use stock Intel cooler on i5-3470 as I have got confidence in Intel design. And no case fans at all – the PSU fan removes the hot air.

        The very small case (made by me) houses also HD7850; temps to either CPU or GPU do not exceed 65-70 degree Celsius.
        [url<]https://www.dropbox.com/s/3m228sn82eltruh/P1010277.JPG[/url<] 25cm x 15cm x 27cm

          • HallsMint
          • 6 years ago

          I really dig the DIY case, although I’d spend some time making the wiring a bit prettier 😛

          • HisDivineOrder
          • 6 years ago

          Props. I like that. I agree with the other guy. You should probably tidy up your wiring, though I’m not judging since my wiring is far worse. 😉

          You’re braver than I, not adding more fans, but I respect it. I salute you.

          • Airmantharp
          • 6 years ago

          I’ve gotta say, that’s neat- I’m always humbled by good minimalist engineering!

          (I wanted to say that it was ‘cool’, or maybe ‘hot’, but I didn’t want to mischaracterize your work :).)

      • Klimax
      • 6 years ago

      I don’t trust liquid cooling anywhere near my expensive components. Damn too high probability of accident.

        • f0d
        • 6 years ago

        only if you dont know what you are doing
        when i build a cusom watercooled system they last for years without doing anything to them – if you know what you are doing in the first place there are zero issues with reliability

          • Klimax
          • 6 years ago

          Sorry, but no matter how well a thing is done, liquid is something don’t want anywhere near those components. Simply too expensive experiment for not really significant increase (Zalman CNPS12X is used) while with increased risks. (Fan can fail, but won’t take out any component)

          But then I have Core i7-3930k and Geforce Titan. I am not going to risk any of these components.

          Simply my POV, I am not saying watercooling is bad, just pointing out why some of us don’t use it.

            • f0d
            • 6 years ago

            i still say its not a risk if you do it properly
            i have a 3930k also and sli 670’s and i never once considered it a “risk”
            imo there is a significant increase in both clock speed (i can easily do 5ghz 1.5v on my 3930k) and reduced temps

            • Klimax
            • 6 years ago

            I guess everybody has different calculation for risks.

            • Airmantharp
            • 6 years ago

            I’m thinking I should pay f0d to do my next build- if he/she lives close enough to pick it up, that is. I worked for UPS for eight years, I know better than to ship such a thing :).

            • f0d
            • 6 years ago

            i would if you lived in australia – i like new projects 🙂
            some things i do thats a bit “over the top” (because i like my systems to be safe for years)

            i use a torque wrench with settings just a little over hand tightness for the fittings – too tight and you could wreck the oring too loose and you can have eventual leaks

            visual inspection of all orings on leak points – you would be suprised how many people dont even look to see if they got a dodgy oring

            i use both a ziptie and a hose clap on the fittings just in case one fails

            when i test i put toilet paper around all the fittings and possible leak points as well as baby powder on final day – makes it easy to see the leaks

            i test the system running with a heat source (an old overclocked p4 motherboard and cpu i dont mind wrecking – they are about $10 on ebay) for about 5+ days (sometimes up to a week) and vary the temperature in the room as much as possible (heater on then aircon on)

            i use D5 laing pumps (doesnt matter which brand as long as its a D5) they are the most reliable pumps on the market imo and the only time i have ever heard of one failing is because the person building diddnt keep water flowing through the pump (they are lubricated with the water flowing through them)

            test with 3 pumps in series to make sure it can take high pressure

            is it over the top testing it this way? it sure is as usually just running it for a few days with toilet paper around the joints is enough 99% of the time but i like to stress test the system as much as i can if im going to be keeping it for years

            i only use demineralized and deionized water with a silver killcoil in the loop – i dont trust water dyes and add in stuff the silver had done the job for me and i havnt ever had any issues with it

            so there you go thats how i test my water loops nowdays

            • Airmantharp
            • 6 years ago

            Not so sadly, I’m in Texas- which is a lot like Australia, just all the things trying to kill you are smaller 🙂

        • Fighterpilot
        • 6 years ago

        LOL…like your 17″ monitor?

      • jimbo75
      • 6 years ago
        • HallsMint
        • 6 years ago

        Not super surprising, and certainly nice to see. Hopefully TR will do at least a short review of some of the OEM cards and their cooling abilities.

      • odizzido
      • 6 years ago

      I’ve tried water cooling, and I found it to be more work than it’s worth.

      • moose17145
      • 6 years ago

      I laugh at all of your peoples silly water cooling loops! Just throw everything in an aquarium and be done with it!

      [url<]http://www.cadred.org/Images/Gallery/Originals/74ec989528caead2348e4b6773785df8.jpg[/url<]

        • moose17145
        • 6 years ago

        Seriously though, I do not understand everyone being so worried about leaks and stuff like that. If you are honestly THAT worried about leaks in a water loop shorting out your components, but still want many of the benefits of water cooling, put mineral oil in your loop instead. Mineral oil has a higher electrical resistance than air does, so there is no worry about it shorting anything out if it leaks. Sure there might be a mess to clean up if it leaked… but it wouldnt short circuit anything.

    • BlondIndian
    • 6 years ago

    Hi Great review as usual guys !

    I see that you are using an open testbed . That could sorta explain the few % difference in the fps between TR and other sites like Anandtech.
    It would be lovely if you can add the average GPU speed during benchmarking for various games .Something like this maybe :
    [url<]http://www.anandtech.com/show/7457/the-radeon-r9-290x-review/19[/url<] Hoping for a second article to fill in the missing blanks - CF , Overclock , 4K , etc. PS : Things "heating" up at the GPU podium . I bet 780 Ti is a higher clocked version of 780 running at higher temperature target .

      • BehemothJackal
      • 6 years ago

      Nvidia 780 Ti (Temperature increase)? 😛

      • superjawes
      • 6 years ago

      From [url=https://techreport.com/news/25532/high-end-geforce-gtx-780-ti-coming-in-mid-november<]Cyril's News writeup:[/url<] [quote<]If you'll recall, the GTX Titan features a version of Nvidia's GK110 graphics processor with one of its 15 SMX units disabled. That means it has 2688 shader ALUs out of a possible 2880, and it filters 224 texels per clock out of a possible 240. All Nvidia has to do to make a faster product is to enable that last SMX. The company could conceivably raise clock speeds, as well, but it could forgo that step and still have a new, faster-than-ever top of the line.[/quote<] It might still be hotter and have a higher clock, but I think that offering a "fully capable" GK110 is their first move.

        • USAFTW
        • 6 years ago

        They could do that, and pay the price if the yields are bad enough.
        TITAN, which is supposed to be NV’s ultra high end has bits disabled. It’s either for thermal constraints or yields.
        BTW, I’m not really comfortable using a GPU (or any PC component) that has been deliberately castrated to serve market segmentation. Call me mad.

          • chuckula
          • 6 years ago

          [quote<]BTW, I'm not really comfortable using a GPU (or any PC component) that has been deliberately castrated to serve market segmentation. [/quote<] Oh really? Then why don't you tell me why you don't like: 1. The HD 7970 (original edition before the GHz overclock) 2. The HD 7950 3. The HD 7870 4. The HD 7850 5. The HD 7770 6. The HD 7750 7. The HD 7730 8. All of the rebadges of the above cards that AMD recently launched. 9. Any FX-series chip other than the FX-9590 10. Any Richland part lower than the 6800K. 11. Any Kabini part below the 25 watt version. 12. All the Temash parts since they are by definition cut-down Kabinis.

            • BlondIndian
            • 6 years ago

            Lopping off parts or die harvesting is one thing .
            Turning off features on good chips just for market segmentation (DP on 780 ; Intel with TXT-NI , AES , etc.) . AMD enables as much as the card does . No artificial DP perf cap .

            I think USAFTW’s point is valid (in spite of the corny name 🙂

            • Airmantharp
            • 6 years ago

            You haven’t heard of FirePros yet?

            • BlondIndian
            • 6 years ago

            The value of FirePro or quaddro is in the Professional certified drivers .
            Charging more for certified drivers is a fair deal. What is bad is limiting features on a chip (like DP rate) just for product segmentation and not for yield/die harvesting reasons .

            • Airmantharp
            • 6 years ago

            I guess it’s similar to AMD’s desktop CPUs- FirePros don’t command the market share of Quadros/Teslas, despite being largely as competitive as AMD is in the consumer graphics market, so they may be shipping ‘feature unlocked’ consumer GPUs minus the pro drivers to keep demand going like they ship their desktop CPUs with full-on ECC and virtualization support.

            • Deanjo
            • 6 years ago

            [quote<]What is bad is limiting features on a chip (like DP rate) just for product segmentation and not for yield/die harvesting reasons .[/quote<] Is it really so different in other areas of electronics? You can get identical hardware TV's for example but you pay premiums for them to activate "Smart TV" features in the firmware. Really there are very few that are going to use DP rate capabilities that the consumer class cards target. There are also more differences then just an drivers on a pro card. Typically a pro card has to have robust error correction and be guaranteed to run 24/7 under full load. You are also paying for direct support from Nvidia/AMD instead of having to deal with your card manufacturer.

            • the
            • 6 years ago

            Except the error correction is also in the same silicon as the consumer cards: again disabled for market segmentation. Ditto for a few other workstation only features like 10 bit per channel color depth.

            Power circuitry could also be different on the workstation cards but those board changes are often seen when the port configuration is different and/or the need for additional memory chips on the PCB for higher capacity. The different ports and additional memory one of the few tangible things that can warrant a price premium.

            The main difference between a consumer card and a workstation card in terms of reliability is that the workstation variant may come with a lower GPU/memory clock. Though the reasons for the lower GPU/memory clocks may not be explicitly for reliability but rather a side effect. As memory capacity is generally higher the memory bus may require a lower memory clock. Similarly the lower GPU clocks may stem from a need to hit a lower power consumption, typically under 225W where as consumer cards have no issue going all the way to 300W.

            Depending on who/where you pick up a workstation card, you’ll still have to work through them to get support. Rebranded FirePro’s from Barco come to mind (I think those cards may have special firmware to drive their 6 MP monitors.)

          • the
          • 6 years ago

          If yields are bad on fully functional GK110’s, there may not be any market segmentation going on at the high end. With a 550 mm^2 area, I can’t imagine fully functional GK110 yields are good. I think only the $5000 Quadro K6000 contains a fully functional GK110 and that is a very low volume SKU.

          The GTX 480 was in a similar situation when it launched years ago due to yields on the GF100. It took a bit of redesign and the GF110 was able to be manufactured with acceptable yields to release a fully functional chip to consumers under the GTX580 name.

        • BlondIndian
        • 6 years ago

        Fully capable GK110 is a long shot because of the huge die size . Yields would be very very low . Yields are proportional to the square of die size approximately (IIRC).

        Increasing shader resources would require more memory bandwidth. Unless Nvidia raises the 6Ghz limit(on stock cards) I don’t see that happening .

        Increasing the number of shaders on the card will cause higher TDP at the same clocks. Process improvements are more for AMD as the 7970 was released at 28nm intro . The GK110 came way later and the 780 is recent.

          • Airmantharp
          • 6 years ago

          They’ve been making GK110 for a very long time, so while what you’re saying is right, you’re missing on the amount of stock Nvidia might already have of fully-functional die. Further, the longer something’s been in production, the better the yields, which also supports the idea that Nvidia might release a card with a fully-functional GK110 in response.

          Higher TDP will only be an issue if they ignore the cooler; but that’s something AMD does. Nvidia has made incredible strides here.

      • Bensam123
      • 6 years ago

      It’d actually be nice if these were tested in a case. I think that’s actually the best way to do this. TR needs to lock down a baseline case and use it for all of their reviews. Something nice mainstream that isn’t too far outside the norm.

        • Airmantharp
        • 6 years ago

        They’d need two- one enclosure set up for open-air coolers, the other for blowers :).

          • BlondIndian
          • 6 years ago

          No , they’d just have to use a stock enclosure . Ideally one they recommend in their builds .
          Most guys(~95%) never modify their cases . The <5% niche are using custom solutions anyway.

            • Airmantharp
            • 6 years ago

            The enclosure stops being ‘stock’ when you start putting parts in it :).

            If you want to bring an anecdotal ‘most guys’ argument into the discussion, note that ‘most guys’ are pretty stupid. That’s why we have a forum here, for those smart enough to know that they need more information. No ‘guide’ will ever be enough.

            Here’s an example of why you’d need two separate enclosures (or more!):
            If you put a card with an open-air cooler on a bench, it performs great, better than if you put it in a ‘stock’ enclosure
            If you put a card with a blower-style cooler on a bench, it performs *worse* than if you put it in the same ‘stock’ enclosure

            However,
            If you put a card with an open-air cooler in an enclosure with good intake and exhaust airflow moving through the expansion slot section, it performs better than on a bench, and far better than in a stock enclosure
            If you put a card with a blower-style cooler in an enclosure with a positive pressure configuration, then it performs better than in a stock enclosure, and FAR better than on a bench

            There are plenty of examples of decent ‘open air’ enclosures, though they usually need more fans and fan speed tuning, but positive pressure examples are more rare- best example is Silverstone’s Fortress FT-02, and it’s definitely worth a look.

            • BlondIndian
            • 6 years ago

            Putting a standard build(CPU , MB , PSU,GPU , HDD ,etc.) into a stock case will still be ‘stock’ .

            The major diff between blowers and open-air coolers is that the latter do bad in poorly ventilated cases while the blowers are not affected much by the case . Both draw air from within the case , blowers just vent more outside . This would matter in CF/SLI or cases without enough ventilation .

            “most guys” arguement is still valid because the review has to be valid in everyday situations for most people . Having two setups produces very little gain for a lot of extra work . TR already does comprehensive reviews . I doubt they’d have the time or resources to do two different setups for all reviews.

            • Airmantharp
            • 6 years ago

            You can try to make a baseline for both types of cards, but you’d be favoring one over the other, depending on what ‘standard’ case you choose. I mean, we could get all of this stuff in a $30 Rosewill, and it’ll work; and both types of coolers would be hampered. The open-air cooler for lack of air circulation, and the blower for lack of air intake.

            In reality, if you’re going to choose a GPU, you have to be aware of the effects that different cooling arrangements have on different types of GPU coolers. It’s a system; if you don’t treat the building process as a ‘combined’ solution, you’re asking for trouble.

            • Bensam123
            • 6 years ago

            Yup… Like a Corsair 650D or something like that. They could even take a ‘budget case’, in like the $50 range most people would actually have. Something with two 120mm fans on it…

    • mcnabney
    • 6 years ago

    Nice card, but I think I will be happy with the 7970 I just bought for $250 with game bundle. $500+ is just too much for a GPU.

    • internetsandman
    • 6 years ago

    This is weird. OC3D showed the 290X to be equal to or below the 780, but here it’s handily beating the green team in nearly every test. What’s going on?

      • Airmantharp
      • 6 years ago

      Everyone tests a little different- but don’t worry, that’s to your benefit!

      • Deanjo
      • 6 years ago

      There is always a bit of variation. Guru3d for example uses newer Nvidia drivers and show better results on the nvidia cards for example. Unless everyone uses the exact same setup you are likely to see some sites slightly differing from others.

      • BlondIndian
      • 6 years ago

      Open air testbed affecting available turbo(powertune) maybe ??

      • Modivated1
      • 6 years ago

      Surprise, Surprise! 🙂 the team with Christmas colors has dropped the BOMB on the green team!

      Never trust rumors because they always disappoint.

        • Deanjo
        • 6 years ago

        Not sure how you think they dropped a “BOMB”. The price/performance difference isn’t great enough for fans of one side to switch to another. AMD fans will get a nice upgrade to an AMD card and Nvidia fans will more then likely stick to their brand because of brand loyalty and the performance delta being minimal at best.

          • Modivated1
          • 6 years ago

          Some reviews say that the R9 290x has the advantage over Titan, others say that it has parity to the Titan. All reviews clearly indicate a decisive but not massive lead on the current 780.

          My question here is concerning Titan: How much of a price difference do you need to buy the better card regardless of brand loyalty? 3x? because we are nearly 2x now and the rival card is arguably better and definitely at least even.

          I can see people going with Nvidia in spite of a $100 difference (I wouldn’t do it but I can see others) so the 780 will have it’s fans. It won’t however outsell the r9 290x in the market. To make things worse if the r9 290 regular card has even a slight advantage over the current 780 and is say $100 cheaper than the r9 290x than that will shame Nvidia’s current price/performance position in the market and they will be forced to drop their price dramatically.

          So while you might feel that the performance difference need to be more dramatic to be considered a ” BOMB ” dropping consider the financial impact and the consumer perspective change in the market. Then maybe if you can see past being a fan you see where the ” BOMB ” has hit Nvidia.

            • Airmantharp
            • 6 years ago

            Define better card- both what qualities make it better, and at what tasks it should be better at. Then we can compare prices.

            For a single card, assuming that aftermarket coolers are available promptly, I’d define the R9 290X as the ‘better’ card for $550.

            Prices change that- configurations change that too. Want G-Sync? I do. I wish AMD had it. Want multi-GPU? Wish AMD had a WHQL driver that’s been tested by the community to fix their latency issues. Want a quiet blower-style cooler? AMD cards need not apply.

            Personally, I’m looking to run a 4k monitor in the next year or so- and while AMD is definitely stepping up their game, they’re not ‘there’ yet. Granted, I’m not buying Nvidia right now either- I’m not buying anything.

            But what’s ‘better’ depends on a great number of factors. The raw performance must be there, sure, but it’s not everything.

            • Modivated1
            • 6 years ago

            You should read more reviews, Latency/Crossfire and 4K issues have been largely resolved some cases the latency has outperformed Nvidia so they are definitely competitive. That leaves G-Sync which is definitely an Nvidia advantage that will be available on future GPU’s if I understand it correctly.

            At the end of the day most people still cannot afford to purchase the super ubber, ubber system (like buying a 4k monitor for $$$$ and then buying duel cards with a 1200 watt PSU) so they will slurg a little and get the best bang for their buck. A card that cost $550.00 yet matches or beats a $1000.00 card is sings to the majority of buyers. Sure there are those that will still choose to go the other way for the advantages it brings but they will be the minority unless the price changes dramatically.

            How likely is that when the actual manufacturing cost of the chip is greater for Nvidia then AMD. If they do drop the price and accept a great loss in profit then AMD may respond in turn by dropping their prices further which may cause Nvidia to consider selling the chip at cost (that won’t happen and it wouldn’t be a victory for Nvidia if they had to do that to outsell AMD). The two most currently tangible measurements of success we have for cards today is Frames per Second and Frame Pacing. By that spectrum I declare that AMD (at least currently) is the better card. All the other stuff is user specific preference.

            You like G-Sync I like Mantle, I am not going to by a $$$$ monitor for 4k, and the latency issues for the crossfire setup look near the same as Nvidia’s and definitely worth paying $450.00 less per card than the $1000 Titan (nice name though).

            • Airmantharp
            • 6 years ago

            I read too many reviews.

            And I don’t trust many of them- but I’ve been reading reviews for a very long time. Skepticism is warranted.

            What I see is that AMD has made a fine GPU, slapped a horrific cooler on it, and still hasn’t *proven* that they’ve fixed all of their driver issues.

            Getting better? Sure, and that’s been documented. Tested WHQL driver? Nowhere to be found.

            So yeah, I’m still waiting.

            But why be so skeptical? BECAUSE I’VE RUN CROSSFIRE RECENTLY.

            • Modivated1
            • 6 years ago

            Is that with the new r9 290x Crossfire cards that use only the PCI slots for crossfire with no additional bridge connect? If not (which I highly doubt since they were only released today) then I suggest you buy a pair and test it if cannot trust how different review site harmonize over the verdict that this card has produced over the majority of review sites out there.

            If you have read this review then you would notice that the latency issues have been resolved and in most cases they are less than Nvidia. Other sites show that 4k and crossfire issues have also been resolved.

            What else you have to be skeptical about? Less Compute than Titan? Ok, if you are buying it for compute. For games this is the better buy price/performance. G-Sync? TR has stated that is a must and therefore a given that AMD will develop their own G-Sync so no reason to switch there. Heat and Noise that’s a good one but it won’t be when the after market solutions hit the shelves, that’s what I am waiting for.

            If you can’t trust no review then you will have to just buy every card that hits the shelves and test for yourself. Who knows if you can write an objective review than maybe you will be the next trusted site. Good luck to you but I will cross reference different reviews to find out what is the common truth and then make my choice.

            • Airmantharp
            • 6 years ago

            Were you born yesterday?

            I’ve been reading reviews for longer than any of the current review houses have been around. I remember when they became popular. Trust is earned, not given; further, AMD is the breaching party in this instance.

            They broke the community’s trust with their incompetence. They’re going to have to work to get it back. Shipping a brand-spanking-new GPU that’s billed to save the world without WHQL drivers that address the issues that they’ve had for years does not inspire more trust. It inspires less.

            • Modivated1
            • 6 years ago

            I agree Trust is earned not given, and however long you have been around is not the issue, many of these sites have been in the industry for quite a long time. If they haven’t earned you trust by now then no one can and absolutely nothing anyone says will persuade you.

            I know you are not about to say the TR is swinging on the ball sack of AMD? As much as they have rode AMD over bad frame pacing no one with objective sense will agree with you there. No one said you had to trust AMD, trust the numbers behind the product.

            You are running out of legitimate arguments to defend your position. So far ….

            1. G-Sync is a product of Nvidia which I prefer over AMD advantages.

            Counter: G-Sync is not a developed product that ships with current GPU’s so there is no advantage to buying Nvidia there. Also it will require to buy a new monitor of 60hrtz or better.

            2. AMD’s latency sucks!

            Counter: TR’s and other sites show lower latency from r9-290x in most games than Titan or the 780.

            3. AMD’s GPU’s won’t be able to handle 4k gaming.

            Counter: Other sites have reviewed crossfire and 4k gaming and showed that the problems have been fixed and 4k is alive and well. Not that it matters for 99% of buyers because they are not going to buy a 4k 30″ set for $3500 or more (probably not available in 60hrtz or greater yet either) and then turn around and buy 2 Titans($2000) or 2 780’s ($1300). So 4k gaming is about 4 years away.

            So Ladies and Gentlemen the New Reigning Single GPU Champion of the World IS ….. The R9-290X!!

            I am not saying that the tides won’t turn when the 780 TI gets here but give credit where credit is due and stop hate’in.

            • Deanjo
            • 6 years ago

            [quote<]My question here is concerning Titan: How much of a price difference do you need to buy the better card regardless of brand loyalty? 3x? because we are nearly 2x now and the rival card is arguably better and definitely at least even.[/quote<] People that are buying the Titan over the 780 generally are not doing it for the gaming performance (unless they just like to blow money). A huge chunk of Titan owners are purchasing it for the computing prowess and that is where the premium starts justifying itself. I own Titans, I bought them even with the price premium for Cuda development. Those cards have paid for themselves many many times over. If I were to purchase a 780 for that, I would have to deal with the crippled DP performance which makes it hard to optimize for full blown Tesla setups that are utilizing DP capabilities. If I went the 290x route, I would save a few bucks on the hardware but I would also lose that source of revenue from not being able to use it for Cuda development (I know it does openCL however the use of openCL in big iron GPGPU computing is very small when compared to the Cuda deployments).

            • Modivated1
            • 6 years ago

            Ok, I see your point and it makes perfect sense. However, have you considered stacking 4 290x’s? I hear that you can only stack 3 Nvidia cards, I do not know the performance outcome so this is not a suggestion but a question to whether you would get more performance from 4 way crossfire($2200) I know that the price would be cheaper than 3 Titans ($3000)?

            Something to consider anyway.

            • Deanjo
            • 6 years ago

            [url<]http://www.youtube.com/watch?v=PVOL7Fbd6go[/url<] [quote<] I hear that you can only stack 3 Nvidia cards [/quote<] You heard wrong.

            • Airmantharp
            • 6 years ago

            You’ve heard? So you don’t actually know?

            You should look up Vega.

        • superjawes
        • 6 years ago

        Isn’t green a Christmas color, too?

          • Modivated1
          • 6 years ago

          Yes it is. In fact that is why I said what I said AMD is Green with their CPU line and Red with their GPU line thus the holiday spirit. Nvidia has more of a St. Patrick’s day thing going on.

            • Airmantharp
            • 6 years ago

            I don’t like Nvidia more… but I do like St. Patrick’s day more :).

      • WaltC
      • 6 years ago

      [H], G3d, and TR all pretty much agree right down the line. Also, I don’t know how much it happens these days, but just a few years ago certain sites (nameless only because I don’t remember them) used to post faux reviews in which they’d often lift whole sections of text from a legitimate review and paste them into their own copy, lift the images, etc. In those reviews it wasn’t uncommon to see made-up frame rates based on the “reviewer’s” guesstimates. Today, on some smaller sites trying to get larger, it isn’t unusual to see blatant bias affecting the results they post, either. It’s very weird and fortunately not at all common anymore–not at least in the sites that I read on a daily basis (where it never happens.)

      Assuming everything is legitimate, though, as someone else pointed out, the testing methods are different–which is to say possibly weird in some way not fully disclosed to the reader if at all. About the most egregious example of vendor bias reflected in a review came years ago from sites like Tom’s Hardware, Sharky Extreme and AnandTech (except for AnandTech, all are under new ownership today, I believe.) Dr. Pabst blue-ribbon writing at TH, reviewed TNT2 reference prototypes direct from nV clocked @ 175MHz, and benched them against stock-clocked 3dfx cards just for the purpose of raining on 3dfx’s parade. Things might’ve been fine except that when the TNT2 shipped, finally, it was factory clocked @ 150MHz–and nary a corrective word was uttered by Dr. blue ribbon (in those days 25Mhz was a big deal.) Then Anandtech grabbed screenshots from the at-the-time brand-new 3dfx V3 and posted them and they appeared gosh-awful looking–far worse IQ than the V2 produced. In the review he devoted one sentence in saying, “The screenshots I took looked nothing like the image quality I saw on the screen,” but he ran the incorrectly grabbed screenshots anyway, and when he discovered that current screen-grab software wouldn’t work with the V3 (you had to have slightly altered screen-grab software, which 3dfx provided for anyone who wanted it), he never bothered with a correction.

      Then there were the embarrassing episodes in which SE and AT talked up “AGP texuring” as if it was the Next Big Thing and, again, jumped all over 3dfx for telling the truth about it: “Our products don’t use AGP texturing because our local ram bandwidth is 20x faster than the AGP bus.” [As an aside, to illustrate the duplicity of those days, nV didn’t use AGP texturing, either, but the company claimed to use it!) I’m guessing that AT and SE didn’t understand the whole bus thing and just found it easier to mouth the words put into their minds by some mindless Intel/nV marketing employees at the time. Even today the gpu onboard ram bus is 10x-20x faster than PCIe x16, which is why all of the performing cards have GB’s of their own on-board memory, etc. Far as I know–AT never corrected itself. 3dfx mismanaged itself out of business, that’s for sure, with the STB factory acquisition in Mexico, but all of the nutball criticism the company had to endure during that time surely didn’t help anything.

      The point I’m trying to emphasize is never automatically believe the word of any single source on the Internet, but rather believe it when several [i<]credible[/i<] sources arrive at similar basic conclusions--you'll hardly go wrong (except in the unlikely event you get victimized by a Round-robin: bad info gets picked up and regurgitated by several sites, etc. But fortunately, that's uncommon these days, as well!) Be very picky about the hardware sites you read regularly, but always, always read more than one.

      • JohnC
      • 6 years ago

      Eh… Most of the gaming benchmarks are flawed in some ways on every review site, be it either because of different hardware configuration, different driver version, different in-game settings or different in-game areas used for testing. Some sites (I am not pointing at any) might also do “adjustments” to their results based on financial incentive provided by some hardware manufacturer 😉

      The only reliable way to determine performance difference is to buy the cards yourself and do your own tests, with your particular software/hardware configuration. Especially when it comes to multiplayer game tests.

        • dzoner
        • 6 years ago

        There are reliable review sites, TR being one. They suffice for all but the very deep pocketed and driven.

      • dzoner
      • 6 years ago

      The Mantle advantage will last for years.

        • Modivated1
        • 6 years ago

        If it takes off, it has my vote I want to see it shine.

        • Klimax
        • 6 years ago

        Won’t take off or we’ll see horrific crash. Take your pick.

      • HisDivineOrder
      • 6 years ago

      Well, I think when you have a card that is going to run as hard as it can until it hits 95 degrees, then run the fans as hard as it can to lower the temp and once it runs those as hard as it can, it’s going to drop performance… you’re going to wind up with different cards with varying limits in different testbeds in different cases in locations with different temperatures…

      …and you’re going to see some vastly different results when comparing the extreme edges.

      That’s a problem with pushing a card to the ragged edge as a matter of standard procedure. You don’t know if every card is going to be capable of running that way in every system because of all the differences a test bed system will have from a conventional system in a case in a home with no direct airflow of A/C to the system.

      That said, it’s also hard to judge the card line with these reference coolers since I think the only people that are going to be buying a reference cooler-based card are the people who just HAVE to have Battlefield 4 with their R9 290X. I think a lot of people are waiting for later to see non-reference cards and what nVidia does in the meantime.

      I think BF4 may “convert” more than would have normally bought in early, but not as many as if AMD had provided a quality reference cooler.

      • jihadjoe
      • 6 years ago

      Ambient temps maybe? Or how long they pre-heat the card before considering benchmark results.

      • Arclight
      • 6 years ago

      It’s possible they ran into thermal issues, either waywe should wait until adequately cooled SKUs are released by board partners. Also it was obious that in some tests the OCed GTX 780 would take the lead as it can outperform the stock GTX Titan as well.

      The story would be quite different with a properly cooled and also factory OCed R9 290x imo.

    • NarwhaleAu
    • 6 years ago

    Way to go AMD! I can now get better than Titan performance for half the price. Now I just need to decide if I should upgrade my 6970. This is going to force a response from Nvidia – if you have $1000 in your pocket you would be a fool to buy a Titan when you could buy two of these (taking AMD at face value that PCI express based crossfire performs). It’s a win win for all of us, regardless of whether you like the red or green team (or are truly ambivalent).

    This is going to be an awesome card when it hits $400 (my pain threshold for new graphics cards).

    • dzoner
    • 6 years ago

    The pricing hurt Nvidia is feeling now will become excruciating if BF4 Mantle comes with a 20-40% AMD performance premium and AMD includes a bundle with a wider selection of more desirable games.

      • Airmantharp
      • 6 years ago

      Nvidia doesn’t feel the pricing hurt- they feel their thick wallet while AMD has been dallying. They have quite a bit more levity in their current offerings than AMD, and a quick adjustment of MSRP would turn the situation around in a hurry.

        • dzoner
        • 6 years ago

        levity = leeway?

        Nvidia feels the pricing hurt. Developing it’s Tegra lines hasn’t been cheap and the returns have yet to materialize.

          • Airmantharp
          • 6 years ago

          Synonyms, I believe. Don’t quote me on that :).

            • superjawes
            • 6 years ago

            [quote<]Synonyms, I believe. Don't quote me on that :).[/quote<] Quoted! Because I'm a rebel 🙂

            • astrotech66
            • 6 years ago

            Defnitely not synonyms.

          • Benetanegia
          • 6 years ago

          They have deep pockets and over the last couple years they have been spending a lot on many open fronts (not just Tegra SoC): their own CPU, their own console, their own tablet, their own phone, G-sync. That’s a lot of money spent that you’ve seen reflected in their financials, and it’s a lot of R&D when you are designing your first device, and entering a market you have no expertise in. But it will be much lower for future designs, starting this coming quarter.

          Nvidia can perfectly present battle in the innevitable price war, if they choose to.

          – Financially they are more than covered. 3+ billions in the bank. They can afford to have 100+ million loses (like AMD’s been doing) for 30 consecutive quarters (yeah 7+ years) and still not make a sweat…

          – GPU they do have. The revised GK110 GPU used on the K6000 (GK180/GK110b or whatever is actually called), which is more efficient than GK110 on Titan/780, fully enabled, plus the extra clocks posible thanks to 50w and 20ºC headroom they have to play with, if they are OK matching 290X on those fronts, will undoubtely run circles around the 290X. Without a doubt. It doesn’t even stand a chance. The only question is whether Nvidia will par that superiority with a much higher price, then lower 780. Or they will just slot it in based on matching perf/$ (likely scenario). Or if they will undercut AMD, not very likely tbh, but GTX680… margins vs volume etc. Don’t forget GTX 260/275, GTX 470, 570, 560 Ti… cards of similar die-size that sold for well below the $300 mark over years.

            • Airmantharp
            • 6 years ago

            Right, on all points- good job!

            I haven’t even gotten to mentioning how cards with the same size GPU as the 780/Titan went for nearly $200 some years ago- and what a bargain they were :).

            • clone
            • 6 years ago

            whenever AMD wins the first response is “Nvidia is holding back” the 2nd “they’ve got something in the wings and AMD will be crap soon enough” and finally the 3rd “AMD’s drivers suck”.

            their are so many double standards applied when it comes to comparing AMD and Nvidia product that I’m not sure they’d fit in a post, why is it always this way, why so much vitriol?

            every so often I got back to Nvidia to see the “massive difference” only to not see it…. yet again.

            if and when Nvidia comes up with a new card it will not redefine the market, Nvidia I suspect is working very hard on trying to grab 2% more than the bar AMD has set and once they reach it they’ll push prices right back up as high as they can.

            and then all those ppl who just love to poop on AMD can pay more and feel better about it.

            that said while I like R290X I’ll wait for a quieter one before I look to buy.

            • Airmantharp
            • 6 years ago

            1. For this generation, i.e. 28nm, Nvidia IS holding back
            2. They almost always have something in the wings- because they are almost always holding something back
            3. AMD’s most recent driver problems are undeniable- and they aren’t fixed yet

            Here’s a double standard for you- if they’d made the blower on the R9 290X as quiet in world-beating Uber mode under load as the Titan, there’d be no contest. But they didn’t. They didn’t even really appear to try.

            There’s one standard here, and it applies equally.

            • clone
            • 6 years ago

            28nm is holding both back, both have drivers issues and AMD’s in particular have been significantly reduced if PC perspective and a plethora of other websites are to be believed, especially given the complaints are now focused on 4k resolution when using Crossfire configurations…. the 0.1% (crossfire) of the 2% (high end) of the of the 5% (overall gaming presence) PC gaming market.

            it’s $400 less than titan and it’s not a double standard to mention a quieter cooler can be bought with the money saved. (that’s not to say I don’t believe they could have charged more and installed a better one but they didn’t and the end user can with the money saved)

            when AMD loses ppl say it’s expected, when Nvidia loses they deny it and say their is something coming.

            HD wins with 4xxx using value as the tool, they won with HD 5xxx by a full year, they tied with HD 6xxx and they beat Nvidia with 7xxx by a full quarter and here they are beating Nvidia with R 290X in value and performance.

            the response….. with HD 4: just shy of accusing them of cheating, the response with HD 5: “Nvidia has something coming, in the 2nd quarter: “don’t you worry Nvidia has something coming”, 3rd quarter: Nvidia holds up a video card held together by wood screws and calls it fermi saying: “just wait we have something coming” 4th quarter: it sort of arrives in limited quantity. HD 6xxx was a tie, HD 7xxx “just wait Nvidia has something coming.”

            and here we are with R290 X beating Nvidia… getting an elite product ranking from TR and every other tech website and the response to this… yet again.

            “Nvidia has something coming”

            • Benetanegia
            • 6 years ago

            [quote<]"Nvidia is holding back"[/quote<] Yeah pretty nice attempt at generalising and making it sound like an usual excuse. But no one here said nor implied that's what happens here. What it does happen is something very well known, that is happening NOW. That Nvidia has had a 15 SMX chip for months now, but only used 2688 on consumer cards. Were they holding back? I wouldn't say so, in the sense that they probably didn't do it on purpose, but rather as a fuction of yields, TDP and desired margins. But that does not preclude them from changing that function at any time and release the full featured chip. The fact that AMD gave them an extra 50w to play with only increases their options. [quote<]"they've got something in the wings and AMD will be crap soon enough"[/quote<] Are you denying the existence of the already announced 780 Ti? Because that's how it sounds like. I'm trying to find where Airmantharp or myself said "AMD will be crap soon enough" and couldn't find it. I said it will be substantially faster if NV chooses to, but I don't see where that sounds even remotedly far fetched considering that AMD had to increase the TDP by 25% in order to match Titan and further invent an Uber mode that makes it sound like a jet engine, just so it could beat it slightly on reviews. No sane person will use the uber mode while gaming.

            • clone
            • 6 years ago

            thank you for proving my point Benetanegia.

            • Great_Big_Abyss
            • 6 years ago

            I would definitely use uber-mode while gaming. I game with a high end headset (as I’m sure many sane people do) so honestly the noise from my rig under gaming load doesn’t bother me.

            • Benetanegia
            • 6 years ago

            I play with headphones too and I have a card with Twinfrowzer which is a couple orders of magnitude more silent and I still hear it while gaming. Not enought to bother me, but I can still hear it and I know what db levels are the ones that would bother me, and 40+ definitely does, let alone 50. But whether you can hear it or if it bothers with some headphones (closed ones I guess) on, is beyond the point anyway, a card should not make so much noise when there’s means to fix that problem so easily (AIB’s do it). Period.

            • Airmantharp
            • 6 years ago

            Me too!

            Except mine are open, for better positioning- which means I hear everything, both in the game and around me. It also means that my ears don’t sweat over time- trade-offs, you know.

            • clone
            • 6 years ago

            the sweating ears during action packed gameplay is the worst, switched from an expensive leather headset years ago to the cheapest foam I could find when doing multiplayer and have never looked bad.

            I’m totally intolerant of a loud machine, and universally improve the cooling of the stock assembly or replace it / modify it as required.

            just pulled the fan assembly off my GTS and attached a low RPM 120mm in it’s place.

            • Airmantharp
            • 6 years ago

            I’m using a Sennheiser HD555- there’s a number of sets like it from them, but they have large cloth cups (replaceable), an open design that keeps sweating down significantly, and are quite light. Oh, and they sound amazing :).

            • Deo Domuique
            • 6 years ago

            This time their vitriol is truly justified… Nvidia stuck in their …. a nice 1000$ treat while AMD came with half the price. You get what I mean, wouldn’t you feel frustrated, betrayed if you had gotten a Titan a while ago?

            They now writhe like fish out of the water.

            • Airmantharp
            • 6 years ago

            If you bought a Titan, you either:

            A) Knew what you were getting into- knew that the price didn’t nearly justify the performance for gaming, and knew that said gaming performance would be quickly eclipsed by lower priced products, and/or you knew you were getting a hell of a deal for compute, or,

            B) You’re an idiot, and you should probably feel bad.

            I don’t want to speculate what percentage of Titan buyers fall into which group. I’ll say that I fall into C, not listed above, of people that didn’t buy a Titan :D.

            • Klimax
            • 6 years ago

            I have Titan and no regret nor remorse. Especially since simple fan setting can have surprising effect.
            (Not to mention I have it for better part of year with support in some nice professional programs)

            • clone
            • 6 years ago

            buying a Titan for reasons not mentioned doesn’t make anyone an idiot anymore than someone buying a car for 80k when one equal to it can be had for 40k.

            some ppl like to own the best and if they are making the kind of coin that leaves Titans price as a curiosity then it’s most likely not a concern.

            • Airmantharp
            • 6 years ago

            Yeah, I’m leaving out the ‘luxury’ purchases by implying that they belong in (b) :).

            (smart rich people don’t spend money if they don’t have too…)

      • Modivated1
      • 6 years ago

      Alot of controversy over the Mantle project. It is promising great potential for the PC, and if it is adopted well then the whole market (including Nvidia) will get on board. It’s not an AMD exclusive anyone can pick up on it.

      I have heard the argument that it’s designed to take advantage of AMD hardware, that may be the case but if it does the job well enough and programmers take to it then Nvidia will make the needed changes to utilize this open source offering.

      The only thing I think should stand in the way of this product is whether or not it delivers as well as it claims to. The door is open for this to be used by anyone who wants to and to have the industry advance significantly by accepting a more effective standard is what we have needed for a long time. Will be more work for some hardware manufacturers to adopt this standard? Sure nothing comes without it’s price AND it’s what you get for being last to the table. AMD knows this lesson all to well so the industry should accept Mantle and let AMD teach.

        • Airmantharp
        • 6 years ago

        Actually, the whole industry know the lesson all too well- there were console games before there were PC games, and they were far better until 3Dfx came along.

        The thing is, there are already ‘standards’ that can be improved in similar ways that Mantle is promising. Microsoft has already stated that they’ve been working on as much; and I don’t expect Nvidia or Intel to leave performance on the table. Intel, especially, has more to gain from optimized APIs than even AMD; and they are really, really focused on graphics development.

        Mantle’s real value, like so many other ‘disruptive’ technologies, will be to push the industry toward solving a problem that’s gone on far too long.

      • Bensam123
      • 6 years ago

      I know he’s been banned, but the argument for Mantle still seems like a talking point that wasn’t given nearly enough cred in the article. Matter of a fact there was little to no actual talk about this. Nvidia has been stomped on the performance front, on the price front, and then there are all these tasty bits on top of it coming out like Mantle, GCN OS improvements, and Trueaudio…

        • Airmantharp
        • 6 years ago

        If there were only something to talk about-

        Measuring the effect Mantle has, and then it’s value, will be largely impossible for review houses. We’ll have to take developers’ words that it’s useful.

        TrueAudio sounds like the revelation in audio we’ve been waiting for since Aureal died, and Nvidia bought up PhysX- pardon the pun. But again, it’s so far down the road. You can name a title that supports Mantle, but can you name one that supports TA?

        Price/performance is a ratio; and AMD does have Nvidia squarely beat today. Tomorrow will probably be different.

        You don’t have to troll reviews, forums, or comments very far to pick up on the vibe- everyone seems to want this GPU, but no one wants this cooler. Think about that. Anyone that’s truly touting AMD’s new golden child as the ‘second coming’ is really just stuck in a red-tinted orgy. They’re not going out and buying these things, they’re all waiting for the versions with coolers that got more than a passing engineering thought.

        And Nvidia knows this too. Don’t expect them to drop their prices and/or introduce new products until AMD’s card actually becomes a threat to their sales. That means volume shipments of AMD cards with decent coolers, and that is still a ways off; thus competitive Nvidia price cuts are also a ways off.

          • Bensam123
          • 6 years ago

          It’s not about measuring it’s effect right now, it’s about discussing the technology in a open ended article. I find it hard to believe you don’t even respect that anymore.

          Since EAX died. Aureal never died, it was gobbled up by Creative and implemented in EAX.

            • Airmantharp
            • 6 years ago

            Here’s the thing- the last article on this site about Mantle is from the podcast- and they flat out stated that there isn’t anything to talk about. AMD hasn’t let any more details loose; marketing slides are it.

            I’m quite happy with TR refraining from a detailed analysis of marketing slides.
            —————–

            Creative gobbled up Aureal- but A3D never made it into EAX. Vortex audio cards had actual positioning hardware on them, not mixer chips from EMU, which is all the Live! and Audigy series were. Now EAX is dead, and we have- at best- OpenAL.

            But neither have anything to do with TA. TA is quite a lot like PhysX, and it’s going to take even more work for developers to use at the outset. Again, what games have been announced that are taking advantage of TA?

        • JohnC
        • 6 years ago

        There is nothing to talk about – no game is using Mantle right now (even BF4 will not have it at release), no other company (Nvidia/Intel) has shown interest in using it (and highly unlikely to ever do that) and even AMD itself hasn’t provided many necessary info about it. Same goes for TrueAudio. If you just want to drool over propaganda slides – you should visit Rage3D forums where the brain-dead fanatics like to congregate.

      • Pwnstar
      • 6 years ago

      Hmmm. SpigZONE or dZONEr?

        • Airmantharp
        • 6 years ago

        Banned or banned?

        See below 🙂

    • ClickClick5
    • 6 years ago

    Must….resist….for the 390x! My buy finger is getting twitchy!

      • Airmantharp
      • 6 years ago

      Just look at those Uber load noise numbers- you’ll be fine :).

        • Great_Big_Abyss
        • 6 years ago

        I’m downvoting you on all your 51 comments (11% of the total number of comments) just on principal alone. (just kidding, I don’t have time to go downvote them all, but it’s the thought that counts).

        Whoahhh…it’s waaay more than 51 negative comments from you…there’s a whole 2 other pages. Screw that. I’m not counting anymore, lol!

          • Airmantharp
          • 6 years ago

          Because pointing out that this card is over twice as loud (literally) as any other modern GPU is bad form?

          I’m thinking that too many people skipped to the comments after looking at the price and a couple performance graphs. I’d be all roses if this thing wasn’t so damn loud.

          (and I’ve definitely up-voted you more than once- because it’s not personal)

            • Great_Big_Abyss
            • 6 years ago

            I know it’s not personal. I was speaking tongue in cheek. Indeed, you make some very valid points, and I agree, this is a loud and hot card in its reference form.

            But you do harp on it so…

            • Airmantharp
            • 6 years ago

            I was really looking forward to AMD getting this one ‘right’. Oh, and I have a few days off, shame BF4 isn’t out yet- and I’ve been stuck at home waiting for ‘signature required’ shipments related to my photography habit.

            I guess what got my attention was the sheer amount of people positing that this card is the second coming without any sort of reasonable perspective. I mean, you probably can’t buy this card right away, but you don’t actually want to either; and then, trying to compare this card that you can’t and don’t want to buy with the performance and pricing of cards that you’ve been able to buy for months kind of seals the deal.

            So I’m just pointing out that the only thing anyone should really get out of this release is the performance potential of AMD’s new ‘large’ GPU. Figuring out how it fits into a system budget comes later :).

            • Great_Big_Abyss
            • 6 years ago

            I think part of the ‘second coming’ comes down to this:

            The two biggest factors in choosing a graphics card are Performance and Price.

            Heat output, Power consumption, and Noise all come after Performance and Price in most peoples’ priorities.

            Therefore, a video card that has the best performance for significantly less money than the competition will automatically be hailed a winner.

            If the 780 comes down to the same price as the 290X, then noise/heat and power will become a factor, and it will become the better option.

            But right now, at $100 less for MORE performance, the 290X is the one to go for.

            • Airmantharp
            • 6 years ago

            If it’s someone’s first high-end graphics card, that’s easy to understand.

            Everyone else should know better, though. Whether it’s CPUs or GPUs, for enthusiasts, balancing performance and noise is what it comes down to in the end.

            • Krogoth
            • 6 years ago

            Not for the ultra high-end market. Performance is the only metric. Everything else is secondly at best.

            • Airmantharp
            • 6 years ago

            This card is ‘ultra-high-end’?

            I realize that the definitions are highly subjective, but shouldn’t the adjectives used rank it below the HD7990 and GTX690?

            • Klimax
            • 6 years ago

            Many people apparently missed a memo then…

            • HisDivineOrder
            • 6 years ago

            I disagree with this.

            I think nVidia created a higher standard when they went with a cooler that showed you could have both for the 690, Titan, and then 780. Hell, they even launched the 770 with the cooler and everyone lamented that it wasn’t going to be a cooler you could regularly get on your 770. At 770 launch, I might have swapped out my Gigabyte 670 3-fan for a 4GB version had they offered one with the Titan cooler without the outrageous upcharge by EVGA. Instead, I bought a second Gigabyte 670 3-fan and went SLI, lamenting the 2 GB card.

            I think a great many people have forgotten what loud cards sound like and are going to realize it just after they install the R9 290X. Then you’re going to see lots of buyer’s remorse.

            This is not a slam on the R9 290X. That’s the shame of it. It’s just a slam on AMD’s cooling solution. For $500+, a great many people now expect a cooler that’s worth a damn.

            • Airmantharp
            • 6 years ago

            I couldn’t (and haven’t) said it better- bravo!

            • Fighterpilot
            • 6 years ago

            “Everyone else should know better, though. Whether it’s CPUs or GPUs, for enthusiasts, balancing performance and noise is what it comes down to in the end.:

            Which just goes to show you should never ever pretend again to speak for enthusiasts.

            This thread has delivered the single biggest butthurt nvidia fanboi reaction ever.
            Bravo!

            • Airmantharp
            • 6 years ago

            I won’t speak for so-called ‘enthusiasts’ that are trying to build their first ‘fast’ computer. Most of them do not want to be helped, which I fully understand.

            Still, if you’re not working overtime flipping burgers to buy one of these, then you might just appreciate an objective look at how various parts will affect performance, noise, and heat, and what you can do about it. If you’re talking about it on a tech site as respected as TR, you just might have enough sense to consider all of the angles before you purchase!

            • Fighterpilot
            • 6 years ago

            Yes…..that make sense./eyeroll
            Back when I joined TR in 2005 I held the overclock record here for a Pentium 4 3.4GHz(Northwood) at 3.9GHz on high end air.
            If only I’d been leet like you…..

      • Firestarter
      • 6 years ago

      just wait for the 290X cards with the big honking coolers, you know you want one of those 😉

        • ClickClick5
        • 6 years ago

        I kept my reference stock cooler on my 6970 for about two months, then bought one of the tri–fan Arctic Coolers and installed it. Not because of the loud turbine sound of the stock fan, but the 89c temps when playing. Now, hitting 65c is a challenge. That I like.

        But If the 290x has this big of a performance leap vs the 7xxx, then the *hopeful* 20nm 390x will be better, and cooler. JUST in time for my next rig build for winter of 2014.

          • Firestarter
          • 6 years ago

          Yup, I got one of those Arctic Extreme coolers as well, to replace the stock fan on my launch-day HD7950. With a bit of pushing and prodding, it allows me to run mine at 1100mhz 100% stable and pretty quiet 🙂

          As for the 390X, I get what you’re saying, but you’re going to need a lot of patience I think. TSMC is supposed to start producing 20nm chips somewhere in 2014, but who knows how long it takes for AMD and Nvidia to make something appreciably better than their current chips on the 28nm process.

            • HisDivineOrder
            • 6 years ago

            Who cares if they make something appreciably better?

            I’d settle for them making what they’re currently making appreciably cheaper.

            • ClickClick5
            • 6 years ago

            We shall see. I have been a long time ATI fan, so whatever is their top (single chip) card around December of 2014, I’ll get it. Now if I have to wait an extra 2 months or so for the 390x by that time, sure I’ll wait. If it is 28nm, then so be it.

            Also, I did have one Nvidia GPU my whole life: BFG Geforce 8600 GTS 256MB.
            I ended up killing it OCing the life out of it with the stock cooler. I had F.E.A.R. running at 880 FPS at 1240×1024!

          • HisDivineOrder
          • 6 years ago

          I suspect they’ll do a refresh on the R9 290/290X somewhere midway through next year. I don’t think the “Don’t refresh the cards at all” experiment worked out as well as they’d hoped because I think it tanked pricing on their cards too low and they mostly had to match those pricing levels when they rebadged the cards.

          I think they’d have preferred it had they had pricing a bit higher in the mid-range cards.

          I also don’t think they can afford to sit out the Maxwell launch without some rebadging. I’d imagine around the time of Maxwell from nVidia (supposedly midway through next year as nVidia is already namedropping them now in public to regular people), you’ll see AMD dropping R9 290X (or rebadges) to incredible pricing.

          Imagine an R9 290X-level card (hopefully with a better cooler?) for $300.

          That’s what makes me so GIDDY about AMD at the high end, pricing low already. I love, LOVE price wars at the high end because they do so much “collateral damage” to all the other lines, giving us all prices we can respect across both companies and across all the lines we’d care about.

            • JustAnEngineer
            • 6 years ago

            AMD’s Hawaii chip is brand-new and at the top of the performance charts right now.

            Why should AMD need to bring out another new high-end GPU before TSMC’s 20nm production ramps up next summer?

            • Airmantharp
            • 6 years ago

            The real question is this- if you compare the increase in die area and SPs over the Tahiti GPU, does the power draw of the R9 290X make sense? Is there any room for improvement, such that clocks might be raised in a future product?

            Because Nvidia can change the whole market picture tomorrow. Not only have they not shipped a fully-enabled SKU of GK110 to the consumer market, they’ve also not clocked it as high as it could go, which means that they’ve got a lot of leeway in their product lineup to release a ‘refresh’.

            Granted, the wonderful thing about the R9 290X release is that we might finally see Nvidia price GK110 remotely close to historic levels. I’d love to see a GTX260-like release that approaches the $300 mark :).

            • Klimax
            • 6 years ago

            Just a data point. I can push Titan to 1016+MHz area. (Under Crysis 3 it fluctuates 1016-1030; Dirt 3 Showdown stable at 1046 – I had mistake in previous posts)

            Those chips have massive room for extra work.

    • UnfriendlyFire
    • 6 years ago

    I remember when the 7000s launched, AMD left a relatively large OCing headroom, at the expense of being a bit behind on stock settings when compared to Nividia’s 600s series. That was somewhat corrected with the GHz editions.

    Seems like this time they intend to achieve the max stock settings as a marketing tool for the relatively large non-OCing crowd. Not that there’s anything wrong with it as long as the other manufacturers such as Asus don’t try to use AMD’s reference design.

    (The common traits I noticed with the non-OCing crowd that buy high end components meant to be OCed, is that they’re either ignorant about the OCing features, apathetic about the features, or fearful that they might turn their $1000 i7 chip into a paperweight.)

      • Airmantharp
      • 6 years ago

      I typically don’t OC GPUs either- but I’ve run the gamut of graphics cards, and I’m sold on EVGA FTW’s. Give me something with the best blower on the market and a decent, solid, stable overclock out of the box, and I’m happy.

      Now CPUs- well, unless I have to visit auxy to delid it, I don’t mind a little extra elbow grease there :).

    • jimbo75
    • 6 years ago
      • Airmantharp
      • 6 years ago

      What?? I can’t hear you over AMD’s new ‘quiet’ blower!

        • rxc6
        • 6 years ago

        Stop lying. You can hear him fine. Like if you would ever get one of AMD’s new cards :P.

          • Airmantharp
          • 6 years ago

          Well, had Nvidia not released info about G-Sync, I’d be hoping that some sane third party would make a decent fully exhausting cooler for it- but without that, it’s a non-sale.

          And yeah, I am looking for a multi-GPU setup to power a 4k screen in the future.

      • Bensam123
      • 6 years ago

      Titanic indeed… XD

      • HisDivineOrder
      • 6 years ago

      Titan was a resounding success. They got lots of high end enthusiasts to go out and buy a GPU for $1k+ with smiles on their faces. Hell, a few people got more than one. With SMILES.

      These are people that were screaming when the Radeon 7970 came out at $550 a year before. They were FIGHTING online sites to beat OTHERS to buy $1k+ GPU’s.

      How is that not mindblowing success?

      Lest we forget, AMD launched the 7970 at $550 and by this time last year, it was far less than that due mostly to somewhat overclocked, pushed-to-the-ragged-edge hardware launched somewhat above its weight class from the competition.

      History repeats itself.

        • xand
        • 6 years ago

        If nVidia launches something which does to the 290X what the 680 did to the 7970, that would be AMAZING.

        • Modivated1
        • 6 years ago

        First off, I want the most out of my hardware so please push it to the ragged edge. It’s better than getting a gimped piece of hardware for nearly twice the price. Second, I bet all those ” High end enthusiasts ” aren’t smiling now, they probably feel like idiots for buying and then seeing this type of performance pop up (unless they are fanboi’s)

        Finally, if you noticed the memory on this card is the slowest of all the cards released in the last two generations of video cards so there is still room to pull more out of this card. At the end of the day just like last generation we will have two flagship single gpu cards with the same performance range.

        The difference will be in the price and that is where the winner of this round be decided. Personally, I think AMD has won this round but like the Superbowl it’s really to early to call.

    • derFunkenstein
    • 6 years ago

    I was way impressed until I saw the temperatures/power usage. Even then I’m still fairly impressed. It’s quieter than the [url=https://techreport.com/review/4966/nvidia-geforce-fx-5800-ultra-gpu/2<]GeForce FX 5800Ultra[/url<]

      • ClickClick5
      • 6 years ago

      Read the first page from that review. Oo0

      Quake III, UT2003, ah man…we have come a long way, and still further to go!

        • derFunkenstein
        • 6 years ago

        Yeah, blast from the past for sure.

      • ClickClick5
      • 6 years ago

      Scott, do read this GeForce review again and look at how far you have come! And the user comments, and the testing style too.

      This is why the internet is awesome. Nostalgia.

    • itachi
    • 6 years ago

    PS I bought my hd5870 at 550 swiss franks (about same to $) at launch not 400 😮 (conclusion performance per dollar comparison)

    • Bauxite
    • 6 years ago

    I think I’ll wait for the asus triple slot version, it should really tame this hot beast.

    • itachi
    • 6 years ago

    Would have been nice to see even more games tested for such a high quality card ! thinking BF4.. and the likes (on a 64 man map plz)

    The card seem like a Titan killer just as I expected (at least for all games except Guilwars 2) .. would have been nice to see more tests, and 3dmark firestrike too !

    The card runs way too hot though of course… mmh don’t know what to make of it, my HD5870 runs uber hot too, have’nt really been a problem. it’s like I don’t need a heater most of the time lol..

    However as you point out overclocking this thing might be a problem.. I had in mind to wait for the Preoverclocked models.. those gonna struggle to keep the temp at this point (um or maybe not)

    And yea lots of Watts during load ! ah .. this sounds almost normal with that kind of temperature.

    I really think they could have improve their cooling solution for that kind of card ugh.. still the same old “1 fan” design than my old hd 5870… wow.

    Anyway thanks for this awesome review I was waiting it impatiently ! you guys always make Top quality reviews, the most professional on the web in my opinion ;). good job.

    • tomc100
    • 6 years ago

    Competition is good for everyone especially consumers. Now that AMD has a product to compete against the Titan, Nvidia has to reduce their ridiculous prices. Also, I can not wait to see games like Battlefield 4 and other Frostbite 3 engines use mantle and further erode Nvidia’s performance. I also question why they didn’t bother to test Battlefield 3 or Arma 3 which are two of the best looking games on the market. I will definitely buy this when they have versions with after market coolers and will sell my 7970 to make up the difference.

    • BoBzeBuilder
    • 6 years ago

    This thing needs a smaller manufacturing process and a few games bundled.

    • itachi
    • 6 years ago

    You talk about CPU bottleneck, but why the hell didnt you use a 4770k or the version above the 3820 ? bugs me !

      • willmore
      • 6 years ago

      Because that’s the current reference platform for all of their GPU testing. Good experimental method says to only change one variable at a time if you want to meaningfully compare results. That one variable is the GPU being tested. The rest must hold constant for meaningful results.

      Now, that doesn’t preclude doing a separate set of tests with a different CPU/MB config, but then they can’t reuse all of the work on all the past GPU testing that they’ve done. It would be a massive undertaking to redo all of those tests just to try out a theory that the CPU might be a limit in this one edge case.

      If they rerun the R9-290X tests with a different CPU to test the CPU limited theory, they can only compare the results to those for the R9-290X with the reference platform CPU. They can’t say “Oh, the nVidia cards would benefit as well.” They could infer that, but the testing would not have *shown* it.

      TL;DR: TR does good work and has limited resources.

        • itachi
        • 6 years ago

        Yea I see does make sense so we can compare to older tests, but I thought the 3820 was a bit weak compared to the top ones not by very muuch thoughl lol

    • chuckula
    • 6 years ago

    Good card at a good price for playing games, and fortunately it will force Nvidia to lower prices to remain competitive. The power consumption is a bit unfortunate since, as the review noted, these cards don’t have a huge amount of headroom for up-clocking.

    I didn’t see any compute benchmarks in there.. or did I just miss them? How fast are these things doing double-precision computation? Is there going to be a followup about how the R290X does in compute-bound loads?

    • danny e.
    • 6 years ago

    6 more watts than 280x at idle. 81 watts [b<]more[/b<] at load. 94 C temps at load. 43 more watts at load than Titan. That's a lot of extra watts and a lot of heat. Power hungry, hot and loud. I'll pass. The 280x is looking more appealing even though it is ancient in GPU terms. Perhaps with the move to 20nm sometime next year things will look better.

      • D@ Br@b($)!
      • 6 years ago

      Agree, except for me my 7970 is more appealing 🙂

      I don’t understand the card being more than three times as loud with a 100% higher DeltaT, while only using 30% more power and delivering 25% more performance!?

        • NarwhaleAu
        • 6 years ago

        Because sound is easy to generate with low power. I can generate it just with beans alone, and they aren’t that powerful.

          • Airmantharp
          • 6 years ago

          There’s more calories in those beans than you might know :).

        • Airmantharp
        • 6 years ago

        AMD appears to have given up on OEM coolers- that’s the one thing that we’ve been screaming for them (and Nvidia) to fix. Nvidia listened.

      • UnfriendlyFire
      • 6 years ago

      I think you should wait for the non-reference cooler designs.

      It’s rare for manufacturers to use AMD’s reference design unless if they’re trying to sell budget high end GPUs, which doesn’t give the best OCing performance…

        • danny e.
        • 6 years ago

        power draw wouldn’t change. 81watts more is a lot of more watts.

    • itachi
    • 6 years ago

    I don’t get the “autodownclocking” thing… what if you overclock it to just 1 mhz more ? will it still drop down randomly ? it seem to affect performance a little.. I’m a bit confused.. if it’s a température problem why they don’t put 2 fans instead of 1 ? :S

    • JustAnEngineer
    • 6 years ago

    When can we expect to see the R9-290 (non-X)?

      • itachi
      • 6 years ago

      29 or 30 I dont remember but it was leaked

      • D@ Br@b($)!
      • 6 years ago

      31 October….

      [url<]http://www.guru3d.com/news_story/radeon_r9_290_%28non_x%29_launch_date_revealed.html[/url<]

    • Dezeer
    • 6 years ago

    Thanks Scott for the great article, I noticed that I really love the frame time and frame latency charts that are not really provided by any other site. Keep up the good work.

    • dzoner
    • 6 years ago

    Next gen and Mantle optimized games are where the rubber meets the future proofing – cost performance road.

    The most telling 290X vs 780Ti face-off will be with the Mantle optimized Battlefield 4.

    • Mr. Eco
    • 6 years ago

    Great performance in all areas, great price. THE card for high-resolution / multi-monitor setups.
    Reminds me the days of of HD4870.

      • Krogoth
      • 6 years ago

      Reminds me more of the X1900XT and X1900XTX, only because of the price points.

      The 4870 similarity comes from that AMD managed to release a high-end GPU that deliver nearly the same performance as Nvidia’s top of the product (GTX 280) for a significantly lower price point ($599 versus $399 at the time). Nvidia was forced to engage in a price war. It was is likely going to happen with 770, 780, 780Ti and Titan.

        • Modivated1
        • 6 years ago

        Funny you should mention the 4870 I bought the 4870×2 back then at $499 and I am typing on it right now :). Guess you can say that I got my money’s worth.

        Anyway, I will be upgrading soon… as soon as the custom coolers come out. Let the good times roll!

    • flip-mode
    • 6 years ago

    Humph. That’s not a bad lineup AMD. Seems like it’s been a while since either card maker has so convincingly outgunned the other at every price point.

    Looks like AMD needs a better heatsink/fan. Why not that triple fan beast that was on the 7990?

      • Spunjji
      • 6 years ago

      I can only imagine they’re using the blower design to try to keep case temperature fairly steady by guaranteeing a certain amount of the card’s heat is directly exhausted. It helps to diminish proper case airflow as a variable; without that they’d have difficulty guaranteeing the card’s performance.

      Of course, anyone spending this much on a card should be buying a proper cooler and putting it in a well-thought-out case. The key word there being “should”…

        • itachi
        • 6 years ago

        I’m not very confident about mounting custom coolers .. especially on a brand new card !

          • derFunkenstein
          • 6 years ago

          Just wait – I’m pretty sure you’ll see something better from AMD’s OEM partners. ASUS in particular has had better-than-stock coolers in recent years.

            • dzoner
            • 6 years ago

            All the AIB partners have killer cooling solutions substantially better than reference blowers.

            • Airmantharp
            • 6 years ago

            Blowers are better; if the AIB partners had better blowers than what AMD ships, I’d be all over them, but the only solution I’ve see- HIS’s IceQ- is only marginally better, and likely not up to the thermal challenge here.

            • the
            • 6 years ago

            There is a bit of thermodynamics here. Blowers move air over all the cooling fins. The amount of air being moved is dependent upon the single blower fan. The down side is that by the time the air reaches the fins above the GPU, it has already been warmed by the VRM and memory portion of the cooler, thus lowering overall efficiency for cooling the GPU itself. The one advantage of this setup is that the input airflow is typically cooler as the hot air from the GPU is immediately exhausted to outside the case.

            The dual and triple fans move air from inside the case directly quickly move that air over all parts of the GPU. The amount of air moving around is typically greater in this scenario add cover more heat sink area directly. The downside is that the exhaust goes right back into the case and thus the hot air could be recycled over the GPU. Hence the high dependence of good case cooling to assist this thermal design. In a poorly ventilated case, this designs can actually perform worse than a blower design due the higher temperature air in the case than what would be coming off of just the VRM’s and memory. In open air, the dual and triple fan setups should perform better thermally.

            • Airmantharp
            • 6 years ago

            And you’ve highlighted the point!

            Both solutions depend on properly setup case cooling. To add to what you’ve said above, with blowers, you focus on intake airflow, making sure that the blower always has a steady stream of cool air. This requires a positive pressure setup and a reasonably sealed enclosure, but done properly, video cards equipped with good blower-style coolers can match or exceed the cooling and noise performance of their open-air brethren, and they can do it in a smaller enclosure with fewer fans.

            • BlondIndian
            • 6 years ago

            Good point .
            Also wish to note that in poorly ventilated cases blowers are better than open-air coolers.
            Ideally No one would put this card in such a case . However I’ve seen too many people do it . Heck , most of the OEMs do it.

            • itachi
            • 6 years ago

            I heard MSI overcloking coolers were good too, and there was another 1 can’t remember, I just hope they have good guarentee too.. i’ll just get the best one (overclocked if possible) when I get the cash !! lol, and of course, I’ll check the benchs of the 780Ti too.. damn imagine a preclocked version of this one against the preclocked r9 290x oO sounds promising, and sounds like it will force nvidia to boost their Titan too.. cause it will make no sense at all !

        • flip-mode
        • 6 years ago

        Meh, if I were to spend several hundred dollars on a card, I want the provided heatsink and fine to be top notch and I prefer not to deal with the time, hassle, warranty voiding, potential for damage, and let’s not forget the added expense that comes with putting on an aftermarket cooler. For high end cards, the stock cooler should be equal or better than anything else, in my opinion.

          • dzoner
          • 6 years ago

          So wait a coupla weeks.

          • Airmantharp
          • 6 years ago

          Yeah, I don’t like the situation much either.

          Open air coolers from third parties mean that you have to design your system’s airflow around having to cool the expansion slot area. You’ll have to put exhaust fans there, which means you’re going to need more intake fans as well.

          All that in place of just making sure that the graphics cards are properly fed with cool air. AMD had a real chance to show that they were serious about making a quality high-end part, and they failed.

            • flip-mode
            • 6 years ago

            That may be too strongly worded – it’s not that the hardware lacks quality, it’s just not quiet. The cooler may be every bit as efficient as Nvidia’s, but it still has to handle more heat so it’s going to have to spin faster.

            I dunno, given the types that usually shop at these levels, I can definitely see people picking the 290 over the Titan – it’s faster, which is often all that matters, and it’s also a good deal cheaper. Faster and cheaper is a pretty killer combo, and fan noise during gaming sessions usually isn’t an issue. If the card is quiet at idle, most people will be happy.

            AMD put out the fastest single GPU graphics card in the world. It should be surprising if it is not also louder and more power hungry than slower cards, even if hitting homers on all three of those is not unheard of.

            • Airmantharp
            • 6 years ago

            You’re absolutely right- but it’s still disappointing :).

            The ONE thing I was hoping AMD would fix, they went backwards on. They really could have done much better, though I don’t discount their ability to do far, far worse either!

      • Bensam123
      • 6 years ago

      Dumping hot air in your case isn’t always a good thing. Tripple fan atrocities give great results on open air test beds, but not in your PC. This is why I never recommend buying recirculating fans.

    • Arclight
    • 6 years ago

    With the delayed launch and rumoured price i expected the card to be positioned, performance wise, between the GTX 780 and the GTX Titan. Exceeding the GTX Titan in performance was a pleasant surprise, keeping in mind the price of this card.

    Custom cooled variants will no doubt solve the heat/noise issues while being lower priced than nvidia’s offerings.

    Personally i think that the price of this card should have ranged from $450 for the stock version to $550 for the custom cooled, highest clocked SKUs and $600 or $650 for the liquid cooled one.

    That said, it’s still way better than i expected and the obvious choice for sombody looking for a card in this the price range. The GTX Titan owners should rightfully be a bit upset.

      • BlondIndian
      • 6 years ago

      I doubt Titan owners are too upset . They are usually green kool-aid drinkers (with the exception of those needing CUDA) . So they will come up with excuses like
      [list<]* AMD drivers suckzzz *290X is a dustbuster *PhysX Rockzzz *G-sync is awesome [/list<] I believe a custom triple slot cooler will pull more performance out of this card . Come on ASUS ...

        • Airmantharp
        • 6 years ago

        I wanted to just call you out for trolling- which you are- but instead, I’ll give you a nice point-by-point response:

        1. AMD drivers may not suck right now, but to ignore history is to repeat it- and we’ve yet to see that they’ve actually addressed all of the issues that have been identified over the last few years in a community-tested WHQL release. So no, they don’t suck, but they’re not out of the doghouse yet either. There are plenty of ‘red kool-aid drinkers’ that are still fuming over AMD’s lack of driver competence. I stopped fuming when I went Nvidia.

        2. The 290X is damn loud. AMD knew it was going to use more power and need better cooling, and they didn’t step up. Sure, it’s cheaper, and yeah, there will be aftermarket coolers- but consider that Nvidia is likely to respond with a full GK110 card that has their best cooler on it; how silly will this card look then?

        3. PhysX certainly doesn’t rock, but it has established itself as a real added value for some. I’m not one of those, but I can’t deny that it does add to the experience visually and that plenty of games are taking advantage of it.

        4. G-Sync, or rather the solution that it represents, IS awesome, and it needs to be everywhere.

      • Farting Bob
      • 6 years ago

      Its the best performing single GPU on the market and is $100 cheaper than its performance rivals, and you think it should have been priced considerably lower still? I think you confused “should have been priced at..” with “i wish this was priced at..”

        • Arclight
        • 6 years ago

        The lineup has a gap performance and price wise from the R9 280X to the R9 290X. Depending on how the R9 290 will perform, my point will be invalid (if it can OC to match stock R9 290X or at least come very close to it).

    • odizzido
    • 6 years ago

    The card looks great when looking at the performance, especially considering how much smaller it is compared to the titan……but it begs for a better cooler.

    Also thanks for including a 5870 in there. As a 5850 owner it really speaks to me.

    • Welch
    • 6 years ago

    Obviously it would be next to impossible to measure it… but in the whole price/performance thing longevity is important too. A card isn’t worth anything if it burns itself out from poor thermal management, requires an aftermarket cooler or you’ve got to replace the fan from it running non-stop or you consume that much more power over its life span or…… you get the idea.

    There is a lot more to price/performance than just the sticker price. I’ll probably wait out this generation until they can bring down the temps and noise.

    • HisDivineOrder
    • 6 years ago

    I’m impressed by AMD’s pricing for the performance they’re offering. That pricing is lower than I expected and bodes well for where the R9 290 (non-X) will wind up and where the 780 will eventually land in the next few months post-nVidia-trying-not-to-drop-prices-era. I’m also EXCITED that nVidia has some competition at the high end again. Real competition is great. Price wars are great. Let’s see the 780 Ti come in lower than where the 780 is right now and the 780 throw itself down in the dirt, scrapping with the R9 290 in the sub-$500 space.

    That said, I’m less impressed by the cooler they chose and not impressed by that high temperature target. Sure, there are 8800 Ultra’s and other such products that hit high temperature targets, but those were also the days when nVidia became afflicted with solder problems due at least in part to extreme amounts of heat putting strain on something already at the ragged edge in mobile systems where the heat just built up. I just don’t think I’d want a huge amount of heat pouring into a system on a regular basis. If the cooler were better, I’d be apt to ignore the temperature, but when the cooler is so subpar with such horrible acoustics, it becomes a real issue both ways.

    It’s amusing that with nVidia cards we all complain when the OEM’s replace the Titan/780 cooler on the 770 with something else, bu t when with AMD we just can’t WAIT till they get some different coolers because AMD does such a horrible job designing/building them. Seems like they may have been better served by going with some kind of hybrid air/AIO water cooling for this card that moved out to a radiator for a 120mm fan that almost all such systems would have in the back. This is not a card for a tight, enclosed system anyway.

    Tactically, AMD did a great move here. I can’t help but wonder if they didn’t deliberately mislead certain sources with rumors of $650 to build up a lot of people saying, “Eh, $650 is not that impressive. I’d have gone for $550” (as people are apt to do) only to then drop it at $550. Suddenly, $550 seems like it’s less because people expected $650. (This is akin to what nVidia did by releasing the Titan as a “halo” product tease only to drop a mostly similar 780 a couple months later for a lot less.)

    This also puts nVidia in a bit of a bind. You know they don’t want to drop a 780 Ti for less than the $650 they released the 780 at, but it just seems like the 780 Ti has to come in matching the $550 price point, which is shaving a lot off the huge margins they’ve been making. Then again, after over a year of production of GK110’s, I imagine nVidia’s got some fully-enabled GK110’s sitting in a warehouse waiting for the day when they were needed.

    I imagine that plane Jen had to catch that kept Carmack from getting to pee was in fact a jet to a super secret facility with a warehouse full of “fully armed and operational” GK110’s. He probably had a retina scan, typed in a password (“Sooooundstooooooorm!!!!”) and then had to have his “guns” analyzed to be sure he was in fact Jen. That done, the computer allowed him entry and he walked into the warehouse, lights clicking on, successively…

    There, collecting dust, were all the GK110’s we’ve deserved for the last year. They waited for the day when AMD would arrive with real competition. Sighing, he probably turned and said, “I guess we have to sell them now. Damn it, I was hoping to get at least $750 for these. Damn them.”

    Let’s hope AMD never waits this long to compete again. We all suffer when they do.

    • RdVi
    • 6 years ago

    I’m pretty impressed, however, I’d be waiting for an open air cooler personally. I’ll probably wait until 20nm, but if I were to get one of these, I’d probably even opt for a triple slot cooler. I have no desire to go dual GPU anymore and have settled on an ATX case for the next few years.

    So yes, it’s a shame it runs so hot and loud, I wouldn’t prefer the stock version personally, but there will soon be many after market solutions which keep that in check, and the performance is certainly there. Those who want a multi-GPU setup are losing out though, blowers are certainly better in a case with more than one card.

    • bfar
    • 6 years ago

    Hats off to AMD. Hands down winner on price, no doubt about that.

    I’m not convinced by those power and thermal limits though. If you allow a GK110 card to perform at those parameters, then are we looking at a pretty level playing field?

    • willg
    • 6 years ago

    Scott – have you figured out what happened with the original Sea Islands roadmap? Is this the delayed high-end card from that series or did they abandon that and this is something else?

    • f0d
    • 6 years ago

    i wonder how close an overclocked version of the 780 would be if given the same thermal headroom (none) and the same volume from the fans (high)
    i suspect if 780 was overclocked enough they would be very close performance wise (time for an ultra version? GTX 780 ULTRA?)

    either way i suspect price cuts from nvidia so the 780 can come close to the price/performance ratio of this card

    • madtronik
    • 6 years ago

    Great GPU. Just a shame that OriginPC customers won’t be able to enjoy it.

      • Spunjji
      • 6 years ago

      Funnily enough, I could have believed the “too hot” criticism they threw out if they did so /after/ this hit the market. Such an amusingly timed announcement.

        • HisDivineOrder
        • 6 years ago

        Well, they would know about that before we did.

          • jimbo75
          • 6 years ago
            • jihadjoe
            • 6 years ago

            I read that and started wondering why the CEO of Origin needs AMD to send him some cards, then realized I was thinking about the wrong Origin.

            • Great_Big_Abyss
            • 6 years ago

            And, it wouldn’t be the CEO of Origin, it would be the CEO of EA.

    • f0d
    • 6 years ago

    seems like a nice card but i personally wouldnt get one with the volume it makes AND the heat it generates

    not that i care too much about the actual heat but with the fan already as loud as it is and the fact its pretty close to its thermal limits means pretty much no overclocking headroom at all

    maybe aftermarket coolers can improve on this situation hopefully until then ill keep my gtx670 sli at a constant 1305mhzcore 3733mem(240GB/s) and its dead silent

      • sschaem
      • 6 years ago

      Can you post a video of your gtx670 at 1305mhz running furmark ?

        • f0d
        • 6 years ago

        downloading it now although i do suspect it will drop a little in furmark because its an extreme test

        edit: its only doing 1150 – iirc isnt there something in the driver that limits its speed in furmark?

        either way by what i have seen almost all cards vary their speed in extreme tests like furmark
        [url<]http://www.geeks3d.com/20101109/nvidia-geforce-gtx-580-the-anti-furmark-dx11-card/[/url<]

    • USAFTW
    • 6 years ago

    What’s with the cold reception?
    This thing has won the TH elite award.

      • Jigar
      • 6 years ago

      Everyone was expecting this kind of performance from R9 290X

        • f0d
        • 6 years ago

        yep it was exactly what i was expecting performance wise
        a little dissappointing heat and volume wise

        • WaltC
        • 6 years ago

        I think it’s more like “Everyone was hoping for this level of performance,” and certainly there would be very few who would also at the same time “expect” it to ring the bell at 1/2 the price of the Titan, which it has just felled and buried. That was a real surprise to me. Sure, AMD said “We don’t make $999 3d cards,” but that didn’t rule out $849 or $799 or $649, etc. $549 is an MSRP that indicates AMD is hot to sell all of these that it can make, and that it believes it can make a lot of them…;) A specialized, costly, low-yield-only gpu is something you might expect to see going for $999–if you can find one.

      • JustAnEngineer
      • 6 years ago

      A card that tops the performance charts for just over half the price of the competition should get some accolades.

      It’s clear that the reference designs are thermally limited. Let’s see what Asus, Sapphire, HIS, XFX, MSI, etc. can come up with when they install custom coolers.

        • Klimax
        • 6 years ago

        With brute force. Nothing impressive.

          • thanatos355
          • 6 years ago

          We’re PC enthusiasts! “Brute Force” is our middle name!

          Crank up the voltages, crank up the clocks, and to hell with warranties!

            • Airmantharp
            • 6 years ago

            See, now I’m sure all of the [H] posters have moved over here.

            • thanatos355
            • 6 years ago

            TR or death! 😛

            Seriously though, this card has exceptional performance and is almost identical to the thermal and power envelope of the GTX 480, which garnered high praise for its performance at release.

            Hypocrisy is no friend of mine.

            • Airmantharp
            • 6 years ago

            It was known for being hot and loud too…

            • Klimax
            • 6 years ago

            And who was that that mocked Fermi cards 1 gen?

            • Airmantharp
            • 6 years ago

            …everyone?

            • Klimax
            • 6 years ago

            Laughing, I was shooting for particular company…

            ETA: Down vote for previous post that fast? I guess some people are already allergic on me… 😀

            • Airmantharp
            • 6 years ago

            You know, I must have read your comment five times before figuring out which company you were talking about. The GTX480 was noted almost universally as being too hot and too loud :).

            • Klimax
            • 6 years ago

            That particular irony is funny. (Should we tell The Fixer? :D)

            ETA: Sorry for being cryptic, but I thought it was clear…

            • Airmantharp
            • 6 years ago

            It’s alright- that generation isn’t one anyone is really fond of. The HD5000’s were loud and decent lacked tessellation support, and the 400 series was fast but even louder; they were both the butt of many jokes, and the high-end products were avoided by most. Faster framerates in Crysis weren’t going to make a better game, after all :).

            • Klimax
            • 6 years ago

            Crysis was good. (Although I don’t recall what I was playing back then, if I was at that time playing…)

            • CaptTomato
            • 6 years ago

            I had a HIS ICE COOLED 4850 dual slot, it’s typical load temps were 72-75c…and I don’t remember it being loud…..of course, just like the 290, the standard 4850 single slot cooler was hot and possibly loud.

            Many of the name brands will fix the 290’s reference cooling, and one would assume that the standard 290 won’t be as hot to start with.

            That said, many who buy the 290 are going to water cool it.

            1080p and even 1600p gamers can buy 280 and standard 290’s without worrying about temps and noise.

            I have a Saphire Vapour X 7950 and it’s an oclocking, low heat and noise champ…so Saphire and ASUS will without question fix the 290’s reference card.

            • Diplomacy42
            • 6 years ago

            I’ve been fond of it, unlike the 600 series it included 1, soon to be 2 worthwhile card(s)

            • Klimax
            • 6 years ago

            It runs on the edge, so if you just push Titan little bit harder it will eliminate this card easily, because it doesn’t have much margins left. Ti will likely do that…

            Nothing wrong with brute force, but it is not impressive. (Especially with his low overtake)

      • Fighterpilot
      • 6 years ago

      This is the Tech Report…damned with faint praise is the best you will ever see here for an AMD part.

      Its a very good win for AMD and a blow to Nvidia’s price gouging strategy.
      Wait till you see the Crossfire numbers and scaling with 2x R9 290X,it totally crushes anything in Nvidia’s lineup and does it at a price that should make JHH blush.

      The silence on how “smooooth” Nvidia’s high end cards are is deafening.
      Any of you Chucky chumps wanna discuss frametimes now?

      No?….figures.

        • Airmantharp
        • 6 years ago

        …we can discuss frametimes in CFX, at 4k.

          • D@ Br@b($)!
          • 6 years ago

          Yeah we can, thanks to the Guru,
          [url<]http://www.guru3d.com/articles_pages/radeon_r9_290x_crossfire_vs_sli_review_benchmarks,21.html[/url<] and Anand, [url<]http://www.anandtech.com/show/7457/the-radeon-r9-290x-review/12[/url<]

            • Airmantharp
            • 6 years ago

            I’ll wait for TR’s assessment, thanks 🙂

            • clone
            • 6 years ago

            ahh see, now you are just pretending to be objective when you reject facts you don’t like.

            • Airmantharp
            • 6 years ago

            I don’t reject facts- well, not more than any other imperfect human being- but I do like to verify them first, and that requires a baseline.

            TR has proven, time and again, to be a reliable baseline. I do have others, of course, but I really, really want to see what TR’s investigation into CF/Eyefinity/4k performance reveals, particularly in light of AMD’s driver issues that affect those configurations, and still haven’t been officially fixed.

            The numbers I saw from Anandtech do look pretty promising, though.

            • Modivated1
            • 6 years ago

            I have to agree that while I don’t always agree with TR’s view they are honest to the best of their ability. They may seem sometimes to favor a side with their commentary but they never skew results in numbers or fail to acknowledge the positives and negatives of both sides.

            Right now they seem like Nvidia fans with verbal beating they have been giving AMD over frame times up until now. However, I remember the introduction of the multi-monitor setup where they couldn’t stop raving over it. At the end of the day the problems they address are real, they are never in denial or delusional.

            That’s all I can ask because everybody has their favorites. If they lean toward the Green then so be it as long as they can acknowledge when the green takes a beating. Which they have when it happens.

            • anotherengineer
            • 6 years ago

            techpowerup has a crossfire review also, being north of the border I also like [url<]http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/63742-amd-radeon-r9-290x-4gb-review.html[/url<]

          • Bensam123
          • 6 years ago

          Sure when you own a 4k monitor.