Nvidia’s GeForce GTX 780 Ti graphics card reviewed

Boy, this is entertaining. AMD uncorks the Radeon R9 290X and captures the GPU performance crown. Nvidia counters by announcing price cuts and promising to introduce a mysterious new GPU, the GeForce GTX 780 Ti. Along the way, we’ve seen a host of new features and buzzwords injected into the high-end graphics space as part of the conversation: AMD’s Mantle low-level graphics API, Nvidia’s creamy smooth G-Sync display tech, and the TrueAudio DSP block built into the new Radeons. Prices have dropped pretty dramatically in the past month, too.

For a dying market, high-end PC graphics sure has a lotta vitality. Makes you wonder about the whole post-PC narrative that’s so popular right now. Hmm.

Anyhow, these are fun times. Things get even more interesting today, since Nvidia is finally pulling back the curtain on the GeForce GTX 780 Ti, its answer to the Radeon R9 290X. The GTX 780 Ti’s purpose in life is crystal clear: to be the best single-GPU graphics card in the world. We knew this fact even before the card arrived in Damage Labs. The intriguing question, in my view, was how Nvidia would achieve this feat. After all, the GK110 chip that powers the Titan and the GTX 780 has been around for quite some time now. How much additional goodness did Nvidia really have left in reserve?

The GeForce GTX 780 Ti

Yeah, turns out the green team was holding back quite a bit. The GK110 is the largest graphics processor ever made, over 100 mm² larger than AMD’s new Hawaii chip. We’ve known for ages that no GK110-based product—not even in the expensive, compute-focused Tesla lineup—comes with all of the chip’s units enabled. Nvidia has stated publicly that the chip has a total of 15 SMX units onboard, yet the GeForce Titan has one SMX disabled and the GTX 780 is down three SMX units. So the simplest way Nvidia could raise its game was to enable all of the GK110’s available units. But surely it wouldn’t be that easy, or they’d have done it already, right?

GPU

base

clock

(MHz)

GPU

boost

clock

(MHz)

ROP

pixels/

clock

Texels

filtered/

clock

Shader

pro-

cessors

Memory

transfer

rate

Total

memory

path

width

(bits)

Peak

power

draw

Price
GeForce GTX 760 980 1033 32 96 1152 6 GT/s 256 170W $249
GeForce GTX 770 1046 1085 32 128 1536 7 GT/s 256 230W $329
GeForce GTX 780 863 902 48 192 2304 6 GT/s 384 250W $499
GeForce GTX 780
Ti
876 928 48 240 2880 7 GT/s 384 250W $699
GeForce GTX
Titan
837 876 48 224 2688 6 GT/s 384 250W $999

Nvidia has evidently been keeping some juice in reserve for an occasion like this one. The GeForce GTX 780 Ti is powered by a GK110 with all 15 SMX units enabled, granting it a grand total of 2880 shader processors and 240 texels per clock of filtering power. That’s . . . plentiful, a nice increase from the GTX 780 and Titan. Impressively, the GTX 780 Ti also has higher base and boost clock speeds than those other cards, while operating at the same 250W power limit.

When I asked Nvidia where it found the dark magic to achieve this feat, the answer was more complex than expected. For one thing, this card is based on a new revision of the GK110, the GK110B (or it is GK110b? GK110-B?). The primary benefit of the GK110B is higher yields, or more good chips per wafer. Nvidia quietly rolled out the GK110B back in August aboard GTX 780 and Titan cards, so it’s not unique to the 780 Ti. Separate from any changes made to improve yields, the newer silicon also benefits from refinements to TSMC’s 28-nm process made during the course of this year. You can imagine Nvidia has been sorting its GK110B chips into different bins depending on their quality since at least August. The ones deployed on the 780 Ti are presumably the best of the best. The end result of all these measures is a GK110-based product with 15 SMX units enabled that achieves higher clock speeds at nice, tame voltages.

That’s not the whole story, either. You may recall that, in my R9 290X review, I explained how AMD’s Hawaii chip benefited from an engineering tradeoff. The design team chose to implement a simpler physical interface in order to allow a very wide 512-bit path to memory in less die area. The downside was that GDDR5 operational speeds would be relatively low, at 5 GT/s, but the additional width would make up the slack. The tradeoff worked. Although substantially smaller than GK110, the Hawaii chip in the R9 290X had more memory bandwidth, as much as 320 GB/s.

Well, the GTX 780 Ti is the revenge of the other approach to that tradeoff. The GK110 has a narrower 384-bit memory path, but it happily pairs up with GDDR5 memory chips running at 7 GT/s to achieve 336 GB/s of bandwidth—a bit more than the 290X.

Peak pixel

fill rate

(Gpixels/s)

Peak

bilinear

filtering

int8/fp16

(Gtexels/s)

Peak

shader

arithmetic

rate

(tflops)

Peak

rasterization

rate

(Gtris/s)

Memory
bandwidth
(GB/s)
Radeon HD
5870
27 68/34 2.7 0.9 154
Radeon HD
6970
28 85/43 2.7 1.8 176
Radeon HD
7970
30 118/59 3.8 1.9 264
Radeon
R9 280X
32 128/64 4.1 2.0 288
Radeon
R9 290
61 152/86 4.8 3.8 320
Radeon
R9 290X
64 176/88 5.6 4.0 320
GeForce GTX 770 35 139/139 3.3 4.3 224
GeForce GTX 780 43 173/173 4.2 3.6 or 4.5 288
GeForce GTX
Titan
42 196/196 4.7 4.4 288
GeForce GTX
780 Ti
45 223/223 5.3 4.6 336

Overall, the GTX 780 Ti has the highest theoretical peak capacities for texture filtering, rasterization, and memory bandwidth of any single-GPU solution today. That’s true even though the numbers in the table above are somewhat skewed. You see, we compute these theoretical peak rates based on the “boost” clocks for each graphics card. Trouble is, Nvidia’s boost clocks are intended to reflect the card’s typical operating frequency, not the absolute peak. Nvidia doesn’t advertise the max clock speeds for its products. Meanwhile, AMD’s boost clock reflects an upper limit, and as we’ve been learning with the R9 290 series, typical operating frequencies can sometimes be substantially lower than that.

To accentuate this point, the green team pointed out that the GTX 780 Ti’s actual peak clock speed is 993MHz. At that frequency, the 780 Ti’s theoretical maximum shader arithmetic rate is 5.72 gflops, higher than the R9 290X’s best case.

For, you know, whatever that’s worth. We’ll be measuring delivered performance here shortly.

Nvidia has added one other feature to the GTX 780 Ti that’s not present in the GTX 780 or Titan, something called power balancing. This card takes its power from three sources: the PCIe slot, a six-pin power input, and an eight-pin power input. In normal operation, those sources ought to be more than adequate to meet its 250W power budget. When overclocking, though, it’s possible one of the three inputs could become overburdened and unable to supply more power. Nvidia says the 780 Ti can pull power from the other inputs in order to get everything it needs. The result should be some extra overclocking headroom in cases where input power is the primary limitation.

The GTX 780 Ti invades Hawaii

At $699.99, the GeForce GTX 780 Ti is priced like the best graphics card in the world. Nvidia has endeavored to soften the blow with various enticements. I continue to be, er, a fan of the swanky aluminum-and-magnesium cooler that the 780 Ti shares with other high-end GeForces. Also, for the holiday season, the card comes with a pretty nice bundle of games—Splinter Cell: Blacklist, Arkham Origins, and Assassin’s Creed 4—and a coupon for a $100 discount on Nvidia’s Shield handheld game console, which can stream games from a GeForce GTX-equipped PC. None of these things make a $700 graphics card feel like a good deal, but they help bridge the gap with the R9 290X, which sells for $150 less.

Interestingly, the GeForce GTX Titan will soldier on at $1K, even though it’s slower than the 780 Ti. The Titan doesn’t have much appeal left to gamers, but its full complement of double-precision floating-point units should continue to make it attractive to folks developing GPU-computing applications in CUDA. Like the 780, the 780 Ti can only do double-precision math at 1/24th the single-precision rate, not one-third like the Titan.

The GTX 780 Ti faces some formidable competition from the Radeon R9 290X. When discussing its new product with us, Nvidia took some time to explain how the 780 Ti differs from the competition. Some of what they offered in this context was FUD about the variable performance of the 290X cards in the market. Nvidia apparently tested about 10 different 290X cards itself and saw large clock speed variations from one GPU to the next. This is an issue that the media is beginning to tackle—and that AMD says it’s looking into, as well. We expect a driver update from AMD soon to address this problem.

However that story plays out in the coming days, Nvidia made a couple of relevant points in explaining how it avoided these issues in the GTX 780 Ti and other products. First, Nvidia’s dynamic voltage and frequency scaling algorithm, GPU Boost 2.0, works similarly to AMD’s PowerTune, targeting specific limits for temperature and power draw and pushing the GPU as hard as possible within the scope of those parameters. But GPU Boost 2.0 contains one variable that the PowerTune routine in AMD’s newest graphics card lacks: a clearly stated base or minimum clock frequency that acts as a guarantee of performance. For the 780 Ti, the base clock is 876MHz. Unless something goes entirely wrong, the GPU shouldn’t run any slower than that during normal use.

Nvidia was very meticulous about explaining how GPU Boost works when it introduced the feature alongside the first Kepler-based card, the GeForce GTX 680. Clearly, the firm wanted to avoid negative user reactions to variable clock speeds. The absence of a baseline performance guarantee in AMD’s competing PowerTune algorithm isn’t necessarily a major drawback, but it could become one if the user experience varies too greatly. Building a known baseline clock into the card’s spec and operation is a good way to remedy that ill.

There’s also been quite a bit of discussion about why the R9 290X’s PowerTune limit is a relatively toasty 94°C and why its cooler generates so much noise. Much of that discussion has been focused on the GPU’s power draw and the amount of resulting heat to be removed—and whether AMD’s stock cooler is good at doing its job—but Nvidia offers a slightly different take.

The key variable, the firm contends, is thermal density, the amount of power to be removed within the surface area of the chip. The slide above, from Nvidia’s product presentation, illustrates the difference in thermal density between the GK110 chip on the GTX 780 Ti and the Hawaii chip on the R9 290X. Hawaii’s thermal density is substantially higher. The GK110’s relatively large surface area and lower power limit allows the GTX 780 Ti to run quieter and at lower temperatures with a similar-sized cooler.

That’s the theory, at least. We’ll put it to the test shortly.

Test notes

Original Closest

current

equivalent

GeForce GTX 670 GeForce GTX 760
GeForce GTX 680 GeForce GTX 770
Radeon HD 7870 GHz Radeon R9 270X
Radeon HD 7970 GHz Radeon R9 280X

The comparison on the following pages covers nearly all of the latest Radeons and GeForces, down to around $200. We’ve also included older Radeons dating back to the Radeon HD 5870, the first DX11-capable GPU. As you look through the results, you might be missing the most immediate prior generations, the Radeon HD 7000 series and the GeForce GTX 600 series. We haven’t slighted those GPUs. They’re here; they’ve just been re-branded. The table on the left shows how these re-branded Radeon and GeForce cards map to the older generation. Some of the clock speeds have been tweaked a bit during the re-branding, but generally, the differences are fairly minor.

Please note that our Battlefield 4 results come from a slightly different OS and software config that what’s listed in the tables below. For our BF4 tests, we updated to Windows 8.1 and the latest graphics drivers, including GeForce 331.70 and Catalyst 13.11 beta 8.

To generate the performance results you’re about to see, we captured and analyzed the rendering times of every single frame of animation during each test run. For an intro to our frame-time-based testing methods and an explanation of why they’re helpful, you can start here. Please note that, for this review, we’re only reporting results from the FCAT tools developed by Nvidia. We usually also report results from Fraps, since both tools are needed to capture a full picture of animation smoothness. However, testing with both tools can be time-consuming, and our window for work on this review was fairly small. We think sharing just the data from FCAT should suffice for now.

Our testing methods

As ever, we did our best to deliver clean benchmark numbers. Our test systems were configured like so:

Processor Core i7-3820
Motherboard Gigabyte
X79-UD3
Chipset Intel X79
Express
Memory size 16GB (4 DIMMs)
Memory type Corsair
Vengeance CMZ16GX3M4X1600C9
DDR3 SDRAM at 1600MHz
Memory timings 9-9-9-24
1T
Chipset drivers INF update
9.2.3.1023

Rapid Storage Technology Enterprise 3.5.1.1009

Audio Integrated
X79/ALC898

with Realtek 6.0.1.6662 drivers

Hard drive OCZ
Deneva 2 240GB SATA
Power supply Corsair
AX850
OS Windows 7
Service Pack 1
Driver
revision
GPU
base

core clock

(MHz)

GPU
boost

clock

(MHz)

Memory

clock

(MHz)

Memory

size

(MB)

GeForce GTX 660 GeForce
331.40 beta
980 1033 1502 2048
GeForce GTX 760 GeForce
331.40 beta
980 1033 1502 2048
GeForce GTX 770 GeForce
331.40 beta
1046 1085 1753 2048
GeForce GTX 780 GeForce
331.40 beta
863 902 1502 3072
GeForce GTX Titan GeForce
331.40 beta
837 876 1502 6144
GeForce GTX
780 Ti
GeForce 331.70 beta 876 928 1750 3072
Radeon
HD 5870
Catalyst
13.11 beta
850 1200 2048
Radeon
HD 6970
Catalyst
13.11 beta
890 1375 2048
Radeon
R9 270X
Catalyst
13.11 beta
1050 1400 2048
Radeon
R9 280X
Catalyst
13.11 beta
1000 1500 3072
Radeon
R9 290
Catalyst
13.11 beta 5
947 1250 4096
Radeon
R9 290X
Catalyst
13.11 beta 8
1000 1250 4096

Thanks to Intel, Corsair, Gigabyte, and OCZ for helping to outfit our test rigs with some of the finest hardware available. AMD, Nvidia, and the makers of the various products supplied the graphics cards for testing, as well.

Also, our FCAT video capture and analysis rig has some pretty demanding storage requirements. For it, Corsair has provided four 256GB Neutron SSDs, which we’ve assembled into a RAID 0 array for our primary capture storage device. When that array fills up, we copy the captured videos to our RAID 1 array, comprised of a pair of 4TB Black hard drives provided by WD.

Unless otherwise specified, image quality settings for the graphics cards were left at the control panel defaults. Vertical refresh sync (vsync) was disabled for all tests.

In addition to the games, we used the following test applications:

The tests and methods we employ are generally publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.

Texture filtering

We’ll begin with a series of synthetic tests aimed at exposing the true, delivered throughput of the GPUs. In each instance, we’ve included a table with the relevant theoretical rates for each solution, for reference.

Peak pixel

fill rate

(Gpixels/s)

Peak

bilinear

filtering

int8/fp16

(Gtexels/s)

Memory
bandwidth
(GB/s)
Radeon HD
5870
27 68/34 154
Radeon HD
6970
28 85/43 176
Radeon HD
7970
30 118/59 264
Radeon
R9 280X
32 128/64 288
Radeon
R9 290
61 152/86 320
Radeon
R9 290X
64 176/88 320
GeForce GTX 770 35 139/139 224
GeForce GTX 780 43 173/173 288
GeForce GTX
Titan
42 196/196 288
GeForce GTX
780 Ti
45 223/223 336

This color fill test is usually a better indication of memory bandwidth than anything else. The GTX 780 Ti lives up to its specs here, outperforming the 290X.

Nvidia’s new flagship almost runs the tables in our texture sampling and filtering tests. Because the Kepler architecture can do full-rate FP16 filtering, the 780 Ti’s lead over the R9 290X becomes commanding when we get into higher-precision color formats.

I’m not quite sure why the 780 Ti trails the GTX 780 and Titan in the FP16 filtering test. I double-checked the results, and they are correct. Could be that the 780 Ti is hitting some kind of thermal limit. That sometimes happens in these directed tests.

Tessellation and geometry throughput

Peak

rasterization

rate

(Gtris/s)

Memory
bandwidth
(GB/s)
Radeon HD
5870
0.9 154
Radeon HD
6970
1.8 176
Radeon HD
7970
1.9 264
Radeon
R9 280X
2.0 288
Radeon
R9 290
3.8 320
Radeon
R9 290X
4.0 320
GeForce GTX 770 4.3 224
GeForce GTX 780 3.6 or 4.5 288
GeForce GTX
Titan
4.4 288
GeForce GTX
780 Ti
4.6 336

I still don’t have any answers from AMD about why the Hawaii-based products don’t handle the higher levels of tessellation in TessMark all that well. Anyhow, the GTX 780 Ti has no such problem, and it cranks through the highly tessellated Unigine benchmark with ease, too.

Shader performance

Peak

shader

arithmetic

rate

(tflops)

Memory
bandwidth
(GB/s)
Radeon HD
5870
2.7 154
Radeon HD
6970
2.7 176
Radeon HD
7970
3.8 264
Radeon
R9 280X
4.1 288
Radeon
R9 290
4.8 320
Radeon
R9 290X
5.6 320
GeForce GTX 770 3.3 224
GeForce GTX 780 4.2 288
GeForce GTX
Titan
4.7 288
GeForce GTX
780 Ti
5.3 336

The full-fledged GK110 acquits itself well here, but the R9 290X still has the edge in the majority of our shader tests. ShaderToyMark is probably the most interesting of these tests, and in it, the GTX 780 Ti makes major strides compared to the GTX 780 and Titan.

Crysis 3


Click through the buttons above to see frame-by-frame results from a single test run for each of the graphics cards. You can see how there are occasional spikes on each of the cards. They tend to happen at the very beginning of each test run and a couple of times later, when I’m exploding dudes with dynamite arrows.



Intriguing. The 780 Ti produces the highest FPS average, yet it ties for fourth place behind a trio of Hawaii-based Radeons in the latency-focused 99th percentile frame time metric. Look at the latency curves for the different cards, and you can see why that is. The Radeons’ latency curves are shaped somewhat differently; frame times stay low for just a little bit longer before curving upward to about the same terminus as the competing GeForces. Credit AMD’s drivers and hardware for squeezing some rendering time out of the most difficult 1% of frames in this scene.

Still, this is a case where our choice to rule out all but the last 1% of frames rendered feels a little arbitrary. The latency curves are otherwise similar. Look at our “badness” measure, and the GTX 780 Ti spends less time working on frames that take longer than 16.7 milliseconds to render than any other card. Arguably, then, it’s the smoothest of the bunch.

The bottom line is that we’re talking about minor differences when all of the higher-end cards perform exceptionally well. There are a few frame time spikes on each of the cards’ plots, but those appear to be caused by CPU bottlenecks. Almost all of the cards are affected similarly by those momentary hiccups.

Far Cry 3: Blood Dragon




Haha. Wow. Talk about a slight victory. The GTX 780 Ti is faster than the Radeon R9 290X in its noisy “uber” fan mode, but only by one frame per second, on average, and a tenth of a millisecond in the 99th percentile frame time.

Battlefield 4

I took the time to add BF4 into the mix, now that it’s been released and we have drivers from both AMD and Nvidia optimizing BF4 performance. These results come from the single-player campaign. I’d like to test multiplayer eventually, but doing so and getting good, consistent results is not easy.

Since this game uses a 64-bit executable, it’s not compatible with the FCAT overlay, so I used Fraps to record frame times.

Also, here’s something different. As you may have noticed, Fraps frame time plots tend to involve more variance than FCAT plots. You’ll sometimes see a “heartbeat” pattern in Fraps results—with one long frame time followed by one or two relatively short frame times—that doesn’t show up in FCAT. That’s because Fraps records frame submission times early in the pipeline, while FCAT measures frame delivery at the very end of the process. In between the two, triple buffering tends to smooth out small hiccups before the frames are delivered.

Depending on how the game’s internal timers work, either Fraps or FCAT results may be more “correct” in describing the smoothness of the final animation. Each game seems to work a little differently, but our current understanding is that most game engines use some sort of moving average to determine how to advance their animation timing from one frame to the next. When that’s the case, then FCAT results are the better indicator.

At the suggestion of AMD’s Raja Koduri, we’ve attempted to simulate the effect of triple-buffering on our Fraps data by implementing a simple, three-frame low-pass filter (just a moving average, in this case.) As you’ll see below, the filtered Fraps data is free from those quick “heartbeat” artifacts, and we believe it’s a more faithful representation of BF4 animation smoothness. You’ll still see frame time spikes in the filtered data, and those spikes are much more likely to have an impact on animation smoothness.

On another front, we’ve chosen a longer 90-second window for our BF4 test scenario and added a bit of an in-game warm-up period for each card prior to testing. That should help make sure our results reflect the true performance of Hawaii-based graphics cards with aggressive PowerTune settings.




Chalk up a clean, if narrow, win for the GTX 780 Ti across all metrics here.

GRID 2


This looks like the same Codemasters engine we’ve seen in a string of DiRT games, back for one more round. We decided not to enable the special “forward+” lighting path developed by AMD, since the performance hit is pretty serious, inordinately so on GeForces. Other than that, we have nearly everything cranked to the highest quality level.




From one perspective, this game isn’t much of a challenge for the faster video cards in this group. Everything from the GTX 770 on up achieves near-perfect 60Hz frame delivery. However, as we’ve noted before, none of the cards crank out frames quickly enough to keep up with a 120Hz display. Doing so would require consistent rendering times of 8.3 milliseconds or less.

Don’t make too much of the small and perhaps unexpected differences in the latency-focused metrics between the GTX 780 Ti and the R9 290X in uber fan mode. There’s enough variance in how I drive during these test sessions to account for those. We’re talking about tenths of a millisecond in frame rendering times. We can’t squeeze all of the variance out of our manual testing methods. Not with me at the wheel.

Tomb Raider





Welp, another game, another closely contested victory for the GTX 780 Ti.

Guild Wars 2




Well, this throws a wrench in the works. We’ve noted before that Guild Wars 2 appears to have an issue that causes faster video cards to exhibit intermittent frame time spikes. The faster the GPU, the larger the spikes. Look at the latency curves for the top three GeForces, and you’ll see that the 780 Ti is most affected, followed by the Titan and then the GTX 780. Those three cards also produce the most frames overall, as the FPS averages indicate. The R9 290 and 290X also suffer from this problem, but much less so. The slower Radeons and GeForces are both completely unaffected.

The impact of this problem is minimal, since the spikes aren’t that large. The GTX 780 Ti wastes only nine milliseconds above our 33-millisecond “badness” threshold. Still, you can feel some little hiccups while playing, and the slowdowns drop the 780 Ti’s 99th percentile frame time to the middle of the pack. The shame of it is that this appears to be an application-level issue with fast GPUs, not some problem with the GPU hardware or software.

Power consumption

Please note that our load test isn’t an absolute peak scenario. Instead, we have the cards running a real game, Skyrim, in order to show us power draw with a more typical workload.

Interesting. Despite the card’s 250W peak power limit, our GTX 780 Ti-equipped test system draws even more power than the same system with an R9 290X in the PCIe slot. I think we’ll add some additional workloads to our power testing in the future, so we can get a better sense of how these things vary.

Noise levels and GPU temperatures

Wow. Looks to me like there may be something to Nvidia’s contentions about the impact of GPU thermal density. Despite its higher overall power draw, the GeForce GTX 780 Ti is substantially quieter than the Radeon R9 290X—while keeping chip temperatures over 10°C lower.

Nvidia has obviously pushed the envelope a little on temperatures and fan speeds in order to extract some additional performance out of the GK110. The 780 Ti’s GPU Boost temperature limit has risen to 83°C, and its fan speed limit has risen, too, compared to the 780 and Titan. Still, those limits seem positively conservative—like, Paul-Ryan-co-authored-the-bill-with-Ted-Cruz conservative—compared to AMD’s choices for the R9 290 and 290X cards. Frankly, I hope Nvidia doesn’t push much further next time. The 780 Ti’s understated acoustic profile is fitting for a premium product.

Conclusions

Ok, it’s time to boil down our test results to one of our famous value scatter plots. As always, the best combinations of price and performance will be situated closer to the top left corner of the plot, and the less attractive ones will be closer to the bottom right.


Click back and forth between the plots, and you’ll see that the GeForce GTX 780 Ti wins the FPS performance sweeps by a nice margin over the Radeon R9 290X. In the 99th percentile frame times, the 780 Ti lands above the R9 290X but below the 290X in its noisy uber fan mode. The application-level issues we saw in Guild Wars 2 hurt the GTX 780 Ti there, perhaps unfairly. Some of the other games appear to be up against CPU performance limitations with the very fastest cards, as well. We’re talking about very small differences in the delivery of the last 1% of frames rendered, but still, credit AMD’s driver team for making some nice strides in smooth frame delivery.

Based on the totality of our results, I’d say the GeForce GTX 780 Ti has returned the single-GPU performance crown to the green team’s trophy case in a controversial split decision. Perhaps more importantly, the GTX 780 Ti achieves these performance levels while running at much lower temperatures and noise levels than the R9 290X. This truly is the finest single-GPU graphics card on the market, and it’s a very attractive overall package. If you buy your video cards based on bragging rights rather than value proposition, then the GTX 780 Ti is the card to beat. You can even make a value case for it over the $150-cheaper R9 290X if you factor in the bundled games, Shield discount, and quieter operation.

For most folks, though, forking over 700 bucks for the GTX 780 Ti will seem like madness when the Radeon R9 290 offers 90% of the performance for $300 less. Yeah, the Radeon is noisier, but I’m pretty sure $300 will buy a lifetime supply of those foam earplugs at Walgreens. Heck, throw in another hundred bucks, and you could have dual R9 290s, which should presumably outperform the GTX 780 Ti handily.

In fact, that’s what I need to test next: the Hawaii GPU’s new XDMA-based CrossFire tech. I plan to throw in a pair of GTX 780 Ti cards and connect it all to a 4K monitor, too. I have a feeling the next few weeks will continue to be unusually entertaining. Stay tuned.

Twitter implements a 140-character low-pass filter on my thoughts.

Comments closed
    • kamikaziechameleon
    • 6 years ago

    This graph is a farce at this point. Prices are so dramatically off its ridiculous!

    Nvidia for the win when you look at the price desparity that is on new egg right now.

    • deb0
    • 6 years ago

    Nvidia is not a bargain brand. Yes, I wish they would be nice and price lower, but the ugly truth is they don’t have to. Got my 2 x evga gtx 780tis a couple of days ago and BF4 on ultra is absolutely stunning. If you want cheap, AMD is your brand.

    • deb0
    • 6 years ago

    My Catleap 27″ will love these beasts! Ordered 2 of these and they will be here tomorrow and I can’t wait!

    • Klimax
    • 6 years ago

    Looks like custom coolers might help not only 290x but 780ti too.
    [url<]http://www.pcper.com/reviews/Graphics-Cards/EVGA-GeForce-GTX-780-Ti-3GB-ACX-Preview-Overclocked-GK110[/url<]

    • Jigar
    • 6 years ago

    So R9 290 (Non X) is the best buy for enthusiast. Great.

      • Krogoth
      • 6 years ago

      770 and 280X (7970GE) are more sensible buys for the higher-end.

      290 and 780 (after upcoming price cut) are the next logical step with diminishing returns.

      290X and 780Ti belong in more money than sense/want the best no matter the cost tier.

        • Klimax
        • 6 years ago

        Or I already got 4K, so something must push those pixels… (alternatively I got Crysis 3 and want to have maxed out settings no matter what)

          • Krogoth
          • 6 years ago

          4K (8 Megapixels) gaming is pointless. Not even 780Ti and 290X can deliver a playable experience with all of the bell and whistles. 4K video content is going to be rare for a while. Don’t get draw in by the hype.

          4K at this time is only good for analyzing huge images, monitoring and other professional projects that benefit from having more screen resolution.

            • Airmantharp
            • 6 years ago

            Assuming a permissive size/cost product, 4k would be preferable to a pair of 4MP 1440p/1600p monitors for many uses. Not all, but many.

            Of course, I’d prefer to go from 30″ 1600p to 40″ 4k on my desktop, but as you say, the video cards are the challenge. There is no consumer GPU that has enough memory for 4k to be comfortable yet, though I have no problem bolting a smattering of cards into my system to hit my framerate targets.

          • jihadjoe
          • 6 years ago

          Well, if you have 4k that means you were willing to spend 5 grand on a monitor which means you probably have any GPU you want.

        • Prestige Worldwide
        • 6 years ago

        GK104 and Tahiti are still good performers but I wouldn’t recommend buying either when you can get a 290 (non x) for $399, unless you are getting a good sale price like $250 for a 7970 GHz. Paying $330 for a 770 is just silly when you can be in Titan territory with a 290 for another $70.

          • Airmantharp
          • 6 years ago

          Just make sure you wait for the custom coolers :).

          A 4GB GTX770 is still appealing today given the noise profile. Not as fast, sure, but at 1080p it wouldn’t matter (yet).

          • Krogoth
          • 6 years ago

          You got to remember that video cards beyond $299 don’t sell that much in volume. The bulk of the GPU sales are in $199-299 range. 770 and 280X/7970GE are plenty fast for their price points. 770 is going to get a price cut soon like the 780, which is going to be place somewhere in $249-$299 range. AMD will follow suite as well. Only a year later you get can closeto top-tier performance for only $249-299 when it used to cost $500+. That’s pretty incredible.

    • ronch
    • 6 years ago

    So nice to see AMD and Nvidia still neck and neck in the graphics arena. Competition is a good thing, for customers, that is.

      • deb0
      • 6 years ago

      There’s no neck and neck here. The price clearly reflects that.

    • confusedpenguin
    • 6 years ago

    Winter is a coming. Nice to know I can heat a small room with this thing.

    • ronch
    • 6 years ago

    Personally, if I was out for one of these high end video cards I’d go for the 290X despite its searing hot temperatures. Also of note is how the Nvidia cooler keeps the GPU (which consumes more power according to TR’s Skyrim graph) cooler while being a bit quieter. Just goes to show how the stock 290X cooler needs replacing.

    If only the CPU industry was this exciting.

    • Mr. Eco
    • 6 years ago

    [quote<]Nvidia apparently tested about 10 different 290X cards itself and saw large clock speed variations from one GPU to the next.[/quote<] Again NVidia tinkers with competitor's products, instead of working on their own. Have they again invited several web site writers, with recommendation to write about it?

      • MathMan
      • 6 years ago

      Are you honestly surprised that multinationals like Nvidia, AMD, Intel etc have dedicated teams that do competitive analysis?

      I bet that there are plenty of Hawaii dies out there that has been subjected to a very thorough and destructive X Ray imaging session in some Santa Clara labs.

        • Mr. Eco
        • 6 years ago

        It is not about competitive analysis. It is about having semi-secret meetings with journalists, pointing them what and how to write about.

          • Klimax
          • 6 years ago

          Unlike say Qualcomm…

          It’s not like many things weren’t reported even before NVidia pointed them out. Also do you really believe AMD wouldn’t do that? Or similar like allowing best results to get out early?

          Don’t make AMD out as anything but corporation it is.

      • Amgal
      • 6 years ago

      Par for the course.

      • Klimax
      • 6 years ago

      BTW: NVidia is doing same as Intel, who has whole teams only to analysis and predict where competition will go and how well they’ll likely do. (Introduced at the same time as tick-tock release cadency as part of never again allow competition to best us)

    • Fighterpilot
    • 6 years ago

    Bleh,talk about over hyped and under performing.
    $700 for a power hungry card that barely beats 290X in any games(unless they are cherry picked Twimtbp)
    Only Nvidia suckers will buy this one.

    • BIF
    • 6 years ago

    And here we go again with ANOTHER full, complete, and comprehensive review of a new graphic card.

    Except…

    No GPGPU information, no folding information. Not even vague speculation.

    Can anybody point me to a GTX 780 TI review that contains SOMETHING about GPGPU capability? Pretty please?

      • Prion
      • 6 years ago

      Amen! Do I want R9 290(X) or GTX 780(Ti) to mine the most bitcoins while I’m not gaming?

        • BIF
        • 6 years ago

        I don’t know anything about bitcoins, but I think I want a GTX 780 Ti for CUDA. I guess it can do OpenCL too, and both of those are used by graphic renderers. Sometimes one or the other, depending on the renderer.

        On the other hand, I know that the Radeon drivers are fine for F@H. I don’t know if the Nvidia drivers are.

        So given WHAT LITTLE I CAN FIND IN REVIEWS (hint hint), I would say that the safest choice at this point is STILL an HD 7990. Dual GPU will still yield more than a GTX 780 Ti-anything for a given slot, and HD 7xxx drivers. Should work for everything I want to do.

        There is Titan, but the HD 7990 is at least $200 cheaper and probably will do better, even if it does need two GPUs to accomplish it.

        Don’t get me wrong, I’m excited about the new technology from AMD and Nvidia, but how can I make ANY determination when I can’t even find one lousy review with GPGPU info?

        I’m frustrated that nobody in the blogosphere seems to think it’s important to test all these new cards to their fullest extent; namely, their ability to run code!

        And now I’m depressed that I actually resorted to using the word “blogosphere” in a sentence. All I want for Christmas is a more complete review!

          • Airmantharp
          • 6 years ago

          Do you actually, really, want to run F@H?

          I mean, if you do, then it shouldn’t be too hard to extrapolate the increase in shader cores over the HD7970 to project performance. And that’s assuming that people haven’t already tested for F@H performance in those specific communities.

      • chuckula
      • 6 years ago

      Anand has some benchmarks: [url<]http://anandtech.com/show/7492/the-geforce-gtx-780-ti-review/14[/url<] Basically: At single precision, the 780TI is actually ahead of Titan. At double precision, the Titan destroys everybody else (including the R9-290x) and the margin is so big that the Titan actually wins the price/performance ratio in those types of workloads. Of course, those types of workloads are limited, but in that niche Titan is still the king.

        • Amgal
        • 6 years ago

        Tomshardware also has a suite of non-gaming applications in their video card reviews now.

        • Klimax
        • 6 years ago

        Which isn’t surprising since 290x doesn’t have full DP. (For that you should buy their professional cards)

          • Klimax
          • 6 years ago

          Looks like I have too some personal downvoters… 😀

            • chuckula
            • 6 years ago

            Welcome to the club…

        • Visigoth
        • 6 years ago

        Agreed. In double-precision performance, if you do not need ECC support and other miscellaneous enterprise features, the Titan is a very affordable alternative to NVIDIA’s Tesla.

    • LastQuestion
    • 6 years ago

    I’d like to see that crossfire review also bench 1080p.

      • clone
      • 6 years ago

      overkill.

        • Firestarter
        • 6 years ago

        144hz

          • clone
          • 6 years ago

          it’s still overkill.

            • Firestarter
            • 6 years ago

            How so? What if you want 4xMSAA to go with that 144hz screen whilst playing BF4? If a single 780ti or 290X won’t cut it at 1080p, then it’s not overkill. It might not be great value for money, it might be a horrendous power hog and heat the room quite a bit, but as long as it will deliver the performance it will be worth it to someone.

            I would do it if I could spend my money with not a care in the world, that’s for sure.

            • f0d
            • 6 years ago

            i agree its not overkill at all
            i dont know of many games at all that i can keep a minimum 144fps on with my dual 670’s and 5ghz 3930k

            • f0d
            • 6 years ago

            i must have some personal downvoters (like a few others here) because i cant find any proof that any 1 card can do 144fps at 1080p

            anandtechs review of the 290 tested it and the 290x at 1080p and none of the games tested could get an average of 144fps

            • clone
            • 6 years ago

            can you see a difference using 4Xmsaa or do you just want to use 4Xmsaa, do you claim to be able to distinguish a difference between 120hz and 144hz or just want to have it?

            is 144fps the new 30fps?

            • Firestarter
            • 6 years ago

            OK I’ll bite.

            [quote<]can you see a difference using 4Xmsaa[/quote<] If you can't see the difference between no AA and 4xMSAA on a 1080p screen at desktop viewing distances, you really need to get your eyes checked if you haven't already. I'm not joking, that would be evidence of a lack of visual acuity. [quote<]do you claim to be able to distinguish a difference between 120hz and 144hz[/quote<] I cannot claim to see that difference as I do not own a 144hz screen (mine only goes to 120hz) nor have I ever seen one in action. [quote<]is 144fps the new 30fps?[/quote<] If we would to setup two computers next to each other, both running a first person shooter game like BF4, assuming you're familiar with FPS games, one running at 30fps on a 60hz screen and the other running at 144fps on a 144hz screen, and you could not tell the difference between those two, I would have to assume that you have suffered a significant reduction in [i<]mental[/i<] acuity, either through age, intoxication or trauma. Note that I do not claim that this is true for all games, and I think it might not be so completely obvious to someone who is not familiar with first person shooters on the PC. That said, I'm certain that almost everyone who actively reads TR would immediately be able to tell the difference and prefer the game running at 144fps on a 144hz screen. With that out of the way, I would like to say that high frame- and refresh rates most definitely have diminishing returns. That means that for most of us, a Crossfire or SLI setup with these high-end cards is not good value for money, as the extra framerate brought by extra GPU power only brings relatively small gains for the money spent. And that is before you consider that the game might be limited by the CPU. But, in my opinion, the term overkill is reserved for situations in where the extra power would bring [i<]very little, if any benefit at all[/i<].

            • swaaye
            • 6 years ago

            Actually, MSAA is somewhat of a waste these days since it misses so much in modern games. It doesn’t even hit all polygon edges most of the time, let alone the problems caused by shader effects and alpha textures (most games don’t support alpha test AA). I often end up with MSAA disabled (if it’s even available) because the performance hit is barely worthwhile.

            AA is really a mess at this point, but I think SMAA is alright. Too bad it’s not supported more often. SSAA is amazing but is so incredibly wasteful and demanding of the GPU power we have available.

            • Firestarter
            • 6 years ago

            it still helps quite a bit though, even if it only removes half the jaggies

            • clone
            • 6 years ago

            did I say no AA vs 4XMSAA?… nope, that said I tend to be in motion in fps’s…. not stopping and looking for flaws.

            I doubt you can see the difference between 120 and 144…. silly to be honest and certainly not worth TR dedicating a cppl days towards the endeavor.

            your 3rd response is….. flawed in every way.

            1st you lack understanding, 30 is a minimum playable level, your singular focus on 144 is…. flawed, hence I mentioned is 144fps the new 30fps? I’m not commenting on differences but instead on your request to have Crossfire and SLI benches for 1080p….. something those setups has so outgrown as to make the effort pointless. their is a tendency for ppl to underestimate those they are talking to…. yes you would be able to discern between 30fps and 144fps… but could you discern between 72 and 144?…. maybe but are you really sure and more notably would you consider the jump from 72fps to 144fps worth triple the price in gfx and psu to power them?

            you focused only on 144fps and even ignored 120fps like it would be life altering if it was anything less, then you added in “well if I didn’t have a care in the world and unlimited access to money I’d do it”….1st off no you wouldn’t, you go higher resolution and more displays because you don’t have a care in the world and more money than you can handle……. but ignoring that …. “I’ll bite”, so FireStarter how large do you believe the “I have unlimited access to money and not a care in the world, but…….. I only wants to game at 1080p” demographic is?

            • Firestarter
            • 6 years ago

            [quote<]did I say no AA vs 4XMSAA[/quote<] What else did you have in mind, 2xMSAA? Besides, motion amplifies aliasing rather than hiding it. [quote<]I doubt you can see the difference between 120 and 144....[/quote<] Yes and I'm not sure either. My points are equally valid for 120hz though. [quote<]your 3rd response is..... flawed in every way[/quote<] So is your retort. I replied to your comment on Crossfire and SLI being overkill at 1080p with the argument of 144hz displays. Maybe I should have elaborated, but considering that the typical refresh rate is 60hz, I figured the argument of 120hz displays was implied since it's also a sizable jump over what is typical and what was used in this test. The crux of my argument is that, although 60fps on 60hz screens is enough for many, [i<]some[/i<] gamers will want higher framerates on their displays with higher refresh rates. For those gamers, the additional performance beyond 60fps has more worth than for the typical gamer, hence the value for money proposition is different for them and they might consider an SLI/Crossfire setup to be worth their money even when [i<]you[/i<] think it's overkill at 1080p. As for why test it at 1080p, well, where are the 2560x1440 that support 120hz or 144hz? Nowhere to be found. That is why Crossfire/SLI testing at 1080p is relevant, not overkill. As to how large the demographic is and whether TR should want to cater to it, I don't know. All I'm saying is that I would be part of it if I didn't have another expensive hobby.

            • clone
            • 6 years ago

            focusing on gameplay minimizes the focus on the edges of images especially when at max details with single gpu pushing 60+ fps at 1080p.

            2nd part: not really, the question now becomes can you tell the difference between 80fps and 120fps because again…. I doubt it, I highly doubt it and once you are down to 80fps you are surprisingly close to single gpu performance at max details at 1080p.

            3rd response, I don’t disagree that some either have or are looking to buy 120hz screens, I’m one of them, I also don’t disagree that more frames is better…… but (and you know nothing matters before the but) single GPU’s today are solid with pushing 60+ frames at 2560 X 1440….. the “1080p answer” is in front of you……. their is no need, desire or interest in Crossfire or SLI setups just for 1080p….. .[b<]the setup is both pointless and overkill.[/b<] as to your closing statement, the "typical Crossfire and SLI user" is not looking to waste money playing at 1080p, they specifically went with the option for 2k and or 4k gaming. the reality in this discussion is that you aren't in the market for the option you are pushing yet for some odd reason you are trying to pretend their is a mass of individuals just dying to buy 2 GPU's to Crossfire or SLI just so they can play at 1080p levels of resolution.

            • Firestarter
            • 6 years ago

            [quote<]focusing on gameplay minimizes the focus on the edges of images[/quote<] If gameplay were all that matters then nobody would want a 290X or 780ti. Graphics matter, and therefor anti-aliasing matters. [quote<]can you tell the difference between 80fps and 120fps because again.... I doubt it[/quote<] I most definitely can tell the difference. [quote<]the "1080p answer" is in front of you....... their is no need, desire or interest in Crossfire or SLI setups just for 1080p..... .[b<]the setup is both pointless and overkill.[/b<][/quote<] Yeah, well, you know, that's just, like, your opinion, man. [quote<]the reality in this discussion is that you aren't in the market for the option you are pushing yet for some odd reason you are trying to pretend their is a mass of individuals just dying to buy 2 GPU's to Crossfire or SLI just so they can play at 1080p levels of resolution.[/quote<] I did not say that. I'm done arguing with you.

            • clone
            • 6 years ago

            then how could ppl play at 4k let alone 2k resolution without high end cards? which would you prefer? Gaming at 1080p with 4XMSAA or to game at 2560 X 1440 without MSAA? you do realize you could buy a nice new higher quality 2k display with the money saved from that 2nd gfx card.

            2nd: nah, doubt it, not in any meaningful way.

            3rd: not my opinion, by a huge margin, most of the interested worlds, it’s why it’s not bothered with.

            4th: true that, what you actually said was you personally aren’t in the market for Crossfire or SLI to play at 1080p, you also hinted that the only market that would be interested in gaming with Crossfire or SLI at 1080p on the newest cards would be the “money no object and no care in the world” demographic.

            • Firestarter
            • 6 years ago

            I get it, you’re absolutely right on all counts.

            • clone
            • 6 years ago

            of course I…… aaaaaawwww, I get it…. shame on you 🙂

            that said seriously which would you prefer?

            2560 X 1440 full resolution or 1920 X 1080 with 4XMSAA?

            according to the numbers running 2560 X 1440 is easier for a gfx card than running 1920 X 1080 AA enabled. (4 games compared at 2560 X 1440 v 1920 X 1080 with AA enabled)

            just be honest, which would you prefer?

            • Jason181
            • 6 years ago

            Says the guy that doesn’t have a high refresh rate monitor.

            • clone
            • 6 years ago

            says the guy who’s owned multiple high end video cards, has had multiple 120 refresh displays and 60 hz displays, has a cppl of 120 refresh displays now and has also owned multiple high end cpu’s over the years.

            fixed your response for you.

      • vargis14
      • 6 years ago

      Agreed 1080p should be tested and not just for high refresh rate monitors for regular 60Hrtz monitors as well

      Steams October report shows 1920 x 1080 Primary display resolution taking up 33% of all resolutions.
      With that kind of % of users it is just plain wrong not to test 1080p especially since most every high refresh rate monitors is 1080p @ 120-144Hrtz.

      I imagine the the desktop only % is much higher then the mixed computer type of 33%. Probably on the order of 50% when you do not include laptop/notebook displays ETC.

        • JustAnEngineer
        • 6 years ago

        I believe that I can understand why Damage isn’t spending a ton of his limited time testing low resolutions.

        If you’re going to limit yourself to 1920×1080 = 2.07 million pixels, why would you bother with a $700 graphics card? A $400 graphics card is more than fast enough for that resolution. Even a $200 graphics card should suffice. The test results for the high-end cards would be pretty boring if the charts consistently showed 0% of frames with any latency of concern.

        Once you’ve gone to a 2560×1440 = 3.7 MP or 2560×1600 = 4.1 MP or 3840×2160 = 8.3 MP display, you’re not going to want to go back to gaming at low resolution.

          • derFunkenstein
          • 6 years ago

          Yeah I’m thinking, a 760 or a 280X would be overkill for my 1080p display.

            • Klimax
            • 6 years ago

            Unless you want to push some games to absolute max. Give Crysis 3 max settings including AA and watch… (Or Arma 3 and like)

            • f0d
            • 6 years ago

            yeah i have troubles with those games with my dual 670’s and just 60fps at ultra+ settings

            also as firestarter said 144hz is hard to keep up with (144fps) in almost any new game at 1080p – i havnt found many (except for older ones) that i can keep a minimum 144fps with on my heavily overclocked gtx670’s and 5ghz 3930k

          • Bensam123
          • 6 years ago

          For more FPS and a smoother experience. The closer you get to 144fps with a 144hz monitor, the more you get out of the monitor.

          I haven’t seen any super high refresh rate TN panels above 1080p… I would’ve bought one had there been one. Having gamed at 144hz, I wouldn’t go back to 60hz regardless of the amount of pixels or color attributes (IPS).

            • f0d
            • 6 years ago

            i have tried 2650×1600 IPS monitors but imo they wasnt as good as a 144hz lightboost for gaming (for professionals they might be better)

            when i saw my friends 144hz lightboost monitor about 2 months ago i instantly fell in love with them and before i saw them for myself i actually thought it was just a gimmick

            so i sold my old monitor got an asus vg278he and i couldnt be any more happier with it 🙂

            • Bensam123
            • 6 years ago

            Yup… It’s one of those things you can’t believe till you see it. I still have a 60hz monitor sitting next to my 144 and even for watching movies it’s simply a better experience. I would definitely like a 4k IPS panel with the refresh rate and response time of a TN gaming panel, but they don’t exist. Chances are most gamers will be going the high refresh route instead of the high resolution, IPS route.

          • travbrad
          • 6 years ago

          I understand the reasoning to some extent, but on the other hand I bet there are just as many people with 120hz/144hz 1080p monitors as there are people with 1440p/1600p monitors. So a 1080p test would have as much real-world relevance as testing at 1440p does.

          Not everyone has decided 1440p is better than doubling your refresh rate when it comes to gaming. They each have trade-offs.

            • Bensam123
            • 6 years ago

            I’d say the majority of people who play games and would actually care about FPS would be those that are more likely to purchase high refresh rate displays over high resolution IPS panels. High refresh rate displays are definitely becoming very common in gaming circles as people learn about them.

            • Airmantharp
            • 6 years ago

            Well, cheap TNs have been well represented in gaming circles since they’ve been available- slightly more expensive TNs with faster refresh rates are certainly appealing to said crowd :).

            For those of use that use our systems for actual work, though, higher-resolution panels with far better performance are still king. And remember, they can see further than you :D.

            • Bensam123
            • 6 years ago

            Sure, but one could question testing for crowds that casually play games (if at all) verse play games all the time… Which is closer to the target base that would purchase and use such hardware?

            • travbrad
            • 6 years ago

            [quote<]For those of use that use our systems for actual work, though, higher-resolution panels with far better performance are still king. [/quote<] I agree, but we are talking about a $700 graphics card here, and not many people buy $700 graphics cards for "actual work". The question is how many of these people have 120hz+ displays versus how many have 1440p+ displays? If there are a similar number of each then it would make sense to test for both cases. Anyone who can afford to spend $700 on a graphics card can probably afford BOTH a 1440p IPS display and a 120hz 1080p display too, and get the best of both worlds.

            • Airmantharp
            • 6 years ago

            Nope, graphics cards for actual work cost quite a bit more than $700 :D.

    • blitzy
    • 6 years ago

    Just waiting for a 290 with a good third party cooler to popup and we have a deal.

    Honestly I would rather buy a 780 due to lower power consumption, but they’re overpriced

    • End User
    • 6 years ago

    3GB on a $700 card in the era of 2560×1440+ displays is just terrible. I’m already seeing games approach 3GB when I play at 2560×1440 on my 4GB card. This thing should have 6GB.

    • DeComposer
    • 6 years ago

    My sincere thanks, Scott, for an even-handed comparison of the competing flagship cards. I commend you on your methodology and empirically derived conclusions.

    My own preference has shifted several times over the years. Though I’ve been harboring a grudge (I lost a favorite gaming laptop to a soldergate-era chip failure), I trust that Nvidia has put its process failures behind it.

    I greatly look forward to your upcoming SLI/Crossfire article.

    • Klimax
    • 6 years ago

    BTW: On page 2 under “thermal density” you have precise reason why OC on Intel’s chips is getting more limited. (Process fine tuning for mobile is another, but even then you cannot beat physics)

    • cynan
    • 6 years ago

    Nvidia. The way it’s meant to be cooled.

    AMD. The way it’s meant to be priced.

    Happy medium, anyone?

      • Firestarter
      • 6 years ago

      So what is a 290 with an Artic Cooling Extreme strapped onto it then? I mean, that cooler isn’t exactly cheap, but I bet it would work wonders for the 290 and 290X.

        • cynan
        • 6 years ago

        [quote<]So what is a 290 with an Artic Cooling Extreme strapped onto it then?[/quote<] And arguable happy medium that doesn't currently exist in stock form (ie, without doing the work yourself and voiding your warranty in most cases)? But yeah, i'd definitely be looking into aftermarket cooling if I found myself with a 290 as it comes now. Sadly, I pretty much see it as a necessity, rather than an enthusiast option (as it should be).

          • jihadjoe
          • 6 years ago

          Custom versions of the 780ti and 290x will be very interesting, as both are hobbled in their stock forms.

          780ti needs more power.
          290x needs more cooling.

      • jihadjoe
      • 6 years ago

      I prefer to mix up their slogans:

      AMD: The way it’s meant to be priced.

      Nvidia: Never settle (for a crappy cooler).

        • cynan
        • 6 years ago

        You know, I think that [i<]is[/i<] better. ;-D

      • Great_Big_Abyss
      • 6 years ago

      you mean: AMD. Pricing Evolved!

      Gaah, JihadJoe beat me to it….kinda. Doh.

      • brucethemoose
      • 6 years ago

      AMD AND Nvidia should put you in charge of marketing.

      • PopcornMachine
      • 6 years ago

      That would be nice, but they’ve got to jam us so how.

      But then, you can fix the cooling on AMD and still be cheaper than NVidia.

      Better that they gave us both though. In the end, I bought a 290.

      • anotherengineer
      • 6 years ago

      Matrox. The way the 3rd wheel is meant to be.

      😉

    • NeoForever
    • 6 years ago

    [quote<]As always, the best combinations of price and performance will be situated closer to the top right corner of the plot, and the less attractive ones will be closer to the bottom left. [/quote<] Anyone notice this used to be different? Is this a typo? I thought it should be best price/performance on top-left and less attractive ones on bottom right. Edit: To me top-right is just more expensive and better performance and bottom-left is cheaper and lower performing... which doesn't say anything new.

      • Milo Burke
      • 6 years ago

      I was about to comment on this, but you beat me to it. Fix it, Scott! =]

        • superjawes
        • 6 years ago

        I have a feeling Scott finished writing up this review, published it, then promptly passed out.

        Quick! Someone air drop him some coffee!

          • Milo Burke
          • 6 years ago

          He’s a day late, but I’d [b<]much[/b<] rather read it here even three days late than at Tom's Hardware, for example. Keep up the good work, Scott!

      • Damage
      • 6 years ago

      Doh, my bad. Fixed!

        • Milo Burke
        • 6 years ago

        Get back to your coma! You need the rest!

    • drfish
    • 6 years ago

    As someone who bought a 780 in May I’m actually feeling pretty good right now. Of course I would have liked to spend less but at the time is was a solid high end offering and compared to the 290 the performance is good with superior thermals and acoustics. Given the choice between the 290, 290X, 780 and 780Ti at their current prices I honestly think I would still go for the vanilla 780. At $50 less it would be a no brainer. I think the build quality is worth it until AMD’s vendors start offering better cooling solutions.

    • weaktoss
    • 6 years ago

    Minor typo:
    [quote<]You can image that Nvidia has been sorting its GK110B chips...[/quote<] I would image that that should read "can imagine." [quote<]I continue to be, er, a fan of the swanky aluminum-and-magnesium cooler[/quote<] I, too, continue to be [i<]blown away[/i<] by that cooler.

      • Amgal
      • 6 years ago

      You punny.

    • geekl33tgamer
    • 6 years ago

    Love how everyone’s skipping over all technical aspects of either a 780 Ti or R290X, and focusing 100% on a over-exagerated cooler war.

      • willmore
      • 6 years ago

      You skipped over the technical issues you’re accusing people of skipping over.

    • flip-mode
    • 6 years ago

    Were I interested in any of these, I’d get a 290X and use the savings on a water cooler, or I’d simply realize that with headphones or speakers on, I don’t notice the 290X cooler anyway.

    Having said that, AMD is hopefully properly chastised regarding acoustic performance criteria, and will be responsive about it.

      • pohzzer
      • 6 years ago

      If aftermarket cards are released in the next couple of weeks on top of the incipient Mantle’s in depth reveal and benchmarks and Kaveri’s in depth reveal and benchmarks, ‘coolergate’ is likely to rapidly fade away. .

        • HisDivineOrder
        • 6 years ago

        pohzzer? spigzone! IS THAT YOU? Thought I saw that over at hardforums.

        Haha, I did:

        [url<]http://hardforum.com/showthread.php?t=1789813[/url<]

          • superjawes
          • 6 years ago

          It sounds like him. Rule #2: [s<]Double[/s<] Triple Tap!

            • jihadjoe
            • 6 years ago

            You cannot kill that which has no life!

      • Farting Bob
      • 6 years ago

      The standard 290 is easily the best value high end card right now, it’s not even that close. Pretty much unnoticable performance diffierence at hundreds less than it’s rivals.

        • flip-mode
        • 6 years ago

        Truetrue. I should have said the r9-290.

      • Kaleid
      • 6 years ago

      “Having said that, AMD is hopefully properly chastised regarding acoustic performance criteria, and will be responsive about it.”

      That will probably take years. Some of us have been criticizing the default coolers for years…they really are not efficient enough on high performance graphic-cards.

    • pohzzer
    • 6 years ago

    “…you could have dual R9 290s, which should presumably outperform the GTX 780 Ti handily”

    Seems preposterous on the face of it. I’d have to see hard numbers to make that jump.

      • HisDivineOrder
      • 6 years ago

      spigzone, spigzone, spigzone… why? Just why?

      [url<]http://hardforum.com/showthread.php?t=1789813[/url<] EDIT: I remembered your handle for two reasons. 1) Everyone on a completely different forum were having the same reaction the comments section here had. "spigzone is under another name, but we all know it's him!" 2) pohzzer = poser? I thought to myself when I saw it at hardforums, "Heh, at least he didn't try to pose on TR." Imagine my surprise at you being here. You poser you.

        • pohzzer
        • 6 years ago

        Do you believe in lizard overlords?

        • ClickClick5
        • 6 years ago

        Scott….SCOTT!!! He is back. Confirm!

          • Meadows
          • 6 years ago

          At least he’s consistent.

    • shank15217
    • 6 years ago

    I’ll say it again, AMDs Hawaii needs a die shrink to shine.

      • HisDivineOrder
      • 6 years ago

      I’m kinda hoping that if they go to the trouble of a new process they don’t give us more Hawaii. Plus, that’d make the heat produced per die size even worse if they don’t get their leakage under control. They’d need a better cooler that could get rid of heat in a small space to deal with that.

      Better if they go back to the drawing board, I think. Damn shame then they’re anchored to their GCN architecture for years now what with Mantle making it essential.

        • derFunkenstein
        • 6 years ago

        They had VLIW5 units for a very long time, too, from the 2900XT up through most of the 6000 series. I think AMD views this as a maturation in a way. Just like they’re sticking with Bulldozer-ish CPUs for a long time, they’ll stick with GCN-ish GPUs for a while too. That’s a bit part of why those 4000, 5000, and 6000 cards were a big hit – the drivers matured over time.

          • the
          • 6 years ago

          Actually the Radeon 3000 and 4000 each added something over their previous generation in terms of hardware. The big things is that AMD dropped the horrid ring bus based memory controller and went back towards a conventional crossbar topology early on. Not only did this improve performance but also reduced power as the memory controllers that were not being used could go into a low power state (this was effectively impossible with a ring bus based design).

          I do think we’re seeing a bit of maturing with regards to GCN. HSA is around the corner and with a well laid out road map. Crossfire has dropped the bridges and moved to pure PCI-e.

          I would throw out a prediction that AMD’s future server CPU’s and select GPU’s would do the daring thing and share a common socket to exploit HSA in HPC workloads. Unfortunately AMD seems to be throwing in the towel in server market since Bulldozer made them take a step backwards in that market.

        • shank15217
        • 6 years ago

        You realize that a die shrink will bring power benefits right?

          • clone
          • 6 years ago

          AMD and Nvidia use the same tech… zero sum gain.

        • clone
        • 6 years ago

        AMD copying Intel’s tick tock product launches would probably decimate Nvidia the same way it decimated AMD in desktop….. it’s a success story you should consider before saying AMD should throw out the bath with the bathwater.

        R9 has been tested using a custom R9 280 cooler, the gpu peaked at 63c under full load.

        given R9’s power consumption is almost identical to Ti’s and it’s performance scales surprisingly well with each rise in mhz I don’t understand ppl want it to go or why ppl are so determined their is something wrong with it.

      • willmore
      • 6 years ago

      Actually, the math tells us that the die is already too small for the power it uses. That’s what the nVidia FUD slide is talking about. The more power you dissipate per mm^2, the harder it is to keep cool. Since the TSMC 20mm generation isn’t likely to decrease power consumption, any shrink will only hurt the power/area situation for AMD on this design.

        • Herem
        • 6 years ago

        According to TSMC their 20nm process technology can provide 30 percent higher speed, 1.9 times the density, or 25 percent less power than its 28nm technology.

        As TSMC are going into mass production in a couple of month’s time I would hope they know what they’re talking about.

        [url<]http://www.tsmc.com/english/dedicatedFoundry/technology/20nm.htm[/url<]

          • mesyn191
          • 6 years ago

          You’re not saying anything that contradicts willmore though since those are all “or” claims. You won’t get all of those benefits at once with a die shrink on their process. You’ll either have to choose 1 or some combination of all.

          Trade offs, trade offs, trade offs.

            • Herem
            • 6 years ago

            I didn’t claim the new process would deliver all of those benefits either, it was Willmore who claimed a die shrink would not change power usage, he didn’t mention anything about clocking higher.

            If an existing design is shrunk to the 20nm process at the current clock speeds the power consumption will drop fairly noticeably.

            • mesyn191
            • 6 years ago

            Power consumption isn’t power density though. You guys are talking past eachother…

            • Herem
            • 6 years ago

            I’m aware power consumption and power density are not the same thing however as the die size and power consumption are both being reduced the density will not alter significantly.

            If 20nm really offers no benefits and only made things worse do you really think all of the foundries would be investing so much to migrate to an an inferior solution?

            • mesyn191
            • 6 years ago

            If you reduce die size more than you reduce total power consumption than power density will still go up.

            Generally speaking power density is increasing with each new process shrink for high end CPU/GPU’s. That is part of the reason why the tend to be so heat/power limited.

            20nm offers benefits but its no longer the clear cut win-win-win scenario that previous shrinks had. As processes continue to shrink the benefits will require even more trade offs which will muddy the issue even further. Most likely you’ll see AMD/nV stop trying to push for the latest/greatest process and do more to make the most of then current or “old” processes.

            To some extent that is already happening now.

        • Airmantharp
        • 6 years ago

        I liked that FUD slide. Always fun when a company uses it’s design disadvantage as a marketing point!

        Huzzah, our giant GPU has stupid amounts of gaming-specific circuitry and compute specific circuitry, and if we only use it for one of those purposes, it’s actually large enough to dissipate the heat!

        As if having a larger die was actually a market advantage, yields being equal.

          • Klimax
          • 6 years ago

          Easier to get rid of heat. See Intel’s CPU…

      • jihadjoe
      • 6 years ago

      Hawaii is already smaller than GK110, a die shrink would just increase the thermal density even more.

      My guess is it will end up just like what happened to Intel when they went from Sandy to Ivy and Haswell.

        • Airmantharp
        • 6 years ago

        Where they cheapened out on the TIM?

        If the thermal density of a smaller die is a problem, a better cooling solution is all that’s needed.

      • BlackDove
      • 6 years ago

      For even higher power density and hotspotting?

    • vargis14
    • 6 years ago

    For those not wanting to make a custom water loop for a 290/290x.

    I would like to see a 290/290x cooled buy a Accelero Hybrid cooler from Arctic cooling does improving performance since a lot of a GPU cards heat comes from the VRM’s and memory chips. With the 120mm rad on the Accelero Hybrid cooling the GPU core. Plus the near silent fan and smallish aluminum heatsinks cooling the VRM etc. I am pretty positive that it would keep the GPU core well under 94c….as for the VRMs etc, I cannot be sure. But it is available now.

    One thing is for sure once we see the cooling results with the Accelero Hybrid on a 290 priced at 400$ plus the 125 for the Accelero Hybrid If the VRM’s and other components stay cool along with the gpu core at 120$ cheaper then the 780ti I would go with the 290 anyday since the performance is so close and the sound would be lower on the Accelero Hybrid equipped 290.
    Now someone go get a Accelero Hybrid and let us know:)

      • Tar Palantir
      • 6 years ago

      Already done.

      Toms hw did it in their review.

      [url<]http://www.tomshardware.com/reviews/radeon-r9-290-review-benchmark,3659-19.html[/url<]

        • pohzzer
        • 6 years ago

        Looks like an Accelero Extreme 3, not the Hybrid.

          • Tar Palantir
          • 6 years ago

          My bad, I misread that.
          Still, OCing with the fan cooler gives you +12% FPS. Not bad. And it’s so quiet!

            • pohzzer
            • 6 years ago

            True and the water cooled Hybrid should up that a fair amount.

    • Tar Palantir
    • 6 years ago

    Well that was…surprising!
    I expected this card to wipe the floor with the R9 290X (you’d expect nothing less for that price).

    The R9 290 IS the winner here!
    But I wish people would just stop with the waterblock talk already!
    Has EVERYONE forgotten that VGA Air Coolers are a thing?
    Arctic and Gelid offer stand-alone coolers from 40$ (dual) to 75$ (triple fan). Slap them on the R9s and they will still be cheaper than even the 780-without-TI, cool, quiet and finally overclockable.

      • willmore
      • 6 years ago

      I agree with the first half of your post, but WRT water cooling, if you already have it, it doesn’t take much effort to add another tap in the loop to cool the GPU. And a GPU water block isn’t much more than an after market air cooler.

      It’s sort of a “if you have a hammer, everything looks like a nail” problem. You don’t own a hamer and that’s fine. Some of us do. 🙂

        • JustAnEngineer
        • 6 years ago

        The corollary to your “if your only tool is a hammer, all of your problems start to looks like nails” aphorism is “if you give a small boy a hammer, [b<]everything[/b<] needs pounding."

    • DeadOfKnight
    • 6 years ago

    Haha, this is what I was waiting for, but now that it’s come so late I’m probably just gonna wait for Maxwell.

    • puppetworx
    • 6 years ago

    It’s extremely close on frame variance, power consumption and FPS. Green definitely wins the sound levels test though, I guess that cooler really is as good as it looks.

    I wonder how the Crossfire/SLI tests will turn out with two R9 290’s costing just $100 more than a single GTX 780 Ti. If you’ve already got a water-cooled rig it could be a no-brainer. The R9 290 is looking extremely strong at this price both for the water-cooled enthusiast crowd and the bang for buck enthusiast crowd.

    • Klimax
    • 6 years ago

    I guess we see full price of (almost) fully unlocked GK110-B (I use last variant as it is less prone to misreading and I saw it more often on chips) That power consumption is high.

    Pity no Arma III, that could be interesting. (Also with some care one could get fully repeatable benchmark)

    Also I’d like sometimes to see much lower resolution with absolutely maxed out settings in games like Crysis 3. (1080 or 1280×1024)

    • Meadows
    • 6 years ago

    See, [i<]this[/i<] is how you make a high-end GPU and cool it, AMD fanboys. The only redeeming feature of the R9 290X is the lower price, allowing a potential buyer to use the extra cash to get a better GPU cooler. However, that requires some know-how and I don't know whether it voids the warranty, so that's a pretty murky redeeming feature.

      • BlondIndian
      • 6 years ago

      Shiny cooler + less fan noise + 10% more perf = 175% of 290 price .
      Makes a ton of sense now that you have pointed it out … A great deal ! </sarcasm>

      I would pay a 60-70$ more than stock for a great cooler but the 780Ti is plain overpriced . The 780 at least makes some sense if you like PhysX or CUDA at its current price . The 780Ti is a joke at 699$.

      [quote<]There are no bad GPUs, Only GPUs at a bad price[/quote<]

    • madgun
    • 6 years ago

    What about overclocking results?

      • Klimax
      • 6 years ago

      Unless something changed then I expect best results from playing with fan speed. (And under Gigabyte OC you don’t even need to touch base/boost as it will more or less ignore them)

      Note: Based upon my experience with Titan (original rev. which can go up to 1046MHz on stock V)

    • Arclight
    • 6 years ago

    It was interesting how it traded places with the R9 290x in different titles in the time frame metrics. The performance it sports will certainly justify the price for the people who intend to install water cooling solutions and OC even furthur, for the rest of us a custom cooled R9 290 will bring us very close to the stock GTX 780ti at a much lower price.

    One thing i did notice though and it made me out a “OHHHHHHH” was the power consumption at load. I knew from the start that it was a non issue for this segment of the market, but what will the people say now when they moaned so much about the R9 290x’s power consumption.

    • albundy
    • 6 years ago

    The R9 290 is $300 cheaper? ROFL! there goes NV’s next quarter earnings. you cant seriously think an fps or two is worth that much more.

      • Klimax
      • 6 years ago

      No.

      ETA: For too many reasons. Hint: What is largest market by income/profit for NVidia.

      • BlondIndian
      • 6 years ago

      You are forgetting Brand Value .
      A lot of guys bought a 1000$ Titan for Gaming . Then they did SLI Titan .
      This will sell even if the 290 matches it’s perf within 10% for half price.

      • travbrad
      • 6 years ago

      Nvidia continued to sell twice as many cards this entire last generation despite AMD having slightly better deals on their cards overall (especially when you factored in the “never settle” bundles). For whatever reason Nvidia still has a lot more brand loyalty and name recognition.

      Neither company sells a ton of these high-end cards anyway, and it’s easy to see why when you consider only 1% of Steam users have a display above 1080p or 1200p. They do have good profit margins on the high-end cards, but the real value of the high-end cards is as a flagship/halo product that makes less informed consumers think “Nvidia has the fastest card, so I’ll buy this $150 Nvidia card”

        • JustAnEngineer
        • 6 years ago

        [quote=”H.L. Mencken”<] No one in this world, so far as I know — and I have searched the records for years, and employed agents to help me — has ever lost money by underestimating the intelligence of the great masses of the plain people. Nor has anyone ever lost public office thereby. [/quote<]

    • Bensam123
    • 6 years ago

    lol, foam ear plugs indeed. XD

    I’m not sure how I feel about Nvidia waiting this long to fully deploy it’s kepler tech. It makes me wonder if there was more to it then yield issues, like milking their tech for as long as possible. I suppose if you have such ‘reserves’ it makes sense, but I don’t think such behavior is conducive of constantly pushing things forward. Sure it gives AMD a chance to catch up… but what would’ve happened if AMD didn’t push them to release this? We may have been sitting at current performance levels for quite some time.

    “AMD’s newest graphics card lacks: a clearly stated base or minimum clock frequency that acts as a guarantee of performance. For the 780 Ti, the base clock is 876MHz. Unless something goes entirely wrong, the GPU shouldn’t run any slower than that during normal use.”

    Aye… I assume they’ll eventually add that in a driver update, where the fan will ramp up if the clock drops too far due to heat, but it’s silly they overlooked such a thing in the first place.

    I dig the comparison table. That should help newbies that don’t know how to compare the 7xxx and the R9 series.

    For your XDMA writeup have you guys considered pairing up older gen mid range cards to take on some of the new stuff in addition to the XDMA testing? So people could compare adding on a additional video card versus buying a outright new shinyness. Like the 7850, 7870, or 7950 vs a r9-290 or r290x. If people already own one of the above cards they could offer a better value… in some cases a much better (such as 2x7870s if it performs on par with a r9-290). Of course this all depends on how well they test out.

    Ideally adding in Nvidia sli too would be a great idea, such as the 660 or 670 (and/or Tis depending on price points) and then comparing them to a 780 or 780ti. You could knock out two birds with one stone.

    Good review. ^^

      • BlackDove
      • 6 years ago

      The older generation AMD GPU’s still don’t work in CrossFire half the time, since their frame metering is only in software.

      The new XDMA CrossFire is better, but it’s not nearly as good as SLI for frametimes. PC Perspective already did those benchmarks like a week ago.

        • MFergus
        • 6 years ago

        Does anyone have an idea if Nvidia will have to go to an XDMA like solution eventually and get rid of the SLI bridge due to bandwidth constraints? AMD’s solution is pretty solid for first effort and it’s future proofed bandwidth wise at least.

          • BlackDove
          • 6 years ago

          The bridge is just for synchronization as far as I know. It allows low latency between the two GPU’s to communicate, rather than using the PCI-E bus, so it would probably be a disadvantage to remove it.

          • Bensam123
          • 6 years ago

          I’m guessing they will. Nvidia usually makes competitive solutions and then lock them down once they’ve been beat in a certain area. I’m sure even if AMD makes a solution that makes cross vendor cards compatible, Nvidia would still go for a proprietary solution and then claim some sort of weird benefit simply so their cards can’t work together.

          • the
          • 6 years ago

          nVidia already has an IOMMU on their Kepler based cards. In fact, the Quadros and Telsa line can communicate directly with other PCI-e cards. For HPC applications, this is used to quickly send information off to an Infiniband card in the same system.

          For gaming cards though, it is used only for SLI.

        • Airmantharp
        • 6 years ago

        I’m still waiting for TR’s review, but I did go look up what PCPer had to say. Apparently the 290 series doesn’t have the dropped frames fix for DX9 and could use some frame-time adjustments, but otherwise, it’s damn fast at 4k resolutions.

          • BlackDove
          • 6 years ago

          It leads me to believe that their frame metering is still done in software though. Can someone from Tech Report confirm that?

            • Bensam123
            • 6 years ago

            If it’s done in software or hardware, what does it matter as long as it works correctly?

            • BlackDove
            • 6 years ago

            Doing it in software means overhead.

            People are also assuming that removing the bridge was a better solution than having one for synchronization. It really seems like it’s not, and they did it because they’re doing the frame metering in software, and don’t need the synchronizing data to be passed from card to card, separately from the majority of the data going back and forth through the PCI-E lanes.

            It also seems to work much better, when done in hardware(as pretty much everything is).

            If you look at the R290X benchmarks for CrossFire, it’s not as bad as the previous generation GPU’s, but it’s nowhere near as good as SLI(which uses hardware frame metering).

        • Bensam123
        • 6 years ago

        Hmm… guess I’ll have to wait for TRs benchmarks. From what I saw last time they did a review on frame metering it was all fixed.

        Not as good isn’t the same as broken though.

    • Blink
    • 6 years ago

    After reading through TR’s, as well as a number of other site’s reviews, I am disappointed. Only in the truest sense of the word. Maybe my exceptions were too high, regardless the Ti did not meet them. However, 290 didn’t meet my exceptions either. The thing I see here is both outfits left 3rd parties to finish the job for them. Ti needs more memory and 290 is obviously too loud. Both of which can be fixed 3rd party. Unfortunately, what 3rd parties won’t fix is the retail price for the Ti.

    In addition to what I said above. If you look at Anand’s review (I expect it will be verified by TR in their upcoming review) they compare the XFX 280x DD ($310) in SLI to a single 780ti. The results are with the exception of COH:2, 280x’s take it at resolutions the Ti was designed for. Which is what really makes me question the VRAM. And SLI scaling is horrible for the Ti, though I imagine it can only get better with improved drivers. It’s really hard for me to see value in the Ti.

    P.S. I see a lot of company loyalty here on TR and it makes the comments hard to read. A lot of smart people invalidate good information with strange and unwarranted bias. AMD/NVidia only care about you to the minimum degree they’ve evaluated you’ll accept as a consumer. My GPU past: ATI 128 Pro, 4600ti, 7900GSO in SLI, 8800 GTS 512, ATI 5830. Whomever puts out the best product in your price range deserves your money, that’s the only determination that matters.

    Edited: Because I used the wrong your. 🙁

      • mnemonick
      • 6 years ago

      I think it’s because Nvidia views this as a stopgap solution while they prep for Maxwell’s release in Q1 2014. We could be seeing consumer Maxwell parts as early as next April, and I’m honestly curious to see what they’ll do with the on-chip ARM CPU – PhysX? Audio? Something else?

      (On a side note, I wouldn’t be at all surprised to see the Titan quietly transition to a “Titan Ti” at the same price point in the near future.)

        • JustAnEngineer
        • 6 years ago

        Isn’t Maxwell coming in the summer with TSMC’s next process shrink?

    • Unknown-Error
    • 6 years ago

    $700? Yes, it is fast, it runs cool, not that noisy but $700? Come on nVidia, be a bit more generous. How about 600?

      • FuturePastNow
      • 6 years ago

      Generosity? Nvidia?

      You’re right, though. This thing is a hundred bucks too high. I expect we’ll see a quiet price cut in just a couple of weeks.

      I also expect we’ll see R9 cards without the stock cooler soon, too.

        • Klimax
        • 6 years ago

        Doubtful. NVidia is not AMD, it doesn’t have value brand. Unless AMD does something, there won’t be any price cut IMHO.

        Interestingly, bundle doesn’t count?

        • Milo Burke
        • 6 years ago

        I’m not sure there will be a price drop that soon. Sure, it’s expensive, but it also has the performance crown, runs cooler, uses less power, and is quieter too. Premium products don’t have to offer the same value proposition to get people to buy them.

        I wouldn’t buy it, but I can’t afford hardware simply for bragging rights. =]

          • Klimax
          • 6 years ago

          Just a note: Not less power under load. (Unless I missed something.)

        • jihadjoe
        • 6 years ago

        Nvidia’s pricing was briefly addressed by Anand near the start of their 2nd live show.

        The end logic was very apple-ish. Basically Nvidia acknowledges that their stuff is indeed priced higher than the competition because they believe their brand is able to command that premium.

        Since they’re likely going to sell every 780Ti they produce even at $700, there’s no reason at all for them to reduce the price.

        Edit: pricing talk is about 39 minutes into the Anandtech show.

          • Krogoth
          • 6 years ago

          Nvidia execs are drinking some fine kool-ad. The GPU market is very volatile and branding doesn’t work that well. The people who tend to get discrete video cards do some research and often get what is the best deal (price, performance, noise/heat) at the time of purchase. Nvidia and AMD/ATI have switched places in these tiers so many times over the past five years.

          Only the die-hard fanboys fall for the brand non-sense and they only make up a tiny minority of the discrete video card market.

      • Bensam123
      • 6 years ago

      Free PhysX!

        • Amgal
        • 6 years ago

        Free Shadowplay!

          • Bensam123
          • 6 years ago

          Free negatives?

            • derFunkenstein
            • 6 years ago

            Free Willie!

            • Amgal
            • 6 years ago

            FREEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEDOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOM!!!!!!!!11one1!!!eleven

      • Meadows
      • 6 years ago

      They don’t want to alienate Titan buyers *that* much.

        • Klimax
        • 6 years ago

        Why? No full DP…

        • Krogoth
        • 6 years ago

        Not really, the people who got the Titan either were people who wanted a “failed K20” on the cheap* or wanted the best gaming single gaming GPU solution no matter the cost.

        The latter group will just pawn off their Titan and opt for the 780Ti if they are so inclined.

      • BlackDove
      • 6 years ago

      Don’t buy it if you don’t like the price. Nvidia isn’t a charity. They’re not supposed to be generous.

    • moose17145
    • 6 years ago

    Think I would rather have the 290, and use the saved 300 dollars for a water loop which is quieter AND will easily allow me to OC the 290 well past what the 780 Ti can do with it’s stock air cooler. Don’t get me wrong… The 780 Ti is a very good card with one heck of a top quality air cooler. But the price tag definitely take away much of it’s appeal when compared to the competition. At least for me it does.

    From what little I have seen, the 290’s overclock very well when put under water.

      • Prestige Worldwide
      • 6 years ago

      Pretty much how I feel. If I had $750 to kill on a GPU I’d go for the 780ti, but it just doesn’t make any sense compared to the 290.

      I already have a full water loop for my CPU so even $500 for the 290 and $100 for a full card waterblock is a much better value position.

      I’m hoping NV drops prices on the vanilla 780 given how hard the 290 outclasses it in performance / dollar. A 780 on water could probably be overclocked to 780ti performance while keeping completely silent!

      • slowriot
      • 6 years ago

      No thanks on the water. Too much annoyance. However, I am patiently awaiting the release of R9 290 with OEM supplied coolers. If Nvidia were to shock me and drop another $75-$100 off the GTX780 I might jump, but I don’t see them doing it.

        • Prestige Worldwide
        • 6 years ago

        I understand, water isn’t for everyone. But it’s a nice hobby to get into and I’ve gone through 2.5 water builds over the last few years and it has been a cool challenge to tackle.

        On the other hand, there’s also the option of an aftermarket air cooler like the Accelero…. at $77 it would still come in cheaper than a 780. But as you said, OEM’s will likely tackle this challenge with custom coolers and render the temps and noise a non-issue.

        That said, I would buy a 780 in a second if it dropped $50-100.

          • Waco
          • 6 years ago

          Not for everyone…and you’ll hate yourself for doing it.

          That said I love my water setup. Silent no matter what I’m doing. 🙂

            • TwoEars
            • 6 years ago

            Do you have any recommendations for quiet pumps? I find the pump noise in general to be more irritating than fans noise.

            • f0d
            • 6 years ago

            if you isolate the pump well then you shouldnt hear it at all – i cant hear my pmp450s (laing d5 style pump they are all similar) at all when isolated when on its rubber/foam mount

            edit: forgot to answer the question (even though it wasnt directed at me)
            personally i wouldnt use any other pumps other than the laing D5 style pumps (koolance pmp450 and 450s / alphacool vpp655 – there is a lot of rebrands of the same thing)

            • f0d
            • 6 years ago

            hate yourself for doing it? i always thought it was the best thing i have ever done pc wise

            also love my water setup
            i have
            rx360 rad with ultra kaze 3k fans that are controlled by a water temp sensor that connects to a scythe “server kaze” fan controller (fans are at min rpm unless the system is being worked hard or its a hot day)
            pmp450s (the strong one that works best at 24v) @24v
            xspc raystorm waterblock
            ek x2 450mm reservoir – its huge and holds about 1ltr of water but i have a nice big case to fit it in (900D) and i like the buffer of having lots of water in the system
            1/2″ primochill primoflex tubing
            bitspower fittings

            • BlondIndian
            • 6 years ago

            How much did your cooling setup cost?
            I’ve been wary of DIY water cooling so far .It’s a bit intimidating. Is there a noob DIY kit ?

            • f0d
            • 6 years ago

            it did cost a bit as i went through a fair few changes before ending up with the parts i have and i purchased them when they were released and they were much more expensive at first

            good news is there is a great kit thats pretty much the same as what i have that you can get from xspc that is pretty awesome for the price heres a link to it (its an aussie store but it shouldnt be too hard to find an american store that has it) [url<]http://www.pccasegear.com/index.php?main_page=product_info&cPath=207_160_45&products_id=24224&zenid=fb363bd91d5c38ffb5922a954d8ed5c2[/url<] or another one is [url<]http://www.pccasegear.com/index.php?main_page=product_info&cPath=207_160_45&products_id=25520&zenid=fb363bd91d5c38ffb5922a954d8ed5c2[/url<] imo (everyone has their opinion on these things) if you get the xspc kit make sure it has a d5 pump and a raystorm block - both are pretty good a good place to see reviews of watercooling gear is [url<]http://martinsliquidlab.org/[/url<] you can see how good the components of that kit is there too even if you dont get the kit i would really REALLY recommend the d5 pumps, they are rock solid and imo the most reliable pumps you can get - as long as you make sure it is ALWAYS lubricated with fluid they are rock solid, i havnt had one die on me yet

            • BlondIndian
            • 6 years ago

            Ok , thanks . I would appreciate it if you could tell me how much you spent on it total ?
            No plan ever goes 100% . So If I had a ballpark figure of cost , It’d be helpful .

            • f0d
            • 6 years ago

            i honestly dont know – i just threw money at it until it was how i liked it (nothing died i just diddnt like some parts), i have been through 3 pumps 2 radiators 2 reservoirs and a fair bit on fittings (some fittings cost $20 each), at a guess i would say close to $1000 but i purchased those items when they were released and they were much more expensive at release than they are now

            the reason i quoted those kits is that they are pretty much the same as what i have (same block same pump same rad) but in a kit form and it has everything you need to make a watercooled system, if i was to make another watercooled system i would just get one of those $300 kits and be done with it they are freakin awesome (they wasnt around when i built my system years ago)

            have a good look at those kits as they are all you need and do some research at the site i linked, teach a man to fish and all that

            • BlondIndian
            • 6 years ago

            kk , thnx

    • chuckula
    • 6 years ago

    [quote<]When I asked Nvidia where it found the dark magic to achieve this feat, the answer was more complex than expected. For one thing, this card is based on a new revision of the GK110, the GK110B (or it is GK110b? GK110-B?). The primary benefit of the GK110B is higher yields, or more good chips per wafer. Nvidia quietly rolled out the GK110B back in August aboard GTX 780 and Titan cards, so it's not unique to the 780 Ti. Separate from any changes made to improve yields, the newer silicon also benefits from refinements to TSMC's 28-nm process made during the course of this year.[/quote<] That's probably the most interestly nugget from the whole article beyond the benchmarks. It's interesting to see that Nvidia tweaked a comparatively old design for the consumer market in competition with AMD's new silicon. Both sides got some advantages since 28nm at TSMC is now a mature process unlike late 2011. Edit: BTW, if the die size numbers from this article and your original Titan article are correct, then the GK110B die is a little bit smaller than the GK110 die from Nvidia's early K20 and Titan parts since the Titan review lists the GK110 at 551mm^2 and the press slide lists the die size as 533mm^2. Not sure if that is 100% on point though. [url<]https://techreport.com/review/24381/nvidia-geforce-gtx-titan-reviewed[/url<]

      • willmore
      • 6 years ago

      They bothered to tweak it because they use a ton of these die in their high end GPGPU products which have insane margins, but it still makes sense to maximize yield..

      • swaaye
      • 6 years ago

      NV did the same with Fermi though.

        • Airmantharp
        • 6 years ago

        This is version three, though- GK100 never saw the light of day :).

    • f0d
    • 6 years ago

    it seems like now whatever the price point you have lots of options from both sides

    290 great price and great card best price/performance card if you have a budget but still has the noise issues (for some its a dealbreaker for others not so much so its up to you)

    780 not quite 290x performance and close to the cheaper 290 but its quiet and overclocks well at a nice price

    290x while im not sure that this is worth it over the 290 with the stock cooler i think it definitely would be with aftermarket cooling or watercooling and some overclocking

    780ti fastest thing out there and for some people thats all that matters despite the price

    so whatever brand you are a fan of both camps have a card for you

    • TwoEars
    • 6 years ago

    I wonder if this was what Nvidia had planned for the GTX 880 or GTX 870 if AMD hadn’t introduced the 290X?

    I guess we’ll never know. Extremely impressive card none the less.

      • Prestige Worldwide
      • 6 years ago

      Pretty sure the plan was always for Maxwell to be the 880…. making Kepler span 3 generations of flagship GPU’s would be ridiculous, even for nVidia.

        • TwoEars
        • 6 years ago

        You’re probably right. I also believe Maxwell will be the 880.

        Perhaps the 780 ti is what will become the 870 or 860 ti. That would kind of make sense from the way things have been going.

        In any case the Titan card is in a very peculiar place right now – how long are they going to keep that around? I love that they made it but they can’t exactly upgrade it and name it the “Titan ti” can they? Or how about the “Titan II”. I don’t know – sounds kind of cheesy.

          • Melvar
          • 6 years ago

          I think the fact that the GK110 is such a large die, and the fact that only the best ones can be used as a 780ti may prevent Nvidia from using it as a second or third tier next gen product. It may well be more cost effective to fab a new, smaller chip on the 20nm process and get twice as many chips per wafer area.

          That is, of course, assuming the new 20nm node actually shows up. And works.

            • TwoEars
            • 6 years ago

            Another good point. I guess it all depends on how well 20nm works and what kind of yields they can get from it out of the gate.

            GK110 is very large, and uses a lot of wafer, but on the other hand it would seem that they’re starting to get the hang of fine tuning the manufacturing process now. I don’t think they’d be able to launch a card like the 780 ti if they weren’t getting good yields on it.

          • JustAnEngineer
          • 6 years ago

          Titan II was very successful.
          [url<]http://en.wikipedia.org/wiki/Titan_%28rocket_family%29[/url<]

            • willmore
            • 6 years ago

            Missle names for high end graphics cards? Love it!

          • Klimax
          • 6 years ago

          Titan has full DP unlike any other top card right now.

            • Prion
            • 6 years ago

            Full DP is so over, I’m holding out for the card with full DVDA.

            • Klimax
            • 6 years ago

            Err, DVDA?

            • superjawes
            • 6 years ago

            DON’T GOOGLE THAT

            • Klimax
            • 6 years ago

            You do know how to cause Maximum Curiosity?

            • Klimax
            • 6 years ago

            Searched it, meh. Apparently just some short for particular set of movies 18+. Note: Just saw in results expanded name, got idea, moved on…

            • Amgal
            • 6 years ago

            It’s funnier because of your name…

            • Klimax
            • 6 years ago

            <Facepalm> I failed to anticipate this… or notice this.

            Thanks…

            • Prestige Worldwide
            • 6 years ago

            You sir, have just won the internet.

          • Skullzer
          • 6 years ago

          Maybe they name the next flagship Titanic, and Amd name their next flagship Iceberg. But then Amd would need cooler temps for it to make sense!

        • Arclight
        • 6 years ago

        Dude, they used basically the same architecture in 3 generations before. Don’t you ever doubt them in that department.

          • Klimax
          • 6 years ago

          Which one was that?

            • Meadows
            • 6 years ago

            GeForce 8800 GT -> 9800 GT -> GTS 240 (OEM).
            GeForce 8800 GTX -> 9800 GTX -> GTS 250.

            • Klimax
            • 6 years ago

            First one is more or less still kept by both companies. (Low end by NVidia and AMD is still ancient and I am not sure if it is just for OEM parts)

            Forgot about 8800, because it was release quite later then rest and used chip which was basis of 9000 series. Missed this sequence that way.

            Just a note: 250GTS has different parameters then 9800GTX, but arch is same.

            I doubt it happens this time. (Outside of mostly OEM low end.)

            • willmore
            • 6 years ago

            The 250 is a die shrink as used in the 9800GTX+.

            • Klimax
            • 6 years ago

            I know, just didn’t finish the thought.

        • the
        • 6 years ago

        The big chip of the Kepler generate did span three versions. The first GK100 didn’t even make it market as it was cancelled. GK110 is what is used in the Titan and vanilla GTX 780. The GTX 780Ti gets a revised GK110 core to help improve yields, fix some errata and reduce its power profile.

      • Skullzer
      • 6 years ago

      Maybe they call it Titanic, and AMD name theirs Iceberg. But amd would need to have cooler temps for it to make sense!

      • BlackDove
      • 6 years ago

      No, the 580 was 480 refreshed, the 680 was the successor to the GF114 not GF110 chip. The GK104 was so competitive against AMD’s big chip that they released it as 680 rather than 660Ti, which is what it would have been if they followed their pattern.

      It’s like Intel’s TickTock. 400 and 500 series were the same chips, 600 and 700 series are the same chips. 800 series will be the new chips. 900 series will likely be a refresh of 800, and then after that they get the on chip memory package with 1TB/s bandwidth.

        • the
        • 6 years ago

        True, but nVidia did wind up clocking the GK104 higher and consume more power than what they originally were planning to make it competitive at the high end. The revised plan did work out rather well for them though.

        The GTX 400 and GTX 500 series were from the same technology generation but they did use different chips. The GF100 and GF110 are slightly different die sizes in fact.

        The GTX 600 and GTX 700 series to have some overlap with the same chips though. The GTX 680 and GTX 770 are using the same chips but clocked higher on the GTX 770.

    • superjawes
    • 6 years ago

    Yeah, I’d say the overall winner right now is the 290 unless you really really want the bragging rights of a 780 Ti. The value of anything more expensive drops considerably.

    Still, since the Hawaii GPUs have some heat and noise issues, and the GK110 is almost a year old now, there is definitely room for improvement. Namely AMD tuning (and properly cooling) their chip while Nvidia should release a new chip of their own. Hopefully the real winner ends up being the consumer, because I do love competition!

      • HisDivineOrder
      • 6 years ago

      I think the overall winner is debatable. I think the 780 Ti is the new high end at the new high end price. No improvement there.

      But overall, I think this just goes to show you:

      The more things change, the more they stay the same.

      When I was looking at the 670 vs 7970 last year, it seems like I faced much the same choice as the person considering the R9 290/290X vs the 780 (non-Ti).

      This year’s roles are:

      2012 670 – 780
      2012 680 – 780 Ti
      2012 Radeon 7970 – R9 290X
      2012 Radeon 7950 – R9 290
      Keith David – Himself

      Raw performance favored the 7970, performance per dollar favored the 7950, but performance per decibel and performance per watt favored the 670. The reference cooler for nVidia was good enough to actually be used, but open air coolers favored the 670 because the cooling was not as difficult for the nVidia cards as opposed to the AMD cards. The 7970/7950 had a higher memory bus, which gave them some headroom at higher resolutions than the 670, but not so much that it mattered in gaming at that moment. A gaming bundle was in play for both sides, though it did little more than give a rebate of 20-30 dollars for either side. I never considered the 680 because it was far too much for far too little gain, especially with overclocked cards approaching its performance.

      That’s the way I still see the market. The R9 290 and 290X have great performance, but their acoustics and performance per watt are far below both the 780 and 780 Ti. As 780B’s get out there and you see GHZ cards start to filter in for marginal increases in price, you’re going to see a scrapping battle for the $400-500 space much like last year’s battle for the $300-400 space.

      The reason someone would choose a 780 is so they don’t have to buy earplugs or sweat to death in the summer. Certainly, they won’t choose the 780 Ti for that, even though it’s superior to the R9 290 and 290X in that. It is only marginally so.

      I know I’m considering a 780, assuming they get overclocked cards out there that keep the performance per watt not too much worse than the standard 780’s they’ve been selling. I like the TItan blower, the bundle’s not bad, and if there is any inching down from the price, we could see $450 in our future in time like I saw with the $350 670 I bought last year.

      All of this happens because AMD finally chose to compete. Thanks to AMD for competing, but it’s a reluctant thanks since it feels like a country that waited all year to come to war rather than immediately come to our aid. They sat on their hands and let nVidia fleece us for a year because they calculated a new bundle offer (or two) would be a cheaper way to continue to fleece its own users to ensure every single last one of them had upgraded to 7970 or 7950 that would before they introduced something newer. And then they took the opportunity to rename the 7xxx series just to ensure they catch every last one afterward.

      I mean, I just don’t know if I feel so grateful when I realize that AMD played us all for fools while letting nVidia beat us around on the high end for a year. I’m glad AMD decided to stop, but what took them so long? Why wait? The only reason I can derive is the same reason that motivated nVidia to fleece us when there was no competition. It’s the same reason AMD released the Radeon 7970 at a higher price than the last generation for an only equivalent increase in performance to the increase in cost.

      Greed. I just don’t know if I’m grateful for them to decide at long last to be LESS greedy. It seems like that’s backwards somehow. Great, they did something worked out for us, but it’s hardly for us. It just so happens to work out.

      But I am glad their greed is quarrelling with nVidia’s greed. At least in that, we all win. No matter if you’re deaf, if you’re water cooling, if you’re willing to spend upwards to have the absolute highest end of all the Big Keplers, if you want the most RAM, if you want gimmicks, or if you just want a great deal on SLI/Crossfire from BOTH vendors that truly work in the way they were supposed to for the first time since the original SLI.

      Yay… greed and competition gives us better prices. I just wish AMD’s greed had led to it showing up at the appointed time in Feb/March when they did their, “Wut? We never said nothin’ bout no Sea Islands” shrug and mock innocent expression. Imagine what the prices on these products would have been, imagine all those 770 out there being 780’s, imagine how much farther our gaming computers would be in advancement (per dollar) if they’d have released the cards they had then rather than stockpile them and sap us all for cash.

      That annoys me.

        • f0d
        • 6 years ago

        well said and i think i agree with pretty much all of it
        this is why nobody should really be nvidia or amd fanboys and hope the other dies out because the second one gets the advantage over the other then thats when the consumer loses out and we have very little performance improvements

        both amd and nvidia have made great cards here and because they are competing with each other so fiercely us the consumer will reap the rewards

        both cards are great and its only minor differences and features that separate them

        • BlondIndian
        • 6 years ago

        Hey man , please be less verbose . We all know the script . A whole lot of Nvidia is really not to blame and somehow it’s AMDs fault in the end . If you were more succinct , I would read more than a few paragraphs .
        It’s toooooo loooonnnggg …….

          • HisDivineOrder
          • 6 years ago

          I know it’s hip today to make things simple, but sometimes reality is more complex. 😉

          Short enough?

            • willmore
            • 6 years ago

            Brevity is the paragon of whit.

        • tahir2
        • 6 years ago

        Greed? Come now AMD needs to make some money. Imagine a world without AMD – no competition for X86 spectrum keeping Intel honest (Intel honest… an Oxymoron if ever there was one) and NVIDIA would just milk their fan base dry.

        AMD make more money dammit.

        • swaaye
        • 6 years ago

        I think our wonderful duopoly has been going back and forth, uh, since the Radeon 8500 came out and NV had real competition again. ATI/AMD has also usually been the one pushing relatively lower pricing, because they typically have a slight (or not so slight) competitive disadvantage. At some points I think they priced aggressively out of some desperation to gain market share.

    • ThorAxe
    • 6 years ago

    I have finally found my next GPU (or two). 🙂

    • spuppy
    • 6 years ago

    Still liking the R9 290 the best, but I only have one monitor

      • derFunkenstein
      • 6 years ago

      I have only one and only a 1080p resolution so if probably be best served by a 760 or a 270X. I’m going to ride it out until the inevitable price drops. My 460 1GB is pretty old but I am still hanging on!

    • Krogoth
    • 6 years ago

    780Ti replaces the 780 in its spot as the fastest gaming GPU you can get no matter the cost.

    Titan is still a better deal if you up for GPGPU related stuff, since 780Ti’s DP performance is crippled like its 780 cousin.

      • Meadows
      • 6 years ago

      Hi Captain Obvious, I see you’ve read the review?

        • Krogoth
        • 6 years ago

        Trying to get a rise out of me?

        You got to do better than that.

          • JustAnEngineer
          • 6 years ago

          Turn the other cheek.

          That may be what we need around here to restore some civility to the discussions. It’s either that or banning all of the trolls, fanboys, shills and other sources of anti-social rudeness.

            • Meadows
            • 6 years ago

            The reason for my snarky comment is that “comments” which essentially just repeat parts of the review cannot (and should not) be considered meaningful.

            • Krogoth
            • 6 years ago

            Nah, it is just a chip on his shoulders.

            Trying to disguise it as being “clever” and “snarky”.

            • derFunkenstein
            • 6 years ago

            Then there wouldn’t be anybody here.

      • Klimax
      • 6 years ago

      Also unlike 290x. (1/4)

      • BlondIndian
      • 6 years ago

      I was expecting a

      [quote<]Krogoth not Impressed [/quote<]

    • ClickClick5
    • 6 years ago

    This noisy fan issue makes me laugh!

    It is called third party coolers, or, third party coolers already installed.

    *mind blown*

      • Voldenuit
      • 6 years ago

      [quote<]This noisy fan issue makes me laugh! It is called third party coolers, or, third party coolers already installed. *mind blown*[/quote<] Sounds like you blew yourself prematurely, to paraphrase Tobias Fünke. Until the OEMs come out with custom coolers in the market, and until they are reviewed by reputable sites, there's no guarantee that the OEMs won't run into the same problems that AMD ran into with their stock cooler. If they do successfully manage the feat, then the R9 290, which is clearly head and shoulders above every other high end card on the price/performance curve, will become very desirable indeed. But until then, I'm not ready to call it yet.

        • ClickClick5
        • 6 years ago

        I did not intend to sound snarky, but when buying a $600 card, most of those who buy the card will intend on over clocking the beast. So they buy a different cooler to install or, they end up liquid cooling the machine.

        When I bought my 6970, I replaced the cooler with the Arctic Cooler Accelero Extreme III and my temps went from 88c to 64c with the fan at 100% speed. And due to the fans not being of a cage type design, the cooler filling the size of the card, you can not hear the cooler at all (low RPM fans too)

        Imagine the 780 Ti and the 290x have the exact same performance, acoustics, power draw, etc. What would the forums fight over? The 780 Ti having a 25MHz faster ram clock? Probably.

        My point is, it seems no matter what, people have to pick and fight over anything they can point at and go, “SEE! It is better!!!”. The console wars also amuse me. The PS4 kicking out 1080p, the One kicking out 720p, upscaled to 1080p. May the rage begin. :p

        • mno
        • 6 years ago

        Or you know, you could look at the [url=http://www.techspot.com/review/736-amd-radeon-r9-290/page8.html<]evidence[/url<]. Techspot transplanted HIS's IceQ X2 cooler from a 280X on their 290 and temperatures never exceeded 76 degrees in FurMark and 63 in Crysis 3 and BF4. Alternatively, there are already coolers on the market such as the Arctic Cooling Accelero Xtreme III and Prolimatech MK-26 that are compatible with the 290/290X, if you don't mind doing it yourself.

          • Voldenuit
          • 6 years ago

          Thanks for the link.

          It was an interesting data point, to be sure, but I’m a bit disappointed that techspot didn’t include information on what fan speed levels and noise levels they were achieved at (after all, the stock radeon 290 results were obtained with the stock cooler set at 47% and were already unacceptably loud for some).

          Still, AC makes great GPU coolers. My old AC Accelero S2 transformed my 4850 from a furnace (the first video card to ever burn my fingers) into something that was not only cool to the touch, but could be passively cooled in marathon sessions of Fallout 3.

          It’s promising stuff. Like I said already, the 290 is head and shoulders above everything else in price/performance. All it needed was a more effective and quieter cooling system, and it does sound as if that’s achievable.

    • cheerful hamster
    • 6 years ago

    I’m not a gamer, but it’s nice to see some competition.

    • codinghorror
    • 6 years ago

    You don’t need to have a noisy fan — one easy tin snip or dremel and you improve fan outflow on the R290X by 100%:

    [url<]http://i.imgur.com/aSzp66V.jpg[/url<] I have two 290X in crossfire and both of 'em have this easy mod. *Massively* improves cooling performance, reduces noise, and keeps the cards at 1000 Mhz even over long play sessions. Done and done.

      • chuckula
      • 6 years ago

      I like that hack. I’m not sure I’d like what it would do to the warranty, but life’s too short to worrying about warranties!

        • shank15217
        • 6 years ago

        Its a metal bracket that screws off, hardly a warranty issue. More importantly, it shows the cooling is a bigger culprit that the chip itself.

      • Airmantharp
      • 6 years ago

      I’m wondering if you couldn’t figure out how to use the bracket’s screw holes to attach the card to the enclosure, setting aside the unmolested bracket for warranty purposes…

        • Firestarter
        • 6 years ago

        I’m guessing a bracket of an older double-wide card would do the trick.

      • TwoEars
      • 6 years ago

      You took a dremel to your new 290x card?

      Ballsy, I like it.

        • spuppy
        • 6 years ago

        Should be easy to remove the bracket before cutting it

      • NeelyCam
      • 6 years ago

      That is Awesome!

      • tbone8ty
      • 6 years ago

      waiting for someone to throw an aftermarket cooler on these to test while we wait for them to go on sale.

      • tomc100
      • 6 years ago

      Well, I think of my pc and all of its parts as being on lease. Meaning, I’m just renting it for now and when it’s time to upgrade I sell off the parts I no longer need. It makes upgrading a whole lot cheaper when you only pay the difference. Altering it will void any warranty and will make it much more difficult to sell later. But that is a cheap and effective solution outside of buying after market coolers or water.

      • iatacs19
      • 6 years ago

      What about opening up the plastic intakes by the fan in the back?

        • puppetworx
        • 6 years ago

        Unfortunately blower fans intake air from the side not the back.

        I’ve never actually understood how dual-card setups manage to get enough air.

          • JosiahBradley
          • 6 years ago

          Actually the R9 serious coolers have small vent intake ports on the back. See image 4 on this link: [url<]http://us.msi.com/product/vga/R9-290X-4GD5.html#4[/url<] Not that they can do so much, but they are there.

          • Airmantharp
          • 6 years ago

          They do it by not putting them next to each other!

          Usually the 16x slots are at positions 1 and 4, or 2 and 5, where with two-slot blowers you have slots 1 and 2 (or 2-3) occupied by the top card, slot 3 or 4 open, then slots 4 and 5 (or 5-6) occupied by the bottom/second card, with the last one or two slots open.

          Given that the cards are pretty long and intake toward the front of the case, those open slots can be used for almost anything. WiFi NICs, sound cards, RAID cards, or tuners or the like fit just fine without impeding airflow.

          Now, if you want to use open-air coolers, you’re going to want more than one slot of buffer space- hopefully two or three. Boards that put the last live x16 slot at positions 6 or 7 are better here, especially with expansive enclosures that have plenty of space beyond the expansion slot area of the motherboard, as well as space ‘above’ the expansion slot area, and lots of intake and exhaust fans.

      • Disco
      • 6 years ago

      That’s awesome (and easy)! Next time I have my computer in pieces I might try that out on my 7970.

      • HisDivineOrder
      • 6 years ago

      Today, it is great.

      Six months from now when one fails, codinghorror weeps because they won’t take it back because of physical damage. No warranty.

      “So sorry, clearly the exterior damage to the card has caused your failure. Feel free to buy another R9 290X at participating retailer. If you do by June 2014, you’ll have access to the Never Settle Not Ever Bundle including your choice of THREE exciting games including Angry Birds Star Wars 2, Farmville XD: Revenge of the Bovines, and Barbie’s Funtime Playtime Goodtime Adventure for Sparkly Glitter.”

      The next day, a man is seen wandering into the wilderness in nothing but a pair of Crysis 3 underoos with what witnesses claim seemed like “two pieces of a computer” and a wild look in his eyes. Forensic teams could never truly ascertain what happened after he passed beyond the treeline and no sign of him was ever recovered.

      For years after, kids would avoid those woods because campers would from time to time report the sounds of a man wailing and trees shaking as if the man were beating himself to death with the tree. If anyone dared approach, the angry spirit would hiss, “NEVER SETTLE! DAMN IT! NEEEVEEER SETTLE!” and lunge at them. If it caught them, they’d feel suddenly hot, sweating, feverish, and the loudest, most shrill sound imaginable (like a hair dryer) would deafen them.

      Or so say the stories. “Them woods be haunted,” an old man named Jen would say nearby. “That’s not The Way you mean To Be goin’.”

      Epilogue:

      In reality, codinghorror went out into them woods to bury his R9 290X’s in peace when he accidentally dug up a chest with a billion dollars in it. He said, “Screw this,” and bought a tropical island much like Hawaii and lived out his days hot and sweaty, surrounded by a lot of loud.

      The truth is usually less exciting than fiction.

        • esterhasz
        • 6 years ago

        The guy founded Stack Overflow, though. The weeping will thus be short.

        TR’s a celebrity hangout now!

        • Meadows
        • 6 years ago

        Good lord, you have to be noticed.

        • D@ Br@b($)!
        • 6 years ago

        Yeah, he would take the backplate of one of his 7970’s, slap it on and RMA his 290x.

        ps.: tnx for the fairy tale (Y)

      • f0d
      • 6 years ago

      while this is a great move i personally would have just ripped the bracket off something else (an older card you are not using anymore) and try and mod it on and keep the old one just in case you need to return it
      thats what i did on my 6950

      • Bensam123
      • 6 years ago

      I assume this mod also works on the 780 and ti… 😛

      • clone
      • 6 years ago

      are you serious, a restrictive backplate is the primary issue?

      send that image to AMD with a comment saying “get your head out of your asses and do it proper.”

      if it’s really that simple that’s just ridiculous.

      p.s. bravo.

      p.s.s. Tech Report should be testing this minor mod by pulling a backplate and seeing for themselves.

      • BlondIndian
      • 6 years ago

      Cool …. How much performance gain did you get ?

      • anotherengineer
      • 6 years ago

      Indeed.

      It’s too bad they don’t stamp the whole bracket like this first
      [url<]http://www.pcstats.com/articleimages/200508/aopena070012ALN_side2.jpg[/url<] then stamp out the connection holes.

        • oldDummy
        • 6 years ago

        Rigidity is a concern I would think; Along with anchoring the connections.
        But your right, it would seem that < restrictions the better.

          • Airmantharp
          • 6 years ago

          JohnC noted in the forum thread that EVGA makes some ‘high air flow’ brackets for the GTX680, and a few others, that can replace the stock bracket.

          Still, you’d think that a little creative engineering might provide a solution that retains the necessary rigidity/stability that the stock brackets provide while impeding exhaust airflow as little as possible.

      • itachi
      • 6 years ago

      thats a nice idea did you remove just screws or cut the metal ? I should do that with my hd5870

      • dale77
      • 6 years ago

      TR should test this hardwarehorror. The original backplate already has significant gaps for airflow, how much temp benefit do you get with your dremel surgery?

        • Airmantharp
        • 6 years ago

        Hell, it’d be easy to test the potential of a bracket mod on an open bench- just take it off!

        The hard part is applying the mod in a non-destructive manner; as I mentioned above, I’m sure an industrious enthusiast could find a way to remove the bracket and use the existing screw mounts to brace the card against the enclosure.

        And hell, while they’re at it, they could cut out the case’s expansion slot guides as well.

      • indeego
      • 6 years ago

      NSA loves the fully-readable fingerprint in that image.

        • Krogoth
        • 6 years ago

        Which is smeared into a worthless pattern.

      • deb0
      • 6 years ago

      How sad that you have to void your warranty on your >500.00 card to get it to perform as expected.

    • chuckula
    • 6 years ago

    Thanks for the review guys! I have to agree with your conclusion about “controversial split decision” too.

    Both the 290x and the GTX-780 have their strengths and their flaws. The twisted irony is that if it weren’t for outside factors* we’d probably be complaining about how the GTX-780Ti runs hot and is somewhat loud under load… oh well. It’s nice to see Nvidia taking a page out of AMD’s play book with the bundle too. I’m sure we’ll see some AMD bundles in not too long as well.

    * Frankly, AMD shot itself in the foot by not dumping the stock cooler and telling its partners that they would be responsible for implementing coolers for the 290x. If they had done that, then the biggest heat/noise issues would be moot and we’d only be comparing these cards on price/performance where the 290x certainly holds its own.

    ** Oh an as an addendum to the whole GPU war of the last month or so: As long as Nvidia & AMD both get chips fabbed by TSMC on basically the same assembly line, don’t expect one side to dominate the other in the foreseeable future. Instead it’s going to be one set of tradeoffs vs. another set of tradeoffs.

      • Modivated1
      • 6 years ago

      I am sure the partners don’t mind sense the way people have been claiming most of the sales will come their way. Now it all depends on who can implement the best cooler for the best price, good competition breeds unusually good results.

    • BoBzeBuilder
    • 6 years ago

    I think I should read the article first before commenting.

      • chuckula
      • 6 years ago

      It’s a good read!

        • HisDivineOrder
        • 6 years ago

        The audiobook is better, though. How did they get Liam Neeson and Tom Hanks to take turns narrating it?

          • Milo Burke
          • 6 years ago

          I’m thinking Scott’s words and Russel Crowe’s voice. Yes please!

Pin It on Pinterest

Share This