Nvidia’s GeForce GTX 980 and 970 graphics cards reviewed

2014 has been a strange year for graphics chips. Many of the GeForce and Radeon graphics cards currently on the market are based on GPUs over two years old. Rather than freshening up their entire silicon lineups top-to-bottom like in the past, AMD and Nvidia have chosen to take smaller, incremental steps forward.

Both firms introduced larger chips based on existing GPU architectures last year. Then, two weeks ago, the Tonga GPU in the Radeon R9 285 surprised us with formidable new technology that’s still somewhat mysterious. Before that, this past spring, Nvidia unveiled its next-gen “Maxwell” graphics architecture on a single, small chip aboard the GeForce GTX 750 Ti. We could tell by testing that card’s GM107 GPU that Maxwell was substantially more power-efficient than the prior-gen Kepler architecture. However, no larger Maxwell-based chips were forthcoming.

Until today, that is.

At long last, a larger Maxwell derivative is here, powering a pair of new graphics cards: the flagship GeForce GTX 980 and its more affordable sibling, the GTX 970. These cards move the needle on price, performance, and power efficiency like only a new generation of technology can do.

The middle Maxwell: GM204

The chip that powers these new GeForce cards is known as the GM204. Although the Maxwell architecture is bursting with intriguing little innovations, the GM204 is really about two big things: way more pixel throughput and vastly improved energy efficiency. Most of what you need to know about this chip boils down to those two things—and how they translate into real-world performance.

Here’s a look at the basic specs of the GM204 versus some notable contemporaries:

ROP

pixels/

clock

Texels

filtered/

clock

(int/fp16)

Shader

processors

Rasterized

triangles/

clock

Memory

interface

width (bits)

Estimated

transistor

count

(Millions)

Die
size

(mm²)

Fab

process

GK104 32 128/128 1536 4 256 3500 294 28 nm
GK110 48 240/240 2880 5 384 7100 551 28 nm
GM204 64 128/128 2048 4 256 5200 416 (398) 28 nm
Tahiti 32 128/64 2048 2 384 4310 365 28 nm
Tonga 32 (48) 128/64 2048 4 256 (384) 5000 359 28 nm
Hawaii 64 176/88 2816 4 512 6200 438 28 nm

Nvidia did well to focus on energy efficiency with Maxwell, because foundries like TSMC, which makes Nvidia’s GPUs, have struggled to move to smaller process geometries. Like the entire prior generation of GPUs, GM204 is built on a 28-nm process. (Although TSMC is apparently now shipping some 20-nm silicon, Nvidia tell us the 28-nm process is more cost-effective for this chip, and that assessment is consistent with what we’ve heard elsewhere.) Thus, the GM204 can’t rely on the goodness that comes from a process shrink; it has to improve performance and power efficiency by other means.

Notice that the GM204 is more of a middleweight fighter, not a heavyweight like the GK110 GPU in the GeForce GTX 780- and Titan-series cards. Nvidia considers the GM204 the successor to the GK104 chip that powers the GeForce GTX 680 and 770, and I think that’s appropriate. The GM204 and the GK104 both have a 256-bit memory interface and the same number of texture filtering units, for instance.

Size-wise, the GM204 falls somewhere in between the GK104 and the larger GK110. Where exactly is an interesting question. When I first asked, Nvidia told me it wouldn’t divulge the new chip’s die area, so I took the heatsink off of a GTX 980 card, pulled out a pair of calipers, and measured it myself. The result: almost a perfect square of 20.4 mm by 20.4 mm. That works out to 416 mm². Shortly after I had my numbers, Nvidia changed its tune and supplied its own die-size figure: 398 mm². I suppose they’re measuring differently. Make of that what you will.

The GM204’s closest competition from AMD is the new Tonga GPU that powers the Radeon R9 285. We know for a fact that not all of Tonga’s capabilities are enabled on the 285, though, and I have my own crackpot theories about how the full Tonga looks. I said in my review that I think it has a 384-bit memory interface, and after more noodling on the subject, I strongly suspect it has 48 pixels per clock of ROP throughput waiting to be enabled. Mark my words so you can mock me later if I’m wrong!


Functional block diagram of the GM204 GPU. Source: Nvidia.

One reason I suspect Tonga has more ROPs is that it just makes sense to increase a GPU’s pixel throughput in the era of 4K and high-PPI displays. I believe the GM204’s ROPs are meant to be represented by the deep blue Chiclets™ surrounding the L2 cache in the fakey diagram above. At 64 pixels per clock, the GM204 has 50% more per-clock ROP throughput than the big Kepler GK110 chip—and double that of the GK104. That’s a sizeable commitment, an enormous increase over the previous generation, and it means the GM204 is ready to paint lots of pixels.

At 2048KB, the GM204’s L2 cache is relatively large, too. The GK104 has only a quarter of the cache, at 512KB, and even the GK110’s 1536KB L2 cache is smaller. Caches are growing by leaps and bounds in recent graphics architectures, as a means of both amplifying bandwidth and improving power efficiency (since memory access burns a lot of power.)


Functional block diagram of the Maxwell SM. Source: Nvidia.

The larger cache is just one way Nvidia has pursued increased efficiency in the Maxwell architecture. Many of the other gains come from the new Maxwell core structure, known as the shader multiprocessor or SMM. The GM204 has a total of 16 SMMs. Each of them is broken into four “quads,” and each of those has a single 32-wide vector execution unit with its own associated control logic. Threads are still scheduled in “warps,” or groups of 32 threads, with one thread per “lane” executing sequentially on each vec32 execution unit. Nvidia says the SMM’s new structure makes scheduling tasks on Maxwell simpler and more efficient, which is one reason this architecture uses less energy per instruction than Kepler. Maxwell’s efficiency improvements come from several sources, though, and I hope to have time to explore them in more depth in a future article.

For now, let’s look at the new GeForce cards.

The GeForce GTX 970 and 980

Nvidia’s new silicon has spawned a pair of video cards, the GeForce GTX 970 and 980. Pictured above is the 980, the new high end of Nvidia’s consumer graphics card lineup (excepting the ultra-expensive Titan series.) Here’s the lowdown on the two new GeForce models:

GPU

base

clock

(MHz)

GPU

boost

clock

(MHz)

ROP

pixels/

clock

Texels

filtered/

clock

Shader

pro-

cessors

Memory

path

(bits)

GDDR5
transfer

rate

Peak

power

draw

Intro

price

GTX
970
1050 1178 64 104 1664 256 7 GT/s 145W $329
GTX
980
1126 1216 64 128 2048 256 7 GT/s 165W $549

At $549, the GeForce GTX 980 ain’t cheap. What you’ll want to notice, though, is its lethal combination of clock speeds and power rating. The GTX 980’s full-fledged GM204 runs at a “boost” speed of over 1.2GHz—and that’s a typical, not peak, operating frequency in games. The card’s 4GB of GDDR5 memory runs at a nosebleed-inducing 7 GT/s, too. That’s one way to squeeze the most out of a 256-bit memory interface. Meanwhile, the GTX 980’s TDP is just 165W—well below the 250W rating of the GeForce GTX 780 Ti or the 195W rating of the previous-gen GTX 680. That’s quite a testament to the efficiency of the Maxwell architecture, especially since all of these chips are fabbed with 28-nm process tech.

Thanks to its frugal power needs, the GTX 980 requires only a pair of 6-pin aux power inputs—and it could darn near get by with just one of them. Although this card has the same familiar, aluminum-clad reference cooler as the last crop of GeForces, its port configuration is something new: a trio of DisplayPort outputs, an HDMI port, and a dual-link DVI connector. Given the ascendancy of DisplayPort for use with 4K and G-Sync monitors, this is a welcome change.

GeForce GTX 980 cards in the form you see above should be available from online retailers almost immediately, as I understand it. Nvidia had the first batch of cards produced with its reference cooler, and I expect custom designs from board makers to follow pretty quickly. Many of those are likely to be clocked higher than the reference board we have for testing.

I’ve gotta admit, though, that I’m more excited about the prospects for the GeForce GTX 970. This card has a much lower suggested starting price of $329, and rather than produce a reference design, Nvidia has left it up to board makers to create their own GTX 970 cards. Have a look at what Asus has come up with:

This is the Strix GTX 970 OC Edition, and it’s pretty swanky. The headline news here is this card’s 1114MHz base and 1253MHz boost clocks, which are quite a bit higher than what Nvidia’s reference specs call for. Heck, the boost clock is even higher than the GTX 980’s and could go a long way in making up for the loss of three SMMs in the GTX 970. Since the GTX 970 has the same 4GB of GDDR5 memory at 7 GT/s, this card’s delivered performance should be within shouting distance of the GTX 980’s. The price? Just $339.99.

Asus has tricked out the Strix with a bunch of special features, which I’d be happy to talk about if I hadn’t just received this thing literally yesterday. I have noted that the cooler’s twin fans only spin when needed; they go completely still until the GPU temperature rises above a certain level. For some classes of games—things like DOTA 2—Asus claims this card can operate completely fanlessly.

On the downside, I’m a little disappointed with the move back to dual DVI outputs and a single DisplayPort connector. I suppose the more conventional port setup will appeal to those with existing multi-monitor setups, but it may prove to be a frustrating limitation in the future.

On the, er, weird side, Asus has elected to give the Strix 970 a single aux power input of the 8-pin variety. That’s unusual, and Asus touts this config as an advantage, since it simplifies cable management. I suppose that’s true, and perhaps 8-pin power connectors are now common enough that it makes sense to use them by default. Still, I was surprised not to see a dongle in the box to convert two 6-pin connectors into an 8-pin one.

Here’s another version of the GTX 970 that just made its way into Damage Labs. The MSI GTX 970 Gaming 4G has the same clock speeds as the Strix, but its cooler is even flashier. MSI says this card will sell for $359.99. I haven’t yet managed to test this puppy completely, but we’ll follow up on it in a future article.

Nvidia is trimming its lineup to make room for these new GeForces. The firm is so confident in the Maxwell cards that it’s ending shipments of GeForce GTX 770, 780, and 780 Ti cards, effective now. Meanwhile, the GeForce GTX 760’s price is dropping to $219. That should be a pretty good clue about how the newest GeForces alter the landscape.

Sizing ’em up

Do the math involving the clock speeds and per-clock potency of the GM204 cards, and you’ll end up with a comparative table that looks something like this:

Peak pixel

fill rate

(Gpixels/s)

Peak

bilinear

filtering

int8/fp16

(Gtexels/s)

Peak

shader

arithmetic

rate

(tflops)

Peak

rasterization

rate

(Gtris/s)

Memory
bandwidth
(GB/s)
Radeon
R9 285
29 103/51 3.3 3.7 176
Radeon
R9 280X
32 128/64 4.1 2.0 288
Radeon
R9 290
61 152/76 4.8 3.8 320
Radeon
R9 290X
64 176/88 5.6 4.0 320
GeForce GTX 770 35 139/139 3.3 4.3 224
GeForce GTX
780
43 173/173 4.2 4.5 288
GeForce GTX
780 Ti
45 223/223 5.3 4.6 336
GeForce GTX
970
75 123/123 3.9 4.7 224
Asus Strix GTX
970
80 130/130 4.2 5.0 224
GeForce GTX
980
78 156/156 5.0 4.9 224

The rates above aren’t destiny, but they do tend to be a pretty good indicator of how a given GPU will perform. Since the GM204 can run at higher clock speeds than the GK110, the GeForce GTX 980 is able to give even the mighty GTX 780 Ti a run for its money in terms of shader arithmetic—with a peak rate of five teraflops—and rasterization. The 980 trails a bit in the texture filtering department, but look at that pixel fill rate. Nothing we’ve seen before comes all that close.

Contrast that prowess to the GTX 980’s relatively modest memory bandwidth, which is no higher than the prior-gen GTX 770’s, and you might ask some questions about how this new balance of resources is supposed to work. The answer, it turns out, is similar to what we saw with AMD’s Tonga GPU a couple of weeks back.

Nvidia Senior VP of Hardware Engineering Jonah Alben revealed in a press briefing that Maxwell makes more effective use of its memory bandwidth by compressing rendered frames with a form of delta-based compression. (That is, checking to see whether a pixel’s color has changed from a neighboring pixel and perhaps only storing information about the amount of change.) In fact, Alben told us Nvidia GPUs have used delta-based compression since the Fermi generation. Maxwell’s compression is the third iteration. The combination of better compression and more effective caching allows Maxwell to reduce memory bandwidth use substantially compared to Kepler—from 17% to 29% in workloads based on popular games, according to Alben.

So what happens when we try 3DMark Vantage’s color fill test, which is limited by pixel fill rate and memory bandwidth, on the GTX 980?

Yeah, that works pretty darned well. The GTX 980 paints over twice as many pixels in this test as the GK104-based GTX 770, even though the two cards have the same 224 GB/s of memory bandwidth.

On paper, the GTX 980’s other big weakness looks to be texturing capacity, and in practice, the 980 samples textures at a lower rate than its competition. The GTX 970 even falls slightly behind the GTX 770 in this synthetic test, just as it does on paper. We’ll have to see how much of a limitation this weakness turns out to be in real games.

The GM204 cards have some of the highest rasterization rates in the table above, and they make good on that promise in these tests of tessellation and particle manipulation. The GTX 980 sets new highs in both cases.

In theory, the GeForce GTX 780 Ti has more flops on tap and higher memory bandwidth than the GTX 980, so it should perform best in these synthetic tests of shader performance. In reality, though, Maxwell delivers on more of its potential. Even with a big memory bandwidth handicap, the GTX 980 outperforms the GTX 780 Ti in both benchmarks. Only AMD’s big Hawaii in the Radeon R9 290X is more potent—and not by a huge margin.

Maxwell’s other innovations

In addition to the performance and efficiency gains we’ve discussed, Nvidia has built some nifty new features into Maxwell-based products. I’ve been awake for five days straight on a cocktail of pure Arabica coffee, Five Hour Energy shots, methadone, ginkgo biloba, and anti-freeze. The hallucinations are starting to get distracting, but I’ll attempt to convey some sense of the new features if I can. To make that happen, I’m resorting to an old-school TR crutch, the vaunted bulleted list of features. Here’s what else is new in Maxwell:

  • Something called Dynamic Super Resolution — Some of us graphics nerds have been bugging Nvidia for years about exposing supersampled antialiasing as an easy-to-access control panel option or something along those lines. They’ve finally found a way to make it happen, and they’ve taken the concept one step further. Supersampling generally involves rendering two or more samples per pixel and then combining the results in order to get a higher-quality result, and it was in use in real-time graphics as far back as the 3dfx days. Multisampled AA, which more efficiently targets only object edges, has largely supplanted it.

    DSR brings supersampling back by letting users select higher resolutions, via in-game menus, than their monitors can natively support. For instance, a gamer with a 1080p display could choose the most popular 4K resolution of 3840×2160, which is exactly four times the size of his display. The graphics card will then render the game at a full 3840×2160 internally and scale the output down to 1920×1080 in order to match the display. In doing so, every single pixel on the screen will have been sampled four times, producing a smoother, higher-quality result than what’s possible with any form of multisampling.

    DSR goes beyond traditional supersampling, though. Rather than just sample multiple times from within a pixel, it uses a 13-tap gaussian downsizing filter to produce a nice, soft result. The images it produces are likely to be a little softer and more cinematic-feeling. This filter has the advantage of being able to resize from intermediate resolutions. For instance, the user could select 2560×1440, and DSR would downsize to 1080p even though it’s not a perfect 2:1 or 4:1 fit.


    Sounds good in theory, but I’ve not had the time to attach a lower-res monitor to my Maxwell cards to try it yet. (The images above come from Nvidia.) I’m sure we’ll revisit this feature in more detail later. Nvidia says DSR will begin its life as a Maxwell exclusive, but the company expects this feature to make its way to some older GeForce cards via driver updates eventually.

  • MFAA — M-F’in’ AA? No, tragically, the name is not that epic. It’s just “multi-frame sampled antialiasing,” apparently. I think every new GPU launch requires a novel AA method, and we keep getting some interesting new attempts, so why not? MFAA seeks to achieve the quality of 4X multisampling at the performance cost of 2X multisampling. To do so, it combines several elements. The subpixel sample points vary from one pixel to the next in interleaved, screen-door fashion, and they swap every other frame. The algorithm then “borrows” samples from past frames and combines them with current samples to produce higher-quality results—that is, smoother edges.

    Nvidia showed a demo of this feature in action, and it does seem to work. I have questions about exactly how well it works when the camera and on-screen objects are moving rapidly, since borrowing temporally from past frames probably falls apart with too much motion. Unfortunately, Nvidia wasn’t willing to say exactly how the MFAA routine decides what samples to borrow from past frames, so it’s something of a mystery. One wonders whether it will really be any better than pretty decent methods like SMAA, which are already widely deployed in games and offer similar promises of 4X MSAA quality at 2X performance.

    MFAA isn’t yet enabled in Nvidia’s drivers, so we can’t test it. One plus of MFAA, once it arrives, is that it can be enabled via a simple on-off switch; it doesn’t require integration into the game engine like Nvidia’s TXAA does.

    More interesting than MFAA itself is the fact that Maxwell has much more flexibility with regard to AA sampling points than Kepler. On Maxwell, each pixel in a 4×4 quad can have its own unique set of subpixel sample points, and the GPU can vary those points from one frame to the next. That means Maxwell could allow for much more sophisticated pseudo-stochastic sampling methods once it’s been in the hands of Nvidia’s software engineers for more than a few weeks.

  • Substantial new rendering capabilities for DX12 — Yes, DirectX 12 isn’t just about reducing overhead. It will have some features that require new GPU hardware, and Maxwell includes several of them. In fact, Direct3D Lead Developer Max McMullen from Microsoft delivered the news at the Maxwell press event. What’s more, a new revision of Direct3D 11, version 11.3, will also expose these same hardware features. The highlights included ROVs, typed UAV loads, volume tiled resources, and conservative rasterization.

    I’d like to explain more about what precisely these features are and what they do, but that will have to wait for a future article. Interestingly enough, many of these features are not present in the GM107 chip that debuted earlier this year. GM204 contains some significant new technology.

  • Accelerated voxel-based global illumination — There’s some overlap with the prior point and this one, but it’s worth calling out the fact that Nvidia has built hardware into Maxwell—some of which won’t be exposed via DX12—to accelerate a specific method of global illumination the company has been developing for some time. Maxwell can “multicast” incoming geometry to multiple viewports in order to facilitate the conversion of objects into a low-res series of blocks or 3D pixels known as voxels. Once the voxel grid is created, it can be used to simulate light bounces in order to create high-quality, physically correct indirect lighting. That could prove to be a huge advance for real-time graphics and gaming, and it deserves more attention than I can give it right now.
  • VR Direct — This is essentially a suite of features Nvidia has been implementing in its drivers to better support virtual reality headsets like the Oculus Rift. Most of those features have to do with reducing the latency between user input (head movements, usually) and visual output (when images reflecting the input reach the screen). Nvidia has even implemented in its drivers an improved version of Carmack’s “time warp” method of repositioning a frame post-rendering. At least, that is the claim. We’ve not yet been able to try a Rift with this feature enabled.

Our testing methods

We’ve tested as many different competing video cards against the new GeForces as was practical. However, there’s no way we can test everything our readers might be using. A lot of the cards we used are renamed versions of older products with very similar or even identical specifications. Here’s a quick table that will decode some of these names for you.

Original Closest

current

equivalent

GeForce GTX 670 GeForce GTX 760
GeForce GTX 680 GeForce GTX 770
Radeon HD 7950 Boost Radeon R9 280
Radeon HD 7970 GHz Radeon R9 280X

If you’re a GeForce GTX 680 owner, Nvidia thinks you may want to upgrade to the GTX 980 once you’ve seen what it can do. Just keep in mind that our results for the GTX 770 should almost exactly match a GTX 680’s.

Most of the numbers you’ll see on the following pages were captured with Fraps, a software tool that can record the rendering time for each frame of animation. We sometimes use a tool called FCAT to capture exactly when each frame was delivered to the display, but that’s usually not necessary in order to get good data with single-GPU setups. We have, however, filtered our Fraps results using a three-frame moving average. This filter should account for the effect of the three-frame submission queue in Direct3D. If you see a frame time spike in our results, it’s likely a delay that would affect when the frame reaches the display.

We didn’t use Fraps with BF4. Instead, we captured frame times directly from the game engine itself using BF4‘s built-in tools. We didn’t use our low-pass filter on those results.

As ever, we did our best to deliver clean benchmark numbers. Our test systems were configured like so:

Processor Core i7-3820
Motherboard Gigabyte
X79-UD3
Chipset Intel X79
Express
Memory size 16GB (4 DIMMs)
Memory type Corsair
Vengeance CMZ16GX3M4X1600C9
DDR3 SDRAM at 1600MHz
Memory timings 9-9-9-24
1T
Chipset drivers INF update
9.2.3.1023

Rapid Storage Technology Enterprise 3.6.0.1093

Audio Integrated
X79/ALC898

with Realtek 6.0.1.7071 drivers

Hard drive Kingston
HyperX 480GB SATA
Power supply Corsair
AX850
OS Windows
8.1 Pro
Driver
revision
GPU
base

core clock

(MHz)

GPU
boost

clock

(MHz)

Memory

clock

(MHz)

Memory

size

(GB)

Radeon
HD 7950 Boost
Catalyst 14.7 beta
2
925 1250 3072
Radeon
R9 285
Catalyst 14.7 beta
2
973 1375 2048
XFX Radeon
R9 280X
Catalyst 14.7 beta
2
1000 1500 3072
Radeon
R9 290
Catalyst 14.7 beta
2
947 1250 4096
XFX Radeon
R9 290X
Catalyst 14.7 beta
2
1000 1250 4096
GeForce
GTX 760
GeForce
340.52
980 1033 1502 2048
GeForce
GTX 770
GeForce
340.52
1046 1085 1753 2048
GeForce
GTX 780
GeForce
340.52
863 902 1502 3072
GeForce
GTX 780 Ti
GeForce
340.52
876 928 1750 3072
Asus Strix
GTX 970
GeForce
344.07
1114 1253 1753 4096
GeForce
GTX 980
GeForce
344.07
1127 1216 1753 4096

Thanks to Intel, Corsair, Kingston, and Gigabyte for helping to outfit our test rigs with some of the finest hardware available. AMD, Nvidia, and the makers of the various products supplied the graphics cards for testing, as well.

Also, our FCAT video capture and analysis rig has some pretty demanding storage requirements. For it, Corsair has provided four 256GB Neutron SSDs, which we’ve assembled into a RAID 0 array for our primary capture storage device. When that array fills up, we copy the captured videos to our RAID 1 array, comprised of a pair of 4TB Black hard drives provided by WD.

Unless otherwise specified, image quality settings for the graphics cards were left at the control panel defaults. Vertical refresh sync (vsync) was disabled for all tests.

The tests and methods we employ are generally publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.

Thief

For this first test, I decided to use Thief‘s built-in automated benchmark, since we can’t measure performance with AMD’s Mantle API using Fraps. Unfortunately, this benchmark is pretty simplistic, with only FPS average(as well as a maximum and minimum numbers, for all that’s worth.)

Welp, this is a pretty nice start for the GTX 980. Nvidia’s newest is faster than anything else we tested at both resolutions, and the GTX 970 isn’t far behind. The fastest single-GPU Radeon, the R9 290X, can’t quite keep up, even when using AMD’s proprietary Mantle API.

The generational increase from the GK104-based GTX 770 to the GM204-based GTX 980 is enormous.

Watch Dogs


Click the buttons above to cycle through the plots. Each card’s frame times are from one of the three test runs we conducted for that card. Most of these cards run Watch_Dogs pretty well at these settings, with no major spikes in frame times.

The GTX 980 is the overall champ in the FPS average sweeps, and it backs that victory up by taking the top spot in our 99th percentile frame time metric. That means in-game animations should be generally smooth, not just a collection of high frame rates punctuated by slowdowns. Amazingly, even the Asus GTX 970 outperforms the GTX 780 Ti.


We can better understand in-game animation fluidity by looking at the “tail” of the frame time distribution for each card, which shows us what happens in the most difficult frames. As you can see, the GTX 970 and 980 perform well right up to the last few percentage points worth of frames.


These “time spent beyond X” graphs are meant to show “badness,” those instances where animation my be less than fluid—or at least less than perfect. The 50-ms threshold is the most notable one, since it corresponds to a 20-FPS average. We figure if you’re not rendering any faster than 20 FPS, even for a moment, then the user is likely to perceive a slowdown. 33 ms correlates to 30 FPS or a 30Hz refresh rate. Go beyond that with vsync on, and you’re into the bad voodoo of quantization slowdowns. And 16.7 ms correlates to 60 FPS, that golden mark that we’d like to achieve (or surpass) for each and every frame.

The GTX 980 almost stays entirely below the 16.7-ms threshold here, which means it’s not far from perfectly matching a 60Hz monitor’s desire for a new frame every refresh interval. When you slice it this way, the GTX 980’s lead over the competition looks even larger.

Overall, this is a nice set of results in that the frame-time-based metrics all seem to correspond with the FPS average. None of the cards are exhibiting the sort of bad behavior that our time-sensitive metrics would highlight. That said, the new GeForces perform very well, and the $339 Asus Strix GTX 970 very nearly matches the performance of AMD’s fastest single-GPU product, the Radeon R9 290X.

Crysis 3


A look at the frame time plots will show you that the Radeons encounter a couple of slowdowns during our test session. By the seat of my pants, I know that’s the spot where I’m shooting exploding arrows at the bad guys like one of the Duke boys. Nvidia cards used to slow down similarly at this same spot, but a driver update earlier this year eliminated that problem. As a result, GeForce cards take the top three places in our 99th percentile frame time metric, despite the R9 290X having the second-fastest FPS average.


Those slowdowns on the Radeons are evident in the last two to three percentage points worth of frames.


Our “badess” metric captures the difference most dramatically. The GeForces spend no time beyond our 50-ms cutoff and very little above the 33-ms mark, while the Radeons spend many tens of milliseconds waiting for those long-latency frames.

Battlefield 4

We tested those last few games at 2560×1440. Let’s switch to 4K and see how the new GeForces handle that.


I have to admit, I was barely able to play through our test sequence on the slower cards here. The Radeon R9 285’s Tonga devil magic didn’t do much for it at 4K in BF4 at its Ultra settings.



In fact, none of the cards handle this scenario particularly well. The GTX 980 is the least objectionable, followed by the GTX 970, which is a testament to these solutions’ pixel-painting prowess. You’d probably want to double-up on video cards or reduce the image quality settings in order to play this game at 4K for any length of time.

Tomb Raider




Although the GTX 980 has a slight edge in terms of average FPS, the Radeon R9 290X performs a bit better overall when running Tomb Raider at 4K according to our time-sensitive metrics. This isn’t the sort of difference one would tend to perceive, though—and even the fastest cards would be struggling to provide a frame to a 60Hz display on every other refresh cycle. In order words, this ain’t the smoothest animation.

Borderlands 2




Borderlands 2 isn’t an especially challenging game for GPUs of this class to handle at 2560×1440, but I’d hoped we could learn something interesting here—and I think we have. Nearly every card has a 99th percentile frame time below 16.7 ms, which means all but the last 1% of frames is produced at a silky-smooth 60 cycles per second. The GeForces struggle just a little more than the Radeons in that last 1% of frames, though, as indicated by our “badness” metric.

Also, notice that I’ve added a new wrinkle to our “badness” results for this review: time spent beyond the 8.3-ms threshold. If you can stay below that threshold, you can pump out frames at 120Hz—perfect for a fast gaming display. Going with a fast graphics card does help considerably on this front, and the GTX 780 Ti gets the closest to achieving that goal.

A caveat: when you’re looking at frame times in such tiny intervals, things like CPU bottlenecks and run-to-run variations tend to play a larger role than they otherwise would. I think we may need to conduct more than three runs per card in order to get really reliable results on this front.

Power consumption

Please note that our “under load” tests aren’t conducted in an absolute peak scenario. Instead, we have the cards running a real game, Crysis 3, in order to show us power draw with a more typical workload.

We already know the GeForce GTX 980 generally outperforms the GTX 780 Ti and the Radeon R9 290X. Now we know that it does so while consuming substantially less power—virtually no more than the prior-gen GK104 GPU does aboard the GeForce GTX 770. That’s a remarkable achievement.

Noise levels and GPU temperatures

Don’t get too hung up on the noise levels at idle (and with the display off) here. The noise floor in my basement lab tends to vary a bit depending on factors I can’t quite pinpoint. I think the speed of the CPU fan may be the biggest culprit, but I’m not sure why it tends to vary.

Anyhow, the bottom line is that all of these cards are pretty quiet at idle. The only one that seems really different to my ears is the GTX 760, whose cooler is cheap and kind of whiny. The fact that the Asus Strix card completely stops its cooler at idle is most excellent and would count for more in an utterly silent environment.

The Nvidia reference cooler on the GTX 980 performs well here, almost exactly like it does on the GTX 770. Meanwhile, the dual-fan cooler on the Asus Strix GTX 970 has just as much heat to remove as Nvidia’s stock cooler—our power results tell us that—but it does so while making less noise and keeping its GPU at a much cooler temperature.

Conclusions

As usual, we’ll sum up our test results with a couple of value scatter plots. The best values tend toward the upper left corner of each plot, where performance is higher and prices are lower.


By either measure, the GeForce GTX 980 is the fastest single-GPU graphics card we tested. With the possible exception of the pricey Titan Black, it’s also the fastest single-GPU graphics card on the planet. Although $549 is a lot to pay, the GTX 980 manages to deliver appreciably better value than the GeForce GTX 780 Ti, which it replaces. The new GeForce flagship outperforms AMD’s Radeon R9 290X, as well. If you want the best, the GTX 980 is the card to get.

That’s an assessment based on price and performance, but we know from the preceding pages that the GTX 980’s other attributes are positive, too. The GM204 GPU’s power efficiency is unmatched among high-end GPUs. With Nvidia’s stock cooler, that translates into lower noise levels under load than any older GeForce or current Radeon in this class. I’m also quite happy with the suite of extras Nvidia has built into the Maxwell GPU, such as DSR for improved image quality and a big ROP count for high performance at 4K resolutions. This graphics architecture takes us a full generation beyond Kepler—and thus beyond what’s in current game consoles like the PlayStation 4 and Xbone.

All of that’s quite nice, if you insist on having the best and fastest. However, the GeForce GTX 970 is what really excites the value nexus in my frugal Midwestern noggin. For $339, the Asus Strix GTX 970 card we tested is astonishing, with overall performance not far from the GeForce GTX 780 Ti at a fraction of the price. This thing checks all of the boxes, with good looks, incredibly quiet operation, and relatively cool operating temperatures that suggest substantial overclocking headroom. As long as you can live with the fact that it has only one DisplayPort output, the Strix 970 looks like a mighty tempting upgrade option for anyone who has that itch.

Then again, MSI’s GTX 970 Gaming seems pretty nice, too. I’m not sure there are any bad choices here.

The folks who have tough choices to make are Nvidia’s competitors in the Radeon camp. What do you do when your single-GPU flagship has been bested by a smaller chip on a cheaper card? Cut prices, of course, and a well-placed industry source has informed us that a price reduction on the R9 290X is indeed imminent, likely early next week. We expect the R9 290X to drop to $399 and to receive a freshened-up Never Settle game bundle, too.

The revised price would certainly improve the 290X’s place on our value scatter plot. AMD would then be in the position of offering slightly better performance than the GeForce GTX 970 for a little more money—and with 100W of additional power consumption and associated fan noise. Does a game bundle make up for the extra power draw? I’m not sure what to make of that question. I’m also not sure whether a fully enabled Tonga variant could do much to alter the math.

I am happy to see vigorous competition and innovation giving PC gamers a better set of choices, though. Feels like it’s been a long time coming, and I’m still wondering at the fact that Nvidia was able to pull off this sort of advance without the benefit of a process shrink. We’ll have to dig deeper into some of Maxwell’s features, including multi-GPU operation, in the coming weeks.

Enjoy our work? Pay what you want to subscribe and support us.

Comments closed
    • Prestige Worldwide
    • 5 years ago

    Great review as always, but I think reviewing with Battlefield 4 only at 4K resolution makes no sense to 99% of single-gpu buyers. Nobody is going to have a good time playing a multiplayer game with 35fps.

    I just don’t see this as being a typical use case of those who will purchase this GPU. 1080p and 1440p results would be very useful to the rest of us.

      • itachi
      • 5 years ago

      on top of that it’s tested on a single mission 😉

    • kamikaziechameleon
    • 5 years ago

    970 looks like a steal!!!

    • 2x4
    • 5 years ago

    has anyone ordered evga 980 with tiger direct?? these people seem to be so lost – i ordered mine on 9/19 and item showed in stock. almost 2 weeks later, it’s either on back order or out of stock!

    • brothergc
    • 5 years ago

    from what I am reading their is a lot of coil whine on alot of the GTX 970 cards . ( just google “gtx970 coil whine “) Me I am going with the Asus strix

    • Sewje
    • 5 years ago

    Am I the only one that notices that the 970 gtx reports 32 ROPS in gpu-z and not the 64 all reviews are quoting in their specifications?, I can confirm in my 970gtx its 32, however i noticed some reviewers have a gpu-z shot and theirs shot says 32 ROP’s also but they specified 64 ROPS.

    • 2x4
    • 5 years ago

    980 is so hot – my card is on back order status until middle of next week!!

    • wizpig64
    • 5 years ago

    Given the numbering scheme and the die sizes listed on page 1, we probably won’t see GM210 until 20nm and a gtx 9xx series successor. That said I’m happy we’re still getting more value out of 28nm even if the manufacturing side has hit a bump in the road to 20.

      • Pwnstar
      • 5 years ago

      They won’t be going 20nm though. Perhaps you are thinking of 16nm FinFETs?

    • GrimDanfango
    • 5 years ago

    I got my GTX 980 today.
    It is very extremely fast.
    I am pleased.

    …Elite Dangerous, full everything, 2560×1440, minimum 75fps, often 120 (well, minimum 50 in the galaxy map, pointing towards the galaxtic center… they gotta tighten that bit up!)
    Extra wonderfully nice with G-Sync on the ROG Swift 🙂

      • TwoEars
      • 5 years ago

      Nice combo you’ve got there!

    • excession
    • 5 years ago

    Now then. (as they say here in the North of England)

    That Asus card does look bloody brilliant and I’d love one as my HD6850 is getting a bit old. Yes, I’m more than willing to swap from red to green.

    My issue is this:

    UK price: £300 inc VAT (sales tax), ex VAT it’s £250.
    Today, £250 = $410
    US price as stated in the article: $340 (which is before tax I believe?)
    $410 – $340 = $70
    Where are my $70 / £42?

    Cheeky gits.

    In fact, one bunch of thieving buggers want £360 inc VAT!
    [url<]http://www.dabs.com/products/asus-geforce-gtx-970-4gb-pci-express-3-0-directcu-ii-oc-strix-9RXD.html[/url<]

      • Airmantharp
      • 5 years ago

      But you don’t complain about all of your beloved social services then, do you?

        • CeeGee
        • 5 years ago

        He’s not complaining about the tax.

          • Airmantharp
          • 5 years ago

          The tax is only a part of what goes into pricing…

            • CeeGee
            • 5 years ago

            That’s right, it is however the part that goes directly into social services and he specifically removed it from the equation…

            • Airmantharp
            • 5 years ago

            There’s the whole ‘cost of doing business’ thing that goes into the price before the taxes…

            • CeeGee
            • 5 years ago

            The ‘cost of doing business’ has a tenuous link at best to the provision of good social services…

            • Airmantharp
            • 5 years ago

            It’s not a direct link, but in general, any place that spends quite a bit on social services also has a relatively high cost of living, and generally speaking everything is more expensive.

            Europe and Commonwealth nations vs. US, California vs. Texas, etc.

            • CeeGee
            • 5 years ago

            Much more likely factors in this case are things like exchange rate, as most business like to insure they don’t catch a cold from exchange rate fluctuations they charge a bit more to places that don’t use dollars. Also it’s more difficult to operate in lots of small European countries where logistics is more challenging than in the US and providing support in local languages may incur further costs which has to be covered by initial sale.

            There are also economies of scale at work here with the likely market much smaller in the UK than in the US and lest we forget there’s likely some price gouging going by the local online stores.

            • excession
            • 5 years ago

            Correct. Not complaining about tax at all (just for reference, 33% of my income goes directly to tax, and that’s before VAT, council tax, road fund licence, fuel duty, or TV licence.)

            I just don’t get why small electrical items like this are consistently around 20% more expensive in the UK. We’re not THAT much smaller a market; there are 64 million of us.

            • sschaem
            • 5 years ago

            … deleted … (post included VAT vs non VAT)

      • Freon
      • 5 years ago

      Is this really new or specific to these two cards?

    • stmok
    • 5 years ago

    I’m currently using a pair of Gainward GTX 580 (3GB) for rendering in Blender.

    Anyone know how the GTX 970 perform in this scenario?

      • Airmantharp
      • 5 years ago

      Better.

      • Krogoth
      • 5 years ago

      GTX 970 would faster than a single 580 due to clockspeed if anything else.

      Dual 580s will outrun the single GTX 970 by a good to somewhat small margin depending on the GPGPU application. In power efficiency, the 970 wins hands down.

    • ronch
    • 5 years ago

    Funny how Intel is beating AMD up in the x86 CPU industry while Nvidia continues to be the more respected GPU provider (even before AMD asked fans to crash NV’s party). So what does AMD do? Combine a weak CPU and weak GPU in one package! Behold! The realization of everyone’s dreams: the APU!!!

    Except Intel and Nvidia can both easily provide the same performance levels offered by these APUs at practically the same prices for those who stubbornly look the other way and insist APUs are just what every OEM is asking for to put inside PCs sold at the flea market.

    But I’m sure AMD will continue to beat the APU drum until they come up with a more competitive CPU to prop up their Radeons, which could also use some efficiency tweaks in light of Maxwell. To deny that Maxwell doesn’t worry them one bit is bordering on unwarranted pride or lying.

      • badpool
      • 5 years ago

      Can’t deny the performance/value advantage you get with intel and nvidia in most cases…but I’ve had a mini-ITX board with a passively cooled AMD APU serving me for 3 years and it’s been tops. For me, there’s still a place for the CPU+GPU die, so long as it adequately leverages it’s best attributes – low cost, low power, low noice, with “good enough” performance.

    • Cannonaire
    • 5 years ago

    I’m happy to see nVidia endorse downsampling in the form of a supported feature. I’m curious about the downsampling filter they use though – a 13-tap Gaussian filter should produce a decently sharp image without ringing, but is there any word on whether or not it is gamma-aware? That last detail is important when downsampling and particularly for high-contrast details.

    • Klimax
    • 5 years ago

    Looks like we know what’s happening in AMD:
    [url<]http://www.kitguru.net/components/graphic-cards/anton-shilov/amd-and-synopsys-to-co-design-14nm-10nm-apu-gpu-products/[/url<] Outsourcing...

    • Rza79
    • 5 years ago

    I don’t understand how the 980 is a 165W TDP card according to nVidia? Tom’s tested the card’s actual power usage. 185W under gaming and 285W under stress testing.
    57W and 20W less than the 290X respectively who’s TDP is 290W.

    This sentence caught my attention from Tom’s: [quote<]the new graphics card’s increased efficiency is largely attributable to better load adjustment and matching[/quote<] It seems nVidia's TDP is more like Intel's SDP. Or am I wrong?

      • Klimax
      • 5 years ago

      Wrong. TDP is for cooling solutions (amount of heat), only partially related to consumption. (Flipping bits still requires some power, which doesn’t produce heat)

      And second, so long on short term average it doesn’t exceed TDP, cooling solution will be able to cool it within parameters like temperature even if spikes of produced heat occur.

        • Rza79
        • 5 years ago

        In computer chips, anything that uses power will become heat.
        Tom’s stress test takes longer than one minute. That’s not short term.
        It spikes above 350W on occasions. That’s short term. 350W!
        It averages 285W. That means the card’s cooling solution needs to be able to handle 285W, as do your PSU and case cooling.
        There’s a reason nVidia uses the cooler from the GTX Titan Black.

          • Klimax
          • 5 years ago

          Still wrong. TDP is not consumption. There is no such thing in chips as 100% conversion to heat. A lot of power will get stored in electrons. and such.

          TDP is about cooling. PERIOD. It has only partial relation to consumption (because certain fraction of consumption will convert into heat obviously, but heat is only part of whole thing)

          As for my point about short term, that was for measured heat output to remove from system (GPU).

            • Rza79
            • 5 years ago

            You’re right to say that TDP is related to the heat produced and not consumption. It’s the typical heat production and doesn’t represent the worst case scenario.
            But … !
            On a GPU (or CPU for that matter), all the energy is converted to heat.
            Charge has to be moved from one place to another and it’s this current through resistance which causes heat. P=I²×R
            Chips do nothing more than changing from 0 to 1 and back. As such, TDP and power consumption are very related. TDP just doesn’t represent the peak power usage. Still 165W and 285W are very far apart in my opinion.

            • jihadjoe
            • 5 years ago

            100% efficient heating, like a dielectric heating element assumes the chip has no outputs and converts all input solely into heat. Processors are pretty far from that, because unlike dedicated heating elements they do output electricity, otherwise how would we get the results of all those computations?

            Chips are more like inefficient transformers than heaters.

            • Meadows
            • 5 years ago

            Sure, sure, but considering electronics have no moving parts, the conversion ratio to heat will still be pretty high and for all everyday intents and purposes you can consider it to be 100%.

            • Freon
            • 5 years ago

            Despite the continued trend of abhorrent up/down votes on TechReport article comments, this is factually correct.

            • NewfieBullet
            • 5 years ago

            You are correct that TDP is used to determine the cooling requirements and in many cases is not the maximum power that a device can dissipate. However the rest of your post makes no sense.
            Electronic devices do not do any mechanical work so nearly 100% of the power drawn by the card is lost as heat. If “A lot of power will get stored in electrons. and such” were true then this card would make a very good battery.

      • Meadows
      • 5 years ago

      It looks like Tom’s has measured motherboard power draw separately and they’ve added it to the GPU power draw to arrive at their total numbers.

      Even disregarding that, it would seem like the “SDP” notion looks correct for torture tests, but in real world usage the TDP seems right as it is.

        • Rza79
        • 5 years ago

        PCIe total = 2 x 6-pin connectors
        Motherboard 12v & 3.3v = PCIe slot

        So you need both.

        What’s worrying is that it’s pulling 230W from the two 6-pin connectors which together are rated for 150W. Could be an issue for some (cheaper) PSU’s I recon.

          • Meadows
          • 5 years ago

          Looks odd. Something’s not right with their measurements.

          • NovusBogus
          • 5 years ago

          …minus the power supplied by the PCIe bus itself, which I believe is either 50 or 75 watts. Some of the 750s don’t even need a 6 pin at all. I do agree that their numbers sound fishy though, it seems like they may have mixed up total power load for everything and load for just the card.

            • Meadows
            • 5 years ago

            Was my first reaction too, but then I noticed that’s what they labeled as “Motherboard”. The power the slot itself gives.

      • anotherengineer
      • 5 years ago

      That’s why I like techpowerup’s measurements. They measure the pcie slot and the pcie power connectors, for an actual or true card consumption.

      [url<]http://www.techpowerup.com/reviews/ASUS/GTX_970_STRIX_OC/23.html[/url<]

        • Rza79
        • 5 years ago

        Yeah good comparison. Checked out this review:
        [url<]http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_980/24.html[/url<] One thing noticable on Tom's review is that the stock card uses more power than the Gigabyte OC card. This points to a worse sample. But realistically, this also happens in real life. Not all chips are created equal. So their stock card represents a bit worse sample and their Gigabyte card represents a good sample but a bit OC'ed. So the idle seems more or less comparable (9 & 15W vs 8W). Gaming seems also comparable (174 & 185W vs 156W). This difference can easily be caused by the game used and/or the slight OC. Maximum 280 & 285W vs 190W. Big difference here but also tested differently. Tom's uses an unnamed GPGPU app and TPU uses Furmark. I guess Tom's GPGPU app is custom made so nVidia's driver can't have a trap for it. It's been known that both nVidia and AMD detect Furmark in the driver and adjust the card's settings accordingly. I think this is really worth looking into by TR. I'm not trying to bash the 980 here. I think it's a wonderful product but this raises some questions. I don't think Tom's made any mistake in the measurement. It's not like it's their first time. 230W from the PCIe power connectors, while they're rated for 150W, is a bit strange. This might be a driver bug.

      • MadManOriginal
      • 5 years ago

      Tom’s hardware guide…lol.

      • Zizy
      • 5 years ago

      Nah. It is just NV’s card runs near TDP all the time.
      TDP = cooling = max average power consumption. It isnt equal to max instantaneous power consumption, nor actual power consumption for any task.
      AMD is simply requiring better cooling for their cards even if their power consumption for most tasks is not that much higher as TDP would suggest.

      So, TDP is actually mostly irrelevant for consumers (matters only for board makers and their cooling systems), yet the number all see. MAX is somewhat relevant for PSU (but more relevant to board makers), AVG is for electricity bill and PSU.

      • exilon
      • 5 years ago

      Looks like to me the Gigabyte custom ones just have a raised TDP of 250W/300W

      The reference 980 is throttling at TDP just fine.

    • snook
    • 5 years ago

    are these test against a reference 290X?

    please, temper some of the joy with AMD has an answer, just like it did for the GTX780.

      • Airmantharp
      • 5 years ago

      A response is on the way, but then again so is Nvidia’s response to that.

        • snook
        • 5 years ago

        3+ as well as AMD’S. Tis the way it goes I guess

    • rpjkw11
    • 5 years ago

    A VERY nice upgrade for my GTX 760: The Asus Strix GTX 980! Now I’m waiting for a GTX 970Ti or 980Ti to replace my Asus GTX780Ti DCU II OC.

    BTW: I’m bowled over at the price for these cards, especially the GTX 970. That’s gotta be the best bang-for-buck in a long, long time.

    • rogue426
    • 5 years ago

    The 970 GTX is the first piece that gives me that feeling of ” I have to retire my Phenom X2 box now”. Combine that with the mid range Haswell-E or the Devil’s Canyon chip running at 4 ghz , 1TB SSD and the upgrade and replace itch might not go away .

    • Captain Ned
    • 5 years ago

    Well, that made it very easy to pick which card would join the parts already purchased after BBQ for the next rig. I fully expect to get a good 5 years out of a 970, just like I did with a Ti4200 (with mad OC; I called it a Ti4700) and an 8800 GTS 640MB.

    Now to pick the new monitor.

    Oh, and with a P182 with all 6 case fan slots filled (might have to yank one to squeeze a 970 in there) I’m just fine on cooling. Cat hair is the larger problem.

      • Voldenuit
      • 5 years ago

      [quote<]Oh, and with a P182 with all 6 case fan slots filled (might have to yank one to squeeze a 970 in there) I'm just fine on cooling. Cat hair is the larger problem.[/quote<] Time to upgrade your cat.

        • Captain Ned
        • 5 years ago

        The short-hair is fine. The other one leaves Stimpy-sized hairballs everywhere. It’s so enjoyable to find one with a bare foot while sleep-walking to let the dog out at 3:30AM.

      • JustAnEngineer
      • 5 years ago

      [quote=”Captain Ned”<] Now to pick the new monitor. [/quote<] If you don't wait for Adaptive-Sync, pick up one of the Korean WQHD IPS LCD monitors. They're just $400-$450 for a 30" 2560x1600 model or $300-$340 for a 27" 2560x1440 model. Don't settle for a puny 1920x1080 display.

    • sschaem
    • 5 years ago

    nvidia must be feeling some pressure. New faster cards at a lower price.

    GTX 780 TI -> GTX 980 ~5% faster, $50 price drop
    GTX 780 -> GTX970 ~5% faster, $150 price drop

    If AMD doesn’t counter with a $350 R9-290x, things are going to get really ugly for amd…
    If they cant cover their CPU division losses with GPU sales, they wont last until 2016 🙁

      • moose17145
      • 5 years ago

      They wont last until 2016??? Rofl wut?

      You do know they are making the chips for the PS4, XBone, and WiiU right? Pretty sure those three companies do not want AMD going under since they are kind of the ones making the chips that are powering their consoles…

      How long has everyone been saying AMD is done for? And yet here they still are…. And very very likely there they will still be in 2016…

        • sschaem
        • 5 years ago

        Look at their financial. even with 15 million Ps4/xb1 sold in the past year, AMD went 300 million further into debt.

        “As of the end of June, AMD had $948 million in cash and equivalents and $2.21 billion in total debt, compared with $1.12 billion in cash and $2.05 billion in debt a year earlier, according to data compiled by Bloomberg.”

        On September 4th AMD CEO warned about upcoming “bumps in the road” until they can deliver their new architecture. The stock has been dropping everyday since that statement,
        10% in 2 weeks so far.

        You can also see an acceleration of insider share selling in the past 3 month…

        They also have no more asset to borrow against.

        So yea, Sony and MS do want their chip made, but AMD can provide that even if they end up firing 50% of its workforce.

        For me done for is : R&D department so far behind they have no chance of a comeback.
        As Intel demonstrated in the past month , this is now a fact.

        We also see how little to no impact 15 million console sold is doing to AMD bottom line, they are still loosing money.

        And their 2014 APU lineup is a complete loss… Things are getting worse, much, MUCH worse … not better. And this nvidia new lineup is a death blow.

        • Deanjo
        • 5 years ago

        [quote<]You do know they are making the chips for the PS4, XBone, and WiiU right?[/quote<] AMD are not really making the chips however. Yes, AMD did design them according the the wishes of MS and Sony however if AMD went belly up, chances are that MS and Sony still have rights to those designs. All that would be needed is the blessing from intel to do so and with AMD out of the picture, they would get it. Consoles are not ever evolving within a console generation like a pc is. The chip that was available at launch is the same chip that is used at the end of the consoles life. If AMD folded, MS and Sony would likely have permission to continue on producing what is already available. That will last them easily 4-5 years until they can simply grab an intel cores and nvidia graphics and continue on with their next gen. Keep in mind as well that there are no guarantees that the next gen consoles are going to be x86 either. They could be just as easily ARM cores with PowerVR or Nvidia graphics cores.

      • jihadjoe
      • 5 years ago

      Last I knew the 780Ti had MSRP at $700, so that’s a $150 price drop going to the 980.

    • Dygear
    • 5 years ago

    Love how The Verge links to the final page of this article. The Tech Report has been name dropped a lot in the past 24 hours as well. JJ did a video with NewEggTV on the ASUS SWIFT Monitor, and they mentioned the Tech Report. That’s always nice :). Keep up the good work guys. Although you seem to start alluding to future articles about the subjects more and more. It’s starting to feel like your phoning it in. It’s starting to feel like an incomplete article for that reason. Just my 2 cents.

      • Captain Ned
      • 5 years ago

      As Damage said, the 2 970s hit his front porch while he was testing the 980. This is the kind of market-changing product that you need to have on your front page 30 seconds after the NDA drops and some mfgs were slow with the review samples. Age-old problem in journalism of any kind; lead with less than you like or follow with everything and miss the wave.

      On another note, the winner’s prize for next year’s cornhole tournament at the BBQ might have a few more people practicing, though I’m pretty sure I know where the 980 will reside. Toss us a bone with one of the 970s, maybe?

    • juampa_valve_rde
    • 5 years ago

    Sweet cards, interesting TDP/efficiency, but i don’t like the way this are getting promoted by the low TDP, because actual power consumption although improved ain’t a deal breaker. About the new features, just meh… like the 285, anyway i’m waiting for the 285x (full tonga), that’s probably my upgrade path.

    • brucethemoose
    • 5 years ago

    What about overclocks? The 780 TI has a ton of OC headroom because it’s clocked so low, but the 980 is already at ~1200MHz out of the box… I love TR GPU reviews, but I never understand why they leave them at stock clocks.

    Also, for anyone wondering about AMD’s response, the latest rumors point towards a mid-sized 20nm GPU with on-package “HBM” memory, though it’s probably awhile away.

      • Airmantharp
      • 5 years ago

      Things get left at stock to produce baselines; factory overclocked products and products customized for the sake of overclocking (coolers, power phases, etc.) need to be considered individually.

      • laggygamer
      • 5 years ago

      Don’t quote me but from what I know the 980 can push 1400mhz on average, and 1500mhz if you get lucky so max OC on both the 980 should still be 5-10% faster. The tdp starts getting high when you push the 780 ti though, well over 300w.

        • brucethemoose
        • 5 years ago

        That’s amazing. 1300MHz is a high OC for most Kepler cards, 1400MHz+ is pretty remarkable for a GPU using the same process.

          • Ninjitsu
          • 5 years ago

          Yeah AT’s seen results like that, see:
          [url<]http://www.anandtech.com/show/8526/nvidia-geforce-gtx-980-review/22[/url<]

    • laggygamer
    • 5 years ago

    Can anyone explain to me why the 770 outperforms the 780 in 3dmark vantage color fill?

      • derFunkenstein
      • 5 years ago

      GPU has higher clocks but less of everything. Very curious.

        • Meadows
        • 5 years ago

        My best guess is the colour fill test does not saturate the link to VRAM but benefits from its speed regardless. The VRAM on the 770 runs at 7 Gbps while the 780 “only” does 6 Gbps.

        The GPU clock speeds seemed suspect to me too at first, but the 780 has 50% more ROP throughput so it should theoretically deliver more pixels even at a significantly lower clock speed.

          • laggygamer
          • 5 years ago

          I would really like TR to get to the bottom of why that happened.

            • derFunkenstein
            • 5 years ago

            Well it’s 3DMark and it doesn’t seem to carry into actual games. Must be just another way that 3DMark isn’t actually like actual games.

          • derFunkenstein
          • 5 years ago

          The thing about memory is that the 780 has a 384-bit bus, whereas the 770 has a 256-bit bus. So the clocks are higher, but the overall bandwidth of the 780 is higher.

            • Meadows
            • 5 years ago

            That’s why I suggested the test might not saturate the link, it probably moves only little data but still benefits from the speed of such. Then again, this is coming from someone who has no idea.

      • Krogoth
      • 5 years ago

      It is probably a statistical fluke or there’s some kind of strange bottleneck going on with normal 780. 780 does operate at lower clockspeeds than 770, but it has a wider bus and more shading units. If the application were depended entirely on clockspeed, then 290/290X 780Ti and 970/980 shouldn’t be pulling ahead that far ahead in it.

      • Damage
      • 5 years ago

      Sometimes, when a synthetic test really beats on one set of resources, the GPU with the higher clock frequency can outperform a “wider” GPU at a lower speed. We ran this same test in our R9 290X review and also saw the 770 outperform the 780 in that case, with an entirely different set of drivers and such:

      [url<]https://techreport.com/review/25509/amd-radeon-r9-290x-graphics-card-reviewed/6[/url<] The possible reasons include: 1) A single blocking resource in the GPU that's limited by clock speed. 2) A memory access pattern than somehow benefits more from higher RAM speeds than a wider interface. 3) Thermal behavior that causes the larger GPU to throttle more readily than the smaller one. 4) Something else about the architecture of these chips that we don't (and probably can't) know. We've seen this sort of thing in the past in directed tests. I wouldn't get too hung up on it. The GTX 780 performs better in games than the 770, even in ROP-limited scenarios like BF4 at 4K w/4X MSAA.

        • laggygamer
        • 5 years ago

        Thank you for the response. I just found it interesting from the standpoint of understanding what the gpu is doing fundamentally on the particular benchmark. Testing with an overclocked 780 might shed some light on some of those theories if you ever get the chance. Of course it’s no big deal in gaming scenarios. This is just curiosity.

    • Ninjitsu
    • 5 years ago

    Looking at compute performance on AnandTech, Maxwell roughly ties the R9 290X (including in OpenCL), except in double precision stuff, that too because GM204 is deliberately gimped.

    Nvidia’s just taken everything with this one, haven’t they?

    • Klimax
    • 5 years ago

    For those wondering what is coming to DirectX:
    [url<]http://www.anandtech.com/show/8544/microsoft-details-direct3d-113-12-new-features[/url<] Intel got tech in (pixelsense) alias ROV. Looks like DX 11.3 and 12 are parallel, because 12 will be much harder to use correctly due to much lower level nature. There is one thing missing, how it will behave and code using it when new cards come out like R285.

    • BoBzeBuilder
    • 5 years ago

    Meh. Still see no reason to upgrade from my 8800 GTX.

      • Krogoth
      • 5 years ago

      Unless you are gaming at some low-end resolution (1280×1024 or 1024×768) or running a very old CPU. The 970 is almost three to four times faster than 8800GTX.

        • Meadows
        • 5 years ago

        Fermi was 3-4 times faster, four years ago. A GTX 970 today is roughly six times faster than an 8800 GTX.

          • Mikael33
          • 5 years ago

          A 8800 GTX would be brutally slow in todays games, even at 1280×1024, I had to run my 4850 for a bit when I attempted to flash my 7850 and it was quite slow…

            • Meadows
            • 5 years ago

            Like I said, if a GTX 970 can do 50-60 fps in some game at some given settings, then the 8800 GTX will at most do 8-10 fps using the same settings.

            Edit: that doesn’t take into account the massive difference in DirectX version support.

            • swaaye
            • 5 years ago

            None of that matters to some people. There are still plenty of games that an 8800GTX would get the job done for. These reviews with Ultra settings on the latest AAA releases at 2560×1440 paint a different picture for the intended audience of course.

            • jihadjoe
            • 5 years ago

            Hey, he could be playing games with a [url=http://xkcd.com/606/<]5 year lag[/url<]. An 8800GTX still did acceptably in MW2 [s<]and Uncharted2[/s<]. Edit: My bad, was checking out some old reviews for this and my brain saw Uncharted in a sidebar.

            • tipoo
            • 5 years ago

            How did a 8800GTX play Uncharted 2?

            • NovusBogus
            • 5 years ago

            I played Rage on a GT 230 once. Resolution is everything.

            • sweatshopking
            • 5 years ago

            UNCHARTED ISN’T ON PC.

            • tipoo
            • 5 years ago

            Besides the point I was making…Uncharted is a PS3 exclusive.

            • swaaye
            • 5 years ago

            I played Bioshock Infinite on a 4850 and Rage on an 8800GT. Both ran acceptably at 1680×1050. The biggest issue is with 512MB RAM. It’s not enough for textures in recent games so you end up on low texture detail if you want a stable frame rate.

            Though I’m not saying those are my go-to gaming cards. I mainly use a 560 Ti or 6950. I’ve just had the chance to experiment with older cards. It’s interesting and gives some perspective compared to staring at the latest hype all day.

            • Rakhmaninov3
            • 5 years ago

            Never had an 8800, but my 9800 GTX plays Just Cause 2 with mostly high settings quite smoothly at 1080

            • Krogoth
            • 5 years ago

            I don’t understand the down-voting.

            People forget that the 8800GTX was a power-house for a long time. GTX 280 was nothing more than a 8800GTX on roids. It wasn’t until GTX 480 where Nvidia release a GPU that was over double the performance of the 8800GTX. GTX 580 push it a bit further. 970 is roughly in the ballpark of a 780 vanilla. 970 is roughly three to four times performance if memory capacity isn’t a bottleneck.

            However if you factor in power efficiency, the 970 is easily eight times more efficient than 8800GTX which was a power-hog in its heyday.

            • flip-mode
            • 5 years ago

            Doing it wrong, you are. You have to multiply the performance factor (4X) by the coefficient of power consumption (8X). So the 970 is really 32 times better than the 8800GTX.

            • Krogoth
            • 5 years ago

            You are doing it wrong. The 970 consumes a fair amount of power at load not quite the level of 8800GTX at load, but it is close to 50-60% to 8800GTX at load.

            Which means you can power almost two 970s with same power consumption of a single 970GTX. They each yield 3-4 times the performance. Just double that and you get roughly 6 to 8 times the power efficiency.

            • flip-mode
            • 5 years ago

            You don’t sarcasm very well.

            • Nictron
            • 5 years ago

            I am still running my 8800 GTX in my son’s PC. 17″ screen 1280×1024. The following games are running like clockwork:

            – Bastion,
            – Grid & Grid 2,
            – Entire C&C Series,
            – Battlefield 3 & 4,
            – Sonic Racing,
            – Castlestorm,
            – can’t recall the rest.

      • MadManOriginal
      • 5 years ago

      Am I the only one who took this post as a meme-ish joke?

        • derFunkenstein
        • 5 years ago

        I did as well, but we might be the only ones.

        I bet if he’d said “I see no reason to upgrade from my Radeon 4850” the AMD fans would have him +11 though.

          • Meadows
          • 5 years ago

          Probably not. That card is the poo as well.

        • jihadjoe
        • 5 years ago

        I saw the joke, but I don’t think “Meh” and “Wake me up when” is funny anymore.

    • Yeats
    • 5 years ago

    Great review, thanks TR.

    These cards are definitely better than what I need. Sometimes, though, hardware comes along that you want to own simply because its so good, and GTX 970 is striking that chord with me.

    • ipach
    • 5 years ago

    it’s moments like this that make PC building so exciting and so damn frustrating. how is a customer not supposed to feel lame when they spend ~$300 on something like a 770 only to have it massively eclipsed by a similarly priced product just a couple months later?

      • MadManOriginal
      • 5 years ago

      Either:

      1) Realize that you got some utility out of the purchase you already made, and that it doesn’t suddenly start performing worse.

      2) Stay on top of information about product releases and don’t buy something (unless at a substantial discount, perhaps) when the next generation is imminent. Just don’t turn this into a ‘I’ll get the next thing’ syndrome where you don’t do anything for years.

        • ipach
        • 5 years ago

        well, like many of us, my gpu is a bit emotional. it displayed this article and immediately got warmer by 10 degrees C. been trying to talk it through its anxiety issues, but what can you do? i’ve even tried lying to it and telling it that even scott thinks it is beautiful and everything. but it can’t help but feel a bit unpretty now. damn new in town prom queens show up in their fancy cars… what’s a card to do?

      • LoneWolf15
      • 5 years ago

      Can I feel good about my two R9 280x cards for less than one 970?

      • Vaughn
      • 5 years ago

      Its called being an informed customer. If you don’t want this to happen you have to keep up with GPU roadmaps and pay attention to what is being released when. Its the only way to stay ahead of the buyer remorse curve.

      • FuturePastNow
      • 5 years ago

      I avoid it by simply not spending that much on this stuff. When I build a PC, my budget for a video card is more like $150, tops.

      While the GTX970 looks awesome and looks like it hits a performance/dollar sweet spot in the comparison, at $300+ it’s priced well out of the affordability range for me.

      Buy a formerly high-end card when it is a generation or two old for far less than new. Or a mid-range card. And be happy with that.

      • Airmantharp
      • 5 years ago

      I know people that bought the last gen Mustang when it came out with the 4.6L, not realizing (as Vaughn mentions above about being an ‘informed customer’) that the 5.0L with another 100HP was coming.

      But they still enjoyed the crap out of their cars, and in the same way that someone would have enjoyed their GTX770.

      In either case (or any other), buy for a targeted capability. Something better is always around the corner.

    • Nation
    • 5 years ago

    Wow, simply Amazing. The 970 definitely has my attention and it only draws about 5 watts more then my 660. I am thinking an upgrade might be in my near future.

    • HisDivineOrder
    • 5 years ago

    I read Anandtech’s review of the 980, saw the performance numbers, and thought, “That’s a nice progression of technology for a product that did not get a dieshrink. Wonder if the 970 is going to be close enough to really impress…”

    Then I came here and saw.

    And all I can say is, “Wow.” The 970 is like the 670 or the 4200 of old. You lose a tiny bit of performance for a LOT of saved money.

    970 SLI looks incredibly compelling as an alternative to the 980 or any other product on the market, especially for 4K and beyond. I’m really shocked that nVidia didn’t do more to differentiate the 970 and 980 beyond what they cut because as it is, the lack of memory difference or …something, it just makes the 980 completely pointless in any but the most space-limited of scenarios.

    Assuming SLI works properly, of course.

      • Airmantharp
      • 5 years ago

      I have two GTX670’s, so I can confirm that SLI does indeed work quite well, though I do believe that the experience between SLI and Crossfire are equivalent.

      Also, one can borrow this same situation from the release of the R9 290 and R9 290X; they also provided excellent competition to existing Nvidia offerings and forced an across the board price adjustment.

        • Krogoth
        • 5 years ago

        SLI/CF are still mixed bags and are heavily depended on the drivers/software in question. You still have the issue of micro-shutter and other interesting quarks with multi-card rendering to content with as well.

          • Airmantharp
          • 5 years ago

          I cannot say that such quirks have been eliminated, but as a general rule the last few years of reviewers focusing on frametimes and the accompanying elbow grease that AMD and Nvidia have applied to the problem of large variations in frametimes has made issue largely academic.

          Generally speaking, in those instances where multi-GPU rendering is desired or essentially needed, it works very well.

            • laggygamer
            • 5 years ago

            My experience with two 670’s in sli is that it is smooth, but input lagged. However if you are gaming with low fps you feel lagged anyway so it cancels out. Overall it improves the experience dramatically. For example, if you are at 120hz, going from 45 fps with 1 card to 80fps with 2 is like night and day. Going to be absolutely essential for anyone who actually needs the horsepower. 120hz at 1440p or anyone using 4k, gtx 970sli is going to be the king for the vast majority of enthusiasts who financially can’t afford to be too extravagant.

            You will occasionally have a game here and there that hates sli, but overall those tended to be the games that either 1 card is excessive, or the game is broken regardless of how many cards. And you have to get a good bridge. If you get unlucky with the bridge that you get it will ruin the whole experience until you replace it.

      • NovusBogus
      • 5 years ago

      My guess is that they see the 980 as a temporary flagship until they get 20nm technology.

    • Takeshi7
    • 5 years ago

    I think for your power tests you should run them at max load instead of in a real game. I like to use my graphics cards to run CUDA programs and they use a lot more power than when they are in games.

      • NewfieBullet
      • 5 years ago

      I think power tests over a range of workloads would be a good addition. Obviously the cards are not drawing maximum power in this game since the 980, 970 and 770 have roughly the same power draw in this test whereas their TDP is 165, 145, and 230 respectively.

    • Klimax
    • 5 years ago

    BTW: DirectX blog got some interesting bits:
    [url<]http://blogs.msdn.com/b/directx/archive/2014/09/18/directx-12-lights-up-nvidia-s-maxwell-editor-s-day.aspx[/url<] Looks like Microsoft massively cooperated with NVidia on DX and associated things. [quote<] Developing an API requires working in a graphics stack where many pieces are constantly changing: the graphics kernel, hardware specific kernel drivers, the API, hardware specific user-mode drivers, and the app itself. Adding new features and fixing bugs in such an environment requires the owners of each piece to work together in real-time to solve problems together. For several months, NVIDIA’s engineers worked closely with us in a zero-latency environment. When we encountered bugs, NVIDIA was right there with us to help investigate. When we needed new driver features to make something run, NVIDIA set an aggressive implementation date and then met that date. [/quote<]

    • El_MUERkO
    • 5 years ago

    What kind of Display Port and HDMI connections are they? DP 1.3, HDMI 2.0?

    • sweatshopking
    • 5 years ago

    I guess I should throw my 290x in the garbage.

      • chuckula
      • 5 years ago

      That’s not good for the environment SSK!

        • sweatshopking
        • 5 years ago

        GOOD POINT! I’LL SHIP IT TO CHINA AND SOME CHILDREN CAN MELT IT DOWN WITHOUT PPE

          • NovusBogus
          • 5 years ago

          Overseas shipping can be rough so be sure to wrap it with a bunch of those little plastic things that strangle ducks.

      • geekl33tgamer
      • 5 years ago

      I’m feeling a little burned by my R9 290X purchase too, dw… 🙁

        • ipach
        • 5 years ago

        word. i was complaining about my recent 770 purchase, but yeah, you have it worse. my apologies. i think i’ll go play some ps4 now.

        • Terra_Nocuus
        • 5 years ago

        Well you’re not supposed to [i<]touch[/i<] it

          • geekl33tgamer
          • 5 years ago

          lol 😀

      • NewfieBullet
      • 5 years ago

      If I already owned a 290x or 780 I don’t see a big enough increase in performance that would make me want to upgrade to a 980. I like to get at least 50% more performance when I upgrade. That said, if I was buying a new card today I would get the 970. The 290x and 290 would have to drop in price considerably before I would consider either.

      • ipach
      • 5 years ago

      oh the joys of pc building.. sigh.

      • MadManOriginal
      • 5 years ago

      I thought you got it for free or something?

        • sweatshopking
        • 5 years ago

        i got it for 250$, and made 400$ selling my old computer, so it was a net gain of 150$. I was being sarcastic. i’ll keep this baby till like 2017.

      • dragontamer5788
      • 5 years ago

      Don’t feel that bad… people are still buying [url=http://www.newegg.com/Product/Product.aspx?Item=N82E16814127770&cm_re=GTX_780_Ti-_-14-127-770-_-Product<]NVidia GTX 780 TIs[/url<] for $600+.

      • UnfriendlyFire
      • 5 years ago

      You should use the Tartis box to go back a year and sell the 290x to a crypto-coin miner.

    • Bomber
    • 5 years ago

    My wife told me this morning after she saw me reading this article last night to simply buy a GTX970. For that money, the performance makes it a no brainer.

    I’m moving up from a 560Ti 448 Core so this is a pretty substantial jump for me. I’ve had every other nVidia product going back to the original Riva128. I was actually looking at jumping on the 770 with the new price drop coming (and would have maintained that every other generation purchase) until reading this. Thanks TR

      • MadManOriginal
      • 5 years ago

      You married well.

        • anotherengineer
        • 5 years ago

        Indeed, wonder if he wants to trade?

          • Bomber
          • 5 years ago

          Unless you’re married to Scarlet Johanssen I think I’ll keep her. She’s pretty great. After the discussion on the GTX970, she told me to order a 5820k, x99 board and memory too. I didn’t even ask!

            • Prestige Worldwide
            • 5 years ago

            WAT

            • sweatshopking
            • 5 years ago

            ugh. no way i’d go for SJ. she’s not my type. you can do better.

            • Redocbew
            • 5 years ago

            Now you’re just gloating…

        • Bomber
        • 5 years ago

        I did marry well but don’t tell her that 😉

      • Milo Burke
      • 5 years ago

      Is she into polygamy?

        • MathMan
        • 5 years ago

        You want to burden Bomber with an extra wife? I think already has what he needs.

        My wife always encourages me to buy stuff like this, it makes her feel less guilty about her buying fancy dresses…

        (You were probably thinking about polyandry, which is just plain gross…)

          • MadManOriginal
          • 5 years ago

          mmmm….codependent consumerism….

            • MathMan
            • 5 years ago

            It’s not working: I gleefully ignore her pleadings. She still buys her stuff…

          • Milo Burke
          • 5 years ago

          Indeed, yes. And I wasn’t suggesting … relations … just permission, nay, the recommendation to order such a product. =]

          • Ninjitsu
          • 5 years ago

          Polygamy is ok but polyandry isn’t?

            • MathMan
            • 5 years ago

            (Uh oh, the imaginary statement police has arrived.)

            Let me check. Did I say somewhere that polygamy was ok? Hmm… Can’t find anything.

            I did say that polyandry was gross, I give you that.

            But nowhere did I claim that polygamy was in any way morally more acceptable than polyandry.

            Now go find another place judge people for thing you imaging saying. Like a church or so.

            • Ninjitsu
            • 5 years ago

            Wasn’t judging, man. Just wanted to know what you were saying, not policing. It’s the internet, not my job to police.

            • MathMan
            • 5 years ago

            It’s all good then.

            • shidairyproduct
            • 5 years ago

            Polygamy is multiple mates (gender-neutral term); polygyny is multiple female mates and polyandry is multiple male mates. But polygamy is used interchangeably with polygamy.

            • SoberAddiction
            • 5 years ago

            Well, I guess It’s ok for polygamy to be used interchangeably with polygamy. 😉

      • anubis44
      • 5 years ago

      I’d at least wait until the Radeon 390 is released later this month.

        • dragontamer5788
        • 5 years ago

        Don’t hold your breath. Rumors I’m seeing are “early 2015” for the Radeon 390. NVidia is probably going to start releasing 16-nm cards by then.

        Expect price cuts while AMD tries to keep the R9 290x relevant as a upper mid-range card.

      • derFunkenstein
      • 5 years ago

      I kinda quit showing this stuff to my wife about 10 years ago because her eyes glaze over. She’s only interested in if she’ll see an appreciable gain in a particular application that she uses, and it’s almost always because her system is mostly built of my hand-me-downs (CPU and mono excepted, at this time, because I didn’t want to spend enough to get unlocked Haswell when she needed a new CPU/mobo – she got the 4430 and H87 board, I kept my Z77 + 3570K).

        • Bomber
        • 5 years ago

        My wife is a gamer. Semi-casual but we have upgraded her pc more than once to play games first and foremost. Discounting my overclocking hers is faster. She has a 2500k, 12gb ram, and a 6950 video card in need of update (probably soon when she sees the 970). I am a very lucky man. She might gloss over on the technical side of things but she understands the need for bigger better faster!

          • derFunkenstein
          • 5 years ago

          My wife only knows the need for bigger/better/faster when it actually does something for the games she plays. Right now it’s Diablo 3 (predominantly) and The Sims 3. Her system is plenty fast for that. Money priorities have shifted over the years – staying super current used to be a big deal, and now it’s a bigger deal to do Girl Scouts and dance class and get a puppy and this and that and the other. lol

          edit: and your wife is in dire need of an upgrade:

          [url<]http://anandtech.com/bench/product/510?vs=660[/url<]

            • sweatshopking
            • 5 years ago

            I ONLY UPGRADE WHEN GAMES BECOME UNPLAYABLE. AN EXTRA 25% MEANS NOTHING TO ME. WHEN I FUNDAMENTALLY CAN’T GET GAMES TO PLAY, THEN I BUY. OTHERWISE, I’D BE BUYING ALL THE TIME, THEN I’D HATE MY PC LIKE SO MANY PEOPLE.

            • derFunkenstein
            • 5 years ago

            Yeah, the last few hardware upgrades were related to dead hardware. My PC’s GTX 460 died after about 3 years of use, so I got my 760. Her PC’s mobo from the Trinity launch died, so I replaced the APU with a discrete video card and Haswell setup. When I upgraded my CPU from a Sandy Bridge i3 to an Ivy Bridge i5 it was because I started working from home and I needed more hardware cores to run the VMs and stuff I use. I used to upgrade frequently, but I don’t want to anymore.

            • dragontamer5788
            • 5 years ago

            I’m only really looking at this generation because FreeSync / GSync intrigue me so much. I don’t really plan to make a purchase until FreeSync comes out however (I can wait till March or later next year).

            My NVidia 560 Ti still works, but it is definitely feeling old at this point.

            • Bomber
            • 5 years ago

            Shh she doesn’t know she needs an upgrade yet . She has an H60 cooler and I haven’t overclocked her yet and she’s getting my 16gb Vengeance kit. She will probably want a video card when she sees my 970 like I said. Though I should be able to put a $200 video card in and she’ll be happy.

      • GatoRat
      • 5 years ago

      I have a 560Ti too, though with a noisy fan. My case barely handles the heat. If the 970 runs cooler, it’s a buy for me.

    • Krogoth
    • 5 years ago

    Just curious, I’m wondering if there are any image quality/rendering differences between Fermi, Kepler and Maxwell.

    I’m aware that ever since DX10-era. The output differences between GPUs have become homogenized, but it would be an interesting academic exercise.

    • Ninjitsu
    • 5 years ago

    [quote<] NVIDIA first introduced color compression on the GeForce FX series, where it could compress data at up to a 4:1 ratio. [/quote<] -AnandTech

      • Mikael33
      • 5 years ago

      So is color compression is where you hack the shit out of games to squeak out more performance from a flawed architecture while making the game look like poop? Pretty sure Maxwall uses something much more advanced 😉

        • Airmantharp
        • 5 years ago

        It’s texture compression, it’s built in to DirectX now, and it’s analogous to using JPGs instead of BMPs. It doesn’t have to be lossy, but even if it is you’re not likely to notice the difference.

        What Nvidia is using in Maxwell as mentioned in the article here at TR (haven’t read the one at Anandtech yet) relates to compressing graphical information that is stored in RAM in order to reduce bandwidth usage, which has a number of benefits.

          • Klimax
          • 5 years ago

          Anandtech got very nice description of entire thing:
          [url<]http://www.anandtech.com/show/8526/nvidia-geforce-gtx-980-review/3[/url<] (Subchapter Color compression) and how NVidia improved it over time.

            • Airmantharp
            • 5 years ago

            Yeah, I got it wrong- I was talking about what you mention below; apparently they actually implemented memory-bandwidth saving technology a rather long time ago too :).

        • Freon
        • 5 years ago

        Frame buffer compression on NV and AMD is supposed to be lossless.

        We’ve had lossy texture compression for a very long time.

          • Klimax
          • 5 years ago

          S3TC (that long…)

    • dextrous
    • 5 years ago

    On the Thief page:
    “The generational increase from the GK104-based GTX 770 to the GM204-based GTX 980 is enormous. The average frame rate doubles at 2560×1440.”

    GTX 770 – 49 FPS
    GTX 980 – 79 FPS
    49 x 2 != 79

    I’m totally confused by the “…average frame rate doubles…” part of that statement.

      • aggies11
      • 5 years ago

      Scott was probably looking at the 760 right below it (39) FPS. Remember, he’s been up for 5 days and been drinking anti-freeze. (It’s gotten cold this last week, but not THAT cold!… 😉 ) so a few discrepancies are to be expected.

      Great work though Scott, and I especially appreciate the extra bit of punchy humour found in this marathon reviews. Definitely worth reading the entire thing.

      • Klimax
      • 5 years ago

      But close to 50% increase. Sadly, not uncommon error.

      • Damage
      • 5 years ago

      Yeah, I was looking at the 760 number. Doh.

      Anyhow, fixed now.

        • dextrous
        • 5 years ago

        Ahh, makes sense. Easy to do on such little sleep! Thanks for the good review!

    • anotherengineer
    • 5 years ago

    Do these cards support DP 1.2a or just 1.2?

      • Aerugo
      • 5 years ago
    • TwoEars
    • 5 years ago

    SLI Tests:

    [url<]http://uk.hardware.info/[/url<]

      • Krogoth
      • 5 years ago

      Triple-SLI and Quad-SLI are still pointless outside of synthetics. The games are all CPU-bound unless you are driving 4K with extreme levels of AA/AF.

        • Airmantharp
        • 5 years ago

        Triple and quad GPU setups still have a place in very high-end configurations; attempting to run three of the new ASUS 1440p 144Hz G-Sync monitors is one example I can think of but many more exist, and even more are coming. We’re not going to run out of multi-GPU use cases anytime soon.

          • Krogoth
          • 5 years ago

          You realize that triple and quad-SLI only yield 10-20% improvement over dual-SLI in most cases? Not worth the extra cost or hassle even for ultra high-end crowd unless you have far more budget than sense.

            • Airmantharp
            • 5 years ago

            If one has the budget and wishes to apply it toward an ultra high-end gaming experience, why wouldn’t they try? A potential gain of 20% could easily be the difference between a playable experience and a frustrating experience.

            I realize that such individuals aren’t frequent posters on TR, but they absolutely do exist and are catered to.

            • Krogoth
            • 5 years ago

            It is a vanishing small niche, which is becoming smaller and smaller as the headaches and costs associated with such setups make them jump from SLI/CF right back to single-care land once a single-card solution can handle that level of performance.

            • Airmantharp
            • 5 years ago

            I don’t think it’s really growing or shrinking; for a moment, it looked like it was no longer necessary, but the drive to 4k and then to 8k will keep demand up for the near future.

      • 3SR3010R
      • 5 years ago

      Also many Multi-GPU tests on PCper.com

      Battlefield 4 – Multi-GPU : [url<]http://www.pcper.com/reviews/Graphics-Cards/NVIDIA-GeForce-GTX-980-and-GTX-970-GM204-Review-Power-and-Efficiency/Battle-0[/url<] Bioshock Infinite - Multi-GPU: [url<]http://www.pcper.com/reviews/Graphics-Cards/NVIDIA-GeForce-GTX-980-and-GTX-970-GM204-Review-Power-and-Efficiency/Biosho-0[/url<] Crysis 3 - Multi-GPU: [url<]http://www.pcper.com/reviews/Graphics-Cards/NVIDIA-GeForce-GTX-980-and-GTX-970-GM204-Review-Power-and-Efficiency/Crysis-0[/url<] GRID 2 - Multi-GPU: [url<]http://www.pcper.com/reviews/Graphics-Cards/NVIDIA-GeForce-GTX-980-and-GTX-970-GM204-Review-Power-and-Efficiency/GRID-2-M[/url<] Metro: Last Light - Multi-GPU: [url<]http://www.pcper.com/reviews/Graphics-Cards/NVIDIA-GeForce-GTX-980-and-GTX-970-GM204-Review-Power-and-Efficiency/Metro--0[/url<] Skyrim - Multi-GPU: [url<]http://www.pcper.com/reviews/Graphics-Cards/NVIDIA-GeForce-GTX-980-and-GTX-970-GM204-Review-Power-and-Efficiency/Skyrim-M[/url<]

        • Mikael33
        • 5 years ago

        So how long till a dual GPU card comes out? Perfect candidate for it!

        • Airmantharp
        • 5 years ago

        So two of these cards can almost play BF4 at 4k? Good news :).

    • anotherengineer
    • 5 years ago

    Hmmm The Asus GTX 970 power draw at load isn’t too far off from the Radeon 285. From all the hype I was expecting 50W+ less under load.

    That said, make me a GTX 970 with 3 display ports that will do 3D-Surround flawlessly and price it at $275 and you got yourself a deal!

    Edited as per Krogoth’s comment.

      • superjawes
      • 5 years ago

      An Nvidia card running AMD’s Eyefinity?!? That [i<]WOULD[/i<] be impressive!

        • anotherengineer
        • 5 years ago

        Well they would have to make up their own name, perhaps Jen-HsunVision?

          • Krogoth
          • 5 years ago

          Nvidia already has their version of Eyefinity. It is called 3D Surround.

    • 2x4
    • 5 years ago

    any word when gtx980 will hit the stores, newegg in particular?

      • Krogoth
      • 5 years ago

      980GTX already hit most etailers when NDA was lifted. The only catch is that 980/970 are probably sold out at this point and the next batch is going to have a price hike in response to demand.

    • kamikaziechameleon
    • 5 years ago

    This is great news. To be honest prices for GPU’s have stagnated for some time. Nice to finally see them “resetting” by undercutting their own offerings with a new generation. The only reason I know of that would compel Nvidia to do that would be better margins.

    • Deanjo
    • 5 years ago

    [quote<]The middle Maxwell: GM204[/quote<] Lol, so we are finally admitting that the Nvidia Gx-x04 GPUs are the midrange offerings. My how things change.... [url<]https://techreport.com/discussion/22653/nvidia-geforce-gtx-680-graphics-processor-reviewed?post=623366#623366[/url<]

      • Ninjitsu
      • 5 years ago

      I’m surprised it wasn’t well known, already. I guess people can’t separate marketing and engineering at times.

      You may want to remove the #623366 at the end of the link, though.

      • Milo Burke
      • 5 years ago

      You’re still sore for getting downvoted two and a half years ago?!

        • Ninjitsu
        • 5 years ago

        Well, he [i<]was[/i<] right...

          • derFunkenstein
          • 5 years ago

          It was widely speculated at the time, but nVidia didn’t need the full power of Kepler to trounce AMD.

          • Milo Burke
          • 5 years ago

          I think it’s all about point of view.

          I personally consider a $300 GPU high end because I don’t know anyone who has one, and I don’t know anyone with a monitor high enough resolution to require one. And I can’t afford one anytime soon.

          By my definition, $300 is high end, $600 is very high end, and I start to run out of grandiose adjectives to describe the superlative nature of the exceedingly high end that I consider the $1000-2000 GPUs to belong to.

          That said, I wouldn’t downvote someone for saying $500 is midrange. To each, their own. =]

          Edit: If you convince my boss to double my pay, I’ll agree that the GTX 980 is midrange. =D

            • Ninjitsu
            • 5 years ago

            Deanjo’s point (and my point too) is that, from an engineering or technical pov, it’s a x04 part, which is a mid range part in Nvidia’s stack.

            Now Nvidia can price it however they like, but it’s not going to be their most powerful consumer offering in the end.

            This is basically what the GTX 560 has become, if you were in Nvidia’s engineering team. I remember the GTX 680 had a comparable die size to a GTX 560 Ti (360mm2). The GTX 580 is still bigger than the 980, at 520 mm2.

            So this is “mid range” performance from Nvidia. If AMD was actually competitive in terms of transistors, die area [i<]and[/i<] performance, you'd probably find GM204 in a GTX 960 Ti. That said, I don't think Fermi and Kepler+ GPUs can be compared directly, Nvidia did sacrifice compute for gaming performance(doesn't appear to be the case with Maxwell though, so maybe it can be compared). It's sort of like how a Core i7 4790K is a "high end" part as far as most of us are concerned, but for Intel, it's just the fastest "mainstream" consumer part in their product tiers, with the "true" high end flagship being the 5960X. Nvidia's probably making a huge profit on both the 970 and the 980, larger dies have been sold for less in the past, and this is a mature and high volume 28nm process.

            • Milo Burke
            • 5 years ago

            No argument from me on any point. =]

            • Deanjo
            • 5 years ago

            I get what you are saying however determining what “class” a card is by price really is a poor indicator as it all depends on the POV from the person looking at it. A person for example has only used crappy IGP’s throughout their life can see even the lowest of dedicated cards as ” high end”.

      • derFunkenstein
      • 5 years ago

      SITYS gets you -3 now, too. 😉

      • laggygamer
      • 5 years ago

      I went back and upvoted you bro.

      No but in all seriousness he is just highlighting how flavor of the month people’s opinions can be. Just because Nvidia decided to release their midrange gpu’s significantly sooner than their high end of the exact same architecture does not make the mid range suddenly high end. It’s not a matter of point of view, nor is it in any way debatable.

      It’s like taking silver, pricing it as platinum, and then selling it as platinum. A bunch of people who don’t have the knowledge to tell what it is will buy it and adamantly claim that it’s platinum, despite what you try to explain to them.

      Nowadays people know better, but there is still a lingering population that will claim that the gtx 980 is a high end gaming card.

        • derFunkenstein
        • 5 years ago

        They didn’t position any single GPU solution as a higher-end solution than GTX 680 in the 600 family. It wasn’t until Titan, and later the 780 and 780Ti, came out that you saw full-fat GPUs used for anything other than Tesla. The previous high-end was GTX 580. Therefore, nVidia positioned the GPU as their high-end offering in 2012. I don’t see any problem with that, and everyone knew at the time that the chip used in Tesla cards would wind up in a higher-than-high-end card eventually.

          • jihadjoe
          • 5 years ago

          We’re probably have to get used to seeing the big chip only once every generation now. GTX 1080 anyone?

      • Airmantharp
      • 5 years ago

      You’re missing something: the Gx-x04 GPUs are mid-range offerings, but the GTX680 was positioned as a high-end product.

        • laggygamer
        • 5 years ago

        Positioned as a high end product how? By name or price? Both are irrelevant. As I explained above, what you call something, and how you price it doesn’t determine what it is.

        The Kepler gpu’s that comprise most of the 6xx and 7xx “families” is one architecture. Within that architecture, the gtx 680 is a mid range product, period. It performs in the middle. Being released early doesn’t change that. People act like they have no knowledge of nvidia’s publicly available roadmap.

          • laggygamer
          • 5 years ago

          For the record I have no problem with what nvidia is doing. I’m just letting you know what nvidia is doing. To call the gtx 980/680 a high end card would be a misrepresentation of what nvidia is doing.

          • Airmantharp
          • 5 years ago

          Positioned by way of name, price, and performance, basically every way that the product could be evaluated, in comparison with existing products within their competition’s product line and their own. Same as how the GTX 980 is being positioned today.

          So Nvidia has produced GM204 as a mid-range part, but has released it as high-end product line by naming it such and pricing it alongside products with competitive performance, such as the R9 290/X and GTX 780/Ti.

            • laggygamer
            • 5 years ago

            It’s a new architecture. Nvidia is taking advantage of the fact that the difference between high end and midrange in one architecture is about the same as the jump between the same class of cards in a new architecture and the one before it. They tend to underclock the true high end cards to pull this off, so that the new midrange parts can win when clocked accordingly.

            We’re basically saying the same thing though, released as a high end product, but really isn’t.

            You are somewhat undervaluing how arbitrary name and price are though. $550 will be the standard midrange some day when you factor inflation, etc. All of those cards won’t automatically be card a high end product. Furthermore, if they release the 980 for $200 or $1000, it’s still a midrange product. If they release the entire Maxwell architecture at once, gtx 980 is still mid range.

            What AMD is doing has no influence over what type of product the 980 is, they’ve just managed to make it look more kickass than many of us would like, it’s not a high end product though. Many high end gpu buyers will pass on this offering in favor of waiting for true high end cards, while other high end buyers will have the intent of having the the strongest card at any given moment, whether it be a high end part or not.

            • Airmantharp
            • 5 years ago

            I agree that we’re saying the same basic thing- the only remaining point of contention I see is that you’re claiming that the 980 GTX isn’t a high-end product because it contains a mid-range GPU, while my positions is that while the GPU remains a mid-range part, the 980 GTX is definitely Nvidia’s high-end offering, as supported by it’s performance relative to current competing offerings and it’s branding within Nvidia’s lineup.

            • swaaye
            • 5 years ago

            The Maxwell lineup looks like it will end up like this:
            GM200 -> GM204 -> GM206 -> GM107 -> GM108

            I’m not sure what these companies call all of their little market segments these days, but GM204 is certainly on the high-end side of the spectrum.

            I wonder how much of a gain GM200 will have considering it will surely be loaded with HPC hardware that does little for gaming. GM204 is already at about 400 mm^2.

            • Airmantharp
            • 5 years ago

            It’d be fair to call a full GM204 implementation ‘upper mid-range’ once the GM200-based product is released, but one could also expect a ‘GTX 960’ in the future based on cut-down or harvested GM204 die. Anything lower in the spectrum is likely to sport a much smaller die, etc.

        • Krogoth
        • 5 years ago

        680 and 980 are both modern incarnations of 8800GT. They are mid-range GPU designs pitched as second-fastest tier.

        The confusion comes from the fact that Nvidia is no longer releasing high-end architecture as a customer-tier product first. They got burn too many times with yielding issues. They release that their mid-range design first to customer-market, since it yielded most of the performance found in the big design while it is more feasible to produce in sufficient quantities. They saw this first-hand with GF104 as 460 which outsold and outproduced the GF100-based units by a good margin. Nvidia decided to first release the GK104 first before GK110 and that stratagem proved to be highly successfully. They are continuing it with GM204 while GM210 comes later (because 20nm process is being sorted out at TSMC).

        The big designs are going to start being marketed as Titans or Ti editions of the same product line.

          • Airmantharp
          • 5 years ago

          But what you’re basically saying is that their competition isn’t strong enough to force them to lead with their high-end architecture, such that they can get away with with releasing next-gen mid-range parts as high-end products, thus allowing them to ease into new architectures and processes :).

          • laggygamer
          • 5 years ago

          You think GM210 will be 20nm? Or am I just thinking you are meaning GM200? I ask because I wasn’t expecting any sort of shrink from 28nm until Pascal.

            • Airmantharp
            • 5 years ago

            Going to 20nm really only makes sense because fabbing GM210 at 28nm would mean that it’d have to be a larger part than GK110, which is likely beyond the point of feasibility given how die sizes and yields are intertwined.

            And Nvidia is likely chomping at the bit to jump to 20nm anyway, as Maxwell was originally designed for 20nm and tweaking it for 28nm probably increases their R&D costs.

            • Ninjitsu
            • 5 years ago

            Nvidia seems to be switching to a tick-tock like model, though that could just be a one-off because of delays.

            BTW, GM210 and 200 were supposed to be two speculated names for the same chip, I think.

      • MadManOriginal
      • 5 years ago

      Performance is what matters, not the codename of the chip.

        • Ninjitsu
        • 5 years ago

        From a consumer/marketing perspective, maybe. Not from an engineering perspective.

        Anyway, performance relative to [i<]what[/i<]? To nvidia's own chips within the same generation, of course The fact that a 750 Ti could match the 480 in some benchmarks doesn't make the 750 Ti a high end part, nor does it make the 480 a lower-mid range or upper-entry level part. However it does suggest that today's mid range used to be yesterday's high end, and that's exactly we're saying.

      • swaaye
      • 5 years ago

      The die size isn’t exactly what I would call mid-range. It’s not far off of Hawaii. I’m surprised by the pricing of 970 though.

        • Ninjitsu
        • 5 years ago

        Why compare to AMD’s “high end”? It’s closer to the 330-370mm2 of Nvidia’s usual mid range than its 500mm2+ range for their high end parts.

        • chuckula
        • 5 years ago

        Uh… 398 mm^2 on an extremely mature (some would say geriatric) process is definitely not in the big-boy die size league compared to other parts we’ve seen from both AMD and Nvidia.

          • swaaye
          • 5 years ago

          What’s wrong with 416 mm^2?

          I don’t know if the process is all super now or not, but this is a bigger chip. It’s not wee tiny and $500 like GK104 was. Of course little GK104 was plenty competitive with big Tahiti (esp pre-GHz edition) so they could run with $500 and laugh with glee. I suppose it’s about the same deal with mid-upper GM204 beating up big Hawaii. Except NV feels they are good for $550 instead of $500 this time.

      • Freon
      • 5 years ago

      $330 and $550, the new “midrange”?

        • ultima_trev
        • 5 years ago

        Adjusting for inflation, apparently so.

          • shidairyproduct
          • 5 years ago

          Wow, some inflation that is. This is a little steep, I’d say. I assumed the variance was a lot lower (maybe around the 250-400 range).

        • NovusBogus
        • 5 years ago

        I’d say that $300-350 is the upper end of midrange, and it’ll trickle down into the $150 and $250 brackets in short order. The GTX 980 reeks of being a late-game product-management solution to the unexpected 20nm supply problem.

          • swaaye
          • 5 years ago

          I don’t think it’ll drop in price like that unless it’s old stock being liquidated down the road after its replacement has arrived.

          • Freon
          • 5 years ago

          $330 as upper end of midrange is not completely unreasonable.

          I think trickle down is purely up to AMD putting on competitive pressure. NV has to be making a killing now with the narrower bus, smaller size, and smaller power regulation needs, though.

        • Deanjo
        • 5 years ago

        Considering I used to piss that amount away in the pubs on a Friday night while still considered below the poverty line years ago I would consider that entry level.

          • sweatshopking
          • 5 years ago

          it’s not entry level. your poor life decisions don’t decide what’s a realistic price.

            • Deanjo
            • 5 years ago

            My poor life decisions? I have zero complaints or regrets about my life and I still can easily afford the hardware.

            Seems to me if you can’t afford $330-$500 for some hardware then maybe it is you that should start taking a look at your life decisions as it is obviously keeping you below the poverty line.

            • sweatshopking
            • 5 years ago

            I know you’re rich. You love to talk about it.
            You don’t get how life is for the rest of us. Failing to connect to humanity in general is a common malady inflicting people with money. Work hard, you might be able to overcome it.

            • Krogoth
            • 5 years ago

            It is called priorities my friend.

            Not everyone can justify spending that much on a gaming GPU even if they can easily afford it.

            Likewise, you may feel that spending thousands on hobby x might be a tad-bit excessive while, other people are more than willing to go for it.

          • Krogoth
          • 5 years ago

          $300 is considered to be entry-level for a gaming-tier GPU?

          What are you smoking bro?

      • Convert
      • 5 years ago

      I logged in to down vote you, complain about it in a couple more years.

    • Rakhmaninov3
    • 5 years ago

    I want one of these with GSync. May get to do it next year……

    • derFunkenstein
    • 5 years ago

    Holy cats. Tiger Direct has samples of both cards up for sale and they’re not egregiously marked-up or anything like that.

    GTX 970: [url<]http://www.tigerdirect.com/applications/SearchTools/item-details.asp?EdpNo=9191503&CatId=7387[/url<] GTX 980: [url<]http://www.tigerdirect.com/applications/SearchTools/item-details.asp?EdpNo=9191502&CatId=7387[/url<] [url<]http://www.tigerdirect.com/applications/SearchTools/item-details.asp?EdpNo=9188863&CatId=7387[/url<] Yay no paper launch. And as for Tonga, they've got some Radeon HD 285s as well: [url<]http://www.tigerdirect.com/applications/SearchTools/item-details.asp?EdpNo=9179543&CatId=7387[/url<]

      • Prestige Worldwide
      • 5 years ago

      Nice price on TD US.

      Prices in Canada are kind of high, 970 starting at 379.99 and 980 at $619.99 on NCIX.

      But they’re not really listed on any other Canadian site yet.

        • derFunkenstein
        • 5 years ago

        Given the current exchange rate of .92 USD per 1.00 CAN, that’s not so bad. 379.99 comes out to be 349.59.

          • anotherengineer
          • 5 years ago

          Then add $10 for shipping and 13% sales tax (if you are in Ontario)

          380 + 10 = 390 + 13% = $440.70 for the GTX 970

          Edit for $316.40 (tax & shipping incl) ($124.30 less than the GTX 970) I can get a 285X plus 3 games

          [url<]http://www.ncix.com/detail/xfx-radeon-r9-285-black-49-101599-1343.htm[/url<]

            • derFunkenstein
            • 5 years ago

            You have to pay for shipping in the US, too, so I don’t feel too bad about that. And in Illinois I’m supposed to pay state sales tax of 6.5%, but etailers don’t charge it and the state just charges me $400 outright at filing time unless I itemize all my sales tax payments online (or something along those lines).

            Just think, at Best Buy the GTX 770 will be $400 + sales tax so I’ll pay the same as you.

            • anotherengineer
            • 5 years ago

            Well that makes me feel alot better. Thanks.

            Still probably going to stick with my HD 6850 I got new for $135 and a free game (Deus Ex human revolution IIRC)

            I will upgrade though, once there is a card that doubles the performance of my HD6850 for the same price as I paid for it 🙂 Unless it dies before then!!

            • derFunkenstein
            • 5 years ago

            You’re probably not too far off. If they do clearance sales on GTX 660s, getting them down in your price range, that’d be a good buy. Radeon 280X down that low would be close too. Depending on the game, a 280 is really close:

            [url<]http://anandtech.com/bench/product/539?vs=549[/url<] Yeah, it's an old benchmark, but the 6850 didn't make the GPU13 cut.

    • 3SR3010R
    • 5 years ago

    I watched the Twitter Podcast Live of the reveal last night and the moon landing proof was astounding.

    And without a doubt disproved three of the conspiracy theories.

    Proved why Buzz was brightly lit even though he was in the shadow of the L.E.M.

    Proved the bright light near the ladder bottom as Buzz was coming down the ladder from the video camera mounted on the L.E.M. was actually the sun reflecting off of Armstrong as he was taking Buzz’s picture coming down the ladder.

    Proved that stars were not visible because the camera’s exposure had to be turned way down because of the extreme brightness of the moon surface and the L.E.M.

    I really hope that Nvidia releases that segment (Moon landing did happen) for all to see.

    EDIT: It can be seen here: [url<]http://www.twitch.tv/twingalaxieslive/b/569845701?t=7h39m30s[/url<] Starting at 8:31:00 through 8:54:30

      • chuckula
      • 5 years ago

      I expect ridiculous hype at product launches, but if Nvidia wants to take the time to prove those conspiracy idiots wrong* while promoting their new card, I’ll give them props.

      * Not that it hasn’t been done a thousand times before as well.

    • Ninjitsu
    • 5 years ago

    Wow, awesome cards! I just might be tempted to get the 970 in a few…months. :/

    BTW, Scott: When you guys compare the various 970 models, could you please try and include 1080p results? Most of us are on 1080p, after all.

      • superjawes
      • 5 years ago

      I predict that all 970’s will be overkill for 1080p, at least for some time. The good news is that if you are waiting a few months, you might be able to catch a new monitor deal and get something with more pixels and/or adaptive sync tech.

      I’ve been eying a new monitor for a while now. A GTX 970 + 1440p IPS monitor looks delicious…

        • Ninjitsu
        • 5 years ago

        Nah, I just bought a new 1080p monitor this year. I’ll probably not replace till Adaptive Sync becomes prevalent, even then it’ll just be cheaper to get a new GPU and maintain 60 fps.

        • Prestige Worldwide
        • 5 years ago

        120/144hz says hi.

        • My Johnson
        • 5 years ago

        It’s not overkill for a 1080p monitor. Example: faster rendering, say, a solid 60Hz with very good detail enabled. On 1440p it may only be half or 75% that.

          • Ninjitsu
          • 5 years ago

          Exactly what I want to see before I upgrade my GTX 560.

        • Ninjitsu
        • 5 years ago

        I checked AT’s review, the 980 can’t do 60 fps minimums in at least Company of Heroes 2, they haven’t provided min fps for all games though. According to Tom’s hardware, it falls short in Ass Creed Black Flag and Watch_Dogs.

        The 970 falls short in: BF4, Thief, Black Flag.

        All results are close enough to 60 that with a few driver enhancements, the 970 should do 60 fps in almost all games.

      • Meadows
      • 5 years ago

      Yeah, TR might want to take their poll into consideration when designing future reviews.

    • green
    • 5 years ago

    page 4, too much coffee:
    [quote<] To make that happen, I've resorting to an old-school TR crutch[/quote<] and while the sample images are from nvidia, i'm hoping dsr can deliver on it's promise

      • Sunking
      • 5 years ago

      The always-classic post that flags a typo using incorrect grammar.

    • Rza79
    • 5 years ago

    GPU Computing:
    [url<]http://www.computerbase.de/2014-09/geforce-gtx-980-970-test-sli-nvidia/9/[/url<] SLI testing: [url<]http://www.computerbase.de/2014-09/geforce-gtx-980-970-test-sli-nvidia/15/[/url<] Seems a pair of GTX970's could be a sweet deal for 4K gaming.

    • Arclight
    • 5 years ago

    Cool stuff and interesting pricing. My take is that they won’t be worth it if you can wait, but for people looking to buy a gaming card as soon as possible, there is no reason to search elsewhere.

    • Pez
    • 5 years ago

    £260 from Overclockers for the 970 looks an absolute bargain after this review!

    Thanks for the review Scott!!

      • JustAnEngineer
      • 5 years ago

      Note that the £260 GeForce GTX 970 graphics card available at Overclockers runs at 1050 MHz, while the Asus Strix GTX 970 OC Edition that was tested in this article is factory overclocked to 1114 MHz.
      [url<]http://www.overclockers.co.uk/showproduct.php?prodid=GX-001-GX&groupid=701&catid=1914&subcat=1010[/url<] The Asus card will be £300 or more when it is in stock. [url<]http://www.overclockers.co.uk/productlist.php?prodids=GX-352-AS,GX-353-AS,GX-354-AS[/url<]

        • Pez
        • 5 years ago

        It runs at 1088Mhz:

        [url<]http://www.overclockers.co.uk/showproduct.php?prodid=GX-039-IN&groupid=701&catid=1914&subcat=1010[/url<] And it'll be being overclocked further regardless 🙂

    • chuckula
    • 5 years ago

    Interestly… go over to Anand’s review and you’ll see the GTX-980 gets pretty high marks in compute… with the very big exception of double-precision where it is way down the list.

    It looks like one of Nvidia’s tricks to keep the die size small was to remove most of DP hardware execution hardware in a similar manner to what AMD did with the R9-290x.

    In short, the R9 and the new mid-range Maxwell parts are great at games, but serious compute users will still be looking elsewhere.

    [Typical fanboyism: I make one of the most critical comments about the new Nvidia GPUs in this entire thread and AMD fanboys downvote me instead of other posters…]

      • peaceandflowers
      • 5 years ago

      Where would they look though? Titan doesn’t fare much better (if at all), being based on the Kepler architecture. Would you just use CPUs instead?

        • chuckula
        • 5 years ago

        You would stick with the high-end Keplers for GPU compute until the tesla versions of Maxwell (the big ones) show up in 2015.

      • Ninjitsu
      • 5 years ago

      Wait, what? The 290X almost ties the 780 Ti in DP compute…

      GM 204 has deliberately gimped DP performance just like GK104, and seems to effectively perform a lot like GF114.

      Hawaii seems to be better at “Optical Flow”, i wonder what’s happening there? Both GM204 and GK110 are tied…

        • Klimax
        • 5 years ago

        IIRC most/all of tests are OpenCL. Might be still insufficient effort on its optimization. (Like Intel’s ICD) That is, we se “raw” HW ability with weaker drivers support.

    • Chrispy_
    • 5 years ago

    I’m minutes away from clicking buy on a £215 Zotac 970.

    I’ve hit a thermal/noise/size threshold in my HTPC and there’s been nothing more powerful than a 7870/660Ti that will fit in there in the last couple of years.

    To have 290X/GTX780 performance in card that sucks even less power than my existing card is something of a shock, and I very much doubt that “full” Tonga will even come close.

    The only thing causing me to hesitate is the lack of games worth buying a graphics card for right now. It’s not as the existing card is really pushed at 1080p….

      • Pez
      • 5 years ago

      Where’s that, or are you quoting the ex. VAT price?

        • Chrispy_
        • 5 years ago

        ex VAT
        I rarely buy stuff outside the company account. All my computers are going to get used for work at some point 😉
        But 95% of the readership on TR is US anyway. VAT is just kerfusing.

    • chuckula
    • 5 years ago

    And now we know why AMD was so nervous the other day….

    Anyway, while the performance is solid, the biggest shocker is that Nvidia isn’t charging inflated prices. They’ve had the highest-performance chips in the past, but this is the first time I can remember where they have the highest-performance chips and you don’t feel like you are paying through the nose to get the performance.

    Excellent review as always TR.

      • 3SR3010R
      • 5 years ago

      How about when Nvidia released the GTX 680 for $499?

      On that day it was AMD who had to drastically reduce the price of 7970 (which was priced through the nose).

      Seems like history is repeating.

        • chuckula
        • 5 years ago

        I agree that there are some parallels with the GTX-680 but there are also important differences:

        1. The GTX-680 and the 7970 were both in roughly the same “class” of GPU. That is to say, similar die sizes, similar memory, etc. etc. The GTX-680 was not an uber-chip ala the GK110 used in Titan & friends.

        2. The 7970 had a reign of a few months as an uncontested performance leader while Nvidia worked on getting Kepler ready. It makes sense that Nvidia would be aggressive with the 680 launch prices since it was playing catchup in some ways.

        In this case, we are seeing a situation where Nvidia already had performance leading (albeit pricy) parts out on the market and is actually beating its own parts with lower-priced items. It’s also generally beating a higher-end type part from AMD (R9-290x) with its new mid-range parts, but not charging the commensurate high prices we have seen in the past for GTX-780Ti and other similar parts.

    • superjawes
    • 5 years ago

    Ouch…the 285[s<]X[/s<] enjoyed about two weeks of being a good deal before the 970 came in and ruined its day. The price drop on the 760 makes that card an excellent deal as well. I hope AMD have something in the works, or it's going to be a long winter.

      • willmore
      • 5 years ago

      What 285X? You mean the 285?

        • superjawes
        • 5 years ago

        Yeah, that one. This was a pre-coffee post.

          • willmore
          • 5 years ago

          Up the dosage, STAT!

      • geekl33tgamer
      • 5 years ago

      No no, it will be a warm one with my R9 290X.

        • superjawes
        • 5 years ago

        True! They could just borrow one of those crypto-mining rigs to heat the building.

          • geekl33tgamer
          • 5 years ago

          I actually have 3 of them, for games. I’m not scared of 100C anymore – almost seems normal now. :-/

      • dragontamer5788
      • 5 years ago

      [quote<] I hope AMD have something in the works, or it's going to be a long winter.[/quote<] Its in the article. [quote<]Cut prices, of course, and a well-placed industry source has informed us that a price reduction on the R9 290X is indeed imminent, likely early next week. We expect the R9 290X to drop to $399 and to receive a freshened-up Never Settle game bundle, too. [/quote<] $400 for the R9 290X.

        • superjawes
        • 5 years ago

        I didn’t miss that. It’s just speculation, though, and not an official AMD action.

        More importantly, Maxwell does things faster than AMD’s best chip, all while using LESS power. If you recall, one of the problems facing the Hawaii chips was running into thermal limits (which cut performance). Maxwell doesn’t seem to have that problem at all.

        I’m guessing that a more powerful 980 Ti or 990 card is already in the works using the same chip, allowing Nvidia enough room to stretch ahead more if they need to.

          • geekl33tgamer
          • 5 years ago

          [quote<]the Hawaii chips was running into thermal limits[/quote<] This. The magic number is 95C, and I can vouch from experience it aggressively clocks down to barely 250-300 Mhz on the core within seconds of hitting it when you use Crossfire. Sigh. I'm looking at water cooling, or moving to the green side's new toys. 🙂

          • dragontamer5788
          • 5 years ago

          [quote<]I didn't miss that. It's just speculation, though, and not an official AMD action.[/quote<] It'd be stupid if AMD didn't react to this. Its clear that the R9 290x is no where near worth $550 anymore. I agree that the number quoted in the article is most definitely unconfirmed however. [quote<] More importantly, Maxwell does things faster than AMD's best chip, all while using LESS power. If you recall, one of the problems facing the Hawaii chips was running into thermal limits (which cut performance). Maxwell doesn't seem to have that problem at all.[/quote<] This was true of the GTX 780 Ti. The [b<]real[/b<] benefit is that NVidia is now able to drop the price to $550. [quote<] I'm guessing that a more powerful 980 Ti or 990 card is already in the works using the same chip, allowing Nvidia enough room to stretch ahead more if they need to.[/quote<] Isn't NVidia's schedule for 20nm or 16nm chips? In any case, that's months away.

            • superjawes
            • 5 years ago

            Yeah, looking again, I expect maybe a new Titan based on the GM210 (the Maxwell equivalent of Kepler’s GK110). I don’t expect that to happen for at least six months, but I do expect it to happen.

            And still, Nvidia could easily push the GK202 if they wanted to. My point was that Nvidia is (still) in a safe position with some clear ways to compete. AMD can cut prices, but that also means lower margins for a company that’s been short on cash for years. Ideally they would have a surprise GPU to debut before the holidays, but that doesn’t seem to be the case.

            • Arclight
            • 5 years ago

            If i were to make a speculation i’d say that nvidia will keep back the big chips until some time after AMD comes with new chips which i think will be faster considerably than the GTX 980, considering the GTX 980 is only slightly faster than yester year cards and is, despite the price, a mid range card.

            • superjawes
            • 5 years ago

            Probably. Nvidia can, after all, wait to release chips whenever they want to.

            That’s why I suspect a GM210 will arrive in a Titan with a $1,000 price tag. Something that will show off the raw power of a fully capable Maxwell GPU without directly competing with your 290X and 980 cards. It’ll generate buzz, but Nvidia won’t introduce it into their proper GPU line until AMD can catch up with the GK204.

            • dragontamer5788
            • 5 years ago

            What’s the point of showing off cards that I can’t afford? The cards don’t really exist unless they create pricing pressure towards cards that I do care about.

            For example, the GTX 980 and 970 are going to create pressure even in the $300 and below range. So those are cards I care to hear about, even if I can’t afford them. A $1000+ price card however is just so far out there that it doesn’t concern me at all.

            Titan-class cards are so far removed that they don’t even affect the pricing of high-end $500+ cards.

    • Deanjo
    • 5 years ago

    Perfect cards for steam machines.

      • sweatshopking
      • 5 years ago

      yay! then your steam machine will only cost about 75% more than a console!

        • Deanjo
        • 5 years ago

        And still about 900% more useful than a console.

          • sweatshopking
          • 5 years ago

          A steambox?!?!?! Did they suddenly get a new operating system??!! A windows powered steam machine, but not steam os. Steam os is still a sub par os. You’ll notice its almost entirely forgotten.

            • Deanjo
            • 5 years ago

            Lol, SteamOS is nearly infinitely more flexible and capable over the OS that any console offers.

            Of course you can switch to another if you wish, can you do that on your console?

            • JustAnEngineer
            • 5 years ago

            You used to be able to on your PlayStation 3.
            [url<]http://en.wikipedia.org/wiki/Yellow_Dog_Linux[/url<]

            • Deanjo
            • 5 years ago

            Yes you [b<]used to[/b<] able to install linux on the PS/3. Items like 3d acceleration however were not available limiting it's usefulness to cluster computing (which was quickly rendered obsolete with PC GPU's).

            • moose17145
            • 5 years ago

            RENDERED obsolete with GPUs! HAHa I get it! 😛

            • anotherengineer
            • 5 years ago

            My steambox

            [url<]http://www.mustknowhow.com/wp-content/uploads/2010/03/OD8x7-interior-300.jpg[/url<] Hey Neely what you doing Saturday night?? Wanna smell the cedar with me and SSK?? 😉

        • Liron
        • 5 years ago

        Or it will cost about 35% of the total cost of the console+(the same)games over its lifetime.

          • derFunkenstein
          • 5 years ago

          Only because so many people buy the high-budget games at 25% of their cost when they’re a year old. At that rate, I kinda have to wonder when those games will shift from shipping a year late to no longer shipping at all on PC.

          • snook
          • 5 years ago

          my gpu alone cost $150 more than the PS4. 75% is closer to the truth than you want to admit. mine is a “mid- range card.” lol

          typed on a PS4. gave my PC away!

        • anotherengineer
        • 5 years ago

        Not really. If these things do crypto-currency 2x better than the Radeons, they could go way up in price, then they will cost 200% more than a console 🙂

        [url<]http://cryptomining-blog.com/3503-crypto-mining-performance-of-the-new-nvidia-geforce-gtx-980/[/url<] We will know soon enough!!

          • MadManOriginal
          • 5 years ago

          Good thing the GPU miner rage crashed because of both coin prices and custom ASICs.

            • NeelyCam
            • 5 years ago

            So… who’s the poor sap that was left holding the bag?

            • MadManOriginal
            • 5 years ago

            The only real losers were anyone who bought an AMD GPU for gaming when the prices were inflated because of the miners.

            • JustAnEngineer
            • 5 years ago

            Not just AMD. The NVidia price cuts that we’re seeing now could have come sooner if not for the mining bubble.

      • Krogoth
      • 5 years ago

      Drives up cost too much and is overkill.

      A 750Ti or a more crippled “GM204” chip marketed at “960” would make more sense. Remember, the Steambox is going to be running in the living room and be connected to a 1080p HDTV. I suppose that it could be connected to a 4K HDTV, but 980 cannot maintain a smooth framerate at such resolutions, especially if you want AA/AF with modern flicks.

        • Deanjo
        • 5 years ago

        [quote<]A 750Ti or a more crippled "GM204" chip marketed at "960" would make more sense[/quote<] For older stuff the 750Ti is fine I agree but games such as Metro ,the redux versions, etc can very much use the additional horsepower. Unlike a console, SteamOS/PC games can require more powerful hardware overtime to maintain satisfactory performance. You do not even have to venture into 4k resolutions to get a benefit with faster hardware even at 1080P.

          • Krogoth
          • 5 years ago

          FYI, my 660Ti effortlessly handles 1920×1080 under modern titles with AA/AF. 750Ti is roughly in the same league, while a “960” would be somewhere in 770 and 780 range.

          The benches for 970/980 were operating at 2560×1440 which is roughly double the pixels and they were able to handle it without too much fuss.

          The problem is the Steambox in most cases are going to be operating with 1080P HDTV so that extra power is going to waste and it just drives up unit cost and power consumption.

            • Deanjo
            • 5 years ago

            FYI, there is a very noticable difference between a 750 Ti and a more powerful card such as the Titan even at 1080 P. I would much rather have my game play experience take advantage of 1080P @ 60 Hz capabilties instead of hovering around at 1/2 the frame rate.

            [url<]http://www.guru3d.com/articles-pages/nvidia-geforce-gtx-750-and-750-ti-review,15.html[/url<]

            • Krogoth
            • 5 years ago

            I’m not debating that higher-end chips can yield higher levels of performance at 1080p. The problem is such units drive-up cost and power consumption. These are luxuries for HTPC/gaming rig hybrids like the Steambox.

            If you care enough about gaming performance. You wouldn’t be looking at a Steambox in the first place.

            • Deanjo
            • 5 years ago

            [quote<] The problem is such units drive-up cost and power consumption. These are luxuries for HTPC/gaming rig hybrids like the Steambox.[/quote<] Many of the SteamBox reference systems were outfitted with K series Ivy Bridge i7's and Titans or 780Ti's and do not have any issues whatsoever. Now consider that Haswell and the likes of the 980 and 970 draw even less power this is less of a concern. [quote<]If you care enough about gaming performance. You wouldn't be looking at a Steambox in the first place.[/quote<] Why not? Many Steambox configurations put the average gamer system to shame.

            • Krogoth
            • 5 years ago

            The higher-end models (Falcon NW/Alienware/Origin) are just mini-ATX/ITX cases with high-end gaming components. You might as well get a regular-ATX case and go with a multi-monitor setup (screw the HDTV/Living Room) if you have that kind of budget for a gaming system.

            The lower-end models are more likely what the official units will be deployed with (mid-range to lower mid-range GPU GPU/CPU combos.

            • Deanjo
            • 5 years ago

            Why go multi-monitor when you can fill your field of vision with just one screen (not to mention the comfort level factor as well).

            I have a kick ass workstation pushing 11 Megapixels but there is a lot to be said about being able to comfortably game on the couch in front of a large single screen. I would only watch a movie on a tablet as a last resort for example despite it having the better resolutions than the TV. The comfort and immersion are still better on the big screen. The other advantage is that the steambox is able to fully utilize the Bryston/Yamaha/Paradigm home theater audio capabilities which blows the doors off of anything that can be typically stuffed in a office.

      • moog
      • 5 years ago

      They’re great for coughs, I have a Vicks Vaposteam.

    • Dezeer
    • 5 years ago

    Does the Dynamic super resolution, DSR only support preset resolutions or is it user configurable like Gedosato?

    • Risme
    • 5 years ago

    Thanks for the article.

    I haven’t yet read any other reviews nor have i had the time to analyze the data more comprehensively, but based on first impressions this chip seems to be an impressive feat of engineering by the technical personnel who worked on this chip.

    Although, as my interests are shifting more and more towards technical design, i’d be most interested to see how this chip would perform in a Quadro implementation in CAD and CAE environments.

    • tanker27
    • 5 years ago

    Shut up and take my money!

    • Laykun
    • 5 years ago

    Alright, now just bring on that die-shrink. *rubs hands together in anticipation*

    • Klimax
    • 5 years ago

    1) So much for any remaining shred of credibility for AMD’s claims about DX.
    DX12 exists and we have already shipping silicon supporting new HW features. (No idea if all, but looks like most are there)

    2)
    [quote<] The result: almost a perfect square of 20.4 mm by 20.4 mm. That works out to 416 mm². Shortly after I had my numbers, NVidia changed its tune and supplied its own die-size figure: 398 mm². I suppose they're measuring differently. Make of that what you will. [/quote<] Measurement error (just instrument will make it problematic) + packaging/gap/extra border... = difference 3) We all knew that there must be some kind of compression at least since Kepler, nice to get it confirmed. 4) Now to get 4K display and 980 would make nice backup to Titan on i7-5960x... 😉 5) Just missing compute stuff.

    • RdVi
    • 5 years ago

    Very impressive. I’m very tempted to pick up a GTX970, my first card from the green camp since a 9800GT. To be honest though I’m really not thrilled about being locked into a proprietary standard like g-sync for something that lasts as long as a monitor – so I’ll probably wait and see what AMD have to offer… if anything, before the end of the year.

    • bfar
    • 5 years ago

    Slightly cheaper than the last gen, but still disappointed with the price of the GTX980. Without a die shrink and with a 256 -bit memory interface, these have got to be be way cheaper to produce than GK110. I suspect there is potential for dramatic price drops if AMD puts up a fight.

    The GTX970, on the other hand, is priced very fairly, and I take my hat off to Nvidia for that. There’s such a huge gap in price, but not much in performance. How do they expect to sell the gtx980s at all?

    As for gtx780 owners like myself, not much to see here, I think. Could look for a second had gtx780 for SLi, and then leave it there until next gen. But only if games arrive that need it.

      • travbrad
      • 5 years ago

      [quote<]The GTX970, on the other hand, is priced very fairly, and I take my hat off to Nvidia for that. There's such a huge gap in price, but not much in performance. How do they expect to sell the gtx980s at all?[/quote<] You would think they'd have a hard time, but there are always people willing to pay just about any cost for "top of the line" performance. The gap between the 780 and 780ti was similar, in both price and performance.

        • bfar
        • 5 years ago

        There was no Ti when the 780 launched. It was 770 -> 780 and the performance gap was larger.

      • Krogoth
      • 5 years ago

      980 will sell out like a hotcakes among the high-end crowd. It has 780Ti performance for a lower price-point and power consumption. A 980 SLI is a steal for what it is.

        • bfar
        • 5 years ago

        True enough. And on that note, current 780 owners also have the option of going SLi until a product with big Maxwell or a die shrink lands on the shelves. There should be a nice glut of Keplers on the second hand market arriving now 🙂

        • travbrad
        • 5 years ago

        Yeah compared to the last generation it’s a steal. I would just say the 970 is even more of a steal.

    • puppetworx
    • 5 years ago

    I did not expect such an improvement over the last generation, especially on the same 28nm process. The 970 looks damn good.

    • USAFTW
    • 5 years ago

    Is there any way to watch Jen Hsun’s keynote in full?

      • Mocib
      • 5 years ago

      [url<]http://www.twitch.tv/twingalaxieslive/b/569845701?t=7h39m30s[/url<]

    • rahulahl
    • 5 years ago

    I cant find the Asus Strix 970 for sale anywhere yet.
    Has it even been released?

    I want to get them in SLI, but since in Australia its probably gonna end up costing me $550 each, I would rather get them from ebay, newegg or something.

      • f0d
      • 5 years ago

      pc case gear has an msi 970 for $519
      [url<]http://www.pccasegear.com/index.php?main_page=product_info&products_id=29086[/url<] prob better off getting from amazon (newegg delivery to aus is expensive)

      • juzz86
      • 5 years ago

      New listings at PCCG, stock “Galax” 970 @ 419, Super Overclock @ 449.

      If they weren’t a completely unknown brand (Galaxy subdivision?) I’d probably jump on a pair of the SOC ones, but I’ll wait and see what the waterblocks play nice with. As there’s no reference 970 we’re probably going to need to be a bit careful blockwise.

        • Klimax
        • 5 years ago

        Looks more like a typo.

          • green
          • 5 years ago

          galax.net/US/story.html

          in short, a merger between galaxy and kfa2 to rebrand as galax

        • rahulahl
        • 5 years ago

        I am hoping I can get something about $400 from ebay soon.
        If yes, then I get a couple of them and that would be much cheaper than buying in Australia.

      • Antias
      • 5 years ago

      I hear ya mate..
      PC Case Gear is my go-to shopping place but the “Aussie Tax” on our PC gear is often crippling…
      If PCCG can reign in there AU Tax $ then they may just get a sale from me as long as its under $450 AU…

      Edit – looks like they might actually suprise us..
      There’s some there now for $419 (stock) and the EVGA superclocked is only $469

    • albundy
    • 5 years ago

    “Substantial new rendering capabilities for DX12”
    “GM204 contains some significant new technology”

    this is why i’m hesitant. once DX12 comes out next year, I have a feeling that hardware requirements will “substantially” change again, like a dreaded new shader model.

    • wnstitw
    • 5 years ago

    Can you please normalize those frame time graphs.

    Presumably they are used to compare the GPUs across different scenes which are delineated by time. Time to render a given frame number is no longer an apt comparison when one test system may be rendering an entirely different scene from another with that particular frame. It would make more sense for x-axis to measure running time (or just test completion percentage). In other words, stretch or compress the individual graphs along the x-axis to be the same length.

      • Terra_Nocuus
      • 5 years ago

      The length of the frame time graph illustrates how “good” the card is at running that particular benchmark: i.e., if the card is better, it renders more frames, the graph is longer.

      The key thing to look for is the longest frame time graph w/ the lowest peaks.

        • wnstitw
        • 5 years ago

        You don’t need a graph for that. If you want a general sense for how bad frame time spikes are and their frequency there are literally multiple charts that show exactly that (99th percentile, time spent beyond 33ms etc.) and do it more succinctly and effectively. Overall graph length is equivalent to avg fps. Can you clearly see the difference between the lengths of frame time graphs of the 970 and the 980? Avg fps numbers and charts do this better.

        The whole purpose of putting frame times on a graph is so you can directly compare GPUs at any given moment of the benchmark to find relative strengths and weaknesses. If an intensive scene is rendered at a given point in time all GPUs should experience an increase in frame times. With time on the x-axis, these spikes will appear on the same x value and thus you could quickly see how the GPUs compare at rendering that particular scene. Using frame numbers as a metric offsets the graphs so they can no longer be directly compared.

        Going a step further, it would be nice to be able to hover your mouse over any part of the time normalized graph and see a small screen cap of what is being rendered. This way you could see exactly what sort of effects on screen cause the spikes.

          • Terra_Nocuus
          • 5 years ago

          That mouse-over idea sounds awesome, but as a web developer, please God, no! 😀

    • Meadows
    • 5 years ago

    [quote<]Our "badess" metric captures the difference most dramatically.[/quote<] Crysis 3 has a badass metric. After all this talk of the GTX 980 not bringing the architecture's ultimate potential to the fight, I was a bit surprised to see it beat the 780 Ti in all but one of the tests, also making the price seem reasonable in the process. <Tyrael> I finally understand. </Tyrael>

    • USAFTW
    • 5 years ago

    AMDs silence is so defeaning. I wish we had more information on what they plan to do to counter Maxwell.
    Maxwell’s biggest bullet point for me is the power efficiency. GTX 780 Ti and 290x beating performance at that power consumption, hell yeah!

      • exilon
      • 5 years ago

      Silence? Pfft, they leaked a 400W single-GPU CLC and announced more freesync!

        • USAFTW
        • 5 years ago

        Quite honestly, after seeing what maxwell has to offer, I don’t want a 300+ graphics card anywhere near me, no matter how low the price. Part of me wishes for AMD to succeed, but frankly it’s a tough situation, for them.
        Kinda reminds me of GT200 vs. RV770 days. I hope AMD pulls another RV770 one day.

          • sschaem
          • 5 years ago

          I would be fine with nvidia releasing a 350W+ graphic card.

          Like a larger maxwell with 512bit bus.
          Frankly the last thing I want in a high end PC or workstation is to save 50 to 100w at load by reducing the performance by 25%. makes no sense to me.

          And yes, AMD has been fully cornered by Intel and nvidia this year…

      • ronch
      • 5 years ago

      [quote<]AMDs silence is so defeaning. I wish we had more information on what they plan to do to counter Maxwell. [/quote<] Wait for 20nm, of course.

      • tomc100
      • 5 years ago

      Yeah, I’ve been using ATI/AMD gpu for probably over 10 ten years. The last Nvidia gpu I had was the first TNT. I then switched over to the red team when they came out with the 9700 pro which blew everything away at the time and I have never looked back. I even supported them when they came out with that stinker 2800 only because it supported AGP slot. If they don’t come out with a truly revolutionary card that is energy efficient, low tdp, and faster frame rates then the next gpu I get will probably be a GTX 980. Rumor is that AMD will continue using the same gcn architecture but use more transistors for their R9 390X and will require liquid cooling which is the wrong approach. Also, they need to release drivers more frequently instead of every 6 months.

    • yogibbear
    • 5 years ago

    I love this review! One thing though that could make it even better is if you could either show how much VRAM is being used in the game at the settings chosen, or when it is likely/possible that the lower FPS no’s are due to those cards hitting VRAM chokes that you test them at a lower resolution to see if the drop off is due to hitting the 2GB and 3GB VRAM limitations that the 970 & 980 are not held back by? This is mainly because I don’t own a 1600p monitor yet 🙁 (see the hardware survey you did recently and check the most common monitor resolution). i.e. I am glad you are testing these games at good resolution, but wouldn’t mind also knowing a few results at a lower resolution just to confirm yes/no if you’re hitting VRAM chokes. Thanks.

      • Airmantharp
      • 5 years ago

      Here’s what you need to know- for today’s crop of games, 2GB is plenty for 1080p and scraping for 1440p/1600p, 3GB is enough for 1440p/1600p and scraping for 4k, and 4GB is enough for 4k.

      Also note that none of these games have been designed with just the new generation of consoles and PCs in mind. Future games that make use of the new consoles’ expansive available memory may very well be VRAM limited on PCs when run on their highest settings and at higher resolutions.

    • ronch
    • 5 years ago

    Considering how Nvidia must surely be planning to talk about their new cards on their next event, I wouldn’t want any AMD fans to go there, see them, and be amazed and possibly be swayed to the Green camp if I were AMD. Yet that’s exactly what AMD did.

    • NovusBogus
    • 5 years ago

    I’d been holding off upgrading my stopgap 650 Ti for about a year to see what the next generation would look like. It would appear I chose wisely. I was expecting legendary efficiency but that 970 utterly dominates on price/performance too. A $300 Titan slayer masquerading as a high-midrange card, lol wut? At this rate I bet the 960 will be a cute-size card that toasts a 770/280X for half the price.

    • swampfox
    • 5 years ago

    I hope NVidia jumps at the chance to come out with a 999X… now that’s the card I need!

      • MEATLOAF2
      • 5 years ago

      They could make a dual 999X and call it the “GTX 2000”.

        • willmore
        • 5 years ago

        Rounding error?

          • MEATLOAF2
          • 5 years ago

          Floating point error, you’ll have to wait for the “Titan 1998”.

            • willmore
            • 5 years ago

            That’s an integer. 🙂

      • gigafinger
      • 5 years ago

      And AMD counters with a 6000 SUX.

      I’d buy that for a dollar!

    • Pantsu
    • 5 years ago

    Titan class performance has reached 329$ levels – just think about that for a moment. It’s great to see Nvidia finally push the prices down, something I really didn’t expect from them. The 970 is a really tough blow for AMD. It almost matches the 290X, so what are they going to do? Release a full Tonga that will likely lose to the 970 in every category? Looks like we’re in for some nice price drops and actual competition for once. I can’t think of a time Nvidia had such a good deal going the last time – I guess it goes back all the way to the 8800gt.

    Can’t wait for the big chips to come from both teams. Might be finally time to let go of my 280X CF.

      • swampfox
      • 5 years ago

      Just looking at the graphs, I’d guess the full Tonga would compete reasonably with the 970, but I’d be surprised if it surpassed it and hit into 980 territory.

        • Mikael33
        • 5 years ago

        And do so using a ton more power.

    • TwoEars
    • 5 years ago

    It is going to be a long Friday before it’s weekend at AMD headquarters.

    • HurgyMcGurgyGurg
    • 5 years ago

    Anyone know how to relate the specs back to CUDA performance? Do a lot of models for work that can take over a week on a 780 so any performance improvements are easy to justify 😛

    Will outperform a 780 TI for instance on GEMM / general BLAS needs? FLOPS is good but how to equate other metrics back in and their relations to matrix ops / FFT / conv seems difficult. Don’t know other review sites well and can’t find any reviews that have CUDA benchmarks.

    Really appreciate any advice, thanks!

      • f0d
      • 5 years ago

      the GX-X04 chips are not really built for CUDA performance so i dont think it would do as well as the 780TI on CUDA

      i could be wrong but usually the “mid range” chips (seems weird calling it mid range) are not as good as the bigger GPU models for CUDA

        • cobalt
        • 5 years ago

        That’s true for double precision, as it’s uncapped on variants of the big chips. But if you’re okay with single precision, the CUDA performance has been growing along with gaming performance.

    • codedivine
    • 5 years ago

    I guess AMD has been sitting on GCN laurels for far too long, with only minor evolution in ~3 years. Even when it debuted, GCN was not the most power-efficient architecture. So no wonder we now have a situation where Nvidia is besting AMD’s flagship card at 2/3rd the power.

      • lycium
      • 5 years ago

      I’d really like to see some OpenCL benchmarks against Intel’s iGPU, since it also seems to be quite power efficient and benefits from the large CPU cache.

        • Mikael33
        • 5 years ago

        If you look towards older generations they’ve both had cards that were on par with each other where one needed a larger TDP to match performance.
        But ya, nVidia sure opened up a giant can of woop ass with these cards, wonder how Amd can possibly counter this, given Tonga isn’t any more power efficient than older GNC designs.

    • Krogoth
    • 5 years ago

    It is the 680 launch redux.

    I expect prices to jump for 970/980 (demand from early adopters) and a price war with the existing Kepler stock going against AMD’s offerings. I’m more curious to see how well Mid-Maxwell handles GPGPU related stuff.

    • odizzido
    • 5 years ago

    I am seriously tempted to get one of these right away. Anyone know how long till the bigger version of tonga comes out?

    • TwoEars
    • 5 years ago

    Looks like Nvidia is bustin a move from AMD’s repertoire with that 970!

    And the performance/watt is also spectacular.

    Hard to believe it was all done on 28nm! Nvidia has some very talented engineers working for them, that much is certain.

    • [+Duracell-]
    • 5 years ago

    That 970 is a beast at that price bracket! I’m really tempted to buy one tomorrow…

    But my monitors are stuck in 1680×1050 land, so there’s no point in me upgrading. Maybe buying a new video card will help me be rid of these monitors? 🙂

    EDIT: Just bought a GTX 970. Whee!

      • TwoEars
      • 5 years ago

      1680×1050? Ugh. How can you live with yourself?

      Just joking of course… but seriously! You need a monitor upgrade!

        • [+Duracell-]
        • 5 years ago

        Right? Just too stubborn to drop $550+ on 1920×1200 monitors. I refuse to go 16:9.

        What’s sad is that I’m stuck with a pair of 1440×900 monitors at work. At least I have two monitors… 🙁

          • swampfox
          • 5 years ago

          1080p isn’t that bad, particularly if you have a couple of them (says the guy reading your comment on a 1920×1200 Dell…)

            • My Johnson
            • 5 years ago

            Mine’s 1920×1200, but it’s TN. Paid *only* $350 for it back in the day.

      • Meadows
      • 5 years ago

      You don’t need to. Use Motherf* AA on the 970 and it’ll be like new.

        • [+Duracell-]
        • 5 years ago

        Very true! Although I’m actually more bugged that my monitors don’t match currently, so I’ll still want to upgrade.

      • ultima_trev
      • 5 years ago

      I’m still running a 1,600*900 monitor but I have no need to upgrade that particular device. The GPU is a different story however, even at this pedestrian resolution, Crysis 3 w/ very high settings + 8x MSAA runs like a snail on my HD 7850. I can’t budget a PSU upgrade so I’m hoping some seriously powerful GPUs that only require a single six pin connection are on the horizon.

        • JustAnEngineer
        • 5 years ago

        With quality power supplies routinely available for less than $50, the idea that you would consider this a hindrance to purchasing a $400 graphics card is bizarre.

      • Jason181
      • 5 years ago

      1680×1050 @ 120 hz requires 96% of the 2560×1440 @ 60 hz horsepower, so I’m interested even though my monitor is stuck in 1680 land as well.

      I’m sure you would’ve mentioned that your monitor is 120+ hz if it was, though

        • [+Duracell-]
        • 5 years ago

        They’re about 6 years old, so they’re not 120+ Hz 🙁

    • ultima_trev
    • 5 years ago

    GTX 970’s release is pretty much the same as what AMD did last year with R9 290X. New card with GTX Titan matching performance at roughly half the price. Definitely gotta feel for those poor bastards that bought a GTX 780 in recent weeks though.

    The GTX 960 rumored to be launching next month is what I’m looking forward too. Roughly R9 290 performance that will draw <130 watts and only require a single six pin connector. However, if that doesn’t deliver I guess I’ll have to wait ’til 16nm (since 20nm seems to be an Apple exclusive). 🙁

    • Freon
    • 5 years ago

    Wonder how much profit they have baked into that ~400mm^2 die and 256bit bus card for $550. I have to imagine they can survive some serious pressure from AMD by just cutting into a bit of that fat buttery margin they left for themselves. Current cards with 384bit buses and larger dies cost less, after all.

    I’ll take a slight drop in price and increase in performance (vs. 780 Ti) as a win for the consumer either way for now, but if AMD can get their act together with a 384bit bus Tonga it could be even more interesting. Here’s hoping for the hobbled underdog.

    • UnfriendlyFire
    • 5 years ago

    Your move, AMD.

    A larger Tonga launch would be preferred, and please give mobile GPUs some love.

      • Krogoth
      • 5 years ago

      IMHO, a “larger” Tonga would have little difference.

    • blitzy
    • 5 years ago

    Sure took a while but its good to see the 780 performance hitting mid tier prices finally.. I’m due for an upgrade so probably will pick up a 970. Upgrading from radeon 5700 so should be a big jump

    Forgot to mention, nice review. Appreciate the extra attention that goes into explaining it in an easy to understand way for us guys who don’t keep super up with the play when it comes to GFX cards. I must be getting old 🙂

      • juzz86
      • 5 years ago

      Echoes my sentiments exactly, right down to the top-notch article. Great work Scott, thankyou.

      Looks like a couple of 970s is the way to go for me, from a 690. Unfortunately, the Australia Tax is in full effect for now, so I’ll hold off a little:

      [url<]http://www.pccasegear.com/index.php?main_page=product_info&products_id=29086[/url<] [url<]http://www.pccasegear.com/index.php?main_page=product_info&products_id=29083[/url<]

        • f0d
        • 5 years ago

        yeah i saw those prices and went “awwwwww”
        was going to get one (pc case gear is just round the corner) until i saw those prices

        hopefully amazon has some decent prices and importing it wont cost as much

          • juzz86
          • 5 years ago

          Yeah, you’re not wrong. I’d expected about USD400 on the 970, so thought about 450 or so here. Margins even worse lol.

          EDIT: Seems PCCG woke up, lol. New listings a bit more reasonable.

    • rxc6
    • 5 years ago

    That 970 looks pretty sweet. It feels like we waited for a long time to get new stuff at decent prices. If AMD doesn’t come up with something interesting soon, it will be time to sell my 7970. That would make it my first NVidia card in 12 years (9700pro->X800 AIW-> 4950->unlocked 6950-> 7970).

    Edit: I just made the rounds to a couple of places and noticed that both anand and hardocp don’t have reviews of the 970 as of now. I want to give you guys props for giving us a full review. Great job!

    • entropy13
    • 5 years ago

    Now if only I have the money for a 970…

      • entropy13
      • 5 years ago

      If each “+1” (currently +9 atm of posting) gave me $50, I’d be able to buy two 970’s! LOL

      Says a lot about me when all I could afford as a new video card in the past 12 months is a GT 630 1GB…

        • superjawes
        • 5 years ago

        If I downvote you, does that mean I get your $50? >:-)

          • entropy13
          • 5 years ago

          Just make sure you send me £31 first before I give you the $50.

    • Ochadd
    • 5 years ago

    Looks like a great time to finally retire my 7970. Nearly a doubling of performance for the same price I paid new. No power supply upgrade required, less noise, less heat, and get to see how big green performs since my 7800 gtx.

      • Airmantharp
      • 5 years ago

      That was my jump from a pair of HD6950’s to a single GTX670- though at the time AMD’s Crossfire was so borked that it felt like a pretty big upgrade when the pair of AMD cards were technically faster :).

    • Orb
    • 5 years ago

    Meh! It’s like 10% faster than 290X. Gonna wait for what AMD has to offer.

      • trek205
      • 5 years ago

      lol have fun with that wait. by time AMD can come up with something Nvidia will be ready for the actual high end Maxwell cards on a smaller process.

        • Orb
        • 5 years ago

        Like Titan ZzzZzzzz? Lmao!

    • Contingency
    • 5 years ago

    When’s the release date?

      • Ninjitsu
      • 5 years ago

      Today?

        • derFunkenstein
        • 5 years ago

        Not seeing them on Amazon or Newegg yet, but Tiger Direct has an eVGA GTX 970 for $340 in stock and ready to go.

        [url<]http://www.tigerdirect.com/applications/SearchTools/item-details.asp?EdpNo=9191503&CatId=7387[/url<]

          • Ninjitsu
          • 5 years ago

          Do you know what the launch prices are, here in India?

          980 = INR 46,000 = USD 755
          970 = INR 28,000 = USD 460

          🙁

            • derFunkenstein
            • 5 years ago

            well you’re getting shafted, not much I can do about that.

    • exilon
    • 5 years ago

    How is AMD going to respond? $270 R9 290? $300 R9 290X?

    Talk about getting Conroe’d.

      • Freon
      • 5 years ago

      384bit Tonga.

      • Krogoth
      • 5 years ago

      Nah, this is the 680 launch again.

        • Airmantharp
        • 5 years ago

        Yup, very familiar though I didn’t expect them to pull it off again.

          • Ninjitsu
          • 5 years ago

          Graphics scales well. Watch them pull this off [i<]again[/i<] on 20nm.

      • NovusBogus
      • 5 years ago

      Launch event coverage: [url<]http://www.youtube.com/watch?v=hjE2sxCQ_rU[/url<]

      • ronch
      • 5 years ago

      How? Send 15,000 fans to crash Nvidia’s party, of co’s.

        • rahulahl
        • 5 years ago

        And those 15,000 fans are so impressed by Nvidias new GPUs, that they change to the green team. 🙂

          • jihadjoe
          • 5 years ago

          “Go crash Nvidia’s party. It’ll be awesome”

          GJ, AMD Marketing Team!

      • Klimax
      • 5 years ago

      Almost identical situation.
      (Apparently dropped word… 😉 )

    • DancinJack
    • 5 years ago

    Yep, take my money for that 970. AMD looks badddddd compared to that 970.

      • The Dark One
      • 5 years ago

      That’s what we expected, right? The 750 Ti was such a kickass little card that its bigger brother was sure to beat a two-year-old AMD design.

      What was interesting to me was that the 970 was a better deal than the 285, according to the scatter plot. Yes, they’re in different price brackets, but the cheaper cards are generally the ones that give the most bang for the buck. The 285X is going to have to be priced well to compete with anything besides other AMD cards.

        • DancinJack
        • 5 years ago

        Yeah, I think it was expected but it’s still nice to see things be delivered upon once in a while. Pretty impressed with Nvidia right now. Crazy to imagine what they could have done with 20nm if Apple wasn’t hogging all the capacity.

      • huge
      • 5 years ago

      Agreed.

      • JumpingJack
      • 5 years ago

      Wowser, you nailed it. The 970 take the crown for best bang for the buck. The 290X will need to drop down to the 350 range to really compete. Aside from being just slightly behind, it is a much cooler and quieter card, that might force the 290X down to the 300-320 range. Naturally the entire product stack needs to waterfall behind that.

        • beck2448
        • 5 years ago

        Still see latency issues with 290X. Nvidia ftw.

    • Jason181
    • 5 years ago

    Don’t know about anyone else, but I’d rather see the 8.3 ms results at a lower resolution. High resolution 120 hz monitors are still pretty expensive, and I’d rather have a 120 hz 1920×1080 than a 60 hz 4k.

      • Airmantharp
      • 5 years ago

      I’m eyeing 1440p at 144Hz with some form of adaptive V-Sync, but I understand the sentiment :).

    • Anvil
    • 5 years ago

    Looks like a very decent set of video cards there. I’m set with my 290 but it definitely seems like the 970 is the new Card to Get. Hopefully Nv can keep up supplies of that one.

    • jessterman21
    • 5 years ago

    Haha, TR is FIRST!

      • PrincipalSkinner
      • 5 years ago

      But you are not.

    • willmore
    • 5 years ago

    I knew I stayed up late for a reason! Yes, thank you, TR!

      • Klimax
      • 5 years ago

      Morning breakfast for me. (Comments delayed by work…)

      Note: CST (Central Europe Summer Time)

Pin It on Pinterest

Share This