Nvidia’s GeForce GTX 980 Ti graphics card reviewed

You knew it was coming. When Nvidia introduced the GeForce GTX Titan X back in March, it was only a matter of time before a slightly slower, less expensive version of that graphics card hit the market. That’s pretty much how it always happens, and this year’s model is no exception.

Behold, the GeForce GTX 980 Ti:

Drawing on my vast reserve of historical knowledge, I can tell you that the “Ti” at the end of that name ostensibly stands for “Titanium.” Look a little closer at the specs for this product, though, and you’ll notice that it might as well stand for “Titan.” The GTX 980 Ti is more of a slightly de-tuned Titan X than a hopped-up GeForce GTX 980.

The GTX 980 Ti is based on the GM200 graphics chip, just like the Titan X, and the spec sheet lists the same base and boost clock speed for both cards. The 980 Ti comes with two modest reductions: only 22 of the GM220’s possible 24 shader multiprocessor units are enabled, and the card has “only” 6GB of GDDR5 memory onboard. That’s it for the cuts, and they’re mostly painless. The 980 Ti still has the same polygon throughput and memory bandwidth as a Titan X, with only a tad less texture filtering and computational power.

Well, kinda. You see, Nvidia has further tuned the GM200 GPU on the 980 Ti, and it expects slighty higher operating clock speeds (~20Hz out of ~1000MHz) as a result. So the difference between the first run of Titan X cards and this newcomer is even smaller in practice than the specs sheet suggests.

Not to worry, rich kids: brand-new Titan X cards now ship with this same tuning, so you can still be ultimate by ordering a new Titan X. (Or, you know, pushing the little slider around in an overclocking utility.)

GPU

base

clock

(MHz)

GPU

boost

clock

(MHz)

ROP

pixels/

clock

Texels

filtered/

clock

Shader

pro-

cessors

Memory

path

(bits)

GDDR5
transfer

rate

Memory

size

Peak

power

draw

E-tail

price

GTX
960
1126 1178 32 64 1024 128 7 GT/s 2 GB 120W $199.99
GTX
970
1050 1178 56 104 1664 224+32 7 GT/s 3.5+0.5GB 145W $329.99
GTX
980
1126 1216 64 128 2048 256 7 GT/s 4 GB 165W $499.99
GTX 980 Ti 1002 1075 96 176 2816 384 7 GT/s 6 GB 250W $649.99
Titan
X
1002 1075 96 192 3072 384 7 GT/s 12 GB 250W $999.99

The table above shows the revised GeForce lineup, and you’ll notice that the GTX 980 Ti lists for $649.99. That’s a nice discount from the one-grand price of the Titan X, especially considering how similar the two products really are. The GTX 980 Ti will come with a copy of Batman: Arkham Knight, as well. That’s not exactly a bargain, but it’s a way better deal than the $1K flagship.

Speaking of which, to make room for the 980 Ti, Nvidia has also dropped the price of the vanilla GeForce GTX 980 by 50 bucks to $499.99.

Beyond that basic info, there’s not much more to say about this new GeForce. The board is rated for 250W of peak power draw, so Nvidia recommends a 600W PSU for the host system. You’ll need one eight-pin PCIe aux power lead and one six-pin lead in order to power the card.

As you can see, our review unit comes with Nvidia’s standard silver-and-black reference cooler with light-up green lettering across the top. I still like the looks of it, and the cooler’s performance is pretty solid, although the Titan X’s black-out paint job is easier on the eyes, in my estimation.

Nvidia is releasing a little bit of other news today to go along with the GTX 980 Ti’s introduction. Among those tidbits is an update on G-Sync and some new software tech for virtual reality game development, which I’ve covered separately. Now, let’s see how this puppy performs.

Our testing methods

Most of the numbers you’ll see on the following pages were captured with Fraps, a software tool that can record the rendering time for each frame of animation. We sometimes use a tool called FCAT to capture exactly when each frame was delivered to the display, but that’s usually not necessary in order to get good data with single-GPU setups. We have, however, filtered our Fraps results using a three-frame moving average. This filter should account for the effect of the three-frame submission queue in Direct3D. If you see a frame time spike in our results, it’s likely a delay that would affect when the frame reaches the display.

We didn’t use Fraps with Civ: Beyond Earth or Battlefield 4. Instead, we captured frame times directly from the game engines using the games’ built-in tools. We didn’t use our low-pass filter on those results.

As ever, we did our best to deliver clean benchmark numbers. Our test systems were configured like so:

Processor Core i7-5960X
Motherboard Gigabyte
X99-UD5 WiFi
Chipset Intel X99
Memory size 16GB (4 DIMMs)
Memory type Corsair
Vengeance LPX
DDR4 SDRAM at 2133 MT/s
Memory timings 15-15-15-36
2T
Chipset drivers INF update
10.0.20.0

Rapid Storage Technology Enterprise 13.1.0.1058

Audio Integrated
X79/ALC898

with Realtek 6.0.1.7246 drivers

Hard drive Kingston
SSDNow 310 960GB SATA
Power supply Corsair
AX850
OS Windows
8.1 Pro
Driver
revision
GPU
base

core clock

(MHz)

GPU
boost

clock

(MHz)

Memory

clock

(MHz)

Memory

size

(MB)

Asus
Radeon
R9 290X
Catalyst 15.4/15.5
betas
1050 1350 4096
Radeon
R9 295 X2
Catalyst 15.4/15.5
betas
1018 1250 8192
GeForce
GTX 780 Ti
GeForce 352.90 876 928 1750 3072
Gigabyte
GeForce GTX 980
GeForce 352.90 1228 1329 1753 4096

GeForce GTX 980 Ti
GeForce
352.90
1002 1076 1753 6144
GeForce
Titan X
GeForce 352.90 1002 1076 1753 12288

Thanks to Intel, Corsair, Kingston, and Gigabyte for helping to outfit our test rigs with some of the finest hardware available. AMD, Nvidia, and the makers of the various products supplied the graphics cards for testing, as well.

Also, our FCAT video capture and analysis rig has some pretty demanding storage requirements. For it, Corsair has provided four 256GB Neutron SSDs, which we’ve assembled into a RAID 0 array for our primary capture storage device. When that array fills up, we copy the captured videos to our RAID 1 array, comprised of a pair of 4TB Black hard drives provided by WD.

Unless otherwise specified, image quality settings for the graphics cards were left at the control panel defaults. Vertical refresh sync (vsync) was disabled for all tests.

The tests and methods we employ are generally publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.

Sizing ’em up

Do the math involving the clock speeds and per-clock potency of the latest high-end graphics cards, and you’ll end up with a comparative table that looks something like this:

Peak pixel

fill rate

(Gpixels/s)

Peak

bilinear

filtering

int8/fp16

(Gtexels/s)

Peak

rasterization

rate

(Gtris/s)

Peak

shader

arithmetic

rate

(tflops)

Memory
bandwidth
(GB/s)
Asus
R9 290X
67 185/92 4.2 5.9 346
Radeon
R9 295 X2
130 358/179 8.1 11.3 640
GeForce GTX
780 Ti
37 223/223 4.6 5.3 336
Gigabyte
GTX 980
85 170/170 5.3 5.4 224
GeForce
GTX 980 Ti
95 189/189 6.5 6.1 336
GeForce
Titan X
103 206/206 6.5 6.6 336

Those are the peak capabilities of each of these cards, in theory. Our shiny new Beyond3D GPU architecture suite measures true delivered performance using a series of directed tests.

The GTX 980 Ti lands squarely in the middle between the GTX 980 and the Titan X in terms of pixel fill rate, which is what we’d expect given the theoretical rates in the table above. Notice that the 980 Ti’s peak rate is lower than the Titan X’s even though it has the same ROP count (96 pixels per clock) and clock speed. That’s because, on recent Nvidia GPUs, fill rate can be limited by the number of shader multiprocessors and rasterizers. The GTX 980 Ti’s 22 SMs can only transfer 88 pixels per clock to the ROPs, so its peak throughput is a bit lower than the Titan X’s.

This test nicely illustrates the impact of color compression on memory bandwidth. Newer GeForces based on the Maxwell architecture are able to extract substantially more throughput from the easily compressible black texture than the Kepler-based GTX 780 Ti does.

Meanwhile, as Andrew Lauritzen pointed out to us, the Radeon R9 290X doesn’t show any compression benefits in this test because it’s primarily limited by its ROP throughput. We may have to rejigger this test to sidestep that ROP limitation. I suspect, if we did so, we’d see some benefits from color compression on the Radeon, as well.

Most of these GPUs come incredibly close to matching their peak theoretical filtering rates in this test, the GTX 980 Ti included.

The GeForce cards all somewhat exceed their theoretical peaks in the polygon throughput test. My best guess is that they’re able to operate at higher-than-usual clock speeds during this directed test—either that or they’re warping the fabric of space and time, but I don’t think that feature has been implemented yet.

Looks to me like the GTX 980 Ti is also exceeding its “GPU Boost” clock in our ALU math test, where it scores slightly higher than its 6.1 teraflops theoretical max. Nvidia’s Boost clock is just a typical operating speed, not a maximum frequency, so this isn’t a huge surprise. Notice, though, that the Titan X tops out at exactly 6.7 teraflops, no higher than expected. The 980 Ti’s lower GPU voltage probably gives it an edge over our early-model Titan X.

Project Cars
Project Cars is beautiful. I could race around Road America in a Formula C car for hours and be thoroughly entertained, too. In fact, that’s pretty much what I did in order to test these graphics cards.


Click the buttons above to cycle through the plots. Each card’s frame times are from one of the three test runs we conducted for that card. You’ll see that the frame rendering times for the GTX 980 Ti are nice and consistently low, always below about 25 milliseconds, even though we’re testing at 4K with some pretty intensive image quality settings. The plots for most of the GeForce cards look similar.

Switch over to the Radeon plots, and things look a little rougher. That’s true even though I tested this game with the latest Catalyst 15.5 beta drivers with specific optimizations for Project Cars. The R9 290X doesn’t look bad—just a bit slow overall, really—but the R9 295 X2 is another story. The R9 295 X2 is a dual-GPU monstrosity with water cooling that is arguably the GTX 980 Ti’s closest competition from AMD. The X2’s dual GPUs make it generally faster than the 290X, but this card somehow runs into some trouble in the middle of our test run. In fact, we encountered the same issue across multiple test runs.

Granted, we’re testing with Fraps, which measures early in the frame production pipeline, not with FCAT, which measures frame delivery to the screen. (I’d prefer to test with FCAT but haven’t been able to get it working at 4K resolutions.) AMD’s frame-pacing solution might smooth the delivery of frames to the screen and produce a bit smoother line than you see in the plot above, but it can’t fix those larger delays.

In fact, you can feel this slowdown while playing on the 295 X2. It happens in a specific section of the track, a long straight where the car is pointed into the sun.

The R9 295 X2 matches the GTX 980 Ti in average FPS, but we know than the 980 Ti’s frame delivery is much smoother overall. That fact is captured by our frame-time-sensitive 99th percentile metric.

We can understand in-game animation fluidity even better by looking at the entire “tail” of the frame time distribution for each card, which illustrates what happens with the most difficult frames.

The 980 Ti and the 295 X2 generally perform the same, but the Radeon struggles with the last five to 10 percent of frames that prove challenging for whatever reason.


These “time spent beyond X” graphs are meant to show “badness,” those instances where animation may be less than fluid—or at least less than perfect. The 50-ms threshold is the most notable one, since it corresponds to a 20-FPS average. We figure if you’re not rendering any faster than 20 FPS, even for a moment, then the user is likely to perceive a slowdown. 33 ms correlates to 30 FPS or a 30Hz refresh rate. Go beyond that with vsync on, and you’re into the bad voodoo of quantization slowdowns. And 16.7 ms correlates to 60 FPS, that golden mark that we’d like to achieve (or surpass) for each and every frame.

The 295 X2 spends some time beyond the 33-ms mark, as does the R9 290X. None of the GeForces do.

The Witcher 3

Performance in this game has been the subject of some contention, so I tried to be judicious in selecting my test settings. I tested the Radeons with the just-released Catalyst 15.5 beta drivers, and all cards were tested with the latest 1.04 patch for the game. Following AMD’s recommendations for achieving good CrossFire performance, I set “EnableTemporalAA=false” in the game’s config file when testing the Radeon R9 295 X2. And, as you’ll see below, I disabled Nvidia’s HairWorks entirely in order to avoid the associated performance pitfalls.


The Maxwell-based 9-series GeForce cards all perform well here, but the GTX 780 Ti and the Radeons struggle a bit. Notice that we’re not even testing at 4K.

Once again, the R9 295 X2 would seem to perform well based on its FPS average, but the frame-time plots and the 99th percentile tell a different tale. As you can see in the curve below, the 295 X2 struggles with the last five percent or so of frames. We know from the plot and from play-testing that those slowdowns are distributed throughout the duration of our test session, with the most trouble coming in the first third or so. The 295 X2’s performance isn’t horrible, but it doesn’t deliver smooth gaming to match the GTX 980, let alone the 980 Ti.

The GTX 980 Ti remains a very close match for the Titan X. In fact, their frame time curves are nearly right on top of one another in the image above.


The R9 295 X2 does perform well generally, as indicated by the fact that it spends less time beyond 16-ms threshold than the 980 Ti. But it does spend time beyond the 50-ms threshold, and the Maxwell-based GeForces don’t.

GTA V

Forgive me for the massive number of screenshots below, but GTA V has a ton of image quality settings. I more or less cranked them all up in order to stress these high-end video cards. Truth be told, most or all of these cards can run GTA V quite fluidly at lower settings in 4K—and it still looks quite nice. You don’t need a $500+ graphics card to get solid performance from this game in 4K, not unless you push all the quality sliders to the right.



Finally, we have a game where the R9 295 X2 seems to live up to its considerable potential. The X2’s lead over the GTX 980 Ti isn’t as dramatic as the FPS average would seem to indicate, though, if you look at the 99th percentile frame times. Also, the GTX 980 Ti somehow edges out the Titan X here by a smidgen, perhaps due to slightly higher operating clock speeds.

Far Cry 4



The R9 295 X2’s performance in Far Cry 4 is much improved from the basket case is was in our Titan X review.

As for the 980 Ti, it’s almost an exact match for the Titan X; its orange curve is entirely covered by the Titan X’s in the plot above.

Alien: Isolation



No surprises here. The cards all perform well, and they finish pretty much in the order one might expect.

Civilization: Beyond Earth

Since this game’s built-in benchmark simply spits out frame times, we were able to give it a full workup without having to resort to manual testing. That’s nice, since manual benchmarking of an RTS with zoom is kind of a nightmare.

Oh, and the Radeons were tested with the Mantle API instead of Direct3D. Only seemed fair, since the game supports it.



This is an incredibly close match-up between the top three cards, with no clear winner. The 295 X2 is using the Mantle graphics API and a load-balancing method called split-frame rendering in order to divvy up the work between its two GPUs. SFR is preferable to the usual method, AFR, for a number of techie reasons. SFR doesn’t yield abnormally high FPS averages, but it can produce a better user experience, with more instant responses to user inputs. Interestingly, when the X2 teams its two big Hawaii GPUs together, it delivers approximately the same performance as one GM200 GPU aboard the GTX 980 Ti.

Battlefield 4

We tested BF4 on the Radeons using the Mantle API, since it was available.



Once again, with Mantle and proper load-balancing, the 295 X2’s two Hawaii GPUs almost exactly equal the performance of a single GM200 GPU aboard the GTX 980 Ti or Titan X.

Crysis 3



This is an odd one. The R9 295 X2 is clearly the fastest card overall, yet it spends the most time beyond the 50-ms threshold thanks to a few distinct slowdowns during the test session. My subjective sense is that the GM200-based GeForces feel smoother overall, although we’re kind of splitting hairs at this point.

Power consumption

Please note that our “under load” tests aren’t conducted in an absolute peak scenario. Instead, we have the cards running a real game, Crysis 3, in order to show us power draw with a more typical workload.

Here’s where we figure out why the R9 295 X2 isn’t really a very good foil for the GTX 980 Ti. When equipped with the X2 and its two big Hawaii GPUs, our test rig draws more than twice the power at the wall socket than it does with a GTX 980 Ti. We’re measuring power draw in Crysis 3—and as we’ve just noted, the GTX 980 Ti and the 295 X2 deliver very similar performance in this game. The power efficiency implications are pretty clear.

Noise levels and GPU temperatures

These video card coolers are so good, they’re causing us testing problems. You see, the noise floor in Damage Labs is about 35-36 dBA. It varies depending on things I can’t quite pinpoint, but one notable contributor is the noise produced by the lone cooling fan always spinning on our test rig, the 120-mm fan on the CPU cooler. Anyhow, what you need to know is that any of the noise results that range below 36 dBA are running into the limits of what we can test accurately. Don’t make too much of differences below that level.

The GTX 980 Ti draws a bit less power under load than the Titan X, and as a result, it doesn’t push the Nvidia reference cooler quite as hard. The result is a small reduction in noise levels under load. The 295 X2 isn’t that much louder than the 980 Ti, at the end of the day, but it manages that feat by combining a longer card with an external radiator for its water cooler.

Conclusions

As usual, we’ll sum up our test results with a couple of value scatter plots. The best values tend toward the upper left corner of each plot, where performance is highest and prices are lowest. We’ve converted our 99th-percentile frame time results into FPS, so that higher is better, in order to make this work.


The GeForce GTX 980 Ti fits neatly into Nvidia’s current lineup, offering nearly the same performance as the Titan X at a considerable discount. If you were salivating over a Titan X but decided to wait for the less expensive version, your patience has been rewarded.

The matchup between the GTX 980 Ti and its closest competitor, the Radeon R9 295 X2, is a strangely close mismatch. The R9 295 X2 clearly has more raw GPU horsepower, as demonstrated by its commanding lead in terms of FPS averages across the eight games we tested. Yet the X2 can’t always turn that additional power into a fluid gaming experience, which is why the 980 Ti ever-so-slightly surpasses it in our 99th percentile frame time metric. The dual-GPU Radeon’s performance is somewhat brittle, and it’s too often troublesome in recently released games.

The R9 295 X2 also consumes a heckuva lot of power, more than double that of a GTX 980 Ti when installed in the same system. To give you a sense of the disparity, have a look at this scatter plot of power efficiency in Crysis 3. The most efficient solutions will tend toward the top left portion of the plot.

These results are from just one game, but they illustrate how effectively Nvidia has managed to improve power efficiency with its Maxwell architecture. The GTX 980 Ti continues that tradition and leaves AMD at a distinct disadvantage. I can’t imagine choosing an R9 295 X2 over a GTX 980 Ti right now.

Then again, prospective buyers may want to wait a few weeks, because AMD is preparing its own next-gen GPU, code-named Fiji, to do battle with the GM200. We already know Fiji will feature an innovative memory type, known as HBM, that promises quite a bit more throughput than any of Nvidia’s current products. That advantage could make Fiji rather formidable when all is said and done. At the very least, things are about to get interesting. Nvidia has played a strong hand, and now it’s up to AMD to counter.

Enjoy our work? Pay what you want to subscribe and support us.

Comments closed
    • bhappy
    • 4 years ago

    Just wondering why a non reference R9 290X card/cooler with higher clocks was used for this review vs a stock gtx 980? Was it done because of the price difference between the two cards and if this was the case why wasn’t a non reference gtx 970 included in this review for comparison’s sake?

    • brothergc
    • 4 years ago

    wonder how it measures up to a plain GTX970 ?

    • ultima_trev
    • 4 years ago

    After taking this in, I think the most surprising thing about this review is not GTX 980 Ti giving 98% of Titan X’s performance, but vanilla GTX 980 (a mid range chip) beating out the R9 290X (AMD’s highest end chip) by roughly 30% overall.

    If there was any doubt before, it is there no longer. Maxwell is clearly nVidia’s Nehalem, the advancement that cements their dominance over AMD, much like Intel did back in 2008 with Bloomfield.

    There’s little hope of AMD closing the gap between 290X and 980 Ti with Fiji. Even if they do, it will be a $850 card that’s so hot/power hungry that it requires water cooling versus nVidia’s far more efficient, streamlined $650 technological wonder.

    Sorry AMD, hate to see you go but you’re never going to fend off the imminent bankruptcy at this rate. GG.

      • K-L-Waster
      • 4 years ago

      [quote<]There's little hope of AMD closing the gap between 290X and 980 Ti with Fiji. Even if they do, it will be a $850 card that's so hot/power hungry that it requires water cooling versus nVidia's far more efficient, streamlined $650 technological wonder.[/quote<] And we know this how? The only one of those things we know is it will be water cooled -- the rest is unconfirmed at this point.

      • Takeshi7
      • 4 years ago

      It was Conroe that cemented Intel’s dominance over AMD. AMD hasn’t been able to match Intel’s top tier since then.

      • Meadows
      • 4 years ago

      I hope you washed your hands after pulling out that price figure.

      • Krogoth
      • 4 years ago

      Maxwell is not a “Nehalem”.

      Maxwell is just an evolution of the Fermi family that builds upon the work Kepler started. Maxwell isn’t really that game changer either. Maxwell isn’t that much faster than their Kepler predecessors. This is mostly due to being stuck on TSMC’s 28nm process.

      Maxwell hasn’t offer any game changing SKU yet. 970 is close, but no cigar.

        • techguy
        • 4 years ago

        You’re way off. Maxwell in the form of GM204 is approximately 9-17% faster on average than Kepler in the form of GK110. All that whilst consuming 68-80W less power. And on the same process node. Maxwell is the most impressive architecture from either IHV since G80.

          • Krogoth
          • 4 years ago

          Take off the green-tinted shades. It is making you silly.

          GM204 die is almost as large as a GK110 die and the power consumption difference is actually closer to 20-40W at load. Maxwell doesn’t offer anything game changing to the market. It is just another evolution of Fermi dynasty.

            • techguy
            • 4 years ago

            it’s not like this information is secret…

            [url<]Http://techreport.com/review/27067/nvidia-geforce-gtx-980-and-970-graphics-cards-reviewed/12[/url<] [url<]https://techreport.com/review/27067/nvidia-geforce-gtx-980-and-970-graphics-cards-reviewed[/url<] for the record I have no allegiance in the GPU wars, I buy whatever makes sense for my purposes at the time I'm ready to buy. I have a 290 in one system (which I was just using to play a game with a friend before I made this post) and a 980 Ti in another system.

            • Krogoth
            • 4 years ago

            Fermi was the last game changing GPU that came from either IHV, despite the rough start. Tahiti comes in at second place. Their successors so far have been evolutions. Fiji and Pascal are going to be next big thing from either IHV.

            Maxwell is closer to being the “Haswell” of its family.

            • techguy
            • 4 years ago

            I see the argument you’re making but since Maxwell and Kepler are on the same process node and Maxwell is vastly more efficient per Watt it’s a hard argument to sell. Maxwell isn’t just a power-optimized Kepler, it puts out greater performance with fewer ALUs as well. If it were the same architecture the second point this be impossible to achieve. Also, the reduction in power consumption wouldn’t be possible to the degree which it is currently. You can get a few Watts from process maturation, though at the time of Maxwell’s release relative to TSMC’s 28HP process all the low-hanging fruit had already been picked so even that is hard to believe.

            I agree somewhat about Fermi, it was a very different architecture than its predecessors but the end result wasn’t nearly as impressive relative to the competition as Maxwell.

            • Chrispy_
            • 4 years ago

            When arguing Maxwell against Kepler, you have to remember that some of it is the application of lessons learned, genuine improvements in efficiency and performance…

            …and most of it is just a redistribution of transistor budgets. Maxwell is better [i<]at gaming[/i<] than Kepler, but they sacrificed other features and performance to get there, Not only is Maxwell a huge downgrade in some departments over Kepler, Kepler was an even greater downgrade over Fermi. Maxwell is Nvidia's most gaming-focussed architecture yet. It's an excercise in min-maxing and it's why Maxwell hasn't appeared in any format as a compute card, workstation card, double-precision card, or anything to do with Tesla. I am not for one minute defending AMD's gaming performance and efficiency defecit, but Hawaii cards are far more capable overall than anything in the Maxwell lineup, and it's AMD's stubborn refusal to trim the workstation, DP, compute functions that keeps them in competition with the GK110 and not the GM204

            • K-L-Waster
            • 4 years ago

            From that perspective, though, would it not make more sense to have 2 separate chip lines? Build one chip that is designed exclusively to perform well in games, and an entirely different architecture for compute that is designed exclusively for double precision?

            The advantage would be that you would not have to compromise in either arena. True, you wouldn’t be able to sell one card to a customer and deliver high performance gaming + compute, but really, the venn diagram of people who need both probably has only a small amount of overlap. (Are there people who need both on one machine? I’m sure there are — but they aren’t likely to be very numerous.)

            Of course, the down side is that you no longer can leverage the economies of scale from the gaming market to keep the price of your compute chips down. Not sure how much of a deal breaker that is, though, when Tesla cards sell for thousands or even tens of thousands — customers in that market aren’t afraid to spend money when they need to.

    • tootercomputer
    • 4 years ago

    Very nice review. Thanks.

    • USAFTW
    • 4 years ago

    Just finished reading the review. I thing the combination of near Titan X performance for 350 bucks less is really enticing.
    A mildly upclocked version should be able to beat the X for around 300 bucks less.
    The power consumption on the 295X2 is stupid high, so that’s the end of it.
    AMD appears to taken the mismanagement of their CPU division and applied it to their up untill now well to do GPU division sucky with it. Let’s hope they get their act together.

      • chuckula
      • 4 years ago

      There’s still Fiji (“Radeon Fury”) coming in the relatively not-too-distant future.

      I’m not that worried about Fiji’s theoretical performance [it’ll take on a Titan X just fine], but I am worried about AMD’s ability to actually get it delivered in quantity in a reasonable timeframe.

      [Edit: For the record, that downthumb means that AMD fanboys obviously think that Fiji is NOT able to take on Titan X. Thanks AMD fanboys, with sycophants like you, AMD doesn’t need enemies.]

        • USAFTW
        • 4 years ago

        Well, what I’ve seen from the leaks tell me:
        1. Watercooled: It’s going to be a warm and power hungry so and so.
        2. Dual 8-pin connectors: The card could theoretically chew up to 375 watts. However, if the 295X2 is anything to go by, PEG power regulation hasn’t stopped them before.
        3. Officially confirmed to be limited to 4 Gigs. AMD might try to sugarcoat it but that’s not enough. And we don’t know anything how and if they will be able to utilize the framebuffer more efficiently.
        Either way, this has been one freaking long delay. Nvidia rolled out an entire line up of Maxwell GPUs while AMDs Tahiti GPU from December of 2011 is still on sale!

          • JustAnEngineer
          • 4 years ago

          [quote<]AMDs Tahiti GPU from December of 2011 is still on sale![/quote<] It's a magical place. Tahiti and Pitcairn (GCN 1.0) need to be put out to pasture, but Hawaii & Bonaire (GCN 1.1) and Tonga (GCN 1.2) should still have life in them until 14/16 nm GPUs fill out the entire product line next year.

      • K-L-Waster
      • 4 years ago

      We’ll have to see how Fiji performs in the silicon when it arrives – but I think we should at least do AMD the courtesy of allowing the card to arrive before burying it.

      The built in water cooling angle is potentially worrying, but one explanation may be the fact that the chip and memory module are very compact, meaning that the heat production is concentrated in a small area. If that’s the case, it may not be so much a case of it being a hot running card like the 290x but rather than the heat is in too compact an area for air cooling to work well.

      But yes, compared to the 980 TI, AMD’s previous generation cards aren’t very compelling.

    • cldmstrsn
    • 4 years ago

    Thank you! you guys are literally the only site I have seen that tested The Witcher 3 which is the most demanding game out there right now. Really nice to see that.

    • TopHatKiller
    • 4 years ago

    I’m pleased Nv priced it lower then expected. With a bit of luck they’ll soon be a price war. [The only nice kind of war.]

    Excuse me; something is concerning me. Why are there threats of banning for people expressing unpopular opinions or behaving in an uncivil manner? Shouldn’t we just put up with that? Unless a post contains violent threat or actively racist or homophobic language banning people seems too much like censorship.

    Very few people appear to agree with me on anything and many replies to my posts have been rude [or very rude & even worse unfunny] I find it a bit hurtful at time [yes, I’m a cry baby] but I just put up with it. [Mostly]
    But if Mr.Damage or who-so-ever is handing out bans, then please ban the following:
    [a] everyone who doesn’t like me
    [b] everyone who disagrees with me
    [c] just make that… everyone… just leaving me… then I’ll ban myself.

      • chuckula
      • 4 years ago

      So when the highest-modded post in the entire thread is from one Krogoth who isn’t exactly impressed with this card, what exactly constitutes an “unpopular” opinion?

      Are you capable of distinguishing between a post that isn’t overly enthusiastic about a product vs. a post that’s a personal attack on this website and its editors for failing to goose-step to the tune played by the AMD marketing department*?

      Are you aware of the fact that Damage has probably barely gotten any sleep in the last week since he not only knocked out this review but flew halfway around the world to Taipei just to cover Computex in person?

      * Hell, there’s a non-trivial subset of koolaiders around here that would attack TR for being paid anti-AMD shills if TR literally copy & pasted AMD’s press kits instead of reviewing AMD’s own products. They literally think that official on-the-record statements from AMD aren’t pro-AMD enough.

        • TopHatKiller
        • 4 years ago

        oh dear. please re-read my post. everyone should be able to feel free to express their opinion in the way they want. i’m not attacking anyone, and i’m sure mr.damage works hard.
        bye.

    • chuckula
    • 4 years ago

    It’s no big deal. AMD will soon release the Radeon FURY that is paired with a more affordable yet still very powerful sidekick: The Radeon Mild Annoyance.

      • JustAnEngineer
      • 4 years ago

      Radeon R9-390 (sans “X) could be an interesting card. We’ll have to wait and see (except for the mindless fanboys that have already decided the card’s fate based on the expected color of the box).

        • chuckula
        • 4 years ago

        Actually both the R9-390X and R9-390 are rebranded Hawaii parts.

        So far, the only truly new part from AMD this year is the “Fury” (which is what we were calling the R9-390X for lack of a better name earlier).

        • Airmantharp
        • 4 years ago

        I’ve already decided the card’s fate in my box- it will be banished. By the color of the drivers :D.

    • TwoEars
    • 4 years ago

    Things I’d like to see:

    1) Other resolutions than 4k. WHQD or 1080p. Not everyone has (or wants) a 4k display and cards scale differently.

    2) Some older cards. Maybe a 680 for instance, it’s good reference to see if it’s time to upgrade or not.

    3) At least one synthetic benchmark.

    To compensate you don’t need that many gaming tests, 3-5 will suffice.

    • TheSeekingOne
    • 4 years ago

    It seems that Nvidia has managed to make GameWorks a success. AMD’s cards are underperforming nicely in these GameWorks titles. With the exception of FarCry4, which I’m thinking Ubisoft decided to give away info and perhaps some relevant source code to AMD to avoid further public embarrassment, AMD is doing very badly. I know that the 980 used in the review is overclocked, but the difference between AMD and Nvidia becomes relatively big in those GameWorks titles.

    Well… it seems that the only way out of this is for each company to make their own proprietary tech mologies, a vision TR were very uneasy about in their Mantle articles. Strangely though, I don’t see them treating GameWorks with the same criticism they did with Mantle.

      • chuckula
      • 4 years ago

      Good evening Sir.
      I’d like to present you with a selection of hammers: [url<]https://www.google.com/search?q=Sledge+Hammers&source=lnms&tbm=isch&sa=X&ei=10ZsVdOUNMzIogSnmoC4Cw&ved=0CAgQ_AUoAg&biw=1070&bih=1077&dpr=0.9[/url<]

      • killadark
      • 4 years ago

      At the end of the day people see what they want to see, and perceive what that want to…

    • dodozoid
    • 4 years ago

    Interesting how 780Ti generaly sucks… seems like Radeons age with much more grace than GeForces

      • geekl33tgamer
      • 4 years ago

      Yeah, I was going to say it’s the VRAM until I saw the Titan Black performing the same. Maxwell changed the chip design an awful lot, and it become faster at some things and slower at others (just).

      Chalk it up to architecture improvements over the 2 years between the releases…

        • dodozoid
        • 4 years ago

        If my memory is correct, 780Ti did beat the crap out of 290x when it was released, which just doesent seem to be the case right now

          • Klimax
          • 4 years ago

          Standard order of driver optimizations. First this gen then last gen… Whatever inefficiency is there will get addressed. (We have seen it with Kepler/Fermi too)

      • dragosmp
      • 4 years ago

      Dunno if it’s wishful thinking, but I think I noticed that too. My last 3 cards have been AMD so I may be biased, but it seems like AMD cards age better. It could just be that when new their drivers are just so woefully inadequate to exploit the full speed of the card and shlowly but surely they get there :\

      • bfar
      • 4 years ago

      Wasn’t always the way..

        • dodozoid
        • 4 years ago

        Sorry, I dont get it. What did you mean to say?

      • Krogoth
      • 4 years ago

      Nvidia has placed Kepler family into legacy status where they only perform critical updates. Just like how AMD has rendered anything older than Tahiti into legacy status.

        • dodozoid
        • 4 years ago

        but isnt that particular card a year and half old? Tahiti is more than twice its age

          • Krogoth
          • 4 years ago

          It shares the same underlying architecture as famed “GTX 680/GK104” just scaled-up.

          GK1xx family is a little older than the Tahiti family.

            • dodozoid
            • 4 years ago

            I am aware of that, but I reckon it should be judged by the newest product sharing the same fundamental architecture, not the oldest.
            It just doesent make me feal comfortable recomending their products knowing they might be lefr behind as soon as shiny new ones appear… On the other hand I am affraid AMD might not be around in two years (if their next line of GPUs and/or zen both dont bring a major succes) and that would definitely stop any driver support…

    • JustAnEngineer
    • 4 years ago

    The first GeForce GTX 980Ti cards have appeared at Newegg this morning, priced between $660 and $700.
    [url<]http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&N=100007709%20600358543%20600565061&IsNodeId=1&bop=And&Order=PRICE&PageSize=30[/url<] P.S.: Newegg does have a few models listed for $200,000 for those folks that thought that the GeForce GTX Titan X was too plebeian.

      • chuckula
      • 4 years ago

      [quote<]P.S.: Newegg does have a few models listed for $200,000[/quote<] I smell an opportunity for audiophile-esque reviews! Do those $200,000 Titan Xs come with the special oxygen depleted PCBs per chance?

      • travbrad
      • 4 years ago

      Those will go great with my gold plated ethernet cables.

        • cmrcmk
        • 4 years ago

        Your ethernet cables are only gold [i<]plated[/i<]?? I'll bet all your digital music sounds like it's played by grade schoolers!

      • Leader952
      • 4 years ago

      GREAT NEWS there is a [b<][u<]50% OFF SALE[/u<][/b<] and they now are priced at $99,999.00. Be sure to order the one(s) with the $5-$7 shipping and not the one(s) with $99,999.00 shipping.

        • JustAnEngineer
        • 4 years ago

        The $200K listings are gone from Newegg this evening, but they’re now out of stock of every single GeForce GTX 980Ti model.

    • Klimax
    • 4 years ago

    Nice review. I would love to see something like DCS World there too.

    BTW: I think we have just seen sharp limit on AMD’s pricing. Unless AMD gets 15-20% on this card, they won’t be able to go any higher in price. (And they would love to do so)

    Price wars have began…

    • sweatshopking
    • 4 years ago

    GUIZE! THIS REVIEW WAS CLEARLY DONE TO SUPPORT THE LIZARD PEOPLE RUNNING THE GOVERNMENT AND SPYING BY THE ILLUMINATI AT NVIDIA #CRAZYPEOPLE

      • Laykun
      • 4 years ago

      Nothing to [url=http://s3.amazonaws.com/rapgenius/1311949541_ILLUMINATI-ALL-SEEING-EYE.jpg<]see[/url<] here, move along.

    • NarwhaleAu
    • 4 years ago

    Thank you! What a glorious ending to a Sunday. 🙂 I shall read this with vigor before heading to bed.

    …what kind of FPS do you think it would get with DOTA 2?

      • sweatshopking
      • 4 years ago

      All of them.

      • Klimax
      • 4 years ago

      What Sunday. It was my first Monday morning reading… 😉

    • Krogoth
    • 4 years ago

    Not impressed

      • sweatshopking
      • 4 years ago

      THAT’S WHAT SHE SAID

      • Klimax
      • 4 years ago

      Unimpressed by unimpress.

      • chuckula
      • 4 years ago

      Nvidia wants to put a new logo on the box: Officially Krogothed.

    • DancinJack
    • 4 years ago

    o.O i just got a new driver. 353.06. Assuming there are 980Ti enhancements? Not sure it’d really make any difference to your results.

    [url<]http://www.nvidia.com/download/driverResults.aspx/85823/en-us[/url<]

    • Thresher
    • 4 years ago

    Just bought a 980 a month ago. Crud.

    • Wild Thing
    • 4 years ago

    What…no compute benchmarks?
    Guess the complete crippling of DP performance gets a pass…its NV right?

      • chuckula
      • 4 years ago

      [Sniffs the air]

      Yes…. the fear is real in fanboys who couldn’t code a bash script much less do real development that utilizes a GPU.

        • derFunkenstein
        • 4 years ago

        There is literally nothing else to complain about .

      • Damage
      • 4 years ago

      GPU computing benchmarks have not been a feature of our GPU reviews for a long, long time.

      I’ve had very little demand for it, honestly.

      I’ve also had a tough time finding two things:

      1) Popular, non-gaming consumer applications that use GPU computing to good advantage where there’s a performance difference you can perceive and quantify between GPUs.

      2) The subset of such applications that requires robust double-precision floating-point performance.

      Do you have suggestions of applications of this sort that a good proportion of our readers would be likely to use?

      Also, can’t say I appreciate the insinuation that my failure to test consumer GPGPU computing applications with a deep need for IEEE754-compliant double-precision FP datatypes is “giving Nvidia a pass.” That’s pretty rich. I just… think people generally buy video cards to play games on them. You would do well to be civil if you want your feedback to be taken in earnest around here.

        • anotherengineer
        • 4 years ago

        “I just… think people generally buy video cards to play games on them.”

        Well there are some that use them for HTPC’s, video stuff, especially if they don’t have quicksync or a slower cpu. That would probably be something better than a DP or GPGPU compute benchmark.

        But first would probably be gaming.

          • MathMan
          • 4 years ago

          Yeah. And none of those examples you gave would require DP anyone.
          I think OP was just looking for an excuse to make the trolling less obvious.

            • DancinJack
            • 4 years ago

            I don’t think he was looking to make it less obvious. Wild Thing has consistently behaved the same way for a while now. Honestly not sure why there hasn’t been a ban, yet. Wild Thing adds nothing to the discussion, that I have ever seen.

            • Laykun
            • 4 years ago

            He’s too Wild to ban.

        • w76
        • 4 years ago

        A lot of distributed computing projects like their double-precision FP math. That said, that use case isn’t all that common even with tech enthusiasts, and certain projects change their apps or the data being run through them somewhat often in such a way that historical performance data might get stale quickly. Better to just leave that to individual enthusiasts in individual project forums.

        I used to think GPU’s might make big gains in accelerating video editing and whatnot, back when QuickSync’s output was mediocre at best. Now it’s much faster these days and with, last I heard, pretty good quality. It’s also included in every processor 90% of enthusiasts are likely to buy. So, that blows away that use case.

          • Klimax
          • 4 years ago

          IIRC it is or it was possible to sort of freeze BOINC client and compute application in such way to have repeatable test. (Prevent any network communication for both after test start and have full back up of relevant folders)

        • dragontamer5788
        • 4 years ago

        I mostly agree with you Damage.

        Although, I’ve been using my GPU (R9 290x) to accelerate the effects I made on Sony Vegas. I honestly don’t know if different GPUs would make a difference because… I’ve only tested very very different GPUs here (NVidia 560 Ti and R9 290x).

        The main reason why I haven’t brought it up is that Video Editing is a relatively niche application. So I’m not sure how many of your readers would find it interesting.

        But I would? Anyone else out there a video editor?

        • Klimax
        • 4 years ago

        Sony Vegas Pro, but it would need some longer video + GPU intensive and processing-wise expensive filters to get something good.

        Otherwise maybe some raytracing (like the one used for CPUs).

        • Rectal Prolapse
        • 4 years ago

        Only thing I can think of to kind of test compute speeds would be to run MPC Home Cinema with MadVR’s GPU scaling/processing/frame-interpolation, using the NNEDI3 algorithms. Those can really challenge even a top-end GPU, especially when upscaling 1080p to 2160p! The great thing is that MadVR includes performance statistics to aid benchmarking.

          • brucethemoose
          • 4 years ago

          +1, that’s what I use my GPU for.

        • Freon
        • 4 years ago

        You’re feeding the trolls…

        • Anovoca
        • 4 years ago

        HOW WELL DOES THIS MINE BITCOINS???????1

          • chuckula
          • 4 years ago

          Not very.

    • anotherengineer
    • 4 years ago

    That GTX 780 Ti seems to struggle in witcher 3 a bit too, glad I didn’t jump on that expensive bandwagon when it came out.

    Seeing benchmarks like that kinda makes me happy I stick to $225 ish and lower for cards.

      • JustAnEngineer
      • 4 years ago

      That much would get you a bargain Radeon R9-290 4GB these days.

      • HisDivineOrder
      • 4 years ago

      Apparently, going by the nVidia and The Witcher 3 forums, all Kepler cards are currently suffering from driver bugs that are hindering performance (e.g. 780 Ti’s losing to 960’s). Supposedly, nVidia is working on a fix. It may even be part of the new driver they released tonight (and that is not the driver that is being used to review these cards).

      If one were thinking it a conspiracy, one might imagine it a great way to convince people who own 780 Ti’s to buy 980 Ti’s to improve their performance, right? 😉 Give reviewers a driver that emphasizes how much better 980 Ti’s are by reducing Kepler performance, then switch up the driver to a newer driver and fix the problem before consumers see it for too long in real life usage.

      Anyways, there’s a new driver up for 980 Ti’s exclusively and released on a Sunday no less.

        • anotherengineer
        • 4 years ago

        Of course a new game. I would expect them to analyze it and fix it eventually.

          • Terra_Nocuus
          • 4 years ago

          And if you wait the normal 2-3 months before buying a game (so as to miss out on those delightful game-breaking day 0 issues) you’re more likely to be running it with proper drivers

    • chuckula
    • 4 years ago

    At $650 the GTX-980Ti is a worthwhile swansong for 28nm from Nvidia.

    It’s still not as much of a jump in technology as the upcoming Fiji parts BUT.. there’s a plus side to not being “revolutionary” as well as a minus side.

    Having said all that: Pascal or Bust.

      • September
      • 4 years ago

      I’m really surprised that Maxwell 2 wasn’t a die shrink after Maxwell 1 was supposed to be but didn’t have fab space. No big deal to make a small chip (750ti) on the old process, but these medium and big chips? Luckily Maxwell was just that good from a performance and power point that staying on 28mm didn’t kill them. I just kept thinking that even as the GM204 came out at 28mm that surely the big chip will be on a shrink, but no! So that’s why prices are high.

      Blame it all on Apple sucking up all the wafer production at TSMC. Or blame it on low yields and problems with the new nodes. But eventually there should be plenty of space on something better and smaller than 28mm for the GPU’s to move to.

      Will I jump on whatever AMD releases? NO WAY. It’s great they are moving the technology forward with HBM but there are so many changes in this one generation of chip, and it is their big chip, it’s extremely risky that something is going to be wrong.

    • Prestige Worldwide
    • 4 years ago

    Looking forward to seeing custom boards tested and overclocked.

    • Westbrook348
    • 4 years ago

    I just bought dual 970s a month ago because a single 980 wasn’t much of a performance increase for the price. This review implies that I should’ve waited for the 980 Ti (can’t know for sure because 970s in SLI weren’t tested, even though I think it’s a pretty common setup these days). With dual cards I get awesome performance in theory, but getting the drivers to work with SLI and especially with 3D vision is such a headache. In fact, in multiple games now I haven’t been able to get both working at once.. What a waste.

      • Flying Fox
      • 4 years ago

      Get it and sell the 970s while they still have decent value?

    • l33t-g4m3r
    • 4 years ago

    I know the 960 wasn’t included in these benchmarks, but if it was, it would most likely show that it is also beating the 780 in several of these new gameworks titles, like TW3.

    Also, now the 290 is handily beating the 780Ti in several titles. How is that right?

    So, in summary, If you bought a 780Ti last year, throw it away and buy a 980Ti, because NV just quit optimizing for your card. Also, when you plunk down for the 980Ti, do not think about Pascal, and if NV treats Maxwell like they did Kepler. Because they probably will, then AMD’s cards running uncrippled drivers will be beating your expensive flagship. But hey, that 980Ti is fast right now, so what does it matter if NV quits supporting your card next year. You’ll just have to buy the new 1080Ti then. All to play console ports that really shouldn’t be needing this level of hardware to get acceptable framerates.

    PS. The reason why TW3 runs so slow on Kepler is that PhysX is actually slowing it down, and the game runs faster (lol) in CPU mode.
    [url<]https://www.reddit.com/r/witcher/comments/37o6sl/how_i_went_from_35_fps_average_on_low_to_45/[/url<] [url<]https://www.youtube.com/watch?v=XnWkSFqo5A4[/url<] But hey, it's the way it's "meant" to be played, right?

      • exilon
      • 4 years ago

      [url<]http://www.nvidia.com/download/driverResults.aspx/85817/en-us[/url<] [quote<]optimizations and bug fixes which can provide increased performance for Kepler-based GPUs. [/quote<] Wow it's like Nvidia took note of the issues and were working on a fix. -1 talking point. So sad.

        • Klimax
        • 4 years ago

        Aka prioritization. First current gen, then last gen. Like we have seen it for long time.

      • anubis44
      • 4 years ago

      @l33t-g4m3r: “But hey, it’s the way it’s “meant” to be played, right?”

      More like the way we’re meant to be played. I’d just say no to nVidia, period.

        • Klimax
        • 4 years ago

        Nope.

      • Damage
      • 4 years ago

      Oh it’s a shiller, shiller night
      ‘Cause he can shill you more than any fanboy
      Would ever dare try

        • derFunkenstein
        • 4 years ago

        In case this tech writer thing never takes off you have a long career in parodies ahead of you. I’ll buy the accordion.

        • jensend
        • 4 years ago

        l33t-g4m3r has been registered at techreport almost as long as you have, has been more active in the forums than you have, and shows zero signs of being a corporate shill.

        I know you’re just trying to be flippant, but as the editor of the site, for you to lightly toss around accusations of shilling (which implies the threat of a ban) just because you disagree with someone’s opinion is a blatant abuse of power, even if it’s meant humorously. You can’t afford to do that to your readership.

        Especially when you’ve already actually banned several users for “shilling” despite good evidence they were not being paid, in a manner that made it seem the [url=https://techreport.com/forums/viewtopic.php?f=32&t=83924#p1138303<]original notice[/url<]'s reassurance that your new policy wouldn't change things for "hopeless fanboys" seem like it only applies to those hopeless fanboys who happen to be rooting for the same teams as the majority of this site's hopeless fanboys. Your example also affects the tone of discussion on the site, as the fragments of rational discussion are increasingly displaced by accusations, recriminations, and insults. Leave the over-the-top demeaning humor to someone who doesn't wield a banhammer.

          • Damage
          • 4 years ago

          He’s incessantly rude, accusatory, committed to an extreme agenda in favor of a single vendor, and unwilling or unable to consider facts and evidence. He consistently lowers the level of conversation around him.

          I can nudge him into bounds or ban him for being a cancer, but one way or another, it’s clear to me something should be done. It’s not bullying to respond to this guy, with this track record, playfully like that. Wouldn’t be bullying to ban him. Context is important. The best thing I can do for my readership is counter his influence, regardless of whether there’s a particular violation of any forum rules happening.

          I appreciate that you’re trying to help me avoid going too far in some fashion, but I should have banned him long ago. I suppose I should just go ahead.

            • chuckula
            • 4 years ago

            [quote<]He's incessantly rude, accusatory, committed to an extreme agenda in favor of a single vendor, and unwilling or unable to consider facts and evidence.[/quote<] Oh Damage, flattery will get you NOWHERE.

            • Meadows
            • 4 years ago

            Did you just do what I think you did? His “last visited” date no longer displays anything but his profile doesn’t explicitly show status either.

            • Damage
            • 4 years ago

            Prolly just his privacy options.

            • Meadows
            • 4 years ago

            Oh.

            • l33t-g4m3r
            • 4 years ago

            Which I use, so I don’t have to deal with certain stalkers.

            • Meadows
            • 4 years ago

            I haven’t noticed any such people. At any rate, it’s good that you’re still here. We need more wrong people in this world.

            • USAFTW
            • 4 years ago

            Just because one has it’s own views on various matters doesn’t make one “wrong”.
            Especially when you can’t prove their wrongness.
            I think Scott might want to pop over to the wccftech comments section to see who to the real shills and mindless fanboys type like.
            The evidence to his claims is backed up by the internet (reddit, freshly launched game benchmarks at gamegpu.ru, techspot.com, computerbase.de and pcgameshardware.)
            So his words are not completely baseless.
            The comment section can be used for praise of the author (rightful) and praise the product if it’s any good or share a more differentiated and nuanced viewpoint. If you’re prohibited and harrased for doing anything other than the first two, I question the bias of the moderator.
            Scott, don’t make your comment section a dictatorship, that unless people say things you like the banhammer descends.
            Don’t accuse your readership willy nilly on shilling. You depend more on them than they do on you.
            Here’s to free speech.

            • Meadows
            • 4 years ago

            You do realise that “free speech” means Mr Wasson, too, has every right to say those things?

            • derFunkenstein
            • 4 years ago

            Not to mention that “free speech” doesn’t apply in this context.

            • l33t-g4m3r
            • 4 years ago

            Don’t worry Meadows. I support both sides here, because this isn’t a black and white issue. The real problem is that some people aren’t bothering to see the other side.

            The only person that I really disagree with 100%, is you, and that’s because anytime you “discuss” anything with me, you are trolling with no regard to truth. You take the opposing side of anything I say, just to oppose what I say. I could say grass is green, and you’d say it’s purple, because that’s just how you are. So in general, people who think I have a valid point should just completely ignore all your posts, because it’s trolling, and they shouldn’t feed the troll.

            Or at least until you can admit grass is green when I say it’s green, and not argue about how it could be purple or some other ridiculous nonsense.

            • Meadows
            • 4 years ago

            The only point I steadfastly opposed you on (recently) was your assertion that The Witcher 3 was not optimised when, in fact, it was. Worth noting that you’re pretty wrong about this particular point, you just don’t see it yet.

            • l33t-g4m3r
            • 4 years ago

            [quote<]"I oppose almost everything you say"[/quote<] No, you admittedly oppose everything I say just for kicks, and it doesn't matter if I'm right or not. That's pretty accurate too, because you've been doing it for over 5 years. Here, let me remind you. I made a thread years ago about a retro PC I was building. Here's your comment: [quote<]Still doesn't explain why you started a pretentious thread bragging about scrap metal from the last century.[/quote<] also:[quote<] "last visited" date no longer displays anything[/quote<] The second quote proves you are stalking me in the forums. So yeah, you have been going around trolling me every time I post anything, and your attitude is insufferable. Oh, and no TW3 is NOT optimized. I think the NUMEROUS PATCHES rather prove that it wasn't. Very Buggy Game. PS HAIRWORKS ALSO USES INSANE LEVELS OF AA BY DEFAULT, AND IS ONLY CONFIGURABLE BY EDITING AN INI. [url<]https://www.reddit.com/r/witcher/comments/36jpe9/how_to_run_hairworks_on_amd_cards_without/[/url<] [url<]https://www.reddit.com/r/witcher/comments/36jjoz/psa_change_hairworksaa_in_renderingini_for_a/[/url<] [quote<]Go to: isntallation directory\The Witcher 3 Wild Hunt\bin\config\base → rendering.ini search for: HairWorksAALevel and change it to 2 or 4 for a significant boost.[/quote<] Optimized MY FOOT. Not with AA set to max. That's straight up sabotaging performance. But of course, evidence and facts mean nothing to you because, and I quote, "I oppose almost everything you say", so you will in fact continue to ignore and oppose real evidence that in fact proves TW3 has NOT been optimized well for PC.

            • Meadows
            • 4 years ago

            Notice the word “almost”. Pretty key word. You’re also wildly mistaken about the definition of “stalking”.

            I give you points regarding the 8x AA of HairWorks, I remarked that one myself in another comment thread.

            However, HW is not turned on by default and when it is, it only takes up one square inch of your screen to process most of the time, unless you start a conversation with someone in the game and the camera zooms in on your character.
            For this reason, during actual gameplay, I measured a performance delta of only around 10-15% between 8x and 1x (none) HairWorks AA, the latter option being unusable for being so ugly and fuzzy. This means if one doesn’t want to deal with HW, then it’s pointless to tweak to 4x AA or something and they should just turn it off altogether instead.

            As for why the particular degree of 8x AA was used, I can only guess that maybe the developers worked on 1440p or 2160p monitors during production and wanted to make sure the protagonist’s hair never shimmers or anything, but I don’t really know.

            • USAFTW
            • 4 years ago

            What it means, though, is that he can’t prohibit others from saying different things that he perhaps dislikes.

            • derFunkenstein
            • 4 years ago

            “Free speech” is a constitutional idea applied to the government, not morons on a privately owned website. The government can’t shut l33t-g4m3r down but Scott surely can.

            • puppetworx
            • 4 years ago

            I think, quite obviously, he’s talking about the principle of free speech rather than the constitutional law.

            • derFunkenstein
            • 4 years ago

            Even then, that doesn’t apply. If Scott deems somebody’s speech too stupid to allow them to continue posting here, that’s fine. The market will decide if he’s right, because the market will either stay, leave, or possibly even grow.

            • puppetworx
            • 4 years ago

            Agreed, that’s how the world works. There’s no need to mischaracterize what he says though, that’s cheap.

            • Voldenuit
            • 4 years ago

            Thank you, I agree.

            The principle is much more important than the constitutional law, because one underlies the principles and values of a society, the other merely tries to codify parts of it at a certain point in history.

            • slowriot
            • 4 years ago

            TR, or any other personally owned institution, is not a platform for your “free speech.” You’ve failed to understand both the constitutional law AND the principle/spirit of it in my opinion.

            • CaptTomato
            • 4 years ago

            So unbiased techreport should silence a knowledgeable pc gamer that happens to be critical of NV?

            • K-L-Waster
            • 4 years ago

            Being critical of a vendor certainly should be allowed.

            But there are constructive ways to do that, and not so constructive. Spamming every single article about NV with the same message about Kepler drivers in TW3 and P-Cars is not so constructive (really, what do they have to do with the 980TI? Not much…). Stating that “everyone knows it’s intentional and malicious and if you don’t agree 112.4% you’re a fanboi” is much less constructive.

            Is calling NV on driver issues or problems with Game Works a legitimate complaint? Absolutely — as long as the article in question is actually on that topic.

            • derFunkenstein
            • 4 years ago

            no, he’s still here, just blathering the same thing on the nVidia driver update post.

            • Andrew Lauritzen
            • 4 years ago

            > He’s incessantly rude, accusatory, committed to an extreme agenda in favor of a single vendor, and unwilling or unable to consider facts and evidence. He consistently lowers the level of conversation around him.

            Well said Damage.

            Regardless of if someone is being paid or not, at a certain point any forum/site needs to decide on a minimum level of conversation that is acceptable. This sort of post (and critically, a complete unwillingness to learn) accomplishes nothing except actively driving away intelligent conversation from the site.

            • K-L-Waster
            • 4 years ago

            [quote<]This sort of post (and critically, a complete unwillingness to learn) accomplishes nothing except actively driving away intelligent conversation from the site.[/quote<] ... which may well be the intent.

            • l33t-g4m3r
            • 4 years ago

            NV admitted there was issues with Kepler. The real controversy here is that the users strongly suspect these issues weren’t accidental, given all the nasty tricks NV plays on AMD. It’s not something you can turn a blind eye to, especially after you spend a good chunk of money on their high end products. People expect NV drivers to be the “gold standard”, and yet NV let it’s users down for months without acknowledging the problem until after the customers had enough and the issue went viral.

            This issue particularly affected me here, because I bought a 780, and I strongly feel that NV has personally wronged me with these drivers. That said, nothing I’ve said about these issues haven’t already been said by other users on other websites. Go look at the NV forums. Every single thread is from people complaining about the poor Kepler drivers, and it’s also all over Reddit. Meanwhile, it doesn’t help that NV continues to push gameworks into games that sabotage AMD cards, and both new titles that Kepler is having issues with, AMD users are having issues with. Both sides combined here, and it exploded, because everyone is pretty much tired of NV’s tactics.

            The only way this doesn’t affect any of you, is if you have completely disposable income, and you can easily afford to buy a whole new system and Titan video card every single year. If so, good for you, but I have a big problem with the elitist snobbery, where I’m looked down on for having a 780, and expecting decent framerates. It’s not that old, and it should be several orders of magnitude faster than a console, so I am expecting playable framerates from console ports. I think I have the right to complain when said 780 doesn’t perform up to snuff, especially after what I paid for it.

            Quite frankly, other than my “attitude”, I don’t think I’ve said anything particularly egregious. It’s more an issue with NV fanboys not liking the non fanboy’s opinion of shady NV practices, and said NV fanboys won’t own up to the shady practices, so they pretend like it’s somehow all my fault. Well, sorry to burst the bubble, but it’s not. NV put themselves into this mess, and I’m merely reflecting the trouble they’ve caused. It’s not my fault if you guys outright refuse to see it, like how people refused to acknowledge the Titanic was sinking. Well, sorry for disrupting your fancy dinner by saying the boat is sinking with an attitude. I truly am sorry that my attitude upset your appetite. Now man the lifeboats.

            • derFunkenstein
            • 4 years ago

            So your tinfoil hat conspiracy theory is nVidia decided “let’s cripple our previous high-end card so that people buy our current high-end card”? With Fiji being literally right around the corner? Are you daft?

            • l33t-g4m3r
            • 4 years ago

            No, NV is. Which is why they finally released these new drivers at the last minute. Problem is, there’s about a billion people complaining in the support forums who have already said they are switching to fiji when it comes out.

            Quite frankly it doesn’t matter if NV did it on purpose. It was really bad timing, and they didn’t react quick enough to convince the users otherwise. So, regardless of what NV fanboys think, or what NV actually did or didn’t do, a big chunk of disgruntled Kepler users are in fact going to be looking at buying Fiji.

            NV constantly gimps PC gaming with their bloatware, and we all know they do it on purpose. They don’t have the credibility that you’d like them to have, so their word really doesn’t mean much to disgruntled Kepler users. All we really know is that there were problems, and that it took too long to be addressed, and the timing of this update seems like more like desperation than actual support. Would Kepler users have gotten this fix if Fiji wasn’t right around the corner? I dunno.

            • derFunkenstein
            • 4 years ago

            So you just debunked your own tinfoil hat conspiracy theory. Everyone including nV knows Fiji is coming.

            • l33t-g4m3r
            • 4 years ago

            Debunked what? Nothing is debunked just by you saying so without elaborating. NV has a reputation for shenanigans. They’re not saints that deserve blind trust, and right now they’re getting the skepticism that they deserve.

            It doesn’t matter whether or not NV did cripple Kepler. The fact stands that the Kepler drivers had performance issues, and it took a full on internet revolt for NV to issue an update, which in all appearances only happened because Fiji was coming out, and NV didn’t want Kepler users to jump ship. Only problem is that the update is way too last minute for the controversy to completely fade away, and disgruntled users will probably be considering fiji regardless.

            Still, any update is better than nothing, so I’m sure there’s been some damage control. But it should have happened sooner if they were actually serious about supporting Kepler users.

            • sweatshopking
            • 4 years ago

            i’m kinda with you on this one. you can sometimes have opinions that i think are abrasively and tactlessly spoken, but there does seem to be something behind the 780 problems, enough that you shouldn’t be considered making it up. You may want to work on your methodology of how you write though. We both know we’ve talked on opposite sides of a discussion MANY times, but this one doesn’t seem to be close to the worst thing you’ve done on here, so not sure why more issues.

            • l33t-g4m3r
            • 4 years ago

            There are two viewpoints on this issue:
            1: The fanboy.
            NV driver update = “Gold Standard”. No need to buy Fiji, because games now gets acceptable framerate.
            -Will buy 980Ti when games start needing more than 3GB of Vram.

            2: The Skeptic
            Driver update proves beyond a shadow of a doubt the performance issues were driver related, and not solely due to hardware differences. After all, does the 285 beat the 290? No, it does not. The 960 should likewise never beat the 780Ti.
            -Would have immediately bought amd card if not for driver update. Now will wait for fiji benchmarks and price drops.

            The driver update was ultimately damage control. It proves NV screwed Kepler users, but also keeps them from immediately jumping ship. If NV didn’t think there was a real threat of users leaving because of performance issues, they likely would have never released the update, and quite frankly I think they could have done better, but that won’t happen because Maxwell is the priority. Like I said earlier, the 285 does NOT beat the 290, and the 280/960 should NOT beat the 780Ti. There’s something very wrong if it does, and the users know it.

            • K-L-Waster
            • 4 years ago

            Dropping a high end card because 2 new games have driver issues in the first week or two after release is a little drastic.

            Going with “It must have been on purpose!!” as the first and only explanation seems even more drastic.

            Claiming that every single Kepler owner is looking to move to AMD over these two game — yeah, right. (I own a 780, and I’m sure not selling it over this…)

            • l33t-g4m3r
            • 4 years ago

            Protip: It wasn’t just those two games. Those two games are just the straw that broke the camel’s back, because it confirmed everything that we suspected NV was doing. Basically, they just quit optimizing for Kepler.

            Oh, and I am definitely grateful for people like you, because who else is going to buy my now discontinued and unsupported used graphics card at inflated prices that will more than cover the cost of the faster 290 that I just bought?

            • sweatshopking
            • 4 years ago

            yeah, i don’t think it was a plan to screw user so they’d upgrade, and i’m not really sure whether the issues were legit or not, but it likely was just a question of driver engineering man hours.

            • l33t-g4m3r
            • 4 years ago

            Man hours makes more sense for AMD, not NV. Like I said earlier, it doesn’t matter if they did or didn’t plan on screwing the users. It still appeared like they did.

            1. NV should have never stopped optimizing for Kepler. Never. Kepler is more dependent on drivers for performance, so bad drivers = bad performance. Remember Tomb Raider? That’s what happens when Kepler doesn’t get it’s driver optimizations.
            2. Too long to address the issue. No timeframe given on fix. Maxwell users got plenty of updates though, making it appear that Kepler really was being neglected.

            • Meadows
            • 4 years ago

            Enjoy your space heater.

            • l33t-g4m3r
            • 4 years ago

            There is barely any difference in power use between the 290 and 780. Very minor.
            Also:
            [url<]https://www.youtube.com/watch?v=fBeeGHozSY0#t=541[/url<] HURR DURR DEBUNKED DEBUNKED BEDURKED Really. If I already have a 780, then what's a few watts more for a 290? Nothing. My performance goes up too, so it's kind of like overclocking the 780. Also, even though it truly IS a space heater, I DON'T SEE THIS ARGUMENT BEING MADE AGAINST SLI, and a lot of you have it. So yeah, obvious troll is obvious.

            • Meadows
            • 4 years ago

            Of course you don’t see that argument, because barely anyone has SLI. Off the top of my head I couldn’t tell you a single name without cheating and looking at forum signatures in the graphics sub-forum, and even there it’s scarce.

            • sweatshopking
            • 4 years ago

            i have a 290. it’s not a space heater.

            • l33t-g4m3r
            • 4 years ago

            I think geek does, and Djeano has a Titan, which would be a bigger power hog than my 290. Not that any of this is relevant, other than to serve as some sort of stupid insult on your part.

            The space heater comment is a joke for more than one reason, one in particular being that none of these cards run full throttle @ the desktop, which is where the card spends most of it’s time. People only game for a couple hours a day, outside of WOW addicts.

            Second, the weather has been rather cold lately, and I would sometimes prefer if this card was a space heater. But since delusional fantasies can’t become reality, that’s not going to happen. I’ll have to run a REAL space heater, if I need heat. Either that, or I could use your mouth, since it’s definitely full of useless hot air.

            • geekl33tgamer
            • 4 years ago

            Yes, I have SLI. Works fine for the most part.

            Downsides are heat, noise and power use. The former don’t bother me at all as long as it works without throttling (check!). The latter isn’t too shabby when I’m using about 80W more than a Titan X under load, but also 12-15% faster at a much lower purchase price.

            The pair of 970’s were £560, and the Titan X is £900.

            • l33t-g4m3r
            • 4 years ago

            Yup. If I ever go NV again, that’s probably how I’m going to do it. The 970’s will hold value better too, as flagship card’s will only resale to about half of what you pay for them.

            • geekl33tgamer
            • 4 years ago

            Yup, and thanks to the whole VRAM debacle on the 970’s – They are pretty cheap here (almost £200 less than a 980).

            • Chrispy_
            • 4 years ago

            When you two have a discussion my eyeballs hurt;
            Can one of you change your name please? 😛

            • K-L-Waster
            • 4 years ago

            Thanks, but I don’t buy used cards….

            • sweatshopking
            • 4 years ago

            I DO. IF THEY’RE CHEAP ENOUGH.

            • Terra_Nocuus
            • 4 years ago

            [quote<]...and I strongly feel that NV has personally wronged me with these drivers...[/quote<] Never attribute to malice that which is adequately explained by stupidity. I'm pretty sure we're all far too unimportant for NV to make an effort at wronging us.

            • bfar
            • 4 years ago

            The latest models always get support first, that’s always been the way. I suspect the issues with Kepler cards has more to do with negligence than anything malicious, but either way, it wasn’t good enough.

            When cards were cheaper back in the day, it was feasible to upgrade every year or two, but now, if you’re going to charge $600 – $1,000 for a card, your customers will expect meticulous driver support for at least three years, if not longer.

            The wider issue for PC gaming is that GPUs have become far too reliant on driver and software optimizations for acceptable performance. I wonder will DX12 or Vulcan help to address this, or just make it worse?

            • Klimax
            • 4 years ago

            For good reason. You do not want game devs to do such level of optimizations. It will be worse, because devs can’t have that kind of knowledge, they will not have such time and their support will be even worse then by AMD… (+ general competence is bad)

            We have been there multiple times, but people apparently refuse to learn, so we have another round of learning, why developers of games shall not be trusted to do the correct things and optimize well. (And why not even managing the placement of allocation)

            DX 12 will strongly amplify already bad things. (GCN, Kepler and Maxwell are very different things, not much in common)

            • ImSpartacus
            • 4 years ago

            If you deem it important to rein in commenters, that’s justifiable.

            However, it doesn’t need to be in public. It’s just as effective (if not more effective) to privately converse with someone and the infrastructure for a private discussion already exists.

            It’s hard to do it publicly without inadvertently setting an example that other commenters may treat each other like that.

            • geekl33tgamer
            • 4 years ago

            Please don’t confuse me with him 😉 Not all l33t gamers are created equal!

          • Bensam123
          • 4 years ago

          I basically have no ability to participate in said discussions because of the above. I mentioned this before as well, but I’d rather read and comment on things like hard drives (which don’t seem to have fanbois at all) then be banned from the site.

          Getting downvoted isn’t the same thing as being wrong either. You’re just on the bottom side of popular opinion.

            • chuckula
            • 4 years ago

            [quote<]Getting downvoted isn't the same thing as being wrong either. You're just on the bottom side of popular opinion.[/quote<] Oh totally, in fact, in honor of this week's launch of socketed Broadwell desktop processors, here's a fine example of how I was 100% correct but got downvoted for taking an unpopular... albeit completely logical and factually correct... position: [url<]https://techreport.com/news/24191/trusted-source-confirms-soldered-on-broadwell-cpus?post=700957[/url<] And as a counterpoint, here's an example from the same thread of how you were completely dead wrong and let your own irrational hatred and biases carry you away when you sensed that you could play people: [url<]https://techreport.com/news/24191/trusted-source-confirms-soldered-on-broadwell-cpus?post=700884[/url<] Don't worry Bensam, I'm not going to forget about you later this week. An apology would go a long way towards me being civil.

        • ImSpartacus
        • 4 years ago

        Don’t bully your readerbase like that.

        I get that you’re capable of humor, but it’s really just not worth the joke. It doesn’t matter if you’re responding to the worst comment ever. It’s never worth it.

        Be a good example for everyone else.

      • the
      • 4 years ago

      Or it could be something that nVidia added to Maxwell that wasn’t in the Kepler architecture that the developers are actually using.

      AMD did something similar with Tonga as the R9 285 can out run the R9 290X in a few select scenarios by a good margin.

        • l33t-g4m3r
        • 4 years ago

        No, not when you get faster framerates by disabling GPU PhysX. It was absolutely a driver problem, and 780Ti’s shouldn’t be getting worse framerates than a 290. Not unless NV had stopped optimizing for Kepler.

        Oh, and people are still complaining about TW3 performance, so no the new driver didn’t fix everything. Maybe a minor improvement. I’d like to see benchmarks with the new driver, and if the 960 is still beating the 780Ti, then there’s still a problem. That shouldn’t be happening under any circumstance, outside of driver crippling.

          • geekl33tgamer
          • 4 years ago

          You saying Nvidia’s deliberately crippling performance on older cards via drivers?

          Sure, whatever.

          The Maxwell GPU contains a lot of extra dedicated processing ability for certain tasks that Kepler lacked. I’m not shocked it’s quicker in cases where that’s clearly being utilized.

          • the
          • 4 years ago

          Actually it could. Kepler and earlier generations would have context switch between compute and graphics queues as it could only do one type at time. Maxwell enables both graphics and comptuer workloads for simultaneously operations for an efficiency gain.

          [url=http://www.anandtech.com/show/9124/amd-dives-deep-on-asynchronous-shading<]Anandtech had a bit to say on this matter:[/url<] [quote<][i<]On a side note, part of the reason for AMD's presentation is to explain their architectural advantages over NVIDIA, so we checked with NVIDIA on queues. Fermi/Kepler/Maxwell 1 can only use a single graphics queue or their complement of compute queues, but not both at once – early implementations of HyperQ cannot be used in conjunction with graphics. Meanwhile Maxwell 2 has 32 queues, composed of 1 graphics queue and 31 compute queues (or 32 compute queues total in pure compute mode). So pre-Maxwell 2 GPUs have to either execute in serial or pre-empt to move tasks ahead of each other, which would indeed give AMD an advantage..[/i<][/quote<]

      • Krogoth
      • 4 years ago

      Planned obsolescence, don’t you love it?

      • techguy
      • 4 years ago

      What you fail to account for is the fact that AMD has NEEDED to continue to focus on performance through driver optimization because they haven’t launched a new flagship single GPU card in 21 months. Since May 2013 (when Nvidia launched the GTX 780), AMD has released only one flagship single GPU card, the R9 290x. During that timeframe, Nvidia has released the 780 Ti, the 980, and the 980 Ti, surpassing the performance of the R9 290x with 3 successive products (not named Titan).

      So you knock NV for not focusing on performance for a product which has been surpassed within their own product stack twice now, but give AMD credit for not having released a successive product in 21 months. Great logic. Have any submarine screendoors or solar powered flash lights for sale while you’re at it?

    • JustAnEngineer
    • 4 years ago

    Something about your “E-tail price” table…

    Newegg hasn’t sold any GeForce GTX Titan X cards for $999.99. They’ve been at least $50 higher than that since launch. The least expensive one is currently $1050 +15 shipping = $1065. The EVGA version is $1090 +13 shipping = $1103.

      • the
      • 4 years ago

      The local MicroCenter tends to be a bit over priced when it comes to GPUs but they’re pretty much on par with NewEgg on this with an EVGA unit at $1100.

      Perhaps e-tailers won’t price gouge but I suspect it may be a week or two before prices settle close to the MSRP. (Basically around the time when competition arrives.)

    • anotherengineer
    • 4 years ago

    Hmmm with the regular 980 about $750 up here, I guess that would put the 980 Ti closer to 1k up here.
    [url<]http://www.newegg.ca/Product/Product.aspx?Item=N82E16814487067&cm_re=gtx_980ti-_-14-487-067-_-Product[/url<]

      • juzz86
      • 4 years ago

      US$817 here (Australia) this morning, pre-order. Welcome to our world!

      🙁

        • anotherengineer
        • 4 years ago

        Min wage here is about $11/hr. Also whatever the ticket price is, add an additional 10-15$ for shipping then add another 13% sales tax on that.

        What is min. wage there??

        No pre-orders I notice for Canada yet.

        Pretty typical for Auz, most things are more expensive there due to the location and the min. wage rate being closer to $20/hr?? A lot of Cnd’s get sticker shock when there, but the few I have known who have moved there for work, have never returned to Canada, so must be better than here overall lol

          • the
          • 4 years ago

          That’s just because everything is upside down over there. 😛

          • juzz86
          • 4 years ago

          I should’ve clarified a bit, that price includes 10% sales tax but excludes shipping, which would bring the total to AUD$1100 (US$843). For comparison, the Titan X is AUD$1599 (US$1225) excluding shipping from the same retailer. They currently have the best price in Aus.

          Minimum wage here does indeed approach 20 bucks an hour. That’s reflected in everything from fuel to groceries to take-out to utilities. I pay ~AUD$1600 a year in electricity for wifey and I, and we’re damn tight with it.

          Australia is a mess financially though. I’d highly recommend not emulating our economy. Keep your wages respectable and reap the rewards at the check-out. There’s a lot of truth to the’s comment below 😉

          EDIT: Added US dollar rates for you guys 🙂

            • anotherengineer
            • 4 years ago

            Indeed.

            Electricity for me is about $1800/year and that is zero AC. If I had to run my AC 6 months of the year like most people in Auz, I don’t want to even think of what my electricity bill would be.

            My brother is on electric heat (I’m on gas) he just pays a flat monthly rate of $400, or $4800/year,
            petro at the pumps here is $1.27/L, and a typical 540mL can of soup is about $3, and min. wage is far below the poverty line.

            I agree we all have our crosses to bear, but no -20C (with dips to -40C) for 6 months of the year is almost incentive enough to bail outa here lol Heck this morning is was 5C and my heat was on, and it’s June 1st!!! lol

      • f0d
      • 4 years ago

      $1049 in aus
      [url<]http://www.pccasegear.com/index.php?main_page=product_info&products_id=32156[/url<]

        • yogibbear
        • 4 years ago

        Pretty cheap considering us aussies have to pay $1500 for the Titan X… 🙂 so it’s still 2/3rd’s the price like it is for our US pals.

    • Rza79
    • 4 years ago

    nVidia’s achievement aside, I’m amazed how good AMD’s Crossfire is working. Especially on the 16.7ms tests.

      • Damage
      • 4 years ago

      To be clear, spending less time above the 16.7-ms threshold isn’t much of a victory if you also spend more time beyond the 33- and 50-ms thresholds. That just means you’re generally fast but riddled with intermittent or specific slowdowns. You need to meet those higher thresholds before claiming victory at 16.7-ms or below.

        • ImSpartacus
        • 4 years ago

        I don’t think it’s that simple.

        We all remember the feedback when tr started reviewing in a proper frame time-based style. It takes about three different methods of displaying the exact same results to capture most of the nuance involved. We can’t forget why all of those methods were requested in the first place. Frame time distributions can’t really be watered down very well. It’s never as simple as we want it to be.

          • morphine
          • 4 years ago

          What’s this, audiophile stuff? Math is math. The results are the results, and they’re plainly visible in the games.

            • ImSpartacus
            • 4 years ago

            If “the results were the results,” then why are they effectively repeated a couple times in different forms?

            The bottom line is that the results [i<]aren't[/i<] the results and it's complicated. There are a lot of different methods of analysis that need to be taken into account (and they thankfully are, by in large). But these are all objective methods of analysis, "math" as you call it. This isn't some subjective audiophile-esque stuff. In the real world, you sometimes need to approach things from a lot of different angles to get the whole picture. And it's hard sometimes. Scott is probably mostly right, but it's for the wrong reasons and that's almost as good as wrong in my book. When your business literally boils down to teaching an eager audience something they are ignorant about, you owe it to them to do your best to avoid misleading them.

            • morphine
            • 4 years ago

            You are still going audiophile. “the results aren’t the results”, seriously?

            Look at the frame time graphs. Just look at them. You see spikes above 33 and 50ms in some games. The spikes correlate to stalls when you’re playing the game. It’s done. The method works. There’s nothing more to argue about.

            You’re trying to throw FUD onto the testing methods to justify… something. Trying to twist words, invent factors, and muddy the issue with platitudes such as “get the whole picture” just makes it look like you’re in denial about the results. Your favorite card doesn’t come out on top every time? Deal with it.

            • Milo Burke
            • 4 years ago

            Why all the audiophile hate?

            I’m a board member for the Audio Society of Minnesota. But I don’t buy obscenely priced junk from the nonsense store, and I don’t blather on endlessly about things I can’t actually hear.

            Sure, some audiophiles are fools. But so are some car enthusiasts, wine enthusiasts, even *gasp* hardware enthusiasts.

            • morphine
            • 4 years ago

            My apologies. I should have qualified it properly with “audiophile”.

            I’m an audiophile myself, and yes, I [i<]can[/i<] hear things others can't. But they're actually there 😛

            • Milo Burke
            • 4 years ago

            Precisely. You have “good ears” either from ear training or from years of “active listening”. So do I. We can tell a lot about a stereo or mix that a lot of people couldn’t, including faults.

            I’m just as sick of the audio nut-jobs as you are. And all their stupid products with stupid marketing, and the stupid people that buy into it. The same stupid people that invent faults that aren’t audible, and “solutions” that aren’t logical. (E.g. “There’s no such thing as too much 2nd order harmonic distortion!”)

            But great stereos sound great, particularly when positioned properly in a a properly treated room, and when playing well-made music provided in a high quality format. It should be whatever music the listener likes, be it Diana Krall or Coldplay or Ariana Grande or ZZ Top. It just sounds better when played through a great stereo.

            I encourage everyone to hear a song they like on a great stereo, to see if it tickles them more than usual. If so, they can upgrade in a way that meets their tastes, their budget, and their desire for complexity/simplicity; and nothing more.

            There’s nothing wrong with wanting to hear your favorite music better than ever. But for some reason, people who care about good sound are persecuted. Let’s hope it ends.

            • Deanjo
            • 4 years ago

            Ahhh, audio societies…. where claimed perception of better audio by the self proclaimed audiophile is usually proven over time and time again that they differences they claim they hear is in most cases purely based on the power of suggestion. There have been a ton of blind A/B tests by various organizations to the “audiophiles” proving that in most cases it is all in their head (I remember one test where it was a blind A/B test to prove that audiophiles were talking pure BS when it came to being able to tell the difference between various grades of speaker wire. The results, 50% found the speaker wire sounded best, the other 50% found the wire clothes hangers that were used to connect sounded better).

            • Milo Burke
            • 4 years ago

            I took part in a cable shootout a couple of years back. I came thinking they made a small difference and left thinking they made next to no difference. I used my ears, as I encourage everyone to do.

            But one DAC over another is clearly audible, even to novices. Which is better may very well come down to preference, but they sound different. (I even did this experiment with my aging father last time I wired up his home theater: I ran digital and analog cables from Blu-ray player to receiver and put in a CD, then flipped between the two inputs. He had no trouble hearing the change but couldn’t decide which he liked more.)

            But again with the assumption that all audiophiles are idiots lying to themselves and everyone else. What gives? Might as well say that all people who occasionally drink are alcoholics.

            I don’t have an axe to grind here on the 980 Ti review’s thread. But I’m compelled to point out bigotry when I see it.

          • derFunkenstein
          • 4 years ago

          I’m going to say something that, based on your score, isn’t too popular. I kind of agree, with a caveat. In Crysis 3 with the R9 295 X2 there are spikes that go on for 60-70ms, and for sure those need to be eradicated (because I’m sure you’ll feel them), but otherwise it seems to do well until you get to the power consumption. For every 3 hours of gaming in a month you’re using another kW/h just by the card you chose. Given that, and how well the 980Ti does, I think I’d pick the nVidia card over the Radeon dual-GPU setup.

        • Kretschmer
        • 4 years ago

        I don’t understand this comment…isn’t a solution that’s superior on the 16.7-ms graph automatically better on 33-ms and 50-ms by definition? E.g. it’s impossible to spend 1 second above 16.7ms and 5 seconds above 50-ms? Thanks in advance for the clarification!

          • Damage
          • 4 years ago

          No, because you can generally be faster overall while still having more, larger hiccups and slowdowns during the course of the test run. A nice example is The Witcher 3 here:

          [url<]https://techreport.com/review/28356/nvidia-geforce-gtx-980-ti-graphics-card-reviewed/5[/url<] The R9 295 X2 spends the least time above the 16.7-ms threshold in total: [url<]https://techreport.com/review/28356/nvidia-geforce-gtx-980-ti-graphics-card-reviewed/5[/url<] But its frame time plot includes some nasty hitches into the 30-60ms range: [url<]https://techreport.com/r.x/geforce-gtx-980ti/w3-r9.gif[/url<] As a result, it spends more time beyond the 33-ms threshold than the GTX 980: [url<]https://techreport.com/review/28356/nvidia-geforce-gtx-980-ti-graphics-card-reviewed/5[/url<] ..since the GTX 980's frame time plot includes fewer, smaller spikes, despite having a higher average frame time: [url<]https://techreport.com/r.x/geforce-gtx-980ti/w3-titan.gif[/url<] This is complicated stuff, but as usual, our 99th-percentile frame time metric turns out to provide a very relevant summary that shows the 980 Ti in the lead: [url<]https://techreport.com/r.x/geforce-gtx-980ti/w3-99th.gif[/url<] At the end of the day, playing this section of the game in testing feels noticeably smoother on the GTX 980 Ti, even though the 295 X2 has a higher average FPS. That's the reality we're attempting to capture.

            • Meadows
            • 4 years ago

            Not complicated stuff, it’s merely average vs. maximum frame times. Same as average vs. minimum fps, except in reverse.

            It’s possible to have a worse extreme and a better average at the same time.

            • Kretschmer
            • 4 years ago

            Thank you all for the clarification! This community is the best!

          • cobalt
          • 4 years ago

          I’ll rephrase what Damage says: yes, you cannot spend more time passing a more stringent threshold than the looser one, so your intuition is right. However, your example comparison is within a card, not across cards. Even though what you say is true, when comparing two cards it’s possible to be better at one threshold but worse at another.

          I’ll pull out the actual numbers from that page:

          295X2 spends 3206ms above 16.7ms and 162ms above 33ms.
          980 spends 12119ms above 16.7ms and 9ms above 33ms.

          So again, your intuition is correct that the amount of time above 16.7ms cannot be smaller than the amount of time above 33ms per card.

          But: you’ll note that the 980 spends more time above 16.7ms than the 295X2, but less time above 33ms than the 295×2.

          So while the 295×2 puts out a bigger proportion of runtime a faster-than-60FPS rate than the 980 does, it spends more of its time with “bad” spikes into slower-than-30FPS territory. (As Meadows says, it’s similar to the difference between avg vs max frame time values; the latter must be higher than the former, but two cards can differ in opposite directions for each.)

          (edit: lost most of a sentence somewhere…. fixed)

          • JustAnEngineer
          • 4 years ago

          Take it to the extreme. Card A renders every single frame at 30 ms. Card B renders most frames at 10 ms, but every 100th frame takes 1000 ms. Which is better?

        • Rza79
        • 4 years ago

        I get that but if I look at GTA V, FC4, Alien:I, Civilization & Battlefield 4, it really beats all other cards on every metric (50, 33 & 16ms). Sometimes by a big margin.
        Project Cars – only 102ms over 33ms
        Witcher 3 – only 162ms over 33ms
        Crysis 3 – not good but this game doesn’t run smooth on any card

        I’m not pro 600w gaming setups. But all I’m saying is that this new PCIe based CF really really works on 4K. That’s really a big change compared to the HD 7990 (or any CF bridge based card).

        If I compare this to an old review of yours for the HD 7990.
        Crysis 3 (3840×2160 on 295X2 vs 2560×1440 on 7990)
        50ms: 128 vs 5372
        33ms: 402 vs 17744
        16ms: 4275 vs 34671

      • jihadjoe
      • 4 years ago

      The improvement in FC4 from the Titan X review to this is amazing. From something like 30% it improved to over 100% scaling.

      OTOH, this does support the point that when crossfire underperforms it’s almost always down to AMD’s drivers.

        • Klimax
        • 4 years ago

        I thought it was common knowledge that it was (almost) always driver related. (On smaller scale we can see it with SLI too)

      • chuckula
      • 4 years ago

      [quote<] I'm amazed how good AMD's Crossfire is working. [/quote<] No, you're amazed at how [i<][b<]well[/b<][/i<] AMD's Crossfire works.

        • Rza79
        • 4 years ago

        I assume your mother tongue is English and it’s probably the only language you speak fluently, right? Think about that before you go grammar Nazi on us poor mainland Europeans (who BTW often (have to) speak 2,3 or 4 languages).

    • drfish
    • 4 years ago

    Very nice! It would be great to see what VRAM usage looks like in your test scenarios though… 3GB, vs 4GB, vs 6, 8, 12… All very interesting at 4k…

      • Meadows
      • 4 years ago

      I agree, and a few people mentioned this need after the last such review too.

      Unfortunately almost no games do what GTA has been doing since GTA 4, which is telling you upfront how much memory you’ll need.

        • travbrad
        • 4 years ago

        So GTA 4 did something right after all.

          • yogibbear
          • 4 years ago

          Yeah but that locked you to it without an .ini tweak. GTAV raw presents an option to choose to exceed the vram limit if you want to.

            • Meadows
            • 4 years ago

            It “felt” restrictive back then, but the gameplay actually suffered plenty if you exceeded your card’s limits. For example, after driving in a straight line for a while, if you then suddenly turned around the game would be choppy for a good few seconds and you’d literally see the buildings behind you loading up their textures again.

      • anotherengineer
      • 4 years ago

      Well here is a VRAM usage from the Witcher 3

      [url<]http://www.techpowerup.com/reviews/Performance_Analysis/The_Witcher_3/3.html[/url<]

        • drfish
        • 4 years ago

        Just leaves me shaking my head at DayZ using 2.9GB on my 780 (4K DSR, max textures, and high MSAA)…

          • anotherengineer
          • 4 years ago

          Well who knows, it could be because the witcher was coded/written better/more efficiently, or maybe it’s the game engine or maybe they possibly dulled the graphics a bit??

          If DayZ is under 4GB on 4k, then really a 4GB card should be good for most things. Games that use more than 4GB, might not be optimized as well??

          And 2GB good for 1920×1080.

      • Freon
      • 4 years ago

      A follow up with this would be nice, along with pitting it back against some SLI setups. 970 SLI is about the same cost and on paper, but has 33% more memory bandwidth and ~15% more general GPU grunt. Would love to see GTA5 and Witcher 3 benchmarks for these “older” SLI setups.

    • LocalCitizen
    • 4 years ago

    Unusual release time

      • drfish
      • 4 years ago

      Yeah, makes me wonder if Fiji has Nvidia a little scared…?

        • Meadows
        • 4 years ago

        How can you be scared of something you’ve never seen?

          • sreams
          • 4 years ago

          I guess you’ve never met a child.

          • TopHatKiller
          • 4 years ago

          Ohoo-oo. They have seen it: industrial spying is alive and kicking, they already have a really good idea [much better then any site] of what ‘Fiji’ really is.

          ps. Just realised; not sure what industrial spies kick – but it’s probably something hard and exciting.

          pps. rephrasing my ps might have been a good idea.

      • Damage
      • 4 years ago

      Not for Taipei!

        • LocalCitizen
        • 4 years ago

        7 AM is too early for gamers 🙂
        It’s just right NZ.

        • chuckula
        • 4 years ago

        Is somebody testing out his fitbit by running all over Computex Taipei??? 😉

        P.S. –> Looking forward to all your hard work this week. Thanks for starting it out with the big GTX-980Ti reveal, and I’m pretty sure there will be some other products [cough]Broadwell and maybe Carrizo[/cough] that we’ll be hearing about too.

      • bfar
      • 4 years ago

      Fiji must be good.

    • yogibbear
    • 4 years ago

    Witness me! O’ 980Ti come be mine as we proudly march towards Valhalla! Do I wait for custom coolers though? Presumably there’s already the standard eVGA ACX cooler option floating around?

    • sweatshopking
    • 4 years ago

    #ripati

    finally a card less than 1k that can do moderate resolutions.

      • Meadows
      • 4 years ago

      Moderate my arse.

        • Laykun
        • 4 years ago

        I was unaware your arse required moderation.

      • ultima_trev
      • 4 years ago

      It is a great card for 1080P for people with 144Hz monitors, unfortunately 4K and 1440P still require something more potent.

        • geekl33tgamer
        • 4 years ago

        A GTX 980Ti is a great card for 1080? I disagree.

        A GTX 960 will pump out fluid frames in modern titles at 1080p, the GTX 970 will easily take on 120+ Hz at 1080p or 60Hz 1440p. The GTX 980 is only a fraction faster than the 970, but can almost pull off 4K at the magic 60 all by itself.

        The GTX 980Ti is excessively overkill for 1080.

          • jihadjoe
          • 4 years ago

          Not if you’re aiming for 120-144Hz.
          /devilsadvocate

            • ImSpartacus
            • 4 years ago

            Maybe there were some edits, but he literally said that a 970 can do high fps 1080p work and I think that’s a perfectly reasonable stance.

            If we’re talking about high fps 1440p, then that’s a different story. After passing a ton for one of those monitors, you probably aren’t terribly cost conscious in the gpu department, so it doesn’t really matter.

            • Jason181
            • 4 years ago

            If you’re spending $500+ on a video card, you’re probably wanting fast [i<]and[/i<] pretty, and even a 980 can't do even 100 hz consistently on some newer games. Granted, a lot of games will be held back by the cpu at some point.

            • ImSpartacus
            • 4 years ago

            I guess if you’re hell-bent on having the highest settings and the most consistently low frame time, then something like a 980 ti might help at 1080p, but it feels weird to get agpu that’s intended for much higher resolutions.

            • geekl33tgamer
            • 4 years ago

            I’m being downvoted everywhere for suggesting it, but the 980 Ti really is overkill for 1080p and I really can’t get my head around the whole “your wrong, it’s finally a single card good enough for 1080p” argument.

            Either I’m being trolled or really missed something?

            • Melvar
            • 4 years ago

            Maybe it’s overkill most of the time, but for games like GTA V it definitely isn’t. I was playing that game on a 1080p HDTV with a 980, and I was running medium-high settings with no AA in an attempt to get a constant 60FPS.

            If you really crank the settings unreasonably high (including 8x MSAA but no DSR), a 980 isn’t always fast enough to maintain playable framerates, let alone 60FPS in demanding games at 1080p.

            • geekl33tgamer
            • 4 years ago

            8 x MSAA with everything cranked and ppl wonder why a 980 tanks? Moving on, I benched GTA V on single and dual 970’s at 1080, 1440 and 4K (Native, no DSR here) all cranked to the max, and even included 2xAA so I could enable TXAA across all resolutions.

            Single card’s wise, I got 82 @ 1080p, 66 @ 1440p and 35 @ 4K average frame rates.

            (Alongside an i7 4790K stock and 16GB DDR3 @ 1866Mhz ).

            • sweatshopking
            • 4 years ago

            people don’t care why it tanks. it tanks. i want fast and pretty, and we’re finally starting to get there @ 1080p.

            • geekl33tgamer
            • 4 years ago

            Well, the same people don’t actually need to use 8 x MSAA neither so they should care. They are the reason why nothing is ever going to be good enough if you set the bar to crazy-high.

            We’ve been able to make any card struggle at a given res since the modern PC was a thing. Don’t see why it’s a problem all of a sudden when you use the card in such a way that’s inefficient for minimal (see: non detectable) gains in visuals.

            Smh. Whatever – I give up.

            • ultima_trev
            • 4 years ago

            While I do think 8x MSAA is a waste, I would rather game at 1080P (or even 900P) with Ultra settings + 4x MSAA or 2x SSAA than 1440P or 4K at medium settings + 2x MSAA. It’s not like games actually have detailed enough textures to scale that high anyway.

            • sweatshopking
            • 4 years ago

            which people? why are you deciding what settings i should use? I like high settings, and i want high fps. i’d rather have ambient occlusion and 1080p vs 1440p without it.

            • geekl33tgamer
            • 4 years ago

            I wasn’t deciding what you do with your settings, I was stating an opinion.

            • sweatshopking
            • 4 years ago

            then we agree that we’re both sharing subjective opinions. I think this card isn’t powerful enough for 1440p, nvm 4k.

            • puppetworx
            • 4 years ago

            I can help them both out, use FXAA and DSR, it [url=http://www.hardocp.com/article/2015/05/26/grand_theft_auto_v_image_quality_comparison_part_5/2<]looks[/url<] and [url=http://www.geforce.co.uk/whats-new/guides/grand-theft-auto-v-pc-graphics-and-performance-guide<]performs[/url<] better. Higher settings don't always mean greater aesthetics, games often include bottlenecking features, it doesn't mean your card is guff. If you can't figure out what better looks like then there is always GeForce Experience which worked pretty well when I used it.

            • Melvar
            • 4 years ago

            Constant 60FPS means minimum, not average. And maxed is maxed. If you don’t think people should pick their settings that way, fine. I agree. I run most games at 2160p with hand-picked settings. That doesn’t change the fact that if someone is looking for maximum visual quality at 1080p the 980 Ti should provide a significantly better experience than the 980 in some games.

            • jihadjoe
            • 4 years ago

            Pretty much this, and doubly so if doing 3D because if each eye wants 60FPS to itself, the GPU has to push 120FPS minimum.

            • chuckula
            • 4 years ago

            [quote<]I'm being downvoted everywhere[/quote<] While I know that you two are definitely not the same person, the unfortunate similarity of your nick to one "l33t-g4m3r" might have something to do with auto-downvotes.

            • sweatshopking
            • 4 years ago

            PLUS HE’S DISAGREEING WITH THE KING.

            • geekl33tgamer
            • 4 years ago

            Requesting a username change in 3…2… 😉

            • Jason181
            • 4 years ago

            In context, trev was referring to 144 hz monitors, and your benchmarks below show that you’re only getting an average of ~80. You’d need something like triple that as an average if you never wanted to go below 144 fps.

            Some people value resolution, others value high framerates, and still others value eye candy. Some want all three and are willing to pay for it, but don’t really want to deal with SLI/crossfire.

            That doesn’t mean your opinion is wrong, but it’s no more “right” than those with different goals.

            • geekl33tgamer
            • 4 years ago

            No edits, but apparently he didn’t read my entire OP… 😉

          • Aquilino
          • 4 years ago

          Well, let me tell you something. I’ve jumped from a HD 3850 to a 980GTX (PNY oc’ed model) and man, was I disappointed. I expected much more bang for the buck, more performance for the actual state of top class videocards.

          Far Cry 4 (1080p monitor with a small downsample, 2048 x 1152) on a 2600K and when the soft shadows are on the framerate goes down the sewers. Ok, everything is on ultra and the game looks great at short distance (not so great from the gyrocopter) except for the grass, that still is quite crappy. And there are a lot of textures that look quite last gen.

          So yes, a 980Ti may be great for 1080. And maybe even not.

            • geekl33tgamer
            • 4 years ago

            You based your performance on a Ubi game? Hold your head in shame!

            No, seriously – That’s not the best game in terms of optimization out there. Look up benchmarks of your 980 at 1440p and 4K running say GTA V, Project Cars, Witcher 3 and Hardline. They run much faster and look great.

          • sweatshopking
          • 4 years ago

          Imo a 980ti is finally enough for 1080p. I wouldn’t upgrade until much more gpu is available.

        • Jason181
        • 4 years ago

        I think you’re right on the money. With games where your cpu can actually push 144 hz, you need a pretty potent GPU to keep up (I like speed and eyecandy, and it sounds like you do too).

      • Kurotetsu
      • 4 years ago

      It probably should be pointed out that SSK’s standards for acceptable performance from a video card is: “Can render any game at 60fps, consistently, at 1080p with no changes to the default graphics settings whatsoever.”

      Honestly, and I can’t believe I’m typing this, I kind of agree with him.

        • sweatshopking
        • 4 years ago

        WE’RE IN LOVE!!!!!! <3<3<3<3<3<3<3<3<3<3<3<3<3<3<3 XOXOXOXOX

        • Chrispy_
        • 4 years ago

        Most people only fiddle with the graphics settings if it runs badly.

        default resolution, default settings, no stuttering is what most people use and want.

    • deruberhanyok
    • 4 years ago

    Man. The performance is practically within the margin of error for the Titan X’s numbers, and the power consumption is noticeably down. Kudos to NVIDIA for their continued focus on efficiency.

    Calling it a “slightly slower” version of the Titan X is practically academic. I’m wondering if there’s a legit reason anywhere to choose the Titan over the 980ti if you’re in the market for a high-end card.

    Now, we sit back and wait to see what AMD has up its’ sleeves.

      • Klyith
      • 4 years ago

      I really hope whatever AMD has up it’s sleeve, it can move down to the midrange price brackets faster than Nvidia’s glacial pace. The stall out at the $300 price point has been really awful. Between AMD forced to sell the old, loud, power hungry 290 as a midrange card, and Nvidia’s obnoxious “F you we’re not dropping prices, we’re Nvidia” stance with the 970, it’s been a real slog.

      I’ve kept my 7870 way longer than anticipated, which is a problem since I want to get a 1440 monitor sometime soon.

        • ultima_trev
        • 4 years ago

        Considering the rumored price point with Fiji XT is $850, I won’t hold my breath. GTX 970 is more or less this generation’s “960 Ti” (like GTX 770 before it). Shame it sells for $50 more than it should.

          • kalelovil
          • 4 years ago

          The same rumor sites were reporting the 980 Ti would be $799. In any case, now that Nvidia has revealed their price AMD will take that into consideration.

          • ImSpartacus
          • 4 years ago

          The 970 provides a ridiculous amount of performance while maintaining a remarkable power envelope. It’s been pretty popular for Nvidia. They don’t need to drop price right now.

        • ImSpartacus
        • 4 years ago

        To Nvidia’s credit, we all knew the 970 would be an absolute champion for quite a while. Nvidia cut all the right corners to produce a pretty spectacular card for the money, so I understand why they would want to get some roi on it at its release price.

          • Klyith
          • 4 years ago

          At *launch*, yeah it was great. 6 months later, with no major price drops, with a 290 cheaper by $60 (or more if you count rebates)? I look at the 970 and say that’s the card I want because noise is important to me and maxwell has major advantages there. But it’s a hard sell at 25% more money just for less noise at load. Also I have oculus 2016 in the back of my head…

          Maxwell is the superior product right now, and nvidia is charging superior product prices. I much prefer it when both players have a superior product because I’m fine with flipping a coin between two equal options. Brand loyalty is the delusion of young nerds who don’t have more useful things to spend their money on.

            • Klimax
            • 4 years ago

            6 months are nothing, when refresh cycle lengthened considerably. Your time scale is no longer rational nor relevant.

            • ImSpartacus
            • 4 years ago

            I don’t understand the problem with charging more for the “superior” product.

            I get that it’s better when there’s a more dynamic competition, but when that isn’t happening, we can’t exactly blame Nvidia for wanting to make money. They have to pay for that r&d somehow.

            • Klyith
            • 4 years ago

            I don’t blame them, and think it’s a natural thing for them to do. I don’t think it’s “unfair” or anything. I also thank all the [s<]fools[/s<] [i<]generous souls[/i<] that buy thousand dollar video cards because they pay for the R&D that I get to enjoy for far less money by being patient. (Of course, in other times it was unfair -- intel in the mid 90s had a real R&D advantage, but they were also using anti-competitive and illegal tactics to [b<]keep[/b<] that advantage.)

        • Klimax
        • 4 years ago

        Forget that any significant underpricing from AMD. They cannot afford it, nor will be able to do (GBM will cost more then traditional arch) and will not want to. AMD is corporation like NVidia, the only reason the have low prices is do to performance of their products in comparison to NVidia’s. Remember original 7970 prices before 680…

      • ALiLPinkMonster
      • 4 years ago

      The only reason would be if you absolutely needed twice the memory. I guess if you’re running quad SLI out to six 4K displays or something crazy like that. Maybe. idk…

      • puppetworx
      • 4 years ago

      Yeah, it’s reallyTitan X performance for 2/3[super<]rds[/super<] of the price.

      • MathMan
      • 4 years ago

      The $300 price range got a very nice boost just past September with the 970. And you can get a 290 even cheaper. Not that glacial IMO…

      • the
      • 4 years ago

      I suspect that the power consumption difference is partially due to the change in memory capacity. GDDR5 has relatively high power for a memory technology.

      As for picking a Titan X over the 980TI, I’d say memory capacity but 6 GB is enough for pretty much everything out there today at 4K. Perhaps if you’re running a 5K display or crazy to run multiple 2560 x 1440 or larger in surround that the extra memory capacity would pay off. Either way, that is a niche of a niche scenario and wouldn’t apply to anyone in the general sense.

      • beck2448
      • 4 years ago

      [url<]http://www.guru3d.com/articles_pages/nvidia_geforce_gtx_980_ti_review,36.html[/url<] Overclocked this beast smokes the 295. Why no overclocking Tech Report?

      • ImSpartacus
      • 4 years ago

      The project cars results showed a tangible benefit to the Titan x. If that’s a big game for you, then maybe a titan x might be worth it.

      Other than that game, the two cards were effectively identical. So unsurprisingly, most people would be better served by a 980 ti.

      Although in general, above the $200-300 mark, most people are always served by the cheaper card until you hit that perf/$ peak somewhere in the $200-300 area.

      • Klimax
      • 4 years ago

      If one can extract every last bit of performance out of those 2 missing blocks and not be limited elsewhere. (Some GPU computations would be good candidates – maybe Anandtech has them)

      • Krogoth
      • 4 years ago

      Titan X = Epenis extender.

      12GiB makes no-sense for Titan X. It is too crippled for GPGPU related stuff that could utillize it. By the time, 12GiB of VRAM is being used by games for all of the bells and whistles. The Titan X will be hopelessly out of date.

        • MathMan
        • 4 years ago

        In addition to not being impressed, you possess the capability of judging the usefulness (or lack thereof) of 12GB for FP32 compute for all mankind. That is… impressive.

        If only you were right.

          • Krogoth
          • 4 years ago

          You realized that Titan X’s GM200 has 1/4 DP performance and it has nothing in common with Keplar-based Titans (a.k.a “failed” Quados/Telsa) which had full DP performance?

          Titan X should have been called “980Ti or “985Ti”.

            • MathMan
            • 4 years ago

            Here’s the thing: after years of fragmented applications for GPU CUDA, some of which required FP64, but many not, there’s now finally one that is taking the world by storm: deep neural networks. If you’re a regular on Hacker News, you must have noticed. Obviously the GTC keynotes were all about it. And even at Google IO they brought it up as a breakthrough technology.

            Like it or not, it’s going to sell a lot of GPUs in the coming years. AMD announced GPU investments for DNNs as well in their investors day, which is very much needed because right now, all popular libraries are CUDA based.

            Anyway: the beautiful thing about DNNs is that training can be done with FP32 and non-training often with FP16. No FP64 required.

            One more thing is that some networks require more memory than what is currently being offered.

            Cue in: Titan X with 12GB.

            With the 980Ti it will lose its appeal for most enthusiast gamers, but expect that it will remain a serious seller for universities and tech companies all over for DNN R&D. There is nothing in its price range that’s more suited.

            • Krogoth
            • 4 years ago

            Did you get the memo that Titan X is crippled? 980Ti works just about as well for GPGPU stuff for less $$$$.

      • beck2448
      • 4 years ago

      Now there are custom 980tis 20% faster than stock Titans and they don’t need WATER COOLERS.
      That’s sad really.

    • Meadows
    • 4 years ago

    Ah, so that guy was correct in seeing the accidental link to this article. Time to read some, then.

      • Meadows
      • 4 years ago

      The card seems great, although power consumption’s a bit too high for my comfort.
      The best achievement of this card is knocking 50 dollars off the price of the 980, but then again, that still doesn’t put either card in a place where I’d buy them.

Pin It on Pinterest

Share This