Five GeForce GTX 960 cards overclocked

I hate to brag, but I kinda know what I’m doing here. I’ve been reviewing PC components since the dawn of human history, or least since the last century, which is pretty much forever in Internet time. I’ve reviewed a lot of stuff, and a big chunk of that stuff has been video cards.

What I’m saying is that I should have the basics of this gig down pretty well by now. One would think.

Yet my attempt to cover a bunch of GeForce GTX 960 cards has left me flummoxed. I can’t seem to get my head around how to approach it. Part of the problem is that I already looked at these five different flavors of the GeForce GTX 960 in my initial review of the GPU. I tested their power draw and noise levels, and I compared their performance. I then resolved to do a follow-up article to look at the individual cards in more detail, along with some overclocking attempts.

Seems simple, right? Yet as I sit here and attempt to pull together this article, I’m struggling to make it work.


What am I supposed to make of this bunch?

Part of the problem has to do with the nature of the GeForce GTX 960 and the video cards based on it. You see, with its Maxwell architecture, Nvidia has sought to make its GPUs much more power-efficient than in the past. The result is a chip that doesn’t consume much more power than the old GeForce GTX 660 while offering tremendously more performance. Meanwhile, the video card makers have all been hard at work refining their coolers to evacuate lots of heat with very little noise. They’ve added more copper, more heatpipes, and more heatsink area. Twin fans are the norm, and one of these cards has triple fans on an extra-long cooler. Beyond that, all of these coolers have a nifty, semi-passive cooling policy where the fans don’t spin up when the GPU is at idle or lightly loaded.

Both of these trends are good ones. Rising power efficiency is always welcome, as is more effective, quieter cooling. The convergence of these trends is a good thing, too. All of these GTX 960 cards—from the likes of EVGA, Gigabyte, MSI, and Asus—are truly excellent. The contrast with comparable offerings from just a few short years ago is striking.


Just look at this EVGA card with an ACX 2.0 cooler. Look at it!

As a reviewer, though, these cards present me with a real problem: they’re too darned good. In the past, five different video cards based on the same GPU might perform about the same, but the cooling solutions and such would give me something to talk about, something to compare. When I tested this group of GTX 960s, though, they were all so quiet, they didn’t exceed the noise floor of my tranquil basement lab—not even under full load running Crysis 3.

At the same time, none of the GPU temperatures reached the 70°C mark. As I said before, these coolers are complete freaking overkill—in the best possible way.

What am I supposed to complain about now?


Probably not these coolers. They’re spectacular.

Some people seem to be disappointed that these GTX 960 cards don’t ship with 4GB of RAM onboard. Perhaps I could muster some concern about that fact. But it’s hard to do so when higher-end cards with 2GB have served me well for the past few years while gaming at 2560×1440—and the latest TR Hardware Survey tells me that over two-thirds of our readers still have monitor resolutions of 1920×1200 or lower. Also, no other GTX 960 cards out there have 4GB of RAM, nor does the competing Radeon R9 285. I could be persuaded that spending more for a faster video card with more RAM is a good idea. Heck, I’m all about dat GPU power. But I’m still convinced these GeForce GTX 960 2GB cards are best-in-class offerings.

GPU

base

clock

(MHz)

GPU

boost

clock

(MHz)

GDDR5
clock

speed

(MHz)

Power

connector

Length Height

above

PCIe

slot top

Intro

price

GTX
960 reference
1126 1178 1753 6-pin N/A N/A $199
Asus
Strix GTX 960
1253 1317 1800 6-pin 8.5″ 0.75″ $209
EVGA
GTX 960 SSC
1279 1342 1753 8-pin 10.25″ 0.25″ $209
Gigabyte
Windforce
GTX
960
1216 1279 1753 Dual 6-pin 10″ 0.25″ $209
Gigabyte
G1 Gaming
GTX
960
1241 1304 1753 Dual 6-pin 11.25″ 0.3125″ $229
MSI
GTX
960 Gaming 2G
1216 1279 1753 8-pin 10.75″ 1.125″ $209-219

I guess I could spend some time worrying about installation requirements. After all, some of the boards are pretty long, and the MSI card in particular has heatpipes that sprout up over an inch beyond the top of the expansion slot covers. If the guts of your target PC case is too small, then you may want to avoid the larger cards.


The Strix is the lightweight of the bunch

Heck, there’s a case to be made that the very best product in this crowd might be the Asus Strix GTX 960. It’s the smallest of the lot, and it’s alone among the group in requiring a single six-pin aux power input. When all of the options are pretty much equally whisper quiet, there’s no need to go larger.

But then the Strix costs just as much as the boards with beefier hardware attached. If your case and PSU won’t be strained by something more formidable, why not indulge?


I’d grab this MSI card anytime

See, I don’t know. Pretty much all of these things are over-engineered. Do I take points off for being more over-engineered than the next guy? At the same price?

That said, there is one more way I can squeeze and strain these GTX 960 cards in order to bring out the differences between them.

Overclocking

Yep, I can overclock these babies. Surely that will bring out some differences between them, right?

Well, maybe. Overclocking a GeForce-based video card these days is complicated business. Don’t get me wrong. Pushing the little sliders around in the applications isn’t hard, but the clock speed and voltage sliders exposed in most tweaking applications are only two inputs in a pretty complex equation. Nvidia’s GPU Boost algorithm reacts to a host of variables when considering how hard to push the GPU, and it works differently in response to different workloads.

For example, consider how these GTX 960s perform at their stock speeds while running the most GPU-intensive workload we know, MSI’s Kombustor app, which is based on FurMark.

GPU

base

clock

(MHz)

GPU

boost

clock

(MHz)

Memory

clock

(MHz)

Kombustor

GPU

voltage

Kombustor

GPU

clock

(MHz)

Asus
Strix GTX 960
1253 1317 7200 1.212 1417
EVGA
GTX 960 SSC
1279 1342 7012 1.181 1418
Gigabyte
Windforce GTX 960
1216 1279 7012 1.200 1468
Gigabyte
G1 Gaming GTX 960
1241 1304 7012 1.187 1468
MSI
Gaming GTX 960 2G
1216 1279 7012 1.187 1392

Although the EVGA GTX 960 SSC (Super Superclocked, ladies and gentlemen) has the highest base and boost clocks of the bunch, its operating frequency in Kombustor is within 1MHz of the Asus Strix. The Strix has markedly lower advertised clock speeds, but it gives the GPU a bit more voltage by default. In the end, the result is virtually the exact same clock speed in Kombustor.

And Kombustor is only one workload—kind of a peak thermal worst case. The cards will operate at different speeds when running games.

Although each of these products ships with its own branded tweaking utility, I decided to use EVGA’s Precision app for overclocking each of them. Precision is one of my favorite GeForce tweaking utilities, along with MSI’s Afterburner.

My approach was to max out the power and voltage sliders for each card. From there, I raised the GPU and memory clocks while running Kombustor and checking for three things:

  • Stability — Does it crash?
  • Visual artifacts — Do Kombustor’s images render correctly?
  • Delivered speeds — Does turning up the slider actually mean increased clock frequencies?

Here’s how far I was able to push each of the cards.

GPU

clock

offset

(MHz)

Memory

speed

(GT/s)

Kombustor

GPU

voltage

Kombustor

GPU

clock

(MHz)

Kombustor

GPU

temp. (°C)

Asus
Strix GTX 960 OC
+80 8000 1.212 1497 57
EVGA
GTX 960 SSC OC
+75 7840 1.225 1518 68
Gigabyte
Windforce GTX 960
OC
+40 8000 1.243 1533 64
Gigabyte
G1 Gaming GTX 960
OC
+50 8000 1.231 1543 66
MSI
Gaming GTX 960 2G
OC
+110 8000 1.231 1528 63

All of them were able to tolerate memory speeds of about 8 GT/s. The one exception was the EVGA, which showed some artifacts in Kombustor, forcing me to lower its memory clocks slightly. That’s almost surely just bad luck on EVGA’s part, since each card comes with a number of memory chips onboard. Not all of those chips are going to tolerate overclocking well.

Although GPU overclocking also depends to some extent on luck, in terms of what speed and voltage combinations your individual GPU can handle, there’s a clear trend in these results. The cards with higher voltages are able to achieve higher stable GPU clock speeds in Kombustor. We’ve seen this before with recent Nvidia GPUs. The main limiting factor in overclocking is the amount of voltage the card can supply to the chip.

The two Gigabyte offerings fare best in terms of delivered, stable clock speeds in Kombustor. Gigabyte’s press materials strongly stressed stable clocks with Furmark, and the firm supplied us with a firmware update for each of its two cards that’s expressly intended to help with overclocking. (The company tells us these firmware updates will be available to the public, and they do appear to be.) My sense is that Gigabyte knows reviewers use Furmark (and its derivatives like Kombustor) for testing and thus has tweaked its firmware to handle this workload well. Clever girl.

That said, Gigabyte also offers some of the highest GPU voltages we’ve seen, so its cards appear to come by these clock speeds honestly. (In fact, for each and every GTX 960 card we tested, the GPU Boost “reasons” flag that limited overclocked speeds was either VOp, VRel, or a combination of the two.)

The other thing to notice in the table above is the GPU temperatures under load. Once again, they never reach the 70°C mark. What’s freaky is that, again, none of the coolers have to work particularly hard in order to keep the GPUs cool. I didn’t notice any of these cards, while overclocked, audibly increasing their fan speeds beyond the lowest level. I opted not to measure fan noise when overclocked because I’m pretty sure none of the cards would surpass the noise floor in Damage Labs. Jeez.

I’d like to call out the Asus Strix in the table above. That card didn’t allow us to increase its peak GPU voltage beyond the stock setting, and as a result, its peak GPU speed was a little lower than the rest. Notice, though, that the Strix also has the lowest GPU temperature in Kombustor, in spite of having the shortest cooler of the group. Asus obviously made different choices with this product than the other brands have.

Here’s how the GTX 960s perform at their stock speeds and while overclocked.

In actual games, the two leaders among the overclocked GTX 960 cards are the Gigabyte GTX 960 G1 Gaming and the MSI GTX 960 Gaming 2G. Apparently there is some magic in that “gaming” label after all. The rest of the pack kinda reshuffles its order depending on the test.

The truth, though, is that we’re looking at awfully minor differences between the fastest and slowest overclocked cards—a few frames per second at most. I’m afraid my quest to find meaningful differences between these products via overclocking has come to a humbling conclusion. I have failed.

Conclusions

What’s a guy to do when all five products in a group perform pretty much the same in every key respect?

I had some ideas related to price, based on the table of list prices on page one. Since all of these GTX 960 cards are cooled more than adequately, I figured I’d penalize the products that cost more because of their exotic coolers. MSI’s Twin Frozr is massive, and MSI gave us an initial price range of $209.99-219.99. There’s no reason to overpay, right? Then I went to Newegg, just now, and the MSI GTX 960 Gaming 2G is selling for $209.99. You can’t count the MSI out of the running based on price.

With a starting price of $229.99 and a massive, triple-fan cooler, Gigabyte’s G1 Gaming seemed like an easier target. After all, the dual-fan Gigabyte Windforce is an outstanding product, and there’s no reason to pay more than its $209.99 list. But guess what? The G1 gaming is also selling for $209.99 at Newegg as I write these words. That longer cooler and third fan may not be necessary, but it also won’t cost you anything more.

In fact, all five of the cards we’ve tested are currently selling for $209.99 at Newegg, including the EVGA SSC, the Gigabyte Windforce and the Asus Strix.

So I’m still basically lost here.

Oh, sure, there are additional things that set these cards apart from one another. For instance, the two Gigabyte offerings have a second DVI port that some folks might appreciate. The MSI Gaming 2G’s cooler has an LED-illuminated dragon that will appeal to folks with a particular sense of style in a way that nothing else here can. EVGA has a strong reputation for U.S.-based service and support. And so on.

But all of these cards are fantastic products. They all come with three-year warranties. All ship with more cooler than the GPU will ever need. They all cost the same. Some are larger, require more space and power, and perform slightly better. Others are smaller, require less space, and run ever so slightly slower when overclocked to the max. Which one you should choose depends on your particular tastes and requirements. That is perhaps the most boring, undramatic conclusion possible for a review like this one. But in this case, it’s also the inescapable truth. Take your pick.

Enjoy our work? Pay what you want to subscribe and support us.

Comments closed
    • rika13
    • 4 years ago

    I like boring comparisons like this. There is no “wrong” choice or the shiniest of two turds like some things have been.

    • marraco
    • 5 years ago

    There is something not reviewed here, that is important to me:

    How easy is to clean the radiators when they get covered in dust?

    Is necessary to remove the radiator(and then apply the thermal paste), or is simpler: just remove the cover, and clean the radiator.

    I had some MSI cards which required so much force to remove the cover, that you risked damaging the PCB. It was also impossible to tell how much clogged was the radiator witouth removing the cover, and that caused some unnecessary maintenance.

    Also, no card comes with instructions on how to remove the cover, so to clean it, you risk forcing the wrong place, and breaking it.

      • Firestarter
      • 5 years ago

      Remove the cover? What for? Just blow out the radiator with compressed air, use a toothpick to keep the fans in place, and you’re done.

    • sschaem
    • 5 years ago

    How does power & efficiency scale ?

    Also I dont get the recommendation. The stock r9-285 beat the most overclocked 960 and is $40 cheaper to boot.

    If a max overclocked 960 now use more power then a stock r9-285, and is slower, and $40 more. how can that product not even be mentioned in the conclusion ?

    • ronch
    • 5 years ago

    Lower temperatures, a shorter length, and requiring only a 6-pin aux power connector, if the Asus sold for $20 more the choice would’ve been a little more difficult, but at the same price as the others at Newegg, I wouldn’t even think twice. Plus, Asus is a brand that’s as respectable as all the others in terms of quality, making it even more of a no-brainer.

    I don’t know what all the dilemma Is about, TBH.

      • NovusBogus
      • 5 years ago

      Yeah, that Strix is reeeeallly sexy. Odd that a GPU that efficient is being so massively overengineered by most of the manufacturers, even EVGA who usually doesn’t do that stuff. I’m holding out for a bit more cowbell though, and the $250 bracket isn’t doing it for me right now. Maybe there’s a 965 or 960 Ti on the way, Nvidia left a massive gap between the 960 and 970 in both price and performance and surely they’re going to do something about that.

      • Visigoth
      • 5 years ago

      Agreed. The Strix is hands down the best card of the bunch. I mean, look at that beautiful temperature @ full load!!! Some of us live in very hot climates, so having a (relatively) cool running GPU is an extremely important metric.

    • Klimax
    • 5 years ago

    That’s hell of amusing rant by lost reviewer who has no way to go for anything bad…

    😀

    • Meadows
    • 5 years ago

    [quote<]"I'd like to call out the Asus Strix in the table above. That card didn't allow us to increase its peak GPU voltage beyond the stock setting, and as a result, its peak GPU speed was a little lower than the rest."[/quote<] A notable fact is that with the 600 series (and 700? I'm not sure) NVidia has enforced a maximum voltage of 1.212 V, right in the GPU hardware, that couldn't be bypassed with either GPU BIOS tweaking tools or regular overclocking tools. They did this to cap the number of fools who'd send burnt cards back for RMA, in the opinion of MSI Afterburner's creator. It would be worth checking with NVidia whether they've shifted this limit since or whether those other manufacturers are actually treading forbidden ground.

      • Damage
      • 5 years ago

      The way Nvidia put it to me, the card makers who push on voltage are taking on extra risk themselves.

      • auxy
      • 5 years ago

      It was possible on the Titan to push past 1.212v with a BIOS modification, but it was unstable and generally a bad idea. Gigabyte created a board mod with a custom power delivery circuit to overclock a Titan to the absolute limits, but they couldn’t sell it because of NVIDIA’s restrictions on that product (and it was a gruesome hack in any case).

    • south side sammy
    • 5 years ago

    There is a card with 4gigs on it. I saw it yesterday. don’t know what site it was on ( on different computer now/no links )………. there was also mention of “if you look on the backs of the 2gig cards” you will see where you “could” put additional ram…………. do you see those? So you will have something to retest in the near future.

      • Ninjitsu
      • 5 years ago

      You mean 3.5 gigs? 😉

        • south side sammy
        • 5 years ago

        they won’t do this twice……… not so close anyways.

        the next card will have 4gigs usable but it will also have a 256bit memory interface. So I guess it won’t be a 960(ti) but a different card altogether. Release date, after AMD’s new stuff hit’s the web for purchase.

        This talk about the 290. Would I buy one, yes. Very good card for the money. The only deterrence I have is 4K. If running that, I would wait to see what’s next in line. Any res under that………. very good to overkill.

          • Ninjitsu
          • 5 years ago

          I don’t know, if it’s a design feature in Maxwell then it’ll be pretty common, I suppose.

    • I.S.T.
    • 5 years ago

    Curious, what does the 960 benefit more from, GPU clock increased or memory clock increase?

    • LoneWolf15
    • 5 years ago

    While I haven’t seen the 960 version, I’ve owned ASUS’ 970 and now the 980. The coolers on them are quite possibly the highest quality units I’ve seen, with full metal shrouds and a full metal backplates for the cards. A plus for me is that I like solid metal plate for the heatsink contact areas as opposed to exposed heatpipes.

    If part of what you want is well-built, I think you’ll get it.

    EDIT: P.S. Both have been ultra-quiet.

    • Krogoth
    • 5 years ago

    Protip: Overclocking GPUs has always been silly and it is not worth the hassle or time unless you got a part that was artificially binned (example the famous 9500PRO to 9700PRO conversion, Geforce 6800NU to 6800GT and 8800GT to 8800GTS) etc.

    You are better off just saving up your pennies and get a larger or more capable GPU. The architecture and components of a GPU are much more important factors for performance then clockspeed. The recent 970 debacle proves this more than ever.

      • torquer
      • 5 years ago

      Oh man, I remember unlocking my 6800. Those were the days…

      I do kind of feel like GPU overclocks are only good for benchmark e-peening anymore. I *feel* like in the past it made a bigger difference because the improvement was better as a percentage. But with a GPU already running at 1.5GHz at boost, another 50-100MHz just isn’t that much.

      Now back in the day running a Ti 4200 at Ti 4800 speeds… that was something

        • Krogoth
        • 5 years ago

        That is because GPU back a decade ago were typically binned by clockspeed rather then disabling portions of the silicons. The few chips that were binned through disabling parts of the silicon were only disabled at the firmware level. It was possible to unlock those portion via a hacked firmware.

        Unlike today, where the silicon is disabled at the fab by burning the effected portions with a laser which makes unlocking impossible.

      • hansmuff
      • 5 years ago

      The 970 debacle is fairly unique. Overclocking a GPU is not going to yield crazy improvements, then again it’s ‘free’ save for your time but as hobbyists, time hardly counts.

      Paying less and overclocking can oftentimes get close to the next higher part without the cost. Maybe it’s not so much fps, but rather $$ saved assuming you compare to a non-oc higher card. There are always edge cases where the higher card is better, but you can get close. And that is what it’s about.

        • Krogoth
        • 5 years ago

        It is becoming less true as AMD/ATI continue to bin GPUs by disabling portions of the silicon rather then lowering the clockspeed.

        You have to achieve a 15-20% clockspeed gain to break even with a higher-bin part at stock and said parts are running close to their clockspeed celling.

      • auxy
      • 5 years ago

      I got a 50% clock gain on my GTX TITAN. Naff off with your baseless rhetoric. ( `ー´)ノシ

        • Meadows
        • 5 years ago

        No wonder you’re so full of yourself, you bought a waste of money.

          • auxy
          • 5 years ago

          Cute, but I didn’t buy it. JohnC from these forums gifted it to me purely for the asking. You can look in the forum archives and see the conversation yourself! (*‘∀‘)

        • Krogoth
        • 5 years ago

        40-50% gains are rare these days unless you got a golden egg. Nvidia/ATI already push their silicon hard at stock and there isn’t really that much headroom.

        It is not worth it either as you have to pump a ton of volts and this tends to overtax the MOSFETs/ and voltage regulators on the GPU board. You usually end-up getting a premature death. I have seen so many horror stories in recent years.

        Again, GPU overclocking has regress into being nothing more than epenis olympics.

          • auxy
          • 5 years ago

          You need to learn that the plural of “anecdote” is not “data”. You’re correct in that, just like overclocking CPUs, overclocking GPUs depends a lot on the sample you get, although in the case of the TITAN specifically, most of them do ~1.2Ghz without a lot of drama. I only put +78mV into mine, and temps were just fine on the stock cooler, although I did re-paste it.

          Saying something like[quote<]Again, GPU overclocking has regress into being nothing more than epenis olympics.[/quote<]is just conscientious ignorance.

            • Krogoth
            • 5 years ago

            It is more like it is that you need to learn it yourself.

            Making bold claims that 50%+ gain is easy makes you silly. Even with said gain, your highly overclocked unit isn’t that much faster than a 780Ti or Titan Black at stock. You couldn’t tell a difference between your unit versus a 780Ti or Titan Black without resorting to epenis benchmarks outside of graphical artifacts and stability issues.

            • auxy
            • 5 years ago

            I had about 15% on a 780Ti at stock clocks, per Dark Souls II 4K gameplay. No graphical artifacts or stability issues.

            I went from 837Mhz to 1202Mhz by flashing a new VBIOS and then flipping some toggles in EVGA PrecisionX. All total, about 20 minutes of work. (I spent more than that trying to take it higher, but it wasn’t going to happen without more voltage than I was willing to apply.)

            Your argument is dead in the water. Give it up, Krogoth.

            • Krogoth
            • 5 years ago

            You may not experience problems for now. Try again in six or more months down the road.

            That’s when typically when silicon, memory, MOSFETs and voltage regulators start to fail under the stress.

            GPU overclockers typically have to kick things back down to keep the card stable.

            • auxy
            • 5 years ago

            I ran that card that way for over a year, and it was still working just fine when I sold it. I’m using a 290X at 1150Mhz right now.

            • Firestarter
            • 5 years ago

            3-year old HD7950 at 1050mhz says hi. And that’s a conservative overclock compared to some. Point is, by overclocking it I’ve had the pleasure of 31% increased performance for going on 3 years now, even if it blows up right now because of that higher clock that’s still a lot of time that I enjoyed the benefits of overclocking. But chances are that it won’t actually explode and that it will serve me for years to come until its technology is completely obsolete

            • kuraegomon
            • 5 years ago

            I’m going to guess that auxy is well aware of the risks of overclocking. The voltage hit she applied wasn’t excessive, and I’m quite sure she’s capable of monitoring gpu temps, tyvm.

            But shifting your point of contention without acknowledging that she made a comprehensive response to your last objection is childish, and I’m going to go ahead and call you on it. You’re better than that. It wouldn’t have killed you to start your response with “fair enough”.

      • Firestarter
      • 5 years ago

      my overclocked HD7950 disagrees

        • Krogoth
        • 5 years ago

        There are a larger number of other HD 7950 owners who are not as fortunate.

          • Firestarter
          • 5 years ago

          Are you making things up now or are you going to strengthen your argument by providing a reputable source for that?

      • vargis14
      • 5 years ago

      I am sorry Krogoth but I and many others have been Overclocking our graphics cards with very good gains in performance, especially 780, 780tis down to the lowly hd7750 making the difference between running at 45fps at stock speeds and 60fps Oced.

      With you being anti OC ,have you actually OVERCLOCKED your cards to reep the benifits?

      Also I think before you condem overclocking I think a survey in the forums on people’s experience in the matter might shed more light on the whole should I or shouldn’t I overclock ?

    • HisDivineOrder
    • 5 years ago

    When cards are all priced the same and the performance is identical, it seems you’d just have recommendations for:

    1) Best Heatsink/Cooler for overclocking
    2) Quietest Heatsink/Cooler at stock
    3) Smallest card
    4) Most extras (an extra DVI could come in handy for the budget gamer)

    Seems like for most of those, Gigabyte wins. For the one that it doesn’t win, Asus wins.

    • MadManOriginal
    • 5 years ago

    Newegg increased the price of the MSI Gaming 2G by $5. Bastards!

    • DPete27
    • 5 years ago

    Since I don’t feel in the mood to complain like many others here, I’d like to say “Thank you Scott, for taking your valuable time to further test these cards.”

    Also, as mentioned in the article, finding little/no differences between the cards may not make for an exciting article, but it certainly gives the buyer flexibility and confidence that there aren’t many/any “poor” choices out there.

    • mjallan123
    • 5 years ago

    You may want to change the award at the end…this is Feb 2015 😉

    • ALiLPinkMonster
    • 5 years ago

    So basically what you’re saying is… get the Asus because Asus.

    • Bensam123
    • 5 years ago

    Hey look a R9-290 for $240…

    [url<]http://www.newegg.com/Product/Product.aspx?Item=N82E16814127774[/url<] Interesting, this entire article is focused on efficiency, noise levels (against stock cooler competition), and doesn't have any price/performance charts. Maybe we should just skip the platitudes and make our way to DB/performance charts.

      • Damage
      • 5 years ago

      This is a follow-up article about the individual GTX 960 cards and how they overclock. The price/perf analysis is in the original review, linked in the article.

      Explain exactly what you are implying with your comment, so everyone here is clear, please.

        • derFunkenstein
        • 5 years ago

        Don’t worry, I’ll translate Bensam123 to English:

        “Clearly you’re in nVidia’s pocket because you’re ignoring bargain basement Radeons, using performance/watt as more important than just performance”

          • Damage
          • 5 years ago

          Yeah, and he’s way off point. I am tired of these accusations. We’re under no obligation to provide a forum for him. Considering a ban.

            • derFunkenstein
            • 5 years ago

            As you should be. I think TR does a pretty great job of being unbiased.

            $250-ish R9 290s are worth considering, but that goes beyond the scope of the article. If you’re reading this looking for “which GTX 960 should I buy?”, then you’ve already decided the 960 is for you.

            • Bensam123
            • 5 years ago

            I remember there was a time when TR didn’t abide by artificial product segmentation that companies want to stick to because that’s how they sell their cards.

            If a R9-290 provides 1.5x the performance for 20 dollars more, regardless of what sort of a card it’s considered, then it would be recommended by TR. There was an entire article written on this and how Scott was pissed off at it over a decade ago, which I clearly remember him writing and talking about.

            [url<]https://techreport.com/review/5403/missing-the-point-of-products-and-market-segments[/url<] [quote<]But is it only an "enthusiast's choice" to pay an extra couple bucks to get a product that doesn't suck? Or to get a product that is a true thoroughbred rather than a one-trick pony? Oftentimes, a company's proclamation that its product is not targeted at enthusiasts is an attempt to avoid the scrutiny that comes with the attention of technically savvy consumers—and reviewers.[/quote<] A GF960 at the time wouldn't even be considered a good deal or a worthwhile purchase (let alone awarded a TR recommendation) because they're just too expensive for the performance they offer. Even putting aside a extra $20, you can get a R9-280x for $200, which is promptly inline with these cards (or on a rare occasion a R9-290x). Sure, you can review 960s and compare them to each other, there is nothing wrong with that. But the article was about more then just comparing the cards. It recommended them too, which I assume is in light of everything else on the market. Not just one of them, but all of them. This isn't like 'pay a little bit extra for a quieter PC', even putting aside something like a R9-290 which can have a very quiet cooler, their cost/performance ratios are quite out of wack. Even if the article is niche and focused on one specific thing, the conclusion or summary ties that up. It doesn't even mention other options available, which TR has done in the past even in roundups, especially if there are much better options available. For instance "XYZ card is great if you're bent on purchasing these, but I definitely recommend you consider other options currently available". There wasn't anything like that in the article. I'm not going so far as to say TR is biased, because it's clear in the past that TR isn't biased, but this article (and the last) is definitely pretty weird put into perspective. Other people in the comments here as well as the last article have also pointed this out, although met with warmer receptions.

            • Ninjitsu
            • 5 years ago

            Cheapest 280X at the moment, without rebate is $240.

            [url<]http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&N=100007709%20600473876%204814&IsNodeId=1&bop=And&Order=PRICE&PageSize=30[/url<]

            • auxy
            • 5 years ago

            $219 final purchase price at this time. ($239 minus $20 giftcard).

            [url<]http://www.newegg.com/Product/Product.aspx?Item=N82E16814125726&nm_mc=AFC-C8Junction&cm_mmc=AFC-C8Junction-_-na-_-na-_-na&cm_sp=&AID=10446076&PID=3938566&SID=[/url<]

            • derFunkenstein
            • 5 years ago

            still just a $20 difference between it and the 290. It’s like they don’t actually want anyone to buy a 280X.

            • Meadows
            • 5 years ago

            Are you kidding me. The R9 290 is the loudest card they’ve tested in recent memory. It also doubles the power consumption of the entire system.

            I wouldn’t take that even if it meant 1.5 times the performance.

            • auxy
            • 5 years ago

            This is a troll, right? Like you’re just trying to start an argument?

            I mean, you know that non-reference coolers for Hawaii are basically silent, right? And you know Hawaii easily has “1.5 times the performance” of GM206, right? [b<]Right?[/b<] (; ・`д・´)

            • derFunkenstein
            • 5 years ago

            not the second batch that had a non-reference cooler.

            • JumpingJack
            • 4 years ago

            [quote<]Are you kidding me. The R9 290 is the loudest card they've tested in recent memory. It also doubles the power consumption of the entire system. [/quote<] This is also a major reason AMD has been losing discrete market share to NVidia the past two quarters, and why the R290 cards are as cheap as they are. The market demands and responds to value vectors that conflict with Bensam123's and he appears to be both upset by it and ignorant of the reasons.

            • deruberhanyok
            • 5 years ago

            Why not state your point like this in the first place? Your opening post looks snarky and trollish, at best, especially in comparison.

            • Bensam123
            • 5 years ago

            Because if I did that it would look like I’m berating the author for no apparent reason out of the blue and it seems like everytime I make a post like the above my balls are on the line. Compared with the original post, which isn’t so different from what other people are writing. Comments section has turned into russian roulette for me, only I don’t win anything if I don’t get shot.

            I thought that post was more likely to offend then the first.

            • Damage
            • 5 years ago

            Look, this review was about GTX 960s, like I said. I didn’t recommend *against* the R9 290 here. I just treated the two products separately.

            That seems appropriate. We measured a 170W system power draw difference under load between them. The PSU requirements are massively different, as are the case cooling reqs. The custom-cooled R9 290s are also physically much larger than most GTX 960s, too. Many systems that will accept a GTX 960 just can’t host an R9 290.

            You seem to think we’re obligated to recommend against the GTX 960 because AMD has severely discounted the R9 290. I don’t think it’s that simple, given the vast differences involved. But I’m happy to recommend the R9 290 to those who want what it offers. It’s a great deal right now.

            Folks will choose one or the other based on their needs and wants. They aren’t far off in price, but they are different classes of solutions.

            Speaking of consistency, your M.O. clearly is more consistent than ours. We try to adjust to the dynamics of a changing market. You complain and insinuate bias whenever we say something good about AMD’s competition–even if it’s warranted–or aren’t as excited as you are about whatever AMD is doing. It gets old, and it has worn on my patience over time. If you can’t get it into your head that we can have an opinion different than your own (unwaveringly rah-rah Radeon) opinion without being “biased,” then I will drop the banhammer.

            • torquer
            • 5 years ago

            Boom!

            Give ’em the ol’ what for.

            Seriously, accusing the site of bias over and over with no reasonable basis seems like a great case for the banhammer. We all have our disagreements, but trying (even subtly and gently) to insinuate a breach of journalistic ethics just because you’re so consumed by fanboyism that you don’t want to read ANYTHING positive about other products is crap.

            Bansam is bad and should feel bad.

            • VincentHanna
            • 4 years ago

            Bleh, The dude should have the right to be wrong. Banning someone who isn’t being outright obnoxious or toxic because they happen to disagree with you, or because you lack the maturity to just ignore them is not a “good case for the ban hammer.”

            Everything posted since Damage’s second piling on goes in his own karmic laundry hamper, not Bensam’s.

            • Bensam123
            • 5 years ago

            Reading over the article I linked, which is what you wrote a decade ago, the two products wouldn’t be regarded then anything more then GPUs, unless you’re specifically looking for a corner of the market to recommend (like low noise, low heat requirements for a HTPC).

            Both you and I know that 170w doesn’t mean really anything. Maybe eventually on a your power bill, someday, but on a PSU you’re likely to have a 550w-600w which can be had for like $50 on Newegg. There are plenty of other people who also have small enclosures with powerful graphics cards and do fine, either way that wasn’t argued, tested, or stated as the direction of the article in the article. Cards for small form factor PCs.

            You didn’t recommend against, but recommending something else (especially in great number, when has this ever happened?) seems quite odd. I don’t know how you can recommend them without adding a lot of context as well. If I were to paraphrase I think something like this would’ve went like “These cards are technological marvels, we love what they’re doing with power efficiency, they’re great for small form factor PCs and if you have a discerning ear; but the market in it’s current form puts hard pressure on me to recommend these cards based on that alone. You can spend a few more bucks and pick up something that completely outclasses this card.” in the past. Maybe then give like the cheapest one a award, as that is really what it’s all about at that point (since they’re all good cards, obviously price would win out).

            “But I’m happy to recommend the R9 290 to those who want what it offers. It’s a great deal right now.” yet they weren’t talked about at all in the summary or at the beginning of the article to add context to things. Given how cheap they are it seems pretty ridiculous that it wasn’t mentioned at all. Heck even the deal on that 290x for $230, which isn’t ‘that’ rare should give someone a idea of how often the prices drop on these cards, which is relevant to PC purchases. Both this article and the initial 960 review were like that.

            I get excited about deals, I was excited about the 970 when it was released for the month before AMD dropped prices. If it looks like I am a AMD fanboi that’s only because AMD has definitely been winning on prices, sometimes a little bit, sometimes a lot. Recently it has started turning into a lot, like with the R9-290.

            Really everyone gets excited about deals at better price points. I assume that’s what the price/performance metrics are based off of and why they exist. Conversely it does seem as though you don’t get excited at all when AMD does something that most people would find exciting, like a 290x at $230 or a 8310 at $90 (which I now own at does indeed have a Turbo to 4.2 and also OC’s easily to 4ghz). These are clear winners in almost everyone’s eyes (unless you make a niche case) and even reading some of the comments, they’re persuading long term Nvidia fans to buy them, which is definitely a indicator they’re worth pointing out.

            While it does appear you love hardware and love Intel and Nvidia for what they do, I would definitely say you don’t love AMD as much, which is really where the bias vibes are coming from. Whether because you hate AMD or love Nvidia/Intel, or maybe AMD just rubbed you the wrong way and you don’t like going out of your way to help them, it appears as though fair treatment isn’t being given to all vendors anymore. You report on AMD’s findings and on happenings in the market, but never actually go out of your way to talk about them or point them out without doing it begrudgingly so. This article had a cheery ‘happy’ vibe to the opening page and it was all about Nvidia, regardless of the current state of the market. The article was all about 960s, but the conclusion or the opening page should’ve had something regarding that.

            You know I never once called you a fanboi, said you may be receiving kickbacks, that you hate AMD, or threatened any sort of treatment on my end (I do indeed stream and make recommendations on my stream), yet what you give me on this side for making thorough arguments with points is a threat with a ban. I can’t even argue on here with other people without looking to get my balls chopped off because I carry the unpopular opinion (or popular in this case). I’ve been around this website for over a decade and I don’t think I’ve ever once went out of my way to gutter stomp it.

            The terms you’ve thrown at me and argued for are exactly what you argued against in the article I quoted a decade ago (which I think you really should go back and read). And while maybe you see me as getting old for telling you how you used to do things and why it’s better, I thought I’d remind you as maybe you’d go back to them as that’s what I liked about TR.

            [url<]https://techreport.com/review/5403/missing-the-point-of-products-and-market-segments[/url<]

            • Meadows
            • 5 years ago

            Two fatal errors there.

            One: you continue to insist to shoehorn a product into a review that’s written about something completely different.

            Two: you actually recommended buying a power supply for $50. Of the >500 W variety, no less. I mean, yuk, man.

            • torquer
            • 5 years ago

            Don’t come off as a pedantic tool and you won’t be treated like one.

            You like AMD. Great. We get it. TR doesn’t need to buff your emotional ballsack with every article they write by including a reference to your favorite brand. There’s a reasonable assumption here that a buyer of video cards would read more than a single article before making their decisions, especially when that single article is a comparison of the variations of a single GPU. It’s not TR’s responsibility to make every damn article a comprehensive review of all current competing products just to pander to overly sensitive fans of one thing or another.

            • JumpingJack
            • 4 years ago

            [quote<]You like AMD. Great. We get it. TR doesn't need to buff your emotional ballsack with every article they write .... [/quote<] You had me on the floor laughing at this one... 🙂 funny.

            • NovusBogus
            • 5 years ago

            [quote<]Both you and I know that 170w doesn't mean really anything.[/quote<] Let's assume someone uses the card 4 hours a day for three years, at an average cost of [url=http://www.npr.org/blogs/money/2011/10/27/141766341/the-price-of-electricity-in-your-state<]12 cents per kWh[/url<] that 170W difference comes out to $89 which means the total cost of ownership on that 290 is going to be close to $200 more than the 960--that puts us in a different price tier, i.e. 970 (though it admittedly draws more power than 960). That's why efficiency is so heavily touted these days. Don't get me wrong, I'm not happy about AMD's current state of affairs either. I bought AMD CPUs for 20 years, and sometimes I wonder if things would have turned out differently for my favorite underdog if I'd gotten one of those R&D jobs I applied for (or Larrabee, but that's another matter). But there isn't anything I can do about that except accept the current reality. edit: Amended an incorrect comparison. Also, for those following along at home, the TCO figure is $0.525/watt (4 x 365 x 3 x 0.001 x 0.12). Multiply that by total and/or differential power draw to find out how much your favorite GPU *really* costs.

            • Jason181
            • 4 years ago

            [quote<]Both you and I know that 170w doesn't mean really anything.[/quote<] As the owner of 6970s in crossfire, I beg to differ. The heat they generate is insane, and a real problem for me. My computer room will hit 90+ degrees Fahrenheit in a matter of minutes with them at full load. I am seriously considering purchasing something different solely because it'll be less of a heat problem.

            • VincentHanna
            • 4 years ago

            This isn’t so much a review of the 960x as it is a review of MSI, and ASUS, Evga, Gigabyte and them. This is about 5 cards, all with identical specs being compared on the design of their spiffy custom heatsinks (plus stock).

            For that reason alone I see no reason why TR should be comparing them directly to the 290 beyond doing what they did, which is to include that card right there in the chart as a reference point. The actual review of the 960 goes right into all that price/performance/dollars/watt 3d scatterplotting stuff. Why should they be recycling their old material for no reason?

            Imo I would personally prefer that recycling old articles to bump up page views be avoided. That’s just my opinion however.
            _______

            all that said, this article is fairly [s<]useless[/s<] benign due to the fact that the ability to overclock a properly cooled GPU is more about winning the silicon lottery than it is how many fans it has and whatnot, although comparing noise levels could(?) be useful to some people I suppose(?)

            • auxy
            • 5 years ago

            Yeah, I can’t get behind Bensam123’s methodology or presentation, but I also can’t deny that he has a point. I really can’t approve of giving these cards an award in the face of their price/performance proposition. (´・ω・`)

            [b<]Of course, my opinion doesn't really matter[/b<], but I hope you're aware that even some of the most fair-minded and level-headed among us are scratching our heads at this article. I don't accuse TR of bias, but rather, I just don't think what you value in a graphics card is necessarily what the enthusiast community at large values. (-。-)y-゜゜゜ Maybe I'm wrong; maybe the "performance at what cost" ideal is outdated and out of vogue, but I've never been one to care about trends anyway... (*‘∀‘)

            • auxy
            • 5 years ago

            Just to elaborate on my point further, I do remember when GPU reviewers started talking about power consumption as a major point of contention in graphics cards; it was around the time of Tesla and early Terascale.

            It had come up before, of course, with the 3dfx Voodoo5 requiring an internal power connector (and the 6000 requiring an external one, the Voodoo Volt), and later, the Radeon HD 2900XT having a massive power draw causing stability problems, but it was always an amusing footnote, or a “keep in mind” sidebar, or whatever — it was never something that really decided the result of the review. Not that I remember, anyway.

            But now, these days, it seems like it’s one of the main things that’s considered in a GPU review. I remember a lot of reviewers slating the HD 4870 and HD 4890 versus the more expensive GTX 260/270 cards because they were less power-efficient, and that’s not unlike what we’re seeing with GCN vs. Kepler and now Maxwell. (Full disclosure: I owned a GTX 260 core 216 at that time.)

            This isn’t just the case at TR, either; most tech sites are like this now, and it really bewilders me. Who even really cares that much about it? Yeah, you want to [i<]know[/i<], it [u<]does[/u<] need to be [i<]tested[/i<], if for no other reason than to keep these companies honest with their TDP ratings (and I have a lot of doubts about those even still) and so that you know how much power supply you need, but most enthusiast rigs are running 500W or more on the supply anyway, which will handle basically any single GPU (and if you're running dual-GPU, the power draw is probably the least of your worries behind motherboard compatibility, heat management, and so on.) When I see a reviewer say something like "yeah, the performance is good, but it sure does suck down the power", I don't understand it. I really don't! I don't know why this is relevant to the enthusiast hardware purchaser. For someone building a datacenter, for corporate and enterprise customers, sure, that's a big deal; save 50W per cluster and you save hundreds of dollars a month. But for enthusiasts? [url=https://www.youtube.com/watch?v=WDgq-K2oYLo<]No, man, no.[/url<] Maybe I'm "old-school", or behind the times, or maybe my use case is just not typical anymore, or maybe I was always mistaken, but I keep going over the facts and the data and I just don't see it. I don't see why I should care if my GPU draws 150W or 250W while gaming. I really don't. And more to the point, I don't see why I should be impressed that the GTX 960 can run off a single six-pin connector. I bought into the hype and got a 750Ti for my wife's PC and sure, it's neat how little it is and how cool it runs, but ... it's ... slow! Really slow! Impressive, perhaps, compared to integrated graphics, sure, but for what I paid I could have picked up a used GTX 580 that would blow it out of the water, or even a 670 or similar. Basically, what Bensam123 is not-so-eloquently expressing is the same frustration a lot of us feel and which coincidentally gets us labeled AMD fans (when that is absolutely not the case, in my case certainly -- I do think ol' Ben is a bit partial to the red team) -- this feeling that we're being sold a product on false pretense. We're being told "ah, this thing is great" when it's really not catering to our needs, demands, desires, or interests. It's like when an authority figure says they have your best interests in mind, but do they really? In this case, I don't really think so. I know there are guys among the crowd here who are going to fling poo at me for this post; derFunkenstein and probably Meadows and HisDivineOrder and maybe ol' chuckie, but these are my sincere feelings. I know why the emphasis at the big tech companies is on low-power; I know all about market forces and the extremely forced narrative of the great exodus from desktops; I know. [b<]I just don't care.[/b<] [sub<]I want 3dfx back. ( ;∀;)[/sub<]

            • RazrLeaf
            • 5 years ago

            I’d like to provide some counterpoint to your perspective.

            Back in the day, raw performance was all that mattered because it was a given that you had to deal with more power, heat, and noise to get more performance. Raw performance is also easily captured in benchmarks and real world testing, be it in FPS or in frame times. Price as well, as sites like PCPartPicker make it all too easy to see both current prices and pricing trends.

            And for an enthusiast that is a “old-school,” all that information is out there for them to make the decision (and I wager old-school enthusiasts have the know-how to do so). Find a review site that has a test bench similar to your setup, and go from there.

            However, the biggest improvements in recent generations have not been in raw performance. Sure, we get a bit more each generation, but biggest change has been in efficiency. It’s no longer a given that high performance requires high power, heat, and noise. And there are many enthusiasts whose expectations have changed. It’s not just about getting the most raw performance per dollar anymore. The expectations and priorities of reviewers (and consumers) change with each generation of products, and now that the option of having a high performance, peaceful, and cool computer exists, more and more users will strive to achieve that, and it brings in the other criteria that reviewers use to assess new products.

            It also doesn’t help that AMD products offer better raw performance per dollar while Nvidia has products that have an efficiency level that trounces AMD. But I love it because it gives us, the end consumer, a choice in what we want to pay for. We can choose with our dollars what we value most.

            And by the way the last quarter (and year) has gone for AMD and Nvidia, it seems to be that more people are valuing efficiency and silence. So it would only make sense that reviewers care about those things too.

            • auxy
            • 5 years ago

            I would argue that the reason people are valuing efficiency has more to do with the fact that they’re being told to do so. (‘ω’)

            • I.S.T.
            • 5 years ago

            Myself? I value it because I try to create a PC that is high performance without burning too much power. I’ve been concerned about electricity usage for years, and if I can get equal or more with less… Well, I’m gonna do it.

            • Bensam123
            • 5 years ago

            Also yup… If a website just throws a bunch of graphs up and makes people look at them and says ‘this is why these cards are awesome!’, that’s what people are going to spit back out. I’ve had people come to my stream and say the same thing. Most recent example was with the 970, where people were buying it based of DSR alone, which was super hyped up (and now AMD has on their cards).

            People generally do what they’re told, especially if it’s been told by a popular figure or some sort of reputable source. They wont question it and they’ll assume everyone else that’s not reputable or unpopular has the wrong opinion. That’s just human nature unless you point it out and add perspective.

            • NeelyCam
            • 4 years ago

            I value efficiency because it reduces noise.

            • derFunkenstein
            • 5 years ago

            Thing is, it *is* great. At different stuff. Whether or not that stuff matters is already decided before going into this particular article.

            Also nVida bought out 3dfx. You can have them back, buy buying a GeForce. :p

            • auxy
            • 5 years ago

            I still don’t understand why NVIDIA doesn’t make use of that Voodoo brand name! They could gain so much cred within the industry.

            • derFunkenstein
            • 5 years ago

            Maybe, I dunno. Does that brand still have any fond memories for a lot of present-day gamers?

            I loved my Voodoo Banshee so much that when I upgraded I got a Voodoo 3 2000 PCI that died a premature death and got replaced by a Savage 4 – a huge mistake – so when I upgraded again I got a Voodoo 5 5500 AGP. For me, I’d buy a Voodoo for irrational reasons.

            • auxy
            • 5 years ago

            I know I’d be real hard-pressed not to irrationally overspend on a card branded with “3dfx” or “Voodoo” these days. ( ;∀;)

            • JustAnEngineer
            • 5 years ago

            NVidia did not buy out 3Dfx. They bought most of the intellectual property (and engineering staff) after 3Dfx entered bankruptcy, leaving not much behind.

            • derFunkenstein
            • 5 years ago

            [url<]http://www.x86-secret.com/articles/divers/v5-6000/letter_to_customers.htm[/url<] They own the brand.

            • Krogoth
            • 5 years ago

            I’m kinda surprised that Nvidia hasn’t resurrected the brand name outside of SLI.

            • MathMan
            • 5 years ago

            They own the brand… which is a major piece of IP.

            • jihadjoe
            • 4 years ago

            That was actually a very smart way of “buying out” 3Dfx. They managed to get the good bits (the IP and the engineers) while leaving out the bad (debt).

            So technically they didn’t “buy out” the entire company, but arguably they did so, and in a brilliant manner to boot.

            • PixelArmy
            • 5 years ago

            If 150W GPU1 performs the same as 250W GPU2, you are short-changing yourself the equivalent of 100W worth of GPU1 performance.

            For a more concrete example, GTX970-SLI vs rumored R8 380x (2x145W < rumored 300W), so the R8 380x damned well be 45% faster than a single GTX 980 or else you should be running the GTX970-SLI which [url=https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_970_SLI/20.html<] really is 45% faster than a GTX 980[/url<]. If not, why aren't you running the faster config!? I thought we cared about performance!?! So in reality, you don't care about performance either, you care about [i<]price[/i<]/performance and at your own personal performance threshold. Keep in mind though, the market will affect prices, the efficiency won't change.

            • auxy
            • 5 years ago

            You don’t understand SLI very well, do you? (‘ω’)

            • PixelArmy
            • 5 years ago

            I was speaking in generalities. If it makes you feel better, imagine scaling up Maxwell to a single gpu 300W part.

            But by all means, enlighten me.

            • Bensam123
            • 5 years ago

            Yup… I don’t normally read other Tech websites, I didn’t know they were also doing this.

            • MathMan
            • 5 years ago

            I don’t understand those who want 3dfx back.

            At one point, they were king of the hill, but the architecture of their chips was pretty terrible. They had a fixed DRAM allocation for frame and Z buffer, and texture unit instead of having unified memory. They made the most inane of business decision to start making GPU boards themselves. They missed the DirectX boat. They were chronically over time with their new products.

            The only reason they were successful at first is because others were worse. Once the others cleaned up their act, they were toast.

            • auxy
            • 5 years ago

            You’re not looking at things from a historical perspective. They were THE FIRST to bring that kind of 3D hardware down to a price point where it made sense for gamers, and at that time, a non-unified memory model was the norm!

            [b<]Let me be clear: nobody sane, rational, and informed wants the 3dfx of the Voodoo5 era back.[/b<] But before that, especially in the Voodoo2 era, there was just nothing that even came close. If you wanted the best performance, you got a 12MB Voodoo2 -- and if you wanted to go nuts, you got a second one. Lots of games [i<]required[/i<] a 3dfx card to run, so having one was sort of a badge of pride and the mark of a serious, hardcore gamer. In that same vein, 3dfx's marketing was legendary. They were one of the first companies to openly push 60fps as the gold standard, and ludicrous ads proclaiming "So fast, it's kind of ridiculous" were ground-breaking for their time. It didn't hurt that the performance of 3dfx's products (even including the somewhat-outdated-before-its-time Voodoo3) was always stellar. What people like me wish for is not 3dfx's technology or 3dfx's business but 3dfx's [i<]personality[/i<]. 3dfx wasn't afraid to make an end-around existing technologies and techniques to produce an interesting result (read about the T-buffer, or Analog SLI on the Voodoo5, or the post-processing box filter applied to the Avenger's 16-bit output.) More than that, though, we miss the attitude that nothing was ever fast enough; the focus on raw performance over anything else, no matter how ludicrous it might seem. (Sure, I'll add two more PCI boards that are larger than anything else in my machine and only work while gaming, with some games.) It's a wistful feeling not unlike the old men who still pine for the over-500-cubic-inch big-block engines of the heyday of muscle cars; the "the only solution is more power" attitude.

            • MathMan
            • 5 years ago

            You’re looking for a company with [url=http://blogs.nvidia.com/blog/2014/01/05/salinas-crop-circle-and-project-192/<]edgy[/url<], [url=https://techreport.com/review/25712/are-retail-radeon-r9-290x-cards-slower-than-press-samples<]aggressive marketing[/url<]. One that doesn't take things laying down. One that pushes high frame rates, and provides monitor overlocking features in their driver, or, even better, isn't afraid to make an end-around existing technologies and invent stuff like variable frame rate monitors? One that has never backed down from making ridiculously large dies even when the competition retreated to the sweet spot? One that has a 600mm2 die waiting in the wings yet spends enough attention to details to outfit them with high quality reference coolers? One that worked around the dead-end that was original SLI (because of render-to-texture) and invented a completely different kind of multiple GPU rendering which had the competition scrambling? One that wasn't ashamed to put not one but [url=https://techreport.com/review/20629/nvidia-geforce-gtx-590-graphics-card<]two power hogging dies on a GPU[/url<], even though it's completely ridiculous? Is it just me or did you just describe Nvidia? It's great that their current GPUs are power efficient as well, but I've never associated Nvidia with the gentleman doofus image of AMD. It's probably not a coincidence that Nvidia has a lot of 3dfx DNA in its veins.

            • auxy
            • 5 years ago

            [super<][i<]edit:[/i<] I wanted to edit this post to clarify in the beginning the whole point of it. MathMan's point is that 3dfx fans should accept NVIDIA as the 'new 3dfx', and I'm pointing out that NVIDIA is really nothing like 3dfx, for many reasons. I'm also being very silly at the end. The original post, unedited, follows:[/super<] NVIDIA didn't invent "variable frame rate monitors", and they weren't the first to put two ridiculous power hog chips on one board. (See: Voodoo5 5500, and the 6000) NVIDIA didn't invent "a different kind of multiple GPU rendering" -- they actually borrowed the idea from the people who invented it: ATI. NVIDIA Geforce Experience optimizes games to hit 30 fps, not 60. NVIDIA's marketing is almost nonexistent compared to 3dfx's, who took out full 2-page magazine spreads and even TV spots. NVIDIA isn't a "gentleman doofus", no, but then I don't really know where that came from? I didn't say that, and I wouldn't say it about AMD. AMD is more like an idiot savant 12 year old at this point -- full of spunk and gusto, but short on ideas and incredibly obnoxious to actually spend time with. To me, NVIDIA is a dudebro, with his nice job and fast car and spiked, platinum blonde hairdo with frosted tips, and his wraparound shades, who nobody thinks is as cool as he does. He's Johnny Bravo; he's the man that he thinks every man wants to be, and while he might be attractive on a first glance, he quickly loses his lustre once you talk to him. "I've got some great stuff going on with Tegra, you know?" he says wistfully, trying to force the image of being thoughtful and romantic, but you know nobody cares about Tegra. "What about GeForces," you ask quietly? "Yeah, heh. I got GeForces. Check this out," he responds, opening the hood of his heavily tuned car, to reveal a 1.7L twin turbo I4. "It gets 32 miles to the gallon. AND it does 0-60 in six seconds. But even better?" He grins. "My stereo goes to 11." You smirk outwardly and sigh internally, knowing this douchebag will never change.

            • MadManOriginal
            • 5 years ago

            #InferiorityComplex

            • torquer
            • 5 years ago

            So that makes you the jealous hater next door who talks crap about his rich neighbor. Maybe you should try harder to achieve something instead of tearing down others who are successful.

            Sad to see someone have such blind hatred for a company over nothing. Why all the venom? AMD and Nvidia can coexist and we need them to.

            • auxy
            • 5 years ago

            … are you being serious right now?

            I don’t have any “blind hatred”, or any hatred at all, for neither AMD nor NVIDIA. What are you talking about?

            Frankly, I think you’re projecting. There’s nothing to be found of anything you’ve mentioned here.

            • torquer
            • 5 years ago

            I’m sorry, I guess douchebag is a term of endearment on your planet.

            • auxy
            • 5 years ago

            Welllll, yeah, sorta! (‘ω’)ノシ When used in a facetious manner as I did above. People do stuff like that all the time; they call their friends jerks, idiots, and other various ruder things I can’t elucidate on this forum.

            I mean, everyone knows this guy, right? The guy who has nice things through no talent or merit of his own; things he fell into, like a job he lucked into because he knew a guy, which got him the money he needed to buy things he doesn’t really understand the value of; he just got it because he heard it was ‘the best’. He values things that don’t really matter and he considers himself “alpha” because he doesn’t understand the concept of humility but readily cedes leadership to people with real self-confidence.

            I’m not saying NVIDIA has no talent or merit of their own, mind you; the above paragraph was not characterization of NVIDIA but of a personal archetype I’ve come into contact with a lot over the years. (A lot of these guys seem to have asian fetishes…) I do sort of see NVIDIA that way — obsessed with glitz and flash, enthusiastic about gimmicks nobody cares about — but I was really just being silly.

            If you sincerely thought any of my prior post was genuine vitriol, you REALLY need to step back from the keyboard for a while and examine your priorities. (´・ω・`)

            • MathMan
            • 5 years ago

            Let me guess: Nvidia invented multi-monitor surround because AMD was the first one to announce it?

            • auxy
            • 5 years ago

            What?

            • MathMan
            • 5 years ago

            It’s the only logical conclusion: Nvidia showed SLI and GSYNC first, so AMD must have invented it. And vice versa…

            • auxy
            • 5 years ago

            … what?

            • MathMan
            • 5 years ago

            Let’s spell it out for you…

            > NVIDIA didn’t invent “variable frame rate monitors”,

            If they didn’t then who did? Do you think it’s a coincidence that it took a year between Nvidia’s announcement and the availability of the first samples of scaler chips that support adaptive sync? And another 6 months between expected availability of production scalers? That’s exactly the kind of timeline that’s required to create new silicon after an ‘oh shit’ moment.

            > and they weren’t the first to put two ridiculous power hog chips on one board. (See: Voodoo5 5500, and the 6000)

            Precisely my point. And since Nvidia bought the 3dfx assets and personal, it’s no surprise that they continue this way.

            > NVIDIA didn’t invent “a different kind of multiple GPU rendering” — they actually borrowed the idea from the people who invented it: ATI.

            Nvidia announced SLI in 2004. And it was fully supported in silicon. ATI started to get really bad support for Crossfire in 2005. And it required the hack of an external dongle. Looks like another ‘oh shit’ moment to me.

            > NVIDIA Geforce Experience optimizes games to hit 30 fps, not 60.

            Yes. And they also add options in the driver to allow people to overclock their monitors beyond their maximum rated specs. What’s your point? (People actually use GeForce Experience?)

            > NVIDIA’s marketing is almost nonexistent compared to 3dfx’s, who took out full 2-page magazine spreads and even TV spots.

            Magazines? TV spots? Hello? We’re 2015 now. They will miss out on octogenarian PC gaming fans by not wasting money on magazines and TV spots, but I don’t think that’s such a huge loss. And one company that makes ads like The Fix3r is already too much for world.

            As for the way they project themselves, call it dudebro if you want, but they have at least one thing most dudebros don’t have: it’s based on the confidence of having real technical superiority.

            • auxy
            • 5 years ago

            [url=http://patents.justia.com/patent/8878825<]Qualcomm invented variable refresh.[/url<] Check the date. [b<]You are wrong.[/b<] NVIDIA was not innovating by using multiple chips on one board. They also weren't specifically continuing any 3dfx legacy, since other companies had done the same thing: Scalable Link Interface uses Alternate Frame Rendering, which was pioneered on the ATI RAGE FURY MAXX back in 1999. Whose 'oh shit' moment is it, again? [b<]You are wrong.[/b<][quote="you"<]You're looking for a company with edgy, aggressive marketing.[/quote<]Where? You linked to a couple of GPU reviews. I don't see any marketing going on there. NVIDIA has never run TV advertisements nor done much direct marketing at all. [b<]You are wrong.[/b<][quote="you"<]You're looking for a company (...) that pushes high frame rates[/quote<][quote="me"<]NVIDIA Geforce Experience optimizes games to hit 30 fps, not 60.[/quote<]They encourage their users to run games at 30fps, not 60fps. [b<]You are wrong.[/b<][quote="you"<]Magazines? TV spots? Hello? We're 2015 now.[/quote<]Yes, taking things out of context makes you look like someone who can't understand context. [b<]You are wrong.[/b<] I could go on but there's really no point. You are literally wrong on every point and you probably won't even admit it. You look like a noob and your bias is evident. And [i<]learn to use the damn forum software![/i<] Quoting with > makes you look like a chantard.

            • tap22sf
            • 5 years ago

            Just FYI…US 20140092113 A1

            • MathMan
            • 5 years ago
            • tap22sf
            • 5 years ago

            I read the forums here often (I assume Scott knows this:). Just thought I would say thanks. There is a lot of crazy thinking in this thread and sometimes it depresses me. But then there is your comment, and a moment of clarity.

            TAP

            • NovusBogus
            • 5 years ago

            I actually do agree that giving the 960 an award in a contest between a bunch of 960s sounds like something El Presidente would do, and both the 960 and the rest of the $200ish price point is a little underwhelming compared to what’s going on both above and below it. But, as noted, the tech version of creative accounting offers nothing useful.

      • PixelArmy
      • 5 years ago

      *clicks link*
      sees $260 card after MIR
      continues to ignore Bensam123

    • Milo Burke
    • 5 years ago

    I notice all of these received the TR Recommended award for February 2016. That’s pretty bold considering we haven’t seen the R9-300 series yet.

      • Milo Burke
      • 5 years ago

      Four downvotes in as many minutes? Let me provide a direct quote from the conclusion page, right beneath the recommended logo:

      [quote<]Asus Strix GTX 960 EVGA GTX 960 SSC Gigabye Windforce GTX 960 Gigabyte G1 Gaming GTX 960 MSI GTX 960 Gaming 2G February 2016[/quote<] I'm just recommending he change it to February 2015. =]

        • Philldoe
        • 5 years ago

        The nvidia shills are out in force today.

          • Milo Burke
          • 5 years ago

          Only the ones who rolled a low reading comprehension check.

        • geekl33tgamer
        • 5 years ago

        That always happens these days. TR community isn’t want it used to be.

          • derFunkenstein
          • 5 years ago

          It didn’t used to have downthumbs

        • derFunkenstein
        • 5 years ago

        It was probably just two guys – one with a gold sub, and the other Bensam123. :p

      • Meadows
      • 5 years ago

      What’s the worst thing that can happen? The R9 300 will get the award for March.

        • auxy
        • 5 years ago

        [url=http://i.imgur.com/c9BkaLp.gif<][i<]didnt_read_lol.gif[/i<][/url<]

    • torquer
    • 5 years ago

    Amusing that so many people complain about RAM and post-1080p performance when the vast majority of people game at 1080p these days.

    Nvidia has rarely been the most cost effective from a price performance ratio. But for the last couple of years they’ve been the best overall product in most of their product categories – meaning good performance per dollar, great performance per watt, excellent cooling, low noise, etc.

      • derFunkenstein
      • 5 years ago

      A good rule of thumb for a while now has been to buy a GPU that costs roughly as much as or maybe a bit more than the monitor. With 1080p IPS displays being ~175-200, I’d look at a 960 (among other things, including discounted Hawaii cards) for 1080p gaming.

      For 2560×1440, I paid $300 for the monitor and many of them push up towards $400. People really expect a $200 GPU to drive it at max detail? Hardly. A 290 would do it for less than the monitor’s price, and a 970 would be basically as fast, or maybe a touch faster, for a lot more money. I’d probably go with the 290 here.

      That also puts into light just how underpriced AMD has to position Hawaii-based silicon right now. nVidia is dominating a market where AMD really has better price and performance, for some reason.

        • torquer
        • 5 years ago

        That is an interesting point. Somehow I think Nvidia was able to gain a majority of the market mindshare, but to be honest I don’t really know how. When you have a good percentage of the enthusiast community that is unimpressed with all of their custom tech (PhysX, G-Sync, etc), vilifies their TWIMTBP program, and can’t understand design choices like the 970’s memory access, its a wonder AMD isn’t doing better than it is.

        I personally care about heat and noise so I have a GTX 980 with the Gigabyte Windforce cooler. I don’t really think the market at large cares that much though.

        If you see the marketshare graphs you see Nvidia and AMD have been playing tug of war over less than 10% of the total market for a long time. Nvidia always has 55-65%, AMD has the rest, and they go plus or minus about 5% with each refresh cycle.

          • derFunkenstein
          • 5 years ago

          I think the explanation could be as simple as the community that has all the gripes you mention must be the extremely vocal minority. It could also be they’re not particularly bright.

            • torquer
            • 5 years ago

            The most well deserved thumbs up so far this year!

          • Klimax
          • 5 years ago

          Or maybe general market cares about noise and heat and power consumption. Be it due to cost of electricity, no AC (like many homes here don’t have AC) or use UPS. ETA: Or like silence. After all many cities are quite noisy or job environments. Seems people want to have some calm environment for relax.

            • torquer
            • 5 years ago

            True. It is important to me personally. I guess I’m just surprised by that level of reasoning by people who seem to value fanboyism over just about anything real.

          • DrCR
          • 4 years ago

          It’s due to the increasingly exclusive use of Linux, and Nvidia’s far better Linux driver.
          ..
          or something like that 😉

        • Ninjitsu
        • 5 years ago

        Hmmm, I don’t know, I still feel that the 970 is just enough for non-adaptive refresh gaming at 1080p.

    • Jambe
    • 5 years ago

    That Asus unit is pretty small; it’d even fit in some 10 liter cases. Given that these things seem pretty efficient, I’d be even more interested in purposefully-diminutive offerings (e.g. these two: [url=http://www.evga.com/Products/Product.aspx?pn=02G-P4-2962-KR<]EVGA's[/url<] and [url=http://www.gigabyte.com/products/product-page.aspx?pid=5369<]Gigabyte's[/url<], 173 and 181 mm long, respectively, compared to the 216 of Asus, which is the shortest card in this review by a considerable margin). I wonder how they'd fair noise-wise since they scale down to a single fan.

      • llisandro
      • 5 years ago

      SPCR just reviewed the 960 Strix in a Fractal R5 with fans on low, and it’s 20 dBa with fans on full, but they show that the dual fans moving at 1120RPM achieve the same cooling temps during a Prime95 + FurMark run and the card is basically silent in that config, so unless you’re going for dead-silent I think you have a good shot with one fan.

      I own the 750 Ti version of that EVGA card you linked and the fan is really quiet- obviously a different TDP class, but the 960 bumps up to a much beefier heatsink than the 750 Ti, while keeping what looks like a similar fan/shroud.

      [url<]http://www.silentpcreview.com/Asus_Strix_GTX_960[/url<] edit: the 173 mm card is hilariously small- my HR-02 is bigger: [url<]http://i60.tinypic.com/2duhxn4.jpg[/url<]

    • USAFTW
    • 5 years ago
    • USAFTW
    • 5 years ago

    The only thing I can find myself complaining about is that Nvidia chose to go same performance/lower power route than what I ideally want to see which is more performance at the same or slightly higher power.
    I mean, the card even when clocked much higher are barely any faster than a GTX 760. Why not take the power efficiency and use it for moar powah?

      • Damage
      • 5 years ago

      Who says they didn’t?

      [url<]https://techreport.com/review/27067/nvidia-geforce-gtx-980-and-970-graphics-cards-reviewed[/url<]

        • auxy
        • 5 years ago

        While I find NVIDIA’s emphasis on efficiency to be a bit overbearing at times, there can be no denying that the performance of GM204 is explosive. GM200 will be a sight to behold; I wonder how it will withstand Mt. Fiji’s apocalyptic eruption…? (´・ω・`)

        • Firestarter
        • 5 years ago

        I think he means higher clocks, instead of a bigger GPU. Both are valid ways of achieving higher performance and both come at the cost of power draw, but all else being equal the graphics card with the smaller GPU will be cheaper, as the GPU itself will be cheaper to manufacture.

        The answer to USAFTW’s question is probably “they couldn’t”, as in the GM206 hits a wall where higher clocks require disproportionately more power, necessitating a bigger GPU. I have no idea whether that actually happens with these cards, but I guess some extreme overclockers will find out soon enough

        • MadManOriginal
        • 5 years ago

        I think the power efficiency is impressive as well, but I was hoping that it would translate into huge performance gains at the same power draw as Kepler GPUs, rather than smaller increases at lower power draw.

    • auxy
    • 5 years ago

    Okay, I’ll be that girl: [url<]http://www.newegg.com/Product/Product.aspx?Item=N82E16814125726[/url<] It's not the 285 this thing competes with... Even a Hawaii chip is just $40 more at $249...

      • JustAnEngineer
      • 5 years ago

      Radeon R9-290 4GB at $250 is a better value than any NVidia-based graphics card.

        • Milo Burke
        • 5 years ago

        It is, definitely. Irritatingly so because it’s loud and hot, and I want a TR review of Adaptive Sync before I’d consider a new non-Nvidia GPU. Not saying it will be bad, just saying we don’t know.

          • sweatshopking
          • 5 years ago

          Not all of them are. My MSI 290 Twin frozr isn’t loud or hot. Had a reference 290x on a computer I sold and that thing was LOUD. I’d highly recommend the 290TF though. Very nice card.

          • Damage
          • 5 years ago

          Yeah, the R9 290X cards we reviewed with custom cooling are quiet and don’t get too hot. They generate a lot of heat, but that’s to be expected with a high-end card.

          [url<]https://techreport.com/review/26092/custom-cooled-radeon-r9-290x-cards-from-asus-and-xfx-reviewed[/url<] The R9 290 is just a crazy-good deal right now. It's also seriously not the same class of solution as the GTX 960 at all. Weird time for the market, but not a bad time to buy.

            • Milo Burke
            • 5 years ago

            Thank you, but I remember that article of yours. Aftermarket coolers killed the reference version for cooling and noise, no question.

            Yeah, the 290 is a scary good deal. If more Adaptive-Sync monitors were out/cheap, and if you had an article proving Adaptive-Sync to be a good experience, I’d buy one up this week. I’ll wait for more developments and your following articles before pulling the trigger.

            Thanks for all you do. You provide clarity to computer hardware and gaming.

            • cynan
            • 5 years ago

            I’m still gobsmacked that AMD released their reference Hawaii cards with such woefully inadequate coolers. It`s like there`s some unspoken rule that all reference coolers must be blower style (maybe there is some secret arrangement with 3rd party card manufacturer`s?). Yes, sure, everything else equal, a blower, rear exhausting solution is a safer reference bet as it is more compatible with all usage models. But when the bloody thing doesn’t do the job, that logic becomes mighty hard to follow.

            • Chrispy_
            • 5 years ago

            I hope AMD spends more effort on reference coolers with the 3xx series

            The 970 and 980 are 4K-ready cards that can be put in a variety of cases ranging from mITX shoebox to low-profile HTPCs using a riser because their blower versions are cool, quiet and easy to manage airflow. Cool air goes in at one end and hot air gets discarded straight out the back with no fuss, no internal case-temperature drama, and no complicated planning required.

            By contrast, many of the compact or unusual form-factors popping up all over the place these days are woeful when it comes to dealing with open coolers. mITX, HTPC, Steambox, and portable LAN box are words that are here to stay. In fact, many of the best/most popular standard ATX towers from Antec, Nanoxia, Fractal, Corair, NZXT focus on foam soundproofing, airflow control, baffles and general noise-reduction at the expense of airflow. It’s REALLY HARD to cool a 250W card quietly if the case you’re using is full of foam, low-rpm fans and only one exhaust above the processor.

            As much as it sounds like I’m singing the praises of Nvidia’s reference coolers, I’m not enamoured with Nvidia’s GPUs. They’re low value for money, I have serious issues with their drivers on multiple fronts (the limited range 16-235 issue has only recently been fixed after something like 20 years!) and of course, I don’t want to be limited to G-Sync screens when Freesync is likely to be the more prevalent standard – Displayport *is* a standard, Nvidia’s expensive FPGA solution isn’t and the G-Sync ASICs we were promised still haven’t materialised yet (I’m not entirely sure they will, given the licensing costs, either).

            Please AMD, don’t make a lame cooler next time. GPU’s are all about managing power, heat and noise these days so why go to huge lengths to design silicon and forget the other half of the equation?

            • sschaem
            • 5 years ago

            I have a reference R9-290x. No issue with the cooler, noise or heat.
            I actually got the reference cooler because my PC have always been setup in “exhaust” mode.

            On my end, could the AMD exhaust cooler be better? yes. does it matter? no

            Reality is that for power efficiency in anything single precision, I think nvidia will keep a significant edge. And so can deliver quieter cards using less power, regardless of the cooling situation.

            Yet, for a HTPC today I would pick want a R9-285, but sadly none come with an exhaust blower design.
            So I would go with my second choice, a GTX 960. In this case nvidia win my $$$ from the cooler , so I do agree that cooling solution can matter.

            • thecoldanddarkone
            • 5 years ago

            They are totally a good deal. I got one @ 250 without any rebates. It will be 220 after rebate (I never count these until I get them).

            It definitely generates heat. My 3820@ 4.3+290 = 510 watts Prime+Kombuster

            It’s pretty loud under that kind of load, but under actual gaming It’s totally reasonable. I was . close to a gtx 960 but at 250 and then a 30 dollar rebate I couldn’t resist.

            edit.

            I find gtx 960’s amazing.

      • Philldoe
      • 5 years ago

      I agree. The only reason a person would buy a 960 over a 280x(or even a 290x for that matter…) is if they have a hard on for “Low Power useage”
      Meh. Power usage is not even a metric for me. FPS/Frame Time and Cost are the only real metrics. Power consumption be damned.

        • swaaye
        • 5 years ago

        Going to a 280 or 290 can also entail a PSU upgrade. And if anyone buys a 290 that has that original AMD cooler on it, they are probably going to be disappointed no matter how cheap it is. I’ve been there…..

          • Philldoe
          • 5 years ago

          I doubt the PSU upgrade is really needed. TR’s own poll shows the VAST majority of TR users have more than enough PSU output for a 290X

            • swaaye
            • 5 years ago

            That would be somewhat surprising because of how whenever PSUs come up people come out of the woodwork to push the adequacy of lower output units. Adequate until someone wants a 300 Watt video card…

            • Philldoe
            • 5 years ago

            It seems people talk more about “Future proofing” than anything. “Always buy a bigger PSU than you need just in case” ect ect. I know I did for my last 2 PSU’s. I bought an Antec 650w way back when most people were buying 400w units. I had that PSU for many years. Heck it still works today. I’m sure my 850w unit will last me just as long.

            • swaaye
            • 4 years ago

            That’s not the sentiment I’ve seen over the years on here. I suppose it varies by news article or something like that.

            Considering the anonymous voting weasel gang has spoken in your favor, I’m sure that almost every gamer dude out there is good to go for a 300W GPU!!!!

        • Milo Burke
        • 5 years ago

        Or noise. Or proven adaptive-sync that’s available today.

        I will be an extremely happy camper when the dust settles around adaptive-sync. Assuming AMD does it right, either nobody will buy Nvidia or Nvidia will have to make G-sync monitors more affordable.

          • tap22sf
          • 5 years ago

          NVIDIA does not control the price of monitors. ASUS/Viewsonic/Acer….monitor guys do.

            • tap22sf
            • 5 years ago

            BTW…who sets the price? Is it the monitor provider…the etailer?

            Maybe the customer? Actually I am pretty sure it is the customer. Things priced too high don’t sell. Too low sell out. In market economies (like ours) price is set by the willingness of consumers to buy. Variation from that price leads to excess or shortage.

            Just saying.

            • auxy
            • 5 years ago

            That still doesn’t excuse the fact that NVIDIA forces monitor manufacturers to include a $200 G-SYNC module for literally no reason.

            • torquer
            • 5 years ago

            I look forward to your article as you have clearly performed an exhaustive analysis of the differences between Adaptive Sync and G-Sync.

            I also look forward to a ride in your time travel machine since clearly you went back to the early part of last year and got ahold of some non-gsync adaptive display technology before it had been released.

            Whether coincidence or by design, Nvidia’s design came first and prior to it NOTHING existed on the market to achieve the same goal. Was it possible? Maybe. Was it in development? Probably. Did it exist on the market for sale? No.

            Nvidia’s solution is proprietary and that’s bad. But it works amazingly well and I’ve been enjoying it on my ROG Swift since that monitor launched in the US late last Summer. To date I have yet to see any FreeSync/Adaptive Sync monitors for sale though I know they are coming. I’m sure the future will be comprised of primarily non-hardware based solutions, but that is not the case at this very moment in the open market and hasn’t been up until this point.

            So, you are wrong. Your statement is wrong that Nvidia forces anyone to do anything that isn’t necessary to support adaptive sync technology *as of this moment.* In addition, the suggestion is absurd. While I’m sure Nvidia enjoys a profit on the modules required for G-Sync, it is hardly a huge source of revenue for them. You can count on both hands the number of currently available monitors with G-Sync, and of the greater market at large I’d wager a guess that the percentage of monitors utilizing it, even compared to the larger number of users with Nvidia graphics cards compatible with G-Sync is extremely small.

            I would say the more likely cause for things happening the way they did was that Nvidia had the chutzpah and R&D dollars available to work up their own solution either in ignorance of adaptive sync being added to displayport standards or knowing it was far enough off that there was an opportunity to make this a “value add” for users who would end up choosing a graphics card on features rather than or in addition to performance.

            An allegory could be made about Mantle. AMD possibly knew or possibly didn’t know that DX12 would likely solve or mostly solve the whole CPU overhead “problem” in DirectX. Either in ignorance of or anticipation of the more open standard, they created their own. Its proprietary and that’s bad, but it works and is a value add for their users. Explain to me how those two things are different?

            In the end hopefully the open standards will be as good or better than their proprietary forebears, but I would remind all of the dear readers here that proprietary solutions very often precede (and hopefully positively impact) the development of open standards. I’ll point you to the old days of graphics cards where games had to be specifically coded for an individual 3D accelerator chip (PowerVR, 3Dfx, etc) prior to the advent of DirectX.

            • auxy
            • 5 years ago

            At this point it seems more like you were projecting with your remark about how I like to hear myself speak. (*´∀`*)

            But no, no, I’m going to be serious. Look, mobile G-SYNC and FreeSync exist; we KNOW that the G-SYNC module is NOT required for G-SYNC. NVIDIA says that there will be ‘experience differences’ between ‘real’ and mobile G-SYNC; PC Perspective tested it and found only one minor difference.

            You seem like you’re arguing FreeSync vs. G-SYNC (“open standards” vs. “proprietary”) rather than for the relevance of the G-SYNC module, which is what I was attacking. Let me make something clear to you, torquer — [b<]I like NVIDIA.[/b<] I also like AMD, and Intel, and most other tech companies. I just don't like some of their business practices, and to be clear, in this sentence, "their" refers to all of the above, and others. I get that you paid a lot for your ROG Swift and you like it and that's great! Really! I wish I could afford one, I really do, and frankly speaking, that's part of the reason I'm so embittered over the pointless expense of the G-SYNC module. If NVIDIA would come forward with a really detailed (technically speaking) explanation of why the G-SYNC module is necessary, I would probably be satisfied. The reality is that they haven't done such a thing because it is demonstrably UN-necessary and that makes me angry. You have a vendetta against me and you want to pick a fight, but I'm just not up for it; the person you want to fight is someone you've imagined who isn't me. You might want to see a proctologist, though. (*´艸`*)

            • torquer
            • 5 years ago

            I figured if you wouldn’t listen to reason condensed into one or two paragraphs that perhaps you would with a more lengthy response. Clearly, I was mistaken 🙂

            You don’t know that the G-Sync module is “pointless.” As an expert you surely know that the standards used for eDP and standard DP are not exactly the same. Laptop monitor technology is not identical to desktop technology. I also read the PCPer piece and they are using beta drivers, not final technology. Again, we’re talking laptops not desktop displays. Similar, but different.

            Why on earth would a company like Nvidia piss away valuable R&D budgets on unnecessary hardware to solve a problem that could be solved even in a proprietary manner with software? They already have proprietary software solutions for PhysX, so they are no stranger to it. Its not like they went out and created daughterboards to solve the problem instead (Ageia did this for them and Nvidia bought them followed by promptly cancelling the hardware solution).

            You may believe that it is unnecessary, but Nvidia doesn’t agree. Logic would disagree that it was profit motivated as it would be beyond comprehension that they’d make enough money off of such a small market to justify the expense. Nvidia may have more money than AMD these days but most companies don’t like pissing away dollars unnecessarily.

            So, if we’re in a court of public opinion here, I don’t think you’ve established motive for them to do this thing that’s irritating you.

            I don’t have a vendetta against anything but untruth and FUD.

            • auxy
            • 5 years ago

            The problem with your whole argument is that you’re begging the question a lot — that is, you’re doing a lot of circular reasoning:[quote<]Why on earth would a company like NVIDIA piss away valuable R&D budgets on unnecessary hardware[/quote<]You're presuming that they actually did. You make the unfounded assumption that they did, and work from there, which kinda brings down your whole argument. "Logic would disagree", "pissing away dollars", etc -- it's all based on the assumption that G-SYNC is legitimate. Basically, you're assuming that you've won the argument before you even make an argument. Obviously, this is no way to have an argument.

            • torquer
            • 5 years ago

            Ok so we live in a fantasy world where R&D is free, chips fall from the sky like bird crap, and G-Sync is a manifestation of El Diablo himself conspiring to make monitors more expensive than your average girl-on-a-tech-site can afford.

            See, you have to tell me that we’re disregarding all reason and logic before we start, then we can have a better argument. Now that I know the rules, I can tell you that G-Sync is actually forged from the tears of Swahili children in Mr Burns’ factory-o-evil. You see its a plot by the Illuminati in concert with the reverse vampires in a fiendish plot to provide Nvidia with a completely-free-to-them way to jack up the monitor prices for about 1% of the entire monitor market for… actually I still can’t think of a good reason why they’d do it even if it didn’t cost them anything *to* do it. Maybe you can fill in the rest.

            If you want to hop back on the unicycle of reality, however, I think we can disregard silly notions like R&D being “free” and not costing them anything. I don’t know how many medium to large enterprises you’ve worked for, but even internal IT groups have to fight for every budgetary dollar. Hell even at Microsoft they watch every penny. Engineering resources and time aren’t free, and Nvidia has to contract out the creation of this chip to someone else. Its even based on an ASIC and not a custom in house design, so there had to have been a fair number of people involved.

            I’m not sure what point you’re trying to make. I find it hard to believe that you actually think that Nvidia was able to design this evil contraption for free, with unknown motives, just to somehow manipulate or otherwise meddle with such an impossibly small subset of the overall monitor market and an only slightly larger subset of their own users.

            Step 1. Invent G-Sync hardware for free that its totally unnecessary because it can be done entirely in software according to auxy

            Step 2. Somehow convince a small number of monitor manufacturers to include overpriced didn’t-need-it-anyway hardware in order to make their monitors more difficult to sell

            Step 3. ?

            Step 4. Profit?

            • auxy
            • 5 years ago

            See, what you’ve done now is called a “strawman argument”, where you misrepresent my argument as something silly or foolish, and then attack the misrepresented argument rather than my actual argument.

            The actual argument is not that the that the G-SYNC module was “free” or that R&D is “free”, but rather that it was relatively simple and cheap to develop, and the very high premium of $200-$300 per monitor is not justified.

            The G-SYNC module uses not an ASIC but [url=http://www.anandtech.com/show/7582/nvidia-gsync-review<]an Altera FPGA[/url<], which, by its nature is much faster and cheaper to design and produce than a classical ASIC. It also has 768MB of cheap DDR3, which is purportedly used for overdrive calculations among other things. FPGAs are not cheap and I don't really doubt that the BOM on a G-SYNC module is fairly high. The problem with this line of thinking is the idea that the customer is responsible for the cost, just as tap22sf said above. It's true that in the end, the customer is responsible for the final cost, and that determines sales, and there is an argument to be made that we, the enthusiast purchasers, should be 'thankful' to NVIDIA that they brought the idea to market at all. However, it's difficult for me to be 'thankful' for something that costs me $200 when, again, it is demonstrably unnecessary. Now, let me step to the side real quick and note once more that I'm not saying that the G-SYNC module "does nothing". I'm saying it is [i<]unnecessary.[/i<] Sure, it's true that we have not seen a good scientific comparison of G-SYNC on a compatible monitor and mobile G-SYNC; I'll grant that -- but judging based on the PCPerspective article and the thread in their forums, I'm not impressed with what it does, and I certainly don't think it's worth $200. "Sure," you say, "well, that's, just, like, your opinion, man." And yeah, sure, I can get behind that; it's true, that's 'just' my opinion. Well, 'just' my opinion and all those other enthusiasts who are looking real hard at adaptive refresh displays right now. If I can get G-SYNC for $monitor+200$, or G-SYNC*0.95 for $monitor, I'm sure you can guess which one I'm going to pick. I am not a rich woman. By most counts, in fact, I'm actually fairly poor. (I have to spend a lot of money on medical bills.) If I had the money to buy a G-SYNC monitor as a frivolity, we might not be having this conversation, as while it would irk me, I probably wouldn't care enough to comment on it. But I don't have the money, and that's why it frustrates me. You COULD characterize my argument as "waah, I'm too poor for G-SYNC", but from my point of view, even that kinda makes NVIDIA look bad. They've positioned G-SYNC as an exclusive, prohibitively expensive feature (most of my monitors cost less than $200 total cost), and from what I have seen and read it just does not justify that cost. You're happy with your G-SYNC monitor and like I said above, I think that's great. I'm sure you'd be a lot happier with essentially the same monitor and $200 in your pocket, though, huh? I've made my argument and reading over it again, all you've done in the last 3 posts is misrepresent (misunderstand?) and move the goalposts, while my logic has remained fairly consistent. "G-SYNC isn't worth a $200 premium, and I blame NVIDIA for pricing it that way" is a fairly simple argument, so I don't really know why I had to spend all this time explaining it. I guess I'm nicer than I thought! (´▽`) I won't be posting in this thread again, though. Even saints have their limits.

            • VincentHanna
            • 5 years ago

            As far as the wall-0-text battles go, I had trouble following both your and Torqer’s points, but I’m pretty sure Torquer is right.

            In your last post, you basically implied that your “point” was that the G-sync module was overpriced (e.g. not worth $200). That was clearly not your point.

            You began this argument by explaining that G-sync can be done, for free, in laptops, and therefore (and here is where you lost both of us obviously) G-sync should be free, and adaptable to [b<] any [/b<] monitor. I mean sure, Nvidia could have come out with a proprietary connector AND module that replicates the way laptops use dynamic refresh, but I'm pretty sure that isn't your point... You then proceed to insist, categorically, that the Gsync module is unnecessary, and yet it seems pretty necessary to me. I would love to see how you got Gsync working on your desktop without it. Is there a firmware patch somewhere? The sheer fact that there are no free-sync monitors on the market tells me that some changes are required above and beyond flipping a switch. You then insist that the guy is "question begging" though the quote you provided isn't self referential at all... [quote<]Why on earth would a company like NVIDIA piss away valuable R&D budgets on unnecessary hardware[/quote<] This is a perfectly valid point. There are some assumptions built into this of course ("Did nvidia invest money into gsync?", and "is this hardware unnecessary?" being the two that jump out at me.) The latter being YOUR assumption to prove, not his. Either way, Its not question begging. Now, had you said, from the outset, hey, Nvidia is charging waaaaaaay more $$$ than this tech is worth, we'd have an entirely different discussion on our hands. A discussion centered on the[i<] VALUE[/i<] added by an extremely low-overhead alternative to anti-aliasing(that doesn't add blur, isn't wasteful, doesn't cause jitter or tearing...), or at best, a technical discussion on the marginal price/net profit per chip, and "fairness." And here is what I have to say about that: I'm sorry you can't/don't want to spend 400 bucks for a gsync monitor, really(I don't either!). Here is the deal. Its new tech. New tech is always expensive. Once AMD releases their competitor to Gsync, and once Gsync modules have achieved a certain level of production, I'm quite sure that their cost will fall dramatically. That's tech.

            • auxy
            • 4 years ago

            Okay, I know, I said I wasn’t going to post in this thread again but this is so stupid I had to come back.
            [quote=”VincentHanna”<]In your last post, you basically implied that your "point" was that the G-sync module was overpriced (e.g. not worth $200). That was clearly not your point.[/quote<]Really? Even though I said it in every post?[quote="VincentHanna"<]You began this argument by explaining that G-sync can be done, for free, in laptops, and therefore (and here is where you lost both of us obviously) G-sync should be free, and adaptable to any monitor.[/quote<]Not quite. I began by remarking that the ability to replicate the vast majority of G-SYNC's functionality without a G-SYNC module proves that the G-SYNC module is unnecessary. There's no "proprietary connector and module" required.[quote="VincentHanna"<]You then proceed to insist, categorically, that the Gsync module is unnecessary, and yet it seems pretty necessary to me. I would love to see how you got Gsync working on your desktop without it. Is there a firmware patch somewhere?[/quote<]This is a logical fallacy. I don't have to have it running on the desktop to prove that G-SYNC can be done without a G-SYNC module.[quote="VincentHanna"<]The sheer fact that there are no free-sync monitors on the market tells me that some changes are required above and beyond flipping a switch.[/quote<]Yes, DisplayPort 1.2a has to come to market. However, this is a black-or-white fallacy. "It can't be done by flipping a switch" is not the same as "it requires a $200 add-in module".[quote="VincentHanna"<]You then insist that the guy is "question begging" though the quote you provided isn't self referential at all... [/quote<]It absolutely is. "Begging the question" is a logical fallacy related to circular reasoning. His circular reasoning is "the G-SYNC module must be required, because NVIDIA spent a lot of money to make it, because it is required, because NVIDIA spent a lot of money to make it", etc. The assumption built into his statement is that it is correct. That is begging the question. Go [url=http://afterdeadline.blogs.nytimes.com/2008/09/25/begging-the-question-again/?_php=true&_type=blogs&_r=1<]here[/url<] to learn what begging the question is.[quote="VincentHanna"<]Now, had you said, from the outset, hey, Nvidia is charging waaaaaaay more $$$ than this tech is worth, we'd have an entirely different discussion on our hands. A discussion centered on the VALUE added by an extremely low-overhead alternative to anti-aliasing(that doesn't add blur, isn't wasteful, doesn't cause jitter or tearing...)[/quote<]Now, look -- I'm trying to be nice here but this part is the reason I got out of bed to make this post and the reason I posted again. This is so mind-bogglingly cretinous that it, well, boggles my mind. Not only do [i<]you insist that I didn't say what I was saying the entire time[/i<], but [b<]G-SYNC HAS NOTHING TO DO WITH ANTI-ALIASING.[/b<] You don't even understand the issue at hand. Why am I even talking to you? Why are you even posting on this forum?[quote="VincentHanna"<]Here is the deal. Its new tech. New tech is always expensive. Once AMD releases their competitor to Gsync, and once Gsync modules have achieved a certain level of production, I'm quite sure that their cost will fall dramatically. That's tech.[/quote<]G-SYNC is going to fall by the wayside. G-SYNC monitors in the wild are still a rarity and it's because of the price, and the vendor-locked nature. The idea that people will continue to pay an extra $(more than the rest of the monitor) for something that can be done without the expensive hardware is just ridiculous. Frankly, I'm tired of arguing with you people. This is stupid. Here is my entire argument. I will summarize it for you in short words:[quote<]G-SYNC can be done without a G-SYNC module.[/quote<][quote<]This proves that G-SYNC does not require the module.[/quote<][quote<]G-SYNC modules add ~$200 to a monitor's price.[/quote<][quote<]Ergo, G-SYNC monitors are $200 overpriced.[/quote<]Done. That's it. There. That's the whole argument. Pack up and go home, because there's nothing else to say here.

            • torquer
            • 4 years ago

            So, while you refuse to accept it, your argument has been thoroughly defeated and now you’re getting angry. No need for that. The lesson here is not to make baseless statements wrapped in so many words and fancy-debate-terms that you give the appearance of intellectual authority.

            You don’t get to decide when the discussion is over. You only get to decide when your participation in it is over. As I’ve said before, you’ve stated opinion as fact. You have limited input of data, basically one piece by PcPer on non-production hardware and software to “prove” that the module isn’t necessary even though mobile display technology is very different from that of desktop PCs, a fact you completely disregard and ignore because it doesn’t agree with your opinion.

            You’re picking and choosing “facts” to support your opinion, then demonizing those who disagree and inflating the whole argument to be something that literally cannot be proven – specifically whether G-Sync would work ON THE DESKTOP without hardware, as it APPEARS to on this specific LAPTOP in the PCPer article. To make that case you’d actually need to do a side by side comparison in a controlled environment with production hardware/software. Unfortunately today that doesn’t exist and thus cannot be done or proven yet.

            This whole thing started because you presented your opinion as irrefutable fact and then defended it like religious canon. Don’t do that. It makes you look less reasonable and less logical than I am sure you really are.

            • VincentHanna
            • 4 years ago

            [quote<] [quote="VincentHanna"<] In your last post, you basically implied that your "point" was that the G-sync module was overpriced (e.g. not worth $200). That was clearly not your point. [/quote<] Really? Even though I said it in every post? [/quote<] Yes, really. That is not the point you are defending. That is not what the subject of this argument is about. Your central thesis is that G-sync doesn't require any additional hardware, and that the module is "unnecessary." That it is included, [i]simply[i] for the purpose of driving up the price and for no other reason. [quote<] [quote="VincentHanna"<] You began this argument by explaining that G-sync can be done, for free, in laptops, and therefore (and here is where you lost both of us obviously) G-sync should be free, and adaptable to any monitor. [/quote<] Not quite. I began by remarking that the ability to replicate the vast majority of G-SYNC's functionality without a G-SYNC module proves that the G-SYNC module is unnecessary. There's no "proprietary connector and module" required. [/quote<] Except, this is a completely false statement. No entity has yet been able to get Gsync working without specialized hardware. Not Nvidia, not AMD, not Russian hackers, nobody. You seem to believe that, because the modules used in laptop displays are older, they aren't specialized. In fact though, laptops have always used a refresh system that is completely different than the one employed by desktops. In order to replicate the results of the laptop only patch on the desktop, one would need to completely reconfigure proprietary or industry standard hardware, or both. You think the Gsync premium is heavy now, imagine if every factory in the world had to retool, just to make monitors compatible with the measly 200k units per year driven by the sale of Nvidia GPUS, and had to maintain those production lines in parallel with their VESA/Display port standard ones. One of the beautiful things about modules, is that they are modular... meaning "easy to install into some other existing thing" [quote<] [quote="VincentHanna"<]You then proceed to insist, categorically, that the Gsync module is unnecessary, and yet it seems pretty necessary to me. I would love to see how you got Gsync working on your desktop without it. Is there a firmware patch somewhere? [/quote<] This is a logical fallacy. I don't have to have it running on the desktop to prove that G-SYNC can be done without a G-SYNC module. [/quote<] You have really got to stop using the word "logical fallacy" in informal settings. To date, I don't think you have correctly identified a single one. First of all, any time a person says X seems this way [b<] [u<] to me [/b<] [/u<] the thing that follows cannot be a logical fallacy. The reason being it is not an argument, it is a statement. Statements cannot be fallacies. "All cats are green" is not a logical fallacy, it is merely an incorrect statement. Second of all, if a version of G sync *arguably, could potentially* be accomplished with no additional hardware, then, I don't think it is unreasonable to point out that [b<] nobody [/b<] has been able to crack that nut yet. Meanwhile, the irony police are coming to arrest you right now because, as we have discussed, Gsync does require specialized hardware. [quote<] [quote="VincentHanna"<]The sheer fact that there are no free-sync monitors on the market tells me that some changes are required above and beyond flipping a switch. [/quote<] Yes, DisplayPort 1.2a has to come to market. However, this is a black-or-white fallacy. "It can't be done by flipping a switch" is not the same as "it requires a $200 add-in module". [/quote<] Okay, now you are literally making up fallacies. I'm going to assume you mean "false dilemma" however in this case, you are wrong. If Gsync requires additional hardware, then the module is necessary, and whether or not it is worth $200.00 becomes a value judgement. There literally is no middle ground between Gsync requiring additional hardware, not present in (desktop) monitors made prior to its release and it's not requiring additional hardware. [quote<] [quote="VincentHanna"<]You then insist that the guy is "question begging" though the quote you provided isn't self referential at all... [/quote<] It absolutely is. "Begging the question" is a logical fallacy related to circular reasoning. His circular reasoning is "the G-SYNC module must be required, because NVIDIA spent a lot of money to make it, because it is required, because NVIDIA spent a lot of money to make it", etc. [/quote<] By that logic, anything anyone has ever said is question begging. The Gettysburg address is question begging. Apparently all you have to do is pull a quote out of context, copy it 2x and add the word "because". "The G sync module is unnecessary, because I can replicate most of the features of gsync without the module, because the g sync module is unnecessary, etc" oh, wait, that one actually works : / [quote<] [quote="VincentHanna"<]Now, had you said, from the outset, hey, Nvidia is charging waaaaaaay more $$$ than this tech is worth, we'd have an entirely different discussion on our hands. A discussion centered on the VALUE added by an extremely low-overhead alternative to anti-aliasing(that doesn't add blur, isn't wasteful, doesn't cause jitter or tearing...) [/quote<] Now, look -- I'm trying to be nice here but this part is the reason I got out of bed to make this post and the reason I posted again. This is so mind-bogglingly cretinous that it, well, boggles my mind. Not only do you insist that I didn't say what I was saying the entire time, but G-SYNC HAS NOTHING TO DO WITH ANTI-ALIASING. You don't even understand the issue at hand. Why am I even talking to you? Why are you even posting on this forum? [/quote<] You got me, I meant to say "alternative to Vertical sync" which should have been obvious because aliasing doesn't cause tearing, jitter, and dropped frames. But yes, I was technically incorrect. I now feel "so mind bogglingly cretinous that it boggles the mind." [quote<] Frankly, I'm tired of arguing with you people. This is stupid. Here is my entire argument. I will summarize it for you in short words: G-SYNC can be done without a G-SYNC module. (on laptops) This proves that G-SYNC does not require the module. (on any display) G-SYNC modules add ~$200 to a monitor's price. (this is non-negotiable and has nothing to do with supply/demand economics) Ergo, G-SYNC monitors are $200 overpriced. (because even if it were a software only feature, I wouldn't be willing to pay for dynamic refresh, a feature that makes lower FPS games feel like 60 FPS games. This feature has no marginal value whatsoever.) Done. That's it. There. That's the whole argument. Pack up and go home, because there's nothing else to say here.[/quote<] *please, formatting gods

            • torquer
            • 4 years ago

            Here’s the problem. You, like probably the majority of people who post in comment threads and forums on the internet, aren’t satisfied with stating at the onset what you now say is your thesis: I, auxy, believe that G-Sync is overpriced and unnecessary.

            No one can really challenge you on that. You’ve decided it isn’t worth the money to you and that is totally fine and acceptable. But, you didn’t stop there. You went on to do what most people do – to try to convince the world that your truth is a universal truth. Essentially, G-Sync not only is too expensive and not needed, but that is the truth for EVERYONE out there and anyone who feels otherwise is wrong. You then take it a step further and speak of it in terms that subtly imply theres some grander Machiavellian purpose behind it all. It sounds very tin-foil-hat and erodes your credibility.

            The truth is, none of us know what G-Sync modules cost to develop or manufacture. Any reasonable person agrees its not $0. Whether the current cost is “worth it” is up to the individual consumer and the greater market. If no one buys it, it won’t exist. If a better solution comes out, hopefully it will replace it. Mobile G-Sync is a different animal and thus it is not an apples to apple comparison.

            Summary: If you don’t like it, fine, don’t buy it.

            • Jason181
            • 4 years ago

            This article explains why variable refresh works on laptops without the g-sync module, but would not work on desktops (generally speaking). I remember when they did this demo that they talked about the unique nature of the connection between a laptop’s display and graphics card being the key to showcasing freesync without additional hardware.

            The article even talks about the fact that additional hardware would be necessary for most desktop monitors to be compatible with variable refresh. If this weren’t the case, AMD would’ve come out with freesync drivers that don’t require the expensive module included on g-sync enabled monitors, and would’ve wiped the floor with them.

            [url<]http://www.pcper.com/reviews/Graphics-Cards/AMD-Variable-Refresh-FreeSync-Could-Be-Alternative-NVIDIA-G-Sync[/url<]

        • f0d
        • 5 years ago

        also in some countries the price difference between a 960 and 280x/290x is much larger than in the US

        edit: why cant i have those 290x prices in australia 🙁
        DAMN YOU AMD and your US only price cuts.!!!

      • anotherengineer
      • 5 years ago

      You’re a girl?!?!?!

      Thought you were a woman 😉

        • auxy
        • 5 years ago

        I wasn’t going to reply to this because it’s immaterial (I only put ‘girl’ because in my head it was the natural gender-change for ‘guy’ in the common idiom “I’ll be that guy”) but then it’s really sort of sad and creepy if I don’t so I’ll just say … uhh, well, nothing, I already said it. (‘ω’)

          • willmore
          • 5 years ago

          I’m with anotherengineer on this one. When my wife talks about her female coworkers as ‘girls’ it’s like fingernails on a chalkboard.

          Oh, and the female version of ‘guy’ is ‘gal’.

            • auxy
            • 5 years ago

            ‘Gal’ sounds hopelessly dated to me. Or possibly it’s just that my wife’s redneck family uses it a lot. Either way, I don’t like the word. ( `ー´)ノシ It grates on my ears in much the same way “swell” (as in, “that’s swell!”) or “groovy” does.

            For most young women and particularly in my case (for reasons I don’t really want to explain on a public forum) ‘girl’ is perfectly acceptable, I think, in casual usage. You wouldn’t describe a lady that way in a professional setting, naturally, but speaking casually I don’t think it’s a big deal. A lot of older women would probably get a kick out of being called or included in a group of ‘girls’. And surely I’m no mewling maiden, but likewise I’m neither a matronly missus; far from it! I get called ‘girl’ almost every day; if I started taking offense to it I’d spend all my time being offended, and I’m not that kind of person. It’s actually pretty hard to offend me! (*‘∀‘)

            I’m also not really one for protocol — which is to say I actively avoid power structures and rules of behavior, because I have [i<]extreme[/i<] problems with authority and I don't like being told what I can and can't do. Obviously, as an adult in our society you have to deal with some of that (I lost my last job over it) and I'm getting better about it, but the point of all this is that [i<]I don't really care what someone 'should' be or 'wants' to be called.[/i<] Ergo, all this business about the proper way of naming things is sort of meaningless to me! (´・ω・)

            • torquer
            • 5 years ago

            Boy howdy your posts are annoying. Its like the written word equivalent of someone who speaks just to hear their own voice. I realize you’re trying to come off as trendy/unique/anti-establishment/hip/whatever, but between your breathtakingly pedantic posts, contrarian nature, and condescension its really just an affront to sense. Fortunately for all of us those qualities are not gender specific.

            • auxy
            • 5 years ago

            Well, how negative. ( *´艸`)Pfufufu. Why so mean? [url=http://assets.head-fi.org/f/fc/1000x500px-LL-fc57ec1f_U-MAD.jpg<]Oh, I know.[/url<]

            • torquer
            • 5 years ago

            We get it. You’re a girl, on a tech site. But you don’t want to be “treated like a girl,” whatever the hell that means. How novel. You make comments intended to get a strong response and you bury people in words trying to trick them into saying something just short of 100% accurate so you can skewer them and feel superior.

            I think we’ve all seen the same song and dance before. Its not our first day on the Internet.

            • auxy
            • 5 years ago

            Ehh? When did I say I didn’t ‘want to be treated like a girl’ (whatever that means)? (´・ω・`) I actually said I don’t [i<]mind[/i<] being called a 'girl'. With that said, you mischaracterize me good sir! (*‘∀‘) I'm nowhere near as Machiavellian as you say. Most of the comments I make are merely correcting some factual error or pointing out some error in logic. I really have no interest in 'skewering' anyone! [super<]Isn't that kind of a masculine thing? Hohoho. ( *´艸`)[/super<] I'll acknowledge that my tone is inflammatory at times but I really don't mean it, usually. I'm just not good at ... playing nice, I guess.

    • geekl33tgamer
    • 5 years ago

    To everyone moaning about the price, it’s £173.99 here for the cheapest card used in this article (EVGA GTX 960 SSC). Straight conversion is $267 USD / $334 CAD excluding any shipping. We also don’t have MIR’s here, so that’s the price whether you like it or not.

    The other 4 cards are more expensive than the EVGA one, starting at £179.99 upwards.

    You guys should stop whining that it’s $200, seriously. 😉

      • Pez
      • 5 years ago

      Yeah we get shafted on price here in Blighty 🙁

      • excession
      • 5 years ago

      Our £173.99 does include 20% tax… but that still makes it £144.99 before tax which is $223… a 10% markup for no good reason!

    • sweatshopking
    • 5 years ago

    in before people whine about bias.

      • anotherengineer
      • 5 years ago

      I will whine about dollar value instead.

      MSI card in Can is $260 + $10 shipping + 13% tax = $305.10

      There is a $10 MIR though!!

      [url<]http://www.newegg.ca/Product/Product.aspx?Item=N82E16814127843&cm_re=MSI_GTX_960_GAMING_2G-_-14-127-843-_-Product[/url<]

        • 3SR3010R
        • 5 years ago

        1 Canadian Dollar equals $0.8016 US dollars

        [url<]http://www.bloomberg.com/quote/CADUSD:CUR[/url<] [quote<]MSI card in Can is $260 + $10 shipping + 13% tax = $305.10[/quote<] So the REAL numbers (US based) are: MSI card in Can is $208.42 + $8.02 shipping + 13% tax = $244.58 and of that: 13% tax is because of Canada $8.02 for shipping seems reasonable and the $208.42 is less than the $209.99 here in the US So what exactly do you have to whine about (except not understanding exchange rates)?

          • anotherengineer
          • 5 years ago

          That our dollar was about par 6 months ago. I understand exchange rates perfectly well, that’s basically what I am complaining about.

            • Westbrook348
            • 5 years ago

            They’ll be on par once more, as soon as the U.S. Federal Reserve wants the stock market to go up again and starts QE4…

            • sweatshopking
            • 5 years ago

            You don’t understand our economy and its ties to the commodity market.
            Also:
            [url<]http://www.thestar.com/opinion/commentary/2015/01/17/monetarism-is-dead-finally.html[/url<]

            • jihadjoe
            • 5 years ago

            Isn’t the current exchange rate actually good for your economy though?

            • JustAnEngineer
            • 5 years ago

            That depends on whether you are a manufacturer exporting goods abroad or a consumer importing them.

            • jihadjoe
            • 5 years ago

            I mean you could be a consumer importing stuff, but your job is still likely supported by the manufacturing or service industry. Imported stuff might cost more, but overall you’d have more money to go around.

            • JustAnEngineer
            • 5 years ago

            Devaluing the currency makes Canadian goods cheaper and foreign goods more expensive. If you’re mostly spending your locally-generated income on products and services also produced locally (rent, daycare, Tim Horton’s, etc.), then your purchasing power is less affected. If you’re spending your Canadian dollars on imported items like NVidia GeForce GTX960 2GB graphics cards, then it takes more of them to buy what you want.

        • Ninjitsu
        • 5 years ago

        Tell me about it. We’ve gone from Rs.45/$ to Rs.62/$ in the last 2 years, and graphics cards are taxed around 30%.

        So the cheapest 960 here is effectively for $280, the cheapest 280X for $384.

        At least the 960 does well on price/performance in my local context…

        • DrCR
        • 4 years ago

        Couldn’t you hedge via a Forex play?

      • NeelyCam
      • 4 years ago

      No GPU article shall escape from those claims. Ever.

Pin It on Pinterest

Share This