GeForce GTX 1060 joins the battle with the RX 480 at $249

Nvidia unveiled the GeForce GTX 1060 this morning. The specs of this $249 Pascal card basically conform to what the rumor mill has been churning out over the past couple weeks. Here are the basics of the GTX 1060, compared to the GTX 1080:

  GPU

base

clock

GPU

boost

clock

Shader

processors

Memory

config

Memory

clock

PCIe

aux

power

Peak

power

draw

Suggested

price

GeForce GTX 1060 ??? 1700 MHz 1280 6GB GDDR5 8 GT/s 1x 6-pin 120W $249.99
GeForce GTX 1080 1607 MHz 1733 MHz 2560 8GB GDDR5X 10 GT/s 1x 8-pin 180W $599.99

Going by those numbers, the GTX 1060 appears to be about half of a GTX 1080, save for its 192-bit memory bus. Anandtech confirms that the GTX 1060 is powered by a new, smaller GPU called GP106. Considering that Nvidia claimed that the GTX 1080 offered up to twice the performance of the GTX 980 before it, the GTX 1060 is logically positioned to deliver the performance of a GTX 980 for about half of that card's launch price.

The eagle-eyed out there will notice that the GTX 1060 lacks an SLI connector. Nvidia told PCWorld that "very few gamers build SLI machines out of mainstream GPUs," and we've always recommended that gamers purchase the fastest single graphics card they can afford instead of pairing up two cheaper cards. Given those facts, the death of SLI on an affordable graphics card like this one may not be a big deal. Still, AMD's RX 480 lets the ambitious gang cards together in CrossFire if they so choose, and VR SLI may have been a promising application of the technology.

The Founders Edition GTX 1060 uses a short PCB that looks a lot like the one underneath the Radeon RX 480's cooler shroud. This view of the long-and-thin GP106 GPU confirms that it's a new chip. Strangely, the card's PCIe six-pin power connector isn't directly integrated onto the reference PCB—it's part of the cooler shroud itself. This stubby PCB design might pave the way for Mini-ITX-friendly GTX 1060s in the future.

The GTX 1060 offers three DisplayPort 1.4 outputs, an HDMI 2.0b out, and a dual-link DVI-D output.

Unlike the GTX 1080 FE, the GTX 1060 reference card doesn't feature a backplate, but we guess it doesn't really need one, either, thanks to its short PCB. The blower-style cooler doesn't have a cutout on its top side for extra air intake, but that design hasn't appeared on recent Nvidia reference cards to begin with.

The GeForce GTX 1060 will launch July 19. Partner cards should sell for $249, while the Founders Edition card will sell for $299 from Nvidia's website. It's an exciting time for the midrange graphics market.

Comments closed
    • ronch
    • 3 years ago

    So GTX 980 performance for $250? Expect RX 480 prices to drop. Either way, it’s a good time to go for a new midrange card.

    • Umbral
    • 3 years ago

    As much as I want to avoid spending ~$400 on GTX 1070, this is the product that would make me do it. Well played Nvidia!

      • Theolendras
      • 3 years ago

      Why spend 400$ on a 250$ product ?

        • tipoo
        • 3 years ago

        Maybe he’s a Canucker, conversion rate and tax gets me pretty close to it ๐Ÿ™

        • rahulahl
        • 3 years ago

        He said 1070. Not 1060.
        Apparently somehow the 1060 convinced him to purchase the 1070?

          • Krogoth
          • 3 years ago

          It is most likely because the 1070 offers a considerable boost over the 1060 for a $100-150 difference.

            • CScottG
            • 3 years ago

            This is exactly my thinking. It’s almost exactly Titan X (for $390 once the gouging has cooled-down in a few months) vs. regular 980 (plus some extra RAM) for $250.

            It’s still a large premium, but once you are up to $250, that extra cost of $140 doesn’t seem cost prohibitive, at least not in a way that the 1080’s extra $350 does when comparing it to the 1060’s price.

            If the 1060 had been $210.. then I probably wouldn’t consider the 1070.

            • bfar
            • 3 years ago

            That’s what I thought. Then I looked at the charts again and saw that the 1080 offered a considerable boost over the 1070 for $200. I might have gone a bit mad ๐Ÿ˜ฎ

          • Theolendras
          • 3 years ago

          That’s called re-edition magic. But yeah wasn’t sure if it was a Canadian or a 1070 would be buyer.

    • Meadows
    • 3 years ago

    This card looks very interesting to me with regards to replacing my venerable 660 Ti OC, but I’m with sweatshopking in waiting for a few months. Hopefully the queen’s currency won’t be entirely toilet paper by then.

    A GTX 1060 should realistically serve me for about 3 years with some solid overclocking and sub-4K resolutions, just like my previous mid-range purchases have done. If previous experience is anything to go by, it should allow high graphics settings (not maximum, but high) in new titles for at least a year and a half, as long as you don’t have multiple monitors or been bitten by the 4K mosquito.

      • sweatshopking
      • 3 years ago

      HAHAHA MY ANSWER!

        • Meadows
        • 3 years ago

        I love you, sweatshopking. In a manly way. Still not buying that you have a wife, though.

          • sweatshopking
          • 3 years ago

          My kids were immaculate conception, as well as from a man! I AM AMAZING

          PS <3

    • rogue426
    • 3 years ago

    At this point I still dont have a good idea which product would be best to replace a 670 GTX with a $300 ceiling.

      • tipoo
      • 3 years ago

      Likely this, if you want to use up all of your 300 then this with a fancy cooler/fancy VRMs and a factory overclock (not the FE, screw those)

    • hiki
    • 3 years ago

    Being required to log in to use the card destroys his value.

      • ThatStupidCat
      • 3 years ago

      What do you mean? What happens if you put this on a computer without internet connection?

        • tipoo
        • 3 years ago

        He means to download the driver I believe. New thing Nvidia did. If you had no connection you’d be boned there anyways, but I’m sure there are driver mirrors if you don’t want to log into nvidia.

    • watzupken
    • 3 years ago

    As Nvidia move down to more budget friendly GPU, the Founders’ Edition nonsense is getting more unbearable. USD 50 premium over a 249 bucks card is almost 17% hike for a very basic build card, based on the pictures of the front of the card. For USD 50, I expect Nvidia to not skimp on the heatsink in the absence of more information. I am pretty sure the 3rd party ones will produce a card with better build than this.

    • HERETIC
    • 3 years ago

    Anyone here game enough to admit paying money to Charlie and let us know about this-

    SemiAccurate is hearing of a serious problem affecting Nvidiaโ€™s GPU supply. If this turns out to be what SemiAccurateโ€™s sources are describing, it may change the GPU market for the rest of the year.
    [url<]http://semiaccurate.com/2016/06/24/serious-manufacturing-problem-hits-nvidia-consumer-gpus/[/url<] Is NV going to have supply problems the rest of the year??? Does this mean-prices are going to remain high rest of year??? Edit-Why the downvotes? Is it the mention of semiaccurate? Or some butt hurt nv fanboy don't like anything bad said about nv? If NV has supply problems this also means AMD prices will stay high as well. Just looking for information............................

    • Krogoth
    • 3 years ago

    Here is an interesting detail that people overlooked on reference 1060 cards.

    The six-connector PCIe connector is actually on the stock HSF. The PCB uses a “floppy molex” for those 12V rails.

    In other words, you cannot use third-party HSF solutions without resorting to wire mods. Good luck on figuring out the pinout on that “4-pin floppy molex”. I doubt third-party HSF guys are going to go out of their way to make a reference-1060 friendly power adapter. AIBs are going to opt for the standard 6-pin to 8-pin PCIe connector on the PCB.

      • smilingcrow
      • 3 years ago

      By reference do you mean the FE cards?
      Not sure that will be a big seller considering that they are only available direct from Nvidia at a premium price and are released at the same time as the custom boards.

      • tipoo
      • 3 years ago

      If the higher end FE/reference boards are any indication…Don’t get the reference boards. Hot and loud.

        • Krogoth
        • 3 years ago

        Founders Edition is just first batch of reference cards.

        Nvidia is just taking advantage of impatient early adopters who have no self-control. They are just as bad as those who take pre-orders for “special edition” games.

      • psuedonymous
      • 3 years ago

      [quote<]In other words, you cannot use third-party HSF solutions without resorting to wire mods. [/quote<] False! The [url=https://i.imgur.com/aVIrCjI.jpg<]PEG connector is captive to the PCB itself (soldered to 'rail' points for 12V and GND, plus the Sense pin)[/url<]. So for a 3rd party HSF you just need to decide where to put the connector, or just leave it dangling if you want. Pinout is still standard PEG, so I have no idea where you got the ;4 pin floppy molex' idea from. This is a huge boon for SFF cases, as it means you can move the PEG connector off-vertical to reduce case Z-height, rather than leaving a huge dead-space just to accomodate the connector sticking up.

    • Firestarter
    • 3 years ago

    now the wait is on for the biggest, baddest most power hungry slab of <20nm silicon dedicated to pushing a record number of pixels. The releases of the RX480 and GTX1060 are exciting for a lot of people, no doubt, but there are a lot of people out there who already have several year old GPUs that come within shouting distance of the performance these GPUs offer, and upgrading to these cards would not be a big step for them. I’m anxious to see what kind of performance a 600mm2 300+watt card of this generation can achieve

    • TwoEars
    • 3 years ago

    Nvidia is charging an arm and a leg for the high-end because there’s no competition, and now they’re squeezing AMD just because they can. I saw it coming a mile away and now AMD is in trouble, no doubt about it.

      • Airmantharp
      • 3 years ago

      They’re not really ‘squeezing’ AMD; AMD is in full control of their product line, and they chose to put out the RX 480 at this price and performance level.

      They could have also built a high-end competitor to be released alongside the GTX 1080, but chose not to.

        • smilingcrow
        • 3 years ago

        That seems a very rose tinted view of things.
        If I could run one of these companies Iโ€™d much rather be in Nvidiaโ€™s position.
        They have the higher performance, the better efficiency, the market share, the mind-set that seems to see them as the premium manufacturer.
        Add all that together and they have a lot of flexibility with pricing whereas AMD are typically stuck at aiming at being the best value.
        This impacts their profitability which impacts their ability to invest in future products.
        So I don’t think they are in control of their product releases in the way Nvidia are due to lack of resources.
        They have bet big on the $200 – $250 market segment with nothing else due soon and itโ€™s a very wobbly start.
        If thatโ€™s control itโ€™s not convincing me.

      • smilingcrow
      • 3 years ago

      Too early to say, let’s see how all the permutations play out first in terms of custom boards, sorted drivers.
      It looks fairly promising for Nvidia but this is a boxing match so losing a round badly isnโ€™t always fatal. The 480 looked on the ropes in the 1st round but the new driver has it coming out for the 2nd with renewed vigour.

    • f0d
    • 3 years ago

    some extra information on the FE cards on the pcperspective video
    [url<]https://youtu.be/W8TMROV_Dog[/url<] FE only on nvidia.com FE limited run

    • HisDivineOrder
    • 3 years ago

    Great, great.

    So, more of the same old, “We announce for one price, but add a surcharge for a barebones Founder’s Edition and then every OEM asks themselves, ‘Is our card really worth THAT much less than that barebones version?’ and then sets a price either above or just below the surcharged Founder’s Edition pricing” song and dance.

    I mean, I’ve seen it happen with the 1080. I’ve seen it happen with the 1070. So I find myself guesstimating that the 1060 will in fact be closer to the $275 pricing I figured on for any card I’d buy rather than $250.

    I’m not saying it’s a horrible value (for 980-level performance) at that price. It’s just disingenuous to say they’re going to be $250 when I highly doubt it. I haven’t seen a $380 1070, I haven’t seen a $599 1080.

    I doubt I’ll see a $249 1060.

    I also don’t particularly like how dieshrinks that used to bring performance improvements up from the current segment to the same segment turning into an opportunity to gouge pricing and shift x60 cards (used to be the $150 bracket, then the $200-225 bracket, now to the $250-275 bracket) upward again.

    I remember the good old days when competition was ripe and nVidia was actually battling to introduce performance improvements without raising prices to increasing levels of absurdity.

    A 670 would be supplanted by a 770. A 460 would be supplanted by a 560. Now nVidia looks for the excuse to make the 960’s replacement be a 970-priced GPU. And that’s amazing?

    No. I’m sorry, I don’t think it is. Price increases for the sake of increasing profits is of little value to me or to most any consumer wanting mid-rangish high performance for $300-ish. nVidia’s aware of this desire or they wouldn’t have bothered to play the pricing games they did with the 1070 (“It’s well below $400! So not that much higher than the 970’s $330!” Except it never hit that.)

    Since nVidia knows it’s an issue at launch, they must know it remains an issue.

      • mnemonick
      • 3 years ago

      The GTX 660 was $230 at launch and the GTX 760 was $250. The GTX 960 was the first time in a while that NV dropped the MSRP on a new GPU, so going back to $250 for the GTX 1060 isn’t that big a deal.

      It’s not what I’d prefer, but it’s not unreasonable. ๐Ÿ™‚

        • HisDivineOrder
        • 3 years ago

        At $250, I’d be fine. That’s a mild increase. But you have to know that it’s not going to be $250. It’s going to be somewhere between the promised MSRP and the Founder’s Edition pricing, if not more than the FE pricing.

        That’s my problem. That’s what I stated was my problem. That $250+, probably $275, pricing is a substantial increase from the 960’s pricing.

        And remember. Founder’s Edition is launching for $299. OEM’s look at that and say, “Hey, that’s just barebones. Our card has a better cooler. It’s surely worth more than that.”

        $299+ anyone? You might say, “That’s ridiculous. They wouldn’t do that.” They already did with the 1070 and 1080.

      • Pancake
      • 3 years ago

      Do you not know how the free market works? In all likelihood the 1060 is cheaper to make than the RX480 so board manufacturers and retailers will have a decent margin. So, NVidia can peg it at $249. However, if demand is bonkers as it likely will be, than the price will rise to what the market will bear. The market will soon speak – people will bear an increased price for the 1060 because they think it’s worth it. That’s the beautiful, raw judgement of capitalism. I love it.

      AMD aren’t pricing the RX480 at $240 because they’re trying to offer consumers the best possible deal because they’re nice people. It’s what they think they can sell it for. I imagine when 1060 production ramps and tests come in showing it’s superior performance and energy efficiency (and just plain better construction and drivers) then that’s going to put downwards pressure on the RX480 price. As a consumer, you may rejoice in that but it’ll be the usual misery for AMD – disgruntled partners and another mediocre financial quarter.

        • HisDivineOrder
        • 3 years ago

        I know that nVidia promised the MSRP of the 1070 would be $379 and the 1080 would be $599, except for the Founder’s Editions.

        They didn’t hit those lofty ambitions.

          • Pancake
          • 3 years ago

          MSRP = manufacturer’s suggested retail price. How is it NVidia’s fault if their cards are so desirable that, in a competitive market, the prices are pushed way up? It’s not like in Soviet Russia where a price is centrally planned. So, we have a system where those who can afford will pay.

          I’m in the can’t afford camp looking dreamily on. I have a GTX970 that I paid 500 dollaroos for and it won’t be until next year at earliest when I can justify a system refresh – if there even will be one – I might go all in on mobile form factors. So I miss out. People that can afford the asking price now win. Winners and grinners.

          • DancinJack
          • 3 years ago

          I think you’re confused about what MSRP is.

      • jessterman21
      • 3 years ago

      It’s true – and it makes me sad. I’ve been waiting for a 980-level GPU as my next upgrade (currently rocking an overclocked GTX 960 2GB) scheduled around my tax refund next year, and I was hoping to get a discounted card for less than $200 at that time. There’s a lot of months in between now and then, but the way pricing has trended in the past on team green’s cards, I don’t expect it will happen… I suppose there will be a 4GB GTX 1050Ti by that time, but I’m considering waiting another refresh cycle now.

        • travbrad
        • 3 years ago

        Right now both Nvidia and AMD seem to be selling cards faster than they can make them so yeah I wouldn’t expect prices to drop in the near future. There is just no reason for retailers or Nvidia/AMD+board partners to lower prices when they are already selling out.

        Hopefully the supply will improve and eventually force prices down towards the MSRP at least. This was sort of to be expected moving to a new manufacturing node though. People shouldn’t have been expecting prices/yields/supply to instantly be great.

    • Kretschmer
    • 3 years ago

    This looks really nice (not replace a 290X nice) and makes me lament buying into Freesync.

    Hopefully Nvidia will enable adaptive sync without an ASIC in the future…

    • USAFTW
    • 3 years ago

    I look at the those empty memory pads and wonder if that chip is 256-bit capable? Do we know the die size? I’m too lazy to count the pixels ๐Ÿ˜›

      • mnemonick
      • 3 years ago

      I think chuckula’s scientifically derived 187 mm[super<]2[/super<] estimate above is pretty close. I saw those empty pads too, and now I wonder if the GP106 still has some unused silicon tucked away. ๐Ÿ˜€

      • chuckula
      • 3 years ago

      It’s possible that they re-used a PCB layout that has pads for a 256-bit memory controller but there’s nothing that’s actually connected to those traces.

    • CScottG
    • 3 years ago

    ..when asked about pricing; Nvidia’s director of marketing responds:

    [url<]https://www.youtube.com/watch?v=g7-tskP0OzI[/url<]

    • YukaKun
    • 3 years ago

    I just read this…

    [url<]https://semiaccurate.com/2016/07/06/nvidias-gp104-based-gt1060-real/[/url<] Anyone with some insight on that serious accusation? Cheers!

      • chuckula
      • 3 years ago

      Well for once Charlie is right!

      The GP104 based GTX-1060 isn’t real.

      Because the GTX-1060 is based on GP106 and there are plenty of photographs of completely operational cards with the GP106 silicon.

      He is just as right in that accusation as he was of accusing Nvidia of failing to launch a Polaris 11 based GTX-1050 too.

      So, Charlie is “right”.
      He’s a complete idiot, but he’s “right”.

        • CScottG
        • 3 years ago

        Plus:

        “semi-accurate” is spin-speak for NOT accurate. AKA – laughably inaccurate. “Journalism” these days seems to be more of joke.

        • September
        • 3 years ago

        Nvidia can always release a GP104 based GTX-1060 Ti down the road. I think it depends on whether they can sustain the high prices on the 1080/1070 which creates a price gap for the Ti to slot into. Probably on chip yields as well for GP104.

        I still want a 1070 for $330 or less to replace my 770, is that too much to ask? These prices just keep inching up!

    • mnemonick
    • 3 years ago

    Based on that ‘naked’ reference board shot, I can see 8 GB cards in the near(ish) future.

      • tipoo
      • 3 years ago

      On a 192 bit bus? It’s why the two rumor options were 3 and 6GB. Unless they go up to 12 I think, but that seems silly for this performance class, or even the current top end.

        • mnemonick
        • 3 years ago

        Considering that AIBs make 4GB GTX 960s on a 128 bit bus now, I don’t think it’s unrealistic to expect them to use those two empty RAM mount points for ‘premium’ gaming cards.

        Whether it offers any real benefit is obviously debatable. ๐Ÿ™‚

          • Waco
          • 3 years ago

          That’s a 128 bit bus, so it’s easy. On a 192 bit bus, the only easy sizes are 1.5 GB, 3 GB, and 6 GB.

            • the
            • 3 years ago

            nVidia has done a 2 GB card using a 192 bit wide bus before. See the [url=https://techreport.com/review/23419/nvidia-geforce-gtx-660-ti-graphics-card-reviewed<]GTX 660.[/url<] As the GTX 970 fiasco has shown, nVidia has a lot of flexibility in how it can configure its memory controllers. A 4 GB and 8 GB card are likely possible with the GTX 1060, though I suspect nVidia will just stick with 6 GB and maybe a 3 GB card for OEMs or specific markets (*cough* China *cough*).

            • Waco
            • 3 years ago

            Right, and that was a very odd configuration where half of the memory controllers ran at half speed. :P. I’d rather not see that repeated!

          • tipoo
          • 3 years ago

          Beaten to it, but the 960 was 128, not 192. 192 forces the use of these in-between capacities, 1.5, 3, 6, theoretically 12, etc.

            • mnemonick
            • 3 years ago

            Yeah, guess I was thinking their new delta compression voodoo could overcome the addressing limitation somehow. Or maybe it’d be like the GTX 970 with ‘fast’ and ‘slow’ VRAM.

            Maybe USAFTW (below) is on to something– could it be that the GP106 used in the 1060 has had some of its silicon disabled? Maybe there’s a 1060 Ti coming with more ROPs and a 256 bit bus? ๐Ÿ˜€

          • ImSpartacus
          • 3 years ago

          What do you mean by “empty RAM mount points”?

          Are you talking about how a 192-bit bus would require 2 more “RAM” chips than a 128-bit bus?

          From my understanding, a 128-bit bus generally implies 4 32-bit memory controllers that each get one memory chip and a 192-bit bus generally implies 6 32-bit memory controllers that, again, each get one memory chip.

          And since we’re generally working with 8Gb (i.e. 1GB) GDDR5 chips, that means a card with a 128-bit bus can only support a total of 4GB of VRAM and a 192-bit bus can only support a total of 6GB of VRAM. Again, that’s with today’s 8Gb chips. We might get 12Gb or 16Gb chips eventually ([url=http://www.anandtech.com/show/10193/micron-begins-to-sample-gddr5x-memory<]Micron has stated that they will support those capacities in GDDR5X[/url<]), but we've only got 8Gb chips today.

    • sweatshopking
    • 3 years ago

    NOW WHAT DO I DO?!?! THIS CARDS PERFORMANCE SHOULD BE CRAZY, BUT IT’LL BE MORE THAN IM ALLOWED!

    NOOOOOOOOOOOOO

      • tipoo
      • 3 years ago

      Time to start putting together your toothfairy money for a few months!

      • Meadows
      • 3 years ago

      Sell your wife’s horses.

        • sweatshopking
        • 3 years ago

        I did that already! SHE’S HORSELESS.

        • sweatshopking
        • 3 years ago

        Since i don’t need it until the end of august, imma wait and see how the market looks. If amd follows with some solid price drops ill look at them. If not, ill look at NVidia.

          • HERETIC
          • 3 years ago

          NEGOTIATION
          There must be something you can “sacrifice” over several weeks to get extra.
          Last resort-unless NV is getting amazing yields(doubtful on new process) 105
          can’t be far away-probably 75% performance……………………..

            • sweatshopking
            • 3 years ago

            Yeah. Like feeding my children

            • NTMBK
            • 3 years ago

            You only need one kidney, right?

            • Chrispy_
            • 3 years ago

            You could auction off your CAPSLOCK key; It would be worth enough to buy you a GTX 1060 and a box of tissues to wipe away your tears at losing such a sentimental part of your e-manhood.

            Life would be hard, holding down the shift key, but you’ll adjust in time.

            • anotherengineer
            • 3 years ago

            He already has a custom Caps key.

            It’s called Cruise Control, maybe it will auction for more $, possibly?

            [url<]http://www.wasdkeyboards.com/index.php/products/printed-keycap-singles/custom-text-cherry-mx-keycaps.html[/url<]

            • Concupiscence
            • 3 years ago

            Little things add up. Before I got a handle on my spending, I could buy a Radeon RX 480 in less than two months on the amount of money I blew buying coffee and lunches out during my work week.

          • anotherengineer
          • 3 years ago

          I have a Radeon 4850 you can have for free, well just have to pay for shipping…………….if Canada post isn’t on strike ๐Ÿ˜‰

          • tipoo
          • 3 years ago

          Yeah, wait and see how the horse market looks in August.

      • End User
      • 3 years ago

      Get a job and start weaning yourself off of your wife’s teat.

        • sweatshopking
        • 3 years ago

        Why would i do that? So i can pay for child care for three children? So my wife can clean and cook on her days off? Nobody wins in that scenario.

          • anotherengineer
          • 3 years ago

          Ya no doubt. $50/kid per day here for daycare. Unless you’re making $35/hr+ it’s not really worth it to go to work and pay for daycare for 3 kids and then probably a maid on top of that.

          • End User
          • 3 years ago

          Exactly! You made your choice. Keep it to you fracking self and stick with discussing technology. You personal situation is none of our concern.

            • sweatshopking
            • 3 years ago

            Your posts are getting less and less enjoyable.

      • End User
      • 3 years ago

      I thought you already thought this through when the 480 was released. You were going to stick with the R9 290 you already had.

      You really are high maintenance.

    • Chrispy_
    • 3 years ago

    Performance leaks are out and everyone’s seen them, not that it was even remotely a surprise.

    It’s exactly half a 1080 where it counts and so the performance will be half the 1080’s in most games. That’s going to put it a bit quicker than an RX 480. Even though it has more than half of the memory bandwidth/TMUs/ROPs of the 1080, that won’t matter, because those aren’t the limiting factors for any of the resolutions the 1060 is targetting.

    What I’m waiting for is the power and noise testing:
    1) I want to see how quiet/small/cool the 1060 can go.
    2) I want to see the efficiency scaling to a smaller chip, as an indicator of laptop models to come

    • chuckula
    • 3 years ago

    Good catch by PrincipalSkinner who was… [b<][i<]first[/i<][/b<] with the link confirming that the GTX-1060 is indeed on the GP106 silicon and is not merely another cut-down GP104 die like what is used in the GTX-1080 & 1070. Die size estimation fun time! Here's a link to the official dimensions of a GDDR5 package, which I am using as a reference to measure the die size of the GP106: [url<]https://www.micron.com/products/datasheets/65c410ee-af9c-4d6f-b35b-595ce11150c4[/url<] Here's the photo I analyzed that includes both the GP106 and a GDDR5 chip: [url<]http://cdn.videocardz.com/1/2016/07/Geforce-GTX-1060-28.jpg[/url<] And here's my (obviously estimated) die size measurement: Pascal width: 17.6 mm Pascal height: 10.6 mm Pascal area: 186.56 mm^2 (!) Admittedly that wasn't the best shot, but at only 187 mm^2 [let's assume a +/- 10% error margin] that GP106 is a decent bit smaller than its Polaris 10 competitor's die size [232 mm^2 according to TR: [url<]https://techreport.com/review/30328/amd-radeon-rx-480-graphics-card-reviewed[/url<] ] .

      • PrincipalSkinner
      • 3 years ago

      Principal means ‘first’ in high valyrian.
      edit :
      stupid typo

    • PrincipalSkinner
    • 3 years ago

    Charlie must be choking on his own bs right now.
    [url<]http://semiaccurate.com/2016/07/06/nvidias-gp104-based-gt1060-real[/url<] [url<]http://videocardz.com/61942/nvidia-pascal-gp106-and-gtx-1060-pcb-pictured-up-close[/url<] In other news : [url<]http://hexus.net/tech/news/software/94249-microsoft-makes-multi-gpu-support-easier-dx12-devs[/url<] Seeing this and seeing how high end is around 2X performance and 3X the price I can't help but think if midrange cards will keep high end prices in check in the future.

      • chuckula
      • 3 years ago

      Charlie has a life long allergy to this thing called “reality”.

        • PrincipalSkinner
        • 3 years ago

        I remember how he was claiming for months that one of Ngreedias GPUs was ‘unmanufacturable’. Big Kepler or Maxwell, can’t remember.
        A lot of people did call him out on his bull****.

    • jokinin
    • 3 years ago

    So to get a little more performance and more memory than 4GB of the RX480 you have to pay 80$ more? I don’t see how this is a great deal.

      • pranav0091
      • 3 years ago

      How does about ~4% more cash for ~10-15% more perf sound then?
      I’m not even talking about the TDP yet.

      Ah, yes, wait until the reviews are out for exact performance numbers.

      <I work at Nvidia, but my opinions are personal>

        • jokinin
        • 3 years ago

        You’re talking about de 3GB VRAM version? I don’t see 3GB as a future proof amount of VRAM. I think 4GB will become the next minimum requirement.
        If I was to get a new nvidia I’d have to go for the 299$ 6GB GTX1060, and yet I’d still get a little more performance (maybe), with less memory, and 30$ more than the 8GB RX480.
        Anyway, it will be interesting to see how AMD reacts to this.

          • tipoo
          • 3 years ago

          There is no 3GB version

          • pranav0091
          • 3 years ago

          Its 249$ for 6GB MSRP. Its 299$ for the 6GB FE. Take your pick – both available on the same day.

          Since you are so concerned about geebees and not about the game-performance, you’ll get better deals if you buy DRAM chips that arent attached to boards with GPUs.

          <I work at Nvidia, but my opinions are only personal>

            • tipoo
            • 3 years ago

            I hope higher availability means that partners won’t price it closer to the FE, like the 1080 and 1070

            • pranav0091
            • 3 years ago

            I’d think so too, the supply should be better since this is a mainstream part, but supply-demand is a two factor equation – and mainstream parts will have higher demand too.

            I’m an engineer, I dont have much of an idea how the markets work…

            <I work at Nvidia, but my opinions are only personal>

          • Chrispy_
          • 3 years ago

          If the AMD product ends up being the less desirable one again, the 4GB RX 480 will probably sell for less than the $200 MSRP. Probably down to $175 or lower if you use MIRs

          Nividia won’t be selling very many $300 1060s if that happens.

        • Meadows
        • 3 years ago

        What’s it like to work at NVidia? Serious question.

        Is it an extravagant hipster environment like at Google, or rather just business as usual?

          • pranav0091
          • 3 years ago

          Where I work, in India, its a very “normal” atmosphere – normal as in how a normal set of office cubicles (partitions?) would look like. There is decent privacy due to the high-ish walls, but you can look around if you stand up, and every person you meet seems to know something that interests you as an engineer. Nobody has any better-than-standard cubicles, not even the site leads. A newer building has motorized standing desks if that interests you. Free lunch and free office transport. No fixed timings, just get the work done. Very open culture, and very down-to-earth people. Some of us play (football/volleyball/cricket/foosball/ping-pong) at office.

          This is my first job, so I might be a bit lacking on perspective, but with what I know from friends working in other silicon companies, Nvidia is the more fun place to be – considerably more so compared to the east-Asian companies. And it helps to have some diverse work to do – performance to modelling to bringups – though I have been luckier than many on that front.

          I guess that puts it as “business as usual” ?

          <I work at Nvidia, but my opinions are purely personal>

            • tipoo
            • 3 years ago

            I sure what to see their low poly office when it’s done

            [url<]http://media.bestofmicro.com/nvidia-campus,C-W-373712-13.jpg[/url<]

            • DrDominodog51
            • 3 years ago

            I drove past it this week. It will be a while before you can see it complete.

            • tipoo
            • 3 years ago

            Yeah, they had hit the pause button on it until recently

            [url<]http://www.bizjournals.com/sanjose/news/2015/03/31/nvidias-futuristicsanta-clara-campus-is-back-on.html[/url<]

          • DPete27
          • 3 years ago

          If we’ve got a resident Nvidia employee, why not ask when Nvidia will be adding FreeSync support.

            • f0d
            • 3 years ago

            multiple nvidia people have already said they have no plans on adding freesync

            • pranav0091
            • 3 years ago

            Well, just because I am an employee, doesn’t mean that I am the official spokesperson. So I cannot make official statements, obviously. ๐Ÿ™‚

            <I work at Nvidia, but my opinions are purely personal>

    • selfnoise
    • 3 years ago

    I will buy the first of either the RX480 8GB or this card that is available in AIB configuration at the MSRP of $250.00.

    I’ll see y’all in 2017.

      • tanker27
      • 3 years ago

      I think you are confused with the terminology. [b<]All[/b<] Graphics cards are AIB.

        • ThatStupidCat
        • 3 years ago

        For me it’s just recently that I’m seeing AIB so I kept thinking this must be something special. Finally googled “AIB video card” and I get “add-in board”. Jaw drop. Face-palm. SMH

          • chuckula
          • 3 years ago

          It was like when I figured out that flammable and inflammable mean the same thing!

    • unclesharkey
    • 3 years ago

    Supposed leaked benchmarks puts the 1060 ahead of the 480.

    [url<]http://www.forbes.com/sites/jasonevangelho/2016/07/06/nvidias-gtx-1060-outperforms-radeon-rx-480-matches-gtx-980-in-leaked-benchmarks/#1e9e992848a7[/url<]

      • Tristan
      • 3 years ago

      Nothing new. Review is planned probably for 9.07, ten days before release, just like for 1080

    • Ninjitsu
    • 3 years ago

    [quote<] Considering that Nvidia claimed that the GTX 1080 offered up to twice the performance of the GTX 980 before it, the GTX 1060 is logically positioned to deliver the performance of a GTX 980 for about half of that card's launch price. [/quote<] 1080 was 70% faster on an average, iirc. So this is going to be closer to the 970, I think.

      • terranup16
      • 3 years ago

      Early performance leaks suggest otherwise.

        • rxc6
        • 3 years ago

        Personally, I consider a 3dmark benchmark to be useless.

        • geniekid
        • 3 years ago

        Early performance leaks suggested the RX480 would be on par with the 980 as well and that didn’t pan out.

        We’ll just have to wait for actual reviews from TR and AT to see.

    • ronch
    • 3 years ago

    Yes, you can CF a couple of RX 480s.

    Why kill just one slot when you can kill two? ๐Ÿ˜€

      • DancinJack
      • 3 years ago

      How many comments are you going to make on one story? Sheesh man.

        • ronch
        • 3 years ago

        Why? If people don’t like it when I post more than one comment, just tell me. I’m beginning to get tired of people around here who think they’re the only ones who can express their opinion, quite honestly. I’ll just be a lurker and avoid all the downthumbs and rude replies from people defending their favorite companies.

          • DancinJack
          • 3 years ago

          Not sure where you conjured all that up from. Sorry you’re having a bad day /shrug

          IMO, the three comments you made, seconds apart, could have been one comment. I just wasn’t sure why you had to make separate ones. I didn’t say anything about not wanting you to post comments at all. If you’re that sensitive to my “rude” comment though, maybe you are right that you shouldn’t be posting comments on TR articles.

    • ronch
    • 3 years ago

    Like I’ve said, this round in the GPU wars won’t be any different from how it’s been for the last 4 years. And it’s starting to look like Polaris is nothing more than a last-gen part with a few new funny bits tacked on. Reminds you of Bulldozer’s later iterations, doesn’t it?

    • ronch
    • 3 years ago

    Like I’ve said, Pascal will be more efficient than Polaris. That’s why it’s important to design a superior product because it allows you to easily chase your competitor wherever they go, until they’re out of breath and you catch them and eat them up.

    • wingless
    • 3 years ago

    The “death of SLI” on this card is a BIG DEAL. People buy this class of card with the hope of having the money to add another for double the performance. I’ve done this several times in my upgrade cycles over the last decade.

    This seriously bothers me….

      • chuckula
      • 3 years ago

      Well, if DX12/Vulkan are really the big deals that they claim to be, then traditional SLI and Crossfire are on the way out anyway.

        • willyolio
        • 3 years ago

        are the 1060’s fully DX12 compatible though?

        edit: lol to the downvoter who don’t understand. DX12 multi-adapter uses Asynchronous Compute AFAIK. nVidia’s implementation doesn’t really do asynchronous compute, just their own version that sort of does it halfway.

          • chuckula
          • 3 years ago

          I think it’s you who doesn’t understand the DX12 multi-adapter setup considering it’s been shown working perfectly fine using so-called “non-DX12” Maxwell hardware, much less Pascal which doesn’t seem to have any problem with DX12 at all.

          [url<]http://hexus.net/tech/news/software/94249-microsoft-makes-multi-gpu-support-easier-dx12-devs/[/url<]

        • hiki
        • 3 years ago

        Nope. It is tied to Windows 10.

      • Chrispy_
      • 3 years ago

      These will still likely work in DX12 multi-adapter mode, so it’s not really “death of SLI”. Additionally, AMD has been doing Crossfire over XDMA for a while now.

      More relevant to the point though, is that you really don’t get double the performance. At best you get 1.8x the performance, and even then that’s if your measure of performance is “average FPS”.

      The minimum FPS doesn’t improve as well as you’d hope, the fame latencies aren’t as good as you’d hope, the input lag isn’t as good as you’d hope, and the heat+noise+power consumtion isn’t as good as you’d hope. I’m not particularly sad that Nvidia is slowly dropping SLI support. It’s a bodge that doesn’t double the quality of the user experience, even if it might in some benchmarks.

      • DancinJack
      • 3 years ago

      Considering the past statistics on the amount of people that use CFX/SLI, I’m not really inclined to believe that people, more than a few, buy mid-range cards with the intention of pairing them with another down the road. Sure, you might be the exception to that rule, but you’re definitely not in the majority.

        • the
        • 3 years ago

        I fathom that SLI adoption rates would be higher if the cards were on the market for longer. The GPU upgrade space works in year long cycles. If you don’t get that second card within that first year of launch, something newer/faster will be out in the same price range. Attempting to get a second card two or three years after they launched is difficult and not that great of deal compared to newer cards released in that time frame.

      • blahsaysblah
      • 3 years ago

      When sites tested SLI/CF recently, it [b<]still[/b<] only worked in 6/16 games tested. Even there it wasnt always perfect scaling. It was like 20% improvement in average across the games. SLI/CF as it is, is only valuable for a small number of gamers who play a specific game a lot. If you truly want to spend two $300, wait a year and spend $600 on a GTX 1080. Did you not read that Nvidia now only supports 2-way SLI on the high end cards? Unreal Engine cant even support SLI because some of the advanced effects rely on previous frame information,... SLI is not simple magic bullet. It was never solved. Its no different with multi-CPU systems. Each CPU has its own locally attached memory. You cant willy nilly just run things randomly. You will easily saturated the shared bus if one CPU tries to work on data in another CPUs RAM. SLI is not magic.

      • BoilerGamer
      • 3 years ago

      Don’t

      the latest Steam hardware survey didn’t even have CFX/SLI percentage because they were too low to be worth tracking. With DX12 Mutli-adapter mode, bridge CFX/SLI will be obsolete soon.

      • tipoo
      • 3 years ago

      Not sure why so many are defending its removal. DX12 multi-adaptor may be cool, but games have to explicitly use it, SLI would have been the band aid in the mean time. And even if 1 more powerful card is better, for budget builds people may want to get 1 now and another when it’s dirt cheap in 2 years for example.

      Why defend artificial market segmentation? It’s plain anti consumer.

        • Leader952
        • 3 years ago

        Maybe SLI on mid-range cards is not needed because of this:

        Microsoft plans DirectX 12 improvements
        Wants to simplify Multi-GPU support

        [url<]http://www.fudzilla.com/news/41069-microsoft-plans-directx-12-improvements[/url<]

          • tipoo
          • 3 years ago

          Isn’t that exactly what I addressed? Simplifying is cool, but it still needs explicit support, SLI would be a band aid until then. Why defend its removal at any rate.

            • Leader952
            • 3 years ago

            No it isn’t what you addressed. You seem obsessed about the missing SLI on the GTX 1060 whereas this:

            [quote<]Microsoft plans to simplify Multi-GPU support in DirectX 12 Microsoft plans to simplify Multi-GPU support in DirectX 12 with a new abstraction layer that will help developers implement support without the need for deep coding. [url<]http://www.overclock3d.net/articles/gpu_displays/microsoft_plans_to_simplify_multi-gpu_support_in_directx_12/1[/url<][/quote<] shows that Multi-GPU will be more readily available thus negating the NO SLI rant.

            • tipoo
            • 3 years ago

            You seem to be getting more upset about this than anyone; me simply lamenting the loss of SLI isn’t a “rant” nor does it make me obsessed. Not sure why any of this makes you feel the need for provocative words; it certainly hasn’t for me.

            How many DX12 games are coming out today, vs DX11? SLI would patch things over until more games support DX12 as well as multiadaptor, regardless of the abstraction layer that makes MA easier. That’s what I’ve said three times now, if you miss the point and talk past me again I give up on your reading comprehension.

        • wingless
        • 3 years ago

        Buying my second GTX 760 was an AMAZING experience. In fact, the pair were faster than my current GTX 980, but the 2GB framebuffer and a hookup price on the GTX 980 made me pull the trigger. Honestly if I had had two 4GB cards, I’d still be using them in SLI.

        I think the folks that are discounting SLI performance are people that have never had dual GPU setups or had them back in 2005 when dual GPU configs were terrible. Heck, folks with lower end cards likely SLI/Crossfire them at a higher rate than the ultra high-end GPU users.

          • Chrispy_
          • 3 years ago

          BS detector is going madcrazy right now.

          [url=http://www.anandtech.com/bench/product/1745?vs=1716<]Quick sanity check[/url<] Yep, in most cases a GTX980 looks to be about 3X faster than a 760. If you can get >100% scaling from SLI, some people at Nvidia want to make you millionaires, go give them a call!

          • f0d
          • 3 years ago

          i have tried dual gpu with every model of card i have had since my 4850 (includes gtx260/6950/gtx670) and it has never worked flawlessly

          even in a lot of AAA games there are issues, when i tried my dual 670’s in pcars the framerate was about 75% higher but there was a stutter and the frametimes wasnt as smooth as a single 670 was – admittedly it DID work early on without issue but after a few months of patches and driver updates sli made the experience worse than a single card

          in planetside 2 i get much higher frame latency with sli than i do a single card and again the framerate while high is not very smooth and the experience with a single card was better

          no doubt i will try crossfire again with my r9-290 soon but i think that yet again it will disappoint me when it comes to smooth frame delivery

      • designerfx
      • 3 years ago

      It’s definitely a direct shot against the value proposition of the card but I have no idea how big the real market for SLI on a card like this from Nvidia would be.

      • Krogoth
      • 3 years ago

      The loss of “SLI” support is not a big deal in the grand scheme of things.

      SLI/CF has always been gimmicky feature that only made sense if you need to reach a high framerate with your monitor’s native resolution and a single-card solution didn’t cut it. The framerate is high enough to effectively mask the “micro-shuddering” for AFR.

      I suspect Nvidia removed it because the bulk of their SLI users(a tiny minority) used high-end cards. They want to save on bottom-line costs. AMD does the same thing with their lower-end stuff as well. Nobody cries about that.

        • the
        • 3 years ago

        nVidia also nerfed SLI on the high though by limiting it to 2-way SLI.

        I fathom the reason for removing SLI support on the GTX 1060 stems from price creep on the GTX 1080. Previously the cost of two midrange cards would that of a single high end card and would provide similar performance when SLI worked well. Now that highend cards carry a higher premium, especially with the Founder’s Edition, there is a fiscal reason to keep SLI a high end feature.

          • Krogoth
          • 3 years ago

          SLI/CF never made any sense beyond two cards outside of epenis benchmarks.

          The CPU overhead just eats anyway any meaningful gain in real-world applications and it amplifies all of the teething issues with multi-card rendering solutions.

      • Flapdrol
      • 3 years ago

      You’d be better off getting a 1070 and selling the 1060 in that case.

      Multigpu was somewhat reliable in the past but it’s been pretty bad since last year. The 1060 is the 960 replacement, I would never recommend putting 960’s in sli. better off with a 970.

        • travbrad
        • 3 years ago

        Yep a lot of people seem to forget you can sell your old card and recoup a lot of the cost, and if you put that money towards a single more powerful GPU you will end up with a better gaming experience in the vast majority of games. Suddenly “doubling up” doesn’t seem like such a great deal when you realize this.

        You might get some lower average framerate numbers in a few games, but even most of those will be smoother with faster single-GPU (not to mention less heat/power usage)

    • anotherengineer
    • 3 years ago

    So $300 for founders ed. and it looks like a 3+1 phase set up?

    [url<]https://www.techpowerup.com/223961/nvidia-geforce-gtx-1060-founders-edition-pcb-pictured[/url<]

      • chuckula
      • 3 years ago

      Well if you want a lower price on a beefy voltage regulation system — and you want to use it for all its worth — then go ahead and get an Rx 480.

        • blahsaysblah
        • 3 years ago

        That’s just mean.

          • DancinJack
          • 3 years ago

          no it’s not. just true.

            • chuckula
            • 3 years ago

            It’s actually both ๐Ÿ˜‰

        • tipoo
        • 3 years ago

        Harsh, but true, AMDs really are overdesigned, it was overkill for the Fury X and they ported it down to the 200 dollar mark. I wonder why? Do economies of scale just mean it’s less cost to keep using the same ones? Or the engineering cost? Or did they just run out of time, as Nvidia seems to not go so overkill, not by a long shot. At 120ยฐC these still can deal out ~400A, that’s crazy town banana pants.

          • chuckula
          • 3 years ago

          In AMD’s defense, the over-engineering came in handy when they had to tweak the drivers to push more current through only 3 of the phases to solve the PCIe power slot issue.

            • tipoo
            • 3 years ago

            I think even using three phases, 120ยฐC delivering 400A still has leagues of margin ๐Ÿ˜›

        • anotherengineer
        • 3 years ago

        Well here are some 4+1 designs on $200 gtx 960’s

        [url<]https://www.techpowerup.com/reviews/ASUS/GTX_960_STRIX_OC/5.html[/url<] [url<]https://www.techpowerup.com/reviews/MSI/GTX_960_Gaming/5.html[/url<] For $300 I would expect a bit more decent PCB/Components/setup/whatever. My old $130 (when it was brand new) HD6850 has a 3+1 setup.

          • terranup16
          • 3 years ago

          Here’s a 6+1 GTX 1060-

          [url<]http://wccftech.com/geforce-gtx-1060-g1-gaming-gigabyte-is-official/[/url<]

    • rudimentary_lathe
    • 3 years ago

    GTX 980 performance for $250? If so, AMD is in serious trouble. I wouldn’t buy a card with GTX 970 performance and negligible overclocking ability when I can buy a card with GTX 980 performance and 15-20% overclocking ability for only $10 more. Not to mention better drivers and accompanying software.

    I think we’re seeing AMD’s diminished R&D resources coming home to roost. I was really rooting for them due to their support of open standards, but I may have to support the dark side this go ’round.

    Edit: If only they supported FreeSync ๐Ÿ™

      • benedict
      • 3 years ago

      Keep dreaming. You won’t be able to buy this for 250$ for at least 6 months.

        • rudimentary_lathe
        • 3 years ago

        Perhaps, but I also can’t buy the RX 480 for its MSRP at the moment, either. We may not see the same price gouging with the 1060 as we’re seeing with the 1070 and 1080 since the AIB cards are launching at the same time as the FE cards, and FE cards will only be available for sale on the Nvidia website.

        I stand by my original statement though. If I can get 20-25% more performance after overclocking for just $10 more – and use less power in doing it – why would I purchase the other product?

          • anotherengineer
          • 3 years ago

          Well you can get them at their MSRP

          [url<]http://www.newegg.com/Product/Product.aspx?Item=N82E16814202222[/url<] once they are in stock that is.............................

        • chuckula
        • 3 years ago

        No, I’ll get a custom version for $250.

        • CScottG
        • 3 years ago

        My guess is late October at the latest, and by late November you should see DISCOUNTS (from retail).

      • Demetri
      • 3 years ago

      I think they’re in pretty good shape with the 4gb model @ 200. I don’t know how useful the extra 2gb on the 1060 would be. Also, I’d wager the 480 will be just like every other AMD release in recent years; half-baked at launch but will gain ground against the Nvidia offering over time.

      • N3M3510
      • 3 years ago

      Where exactly did you see a 1070 or 1080 getting a 15-20% jump in performance with overclocking? because of all the sites i visit, the largest increase from o/c was 12%.

        • ColeLT1
        • 3 years ago

        Mine is a factory-overclocked card (MSI Gaming X). And I got a 13% gain, so I could see 15-20% on a factory clock vs overclock.

        Stock GTX1070 = 1952mhz core 1.062v stock 4000mhz memory (8000 mt/s)
        Overclock = +100core(2076mhz) +700memory(4700mhz) (9400 mt/s)

        Metro 2033 very high 2560×1440
        970oc 37fps
        1070 54fps
        1070oc 61fps

        Unigine Heaven Maxed 2560×1440
        970oc 34.7fps 873
        1070 55.7fps 1402
        1070oc 62.3fps 1570

        bioshock infinite bench Average Min Max
        970oc Preset 2 2560×1440 83.60, 29.41, 155.73
        1070 Preset 2 2560×1440 111.36, 31.54, 204.33
        1070ocPreset 2 2560×1440 124.40, 66.12, 239.18

      • designerfx
      • 3 years ago

      The sky has not fallen, thanks. I don’t expect this to even hit GTX 970 performance.

        • CScottG
        • 3 years ago

        An engineer from Nvidia below (comments) is also suggesting 980 performance (derived from improved performance over the RX 480) – and of course that’s “stock-to-stock”.

      • JumpingJack
      • 3 years ago

      I don’t think AMD is in trouble here. What I think happened in this case is that NVidia answered AMD’s push in the midrange earlier than they expected.

      What we are observing here is competition for market share at work. AMD priced the 480 to move, specifically to establish a higher share in the sweet spot of the market. Nvidia had one of two choices — answer it or ignore it. They chose to answer it.

      AMD may counter with a lower price to entice more users over to their side. Who knows?? However, one thing is clear, the consumer is the clear winner here. You are going to be able to get performance that last year cost > 500 bucks for roughly half that (after the demand side wanes enough to stop the gouging ๐Ÿ™‚ )

      • bfar
      • 3 years ago

      Polaris 10 was their first crack at a new process in a new foundry. They were wise to go for a small chip. I reckon AMD’s next couple of chips will come with pretty steep improvements as they learn the process.

        • sweatshopking
        • 3 years ago

        It wasn’t a process problem. This chip is basically the 390, but consumes hundreds of watts less. The problem is the architecture, it’s the old ass gcn. NVidia has made huge changes to its chips. Amd hasn’t.

          • f0d
          • 3 years ago

          what i find disappointing is that the RX 480 is slower than the 290 at the same frequency
          my 290 runs at 1200mhz and gets a higher score than the 480 at 1266
          my 290 – graphics score of 14104 [url<]http://www.3dmark.com/fs/8172377[/url<] RX480 review with a graphics score of 12195 [url<]http://www.guru3d.com/articles-pages/amd-radeon-r9-rx-480-8gb-review,26.html[/url<]

            • sweatshopking
            • 3 years ago

            Yeah. It isn’t a speed demon. If you want fast go NVidia. Amd is cheaper and it works but it isn’t sparkle magic good.

    • ptsant
    • 3 years ago

    I approve of the pricing. Remains to be seen if it matches the performance and if it applies to actual products (+$100 markup –> huge fail).

      • Concupiscence
      • 3 years ago

      I’m more concerned about retail channel availability. If the 1070 and 1080 are any indicator, Nvidia’s been unable to scale to meet demand on their new parts. Now the 1060’s bound to be cheaper to manufacture and have better yields, but that doesn’t guarantee the yields can meet the greater demand at $250 either. If rumors are true and Nvidia pushed the time table a month ahead to try and head off the 480, don’t expect these to flood the marketplace in earnest for a while either.

    • xeridea
    • 3 years ago

    I will never understand why people actually pay a premium for the “Founder’s Edition”, when all it is is a reference card that will surely be outdone by third parties. It’s like buying a special edition car, or collectors edition game. Except in those cases you actually get something extra, in this case, you get something subpar compared to third party solutions that arrive shortly after.

      • ptsant
      • 3 years ago

      Because it’s available. The other versions are magically absent.

        • xeridea
        • 3 years ago

        If they didn’t do the FE, some of the others would be available. The only reason the FE is available is many don’t want to play into this game. Currently, the $699 FE cards are all out of stock, but some for $886 are in stock, I wonder why. Really, what it comes down to is Nvidia knew they would be out of stock for several months with their limited launch, so they are joining in on the price gouging.

          • JustAnEngineer
          • 3 years ago

          “The way it’s meant to be gouged.”

            • CScottG
            • 3 years ago

            LOL! Good one, a post worthy of many up-votes!

          • Ifalna
          • 3 years ago

          Capitalism in a nutshell.
          Tune the prices as high as you can get away with.

      • davidbowser
      • 3 years ago

      Instant gratification is a real thing. Nobody really seems immune to it.

      I too am fascinated by the peoples desire to have something NOW, when even a modest research effort on their part would prove that a faster/cost-effective/cooler-looking version would be available in a matter of days.

      I have only pre-ordered a piece of tech a couple times, and had to RMA one of them, so I won’t be doing that again if I can help it.

        • Krogoth
        • 3 years ago

        There are plenty of people who are immune to instant gratification. It is called having willpower. ๐Ÿ˜‰

          • pranav0091
          • 3 years ago

          You seem immune to any kind of gratification. Krogoth is never impressed or gratified ๐Ÿ˜‰

          • DrCR
          • 3 years ago

          That’s not immunity, that’s self control. ๐Ÿ˜‰

        • Laykun
        • 3 years ago

        Unless a difference of 50-100 dollars is meaningless to these people, generally an attitude of people who buy to tier hardware.

        • anotherengineer
        • 3 years ago

        Indeed.

        I am still running and old AMD 955 and Radeon HD6850 which is driver legacy support now. Unless I get minimum 2x performance increase across the board in everything for under $200 cnd, I will run the system until it dies.

        Now if there was tons of good paying jobs within commute distance from my home, and I wasn’t paying for a vehicle, a mortgage, 2 children, and a pension, then even $1000 would be trivial money, however that’s not the case, in my case.

          • davidbowser
          • 3 years ago

          Yep. Running an HD7750 and probably will look at an upgrade next year. I am the target audience for the price/performance charts at TR ๐Ÿ™‚

      • blahsaysblah
      • 3 years ago

      Tons of people get non-K 6700 and 6500. Same reason.

        • xeridea
        • 3 years ago

        6700k vs 6700 gets you a big clock boost, and unlocked OC. What does the FE cards get you? I fail to see the reason, can you explain?

      • slowriot
      • 3 years ago

      Nvidia is literally charging more for the inferior model. And they’re able to get away with it by artificially controlling the supply.

      It’s not cool. Nvidia makes good GPUs, best on the market generally. But they’re also jerks and with each launch AMD screws up on, Nvidia will use it to squeeze our throats tighter. It’s not a joke. Founders Editions? Forced sign up to get current driver releases? G-Sync monitors carrying a $100+ premium over the same panel with Freesync? This is just the start.

        • chuckula
        • 3 years ago

        Nvidia is doing this to be nice to their customers.

        But you and I aren’t their customers.

        AIB manufacturers are their customers, and those manufacturers don’t mind a bit when the “reference” model is actually more expensive than most of their custom models outside of the most exotic OC’d versions of the product. It gives them more wiggle room to customize cards at different price points.

        So there’s an early adopter tax, but hey, given some of the prices of the Rx 480 before it went unavailable, there was an early adopter tax on those cards too.

          • Leader952
          • 3 years ago

          What exactly is Nvidiaโ€™s Founders Edition?

          [url<]http://www.pcgamer.com/what-exactly-is-nvidias-gtx-1080-founders-edition[/url<] [quote<]There's more to this than just making an alternative card, however. Nvidia has other partners, system vendors, who use their GPUs. Many of these will qualify specific hardware to work in their systems, and some of these vendors really like the old reference cards. They use blowers in place of open air coolers, which means they often work better in SLI configurations or small chassisโ€”Falcon Northwest for example told me they won't put a non-blower GPU in their compact Tiki, due to heat concerns. If the reference design is only available for a short time before transitioning all graphics card production to the AIB partners, it can make things more difficult for system integrators. Now they can qualify the Founders Edition and rest easy knowing the cards will be available for purchase for a year or more. And that's basically the whole story: the [b<]Founders Edition will be an Nvidia manufactured card, just like the old "reference" models, only it will continue to be manufactured and sold throughout the life of the GTX 1080/1070 cards. It will carry a price premium, but that appears to be mostly a case of avoiding too much direct competition with their AIB partners[/b<]. [/quote<]

          • xeridea
          • 3 years ago

          AIB manufactures have been doing custom boards and coolers for as long as they have existed with no complaints. If the cooler or board is better, customers don’t mind spending a small premium over reference. The FE is supposedly to increase availability, but they have been non existent since launch.

          Early adopter tax for Rx 480 is strictly from ebay and amazon gougers. Stocks were massive. There was decent availability for most of launch day, but given the card is $500 cheaper, and mainstream rather than ultra high end, it is reasonable for them to run out.

          • NovusBogus
          • 3 years ago

          Fair point. That $650 flagship smartphone doesn’t actually make many sales at $650, either. The sporting goods industry is notorious for MSRP shenanigans.

            • MDBT
            • 3 years ago

            Sure it does, just on contract. $25/mo for 24 months is $600 and some places will bump you up to $27 and change which ends up being $650.

            • Ninjitsu
            • 3 years ago

            Depending on your country, it could actually sell at $800+ without contract.

        • terranup16
        • 3 years ago

        But is it the inferior model? Is RGB not worth more than monochrome LED? Is electric blue not worth more than black? Does a metal phone not get its rating boosted by its material over a plastic one?

        Honestly, I think the main reason NVidia is charging more and “justified” in doing so stems almost purely from aesthetics.

          • slowriot
          • 3 years ago

          You can certainly prefer the aesthetic’s of the reference cooler. I think it looks good as well.

          However, it also doesn’t cool as well or as quietly as the cards with open air coolers. The FE’s are also lower clocked than all the other models near the same price. The FE’s also have less robust power delivery than the other 1080s with similar pricing. $679 is the price on EVGA’s current top end GTX 1080 and its better in every way except arguably looks. Same with the Asus, Gigabyte, Zotac, etc, etc. models.

      • wizardz
      • 3 years ago

      i guess mostly because watercooling manufacturers usually offer waterblocks for reference designs only. it will then mostly likely make the blocks cheaper because they wont have to make 4-5 different versions and reduce total manufacturing costs of all the tooling required.

      well, that’s the scenario in my head and i’m sticking to it ๐Ÿ™‚

      • Chrispy_
      • 3 years ago

      Because you’re not part of the demographic that wants a Founder’s Edition; Even before they were branded as “FE” cards, there was a distinct market for what used to be called Nvidia’s TTM cooler, and for the most part, there was a premium to pay for them (I’ll admit, some places still sold them at reference price, but I think that was only a couple of stores, and only in the US)

      1) People who are impatient and must have the first card. Nvidia knows they’re impatient, so they can capitalize on this.
      2) People who want exhausting blowers, not open coolers. HTPC market, Workstations, Boutiques, multi-GPU users.

      I own two “Founder’s Edition” GTX 970s and I paid a โ‚ฌ50 premium for each of them, just like owners of the GTX770 with the Titan cooler – there was a premium for that too. If you want a single-fan blower design then quality matters because the cheap ones are terrible. All of AMD’s blowers have been bad for years, and Nvidia’s cheaper reference blowers have also been bad (GTX 660, 760, 960). A good blower *is* worth paying for.

      If that’s not you, don’t worrry: Since you’re not in group 1, you can deal with waiting for the vendors to release open coolers, and since you’re not in group 2, you don’t need to look at the FE or its premium price.

      The only real question is whether Nvidia’s GTX1060 “FE” cooler is actually high quality or not. If it’s a cheap extrusion with a rough-sounding fan, then people won’t be forgiving of the premium just for the GTX1060 FE.

        • slowriot
        • 3 years ago

        I sincerely doubt 2x FE 1080s will be cooler or quieter than say 2x EVGA FTW 1080s or 2x Asus STRIX 1080s. The FE’s will absolutely be slower and not have the same overclocking headroom.

        The case you’re putting the cards in would have to be a severely poor ventilation environment for the blower styles to come out ahead. I think one would be much better off addressing that issue then going with the card that’s inferior besides it appease your desire for a blower style cooler.

        Same goes for any GTX 1070 or 1060 that will come.

          • Chrispy_
          • 3 years ago

          What, am I talking to a wall?

          If your case has enough ventilation that two open coolers perform better than two exhausting coolers, then [b<]YOU ARE NOT PART OF THE TARGET DEMOGRAPHIC, SO STOP WHINING THAT THESE CARDS AREN'T FOR YOU.[/b<]

            • xeridea
            • 3 years ago

            I used to mine crypto, and I can attest to this. 2 non blower cards sandwiched next to each other in a case with side off have major heat issues. I had to have an 8 or 10″ house fan blowing between the cards so I didn’t have to downclock them substantially just to keep under 80C. For sandwiched multi GPU, blowers are absolutely better (and the only time I would consider water cooling).

            My point actually wasn’t about blower vs custom, I was just talking of the crazyness of now paying an extra $50-100 for FE (which gets you a blower), when historically, this has never been a thing. There has always been reference, and custom coolers, and prices generally don’t vary too much, though some exotic coolers fetch small premium (more like $20).

            • slowriot
            • 3 years ago

            Ok. ultimately people are beholden to their current “situation” i.e. what case they have, fans, etc.

            But ideally… if you’re maintaining good positive pressure in your case by taking in more than you’re exhausting… taking your side panel off would increase your temperatures.

            Even if you’re going the other route and going for negative pressure… again taking the side panel off would increase temperatures. If your temperatures improved by taking off the side panel then there were greater issues at play with your case cooling strategy.

            • Chrispy_
            • 3 years ago

            With two open coolers, the hot exhaust from one card gets sucked directly into the second card. no matter how good your airflow is.

            Insane airflow from a bajillion case fans can reduce the effect of this, but a blower design completely eliminates this issue.

            Also, not everyone has a bajillion case fans, because [b<]NOT EVERYONE HAS WHAT YOU HAVE.[/b<]

            • slowriot
            • 3 years ago

            I have 3 case fans, all low speed Noctua’s. You don’t need insane case fans to maintain positive pressure.

            • slowriot
            • 3 years ago

            You’re missing the point. You’re choosing a card that is inferior in every other way simply because you refuse to use a better case. That’s… silly to say the least. And you’re paying more for the cards, possibly the cost of a new/better case, for the difference.

            • Chrispy_
            • 3 years ago

            [quote<]You're missing the point[/quote<] No, [i<]You're[/i<] missing the point. For the [i<]n[/i<]th time, not everyone has the same use case as you; There are [b<]OTHER DEMOGRAPHICS.[/b<] (hmmm, I'm getting a sense of deja-vu. Feels oddly like I'm repeating myself) In *my* example, I can't use a different case, the case was chosen because of its dimensions and layout. The case fits the airflow and size requirements of the environment its in, and the blower compliments the case. In other examples, people bought gaming PC's from Best Buy and they have one intake, one exhaust. They're [i<]not[/i<] the demographic that will buy a new case, strip the old one down and transplant everything. I know it's going to dissapoint you, but the PC Master Race contains a vast spread of different people using different hardware. Not everyone has a Full-ATX, custom-built PC with mutliple 140mm intakes and exhausts with a motherboard that provides adequate spacing between the two PCIe x16 slots. If I said that particular demographic was as large as 10% of the PC Master Race, I'd probably be considered overly hopeful, because for every person that bought a boutique gaming PC or built it themselves, there are tens, maybe even hundreds of people who didn't.

            • slowriot
            • 3 years ago

            [quote<]Ok. ultimately people are beholden to their current "situation" i.e. what case they have, fans, etc.[/quote<] Words I've said already. It's pointless talking to you. Because you don't listen at all. You characterize others arguments wrongly and then run with it. No where did I state you need an army of fans. Quite the opposite actually. I have simply repeatedly told you that you need positive pressure. I have also addressed your nonsense about Best Buy buyers. You're telling these people to BUY THE INFERIOR CARD FOR MORE MONEY THAN THE BETTER ONE THEN SIMPLY GETTING A BETTER CASE FOR THE SAME MONEY. And that's it. I'm done here. None of this has anything to do with some BS "PC Master Race" or however you want to choose to manipulate my statements. It's about pushing people in the right direction instead of building up on stupid decisions, like you keep telling people to do. Good luck with that. Keep paying more for less. Enjoy.

            • Ninjitsu
            • 3 years ago

            I would point out that AIB partners almost always release blower versions too…

      • brucek2
      • 3 years ago

      These cards have a shelf life of relevancy. The clock is already ticking from the moment they are released, and purchasing the card later in the period does not extend its time of relevance.

      Think of it as someone offering you a 10% discount if you’re 15 minutes or more late to the movie. Sure you’re paying 10% less for the ticket, but you’ve also missed 15 minutes of the movie.

      The period of ready availability of these cards has not even begun yet, but we are already six weeks after launch of the 1080 – or more than 10% of the year in between new releases.

      • psuedonymous
      • 3 years ago

      [quote<]I will never understand why people actually pay a premium for the "Founder's Edition", when all it is is a reference card that will surely be outdone by third parties. [/quote<] Have you seen the rear-exhaust coolers OEMs slap on cards? They're shit. Pure shit. If you want a decent blower, it's reference/FE or nothing.

      • End User
      • 3 years ago

      The FE was available launch day so I bought one. I had no interest in waiting around to play the silicon/cooler lottery just to get another 200 Mhz out of my overclock. New cards are still coming out so I’d still be waiting for the best one.

      I wanted GPU with a blower fan.

      Temps and audio are not an issue. I’m a happy camper.

      Which 1070/1080 did you end up getting?

    • anotherengineer
    • 3 years ago

    1700Mhz boost, TSMC silicon seems to be good stuff. Now to see if e-tailers will tack on more $$ or leave the price as is.
    edit – misread too early in the morning

      • Jeff Kampman
      • 3 years ago

      That’s the boost clock, we don’t know the base speeds yet.

      • tipoo
      • 3 years ago

      That’s boost, base is ???MHz

      • Krogoth
      • 3 years ago

      Probably ~1.4-1.5Ghz depending on how conservative Nvidia is with thermal and noise output.

        • Leader952
        • 3 years ago

        1506 MHz Base same as the GTX 1070.

        [url<]http://www.pcper.com/reviews/Graphics-Cards/NVIDIA-GeForce-GTX-1060-Preview-Pascal-GP106[/url<] Ryan made a boo boo and posted the BASE number.

    • tipoo
    • 3 years ago

    An innocent “1x 6-Pin” now seems mocking

      • DPete27
      • 3 years ago

      It’s fine if the card ACTUALLY sticks to its claimed <150W TDP.

        • Krogoth
        • 3 years ago

        Which is highly plausible since 1070 and 1080 only eat around 200W when fully loaded . A design that effectively half of a GP104 silicon should easily consume ~100-120W when fully loaded.

      • ImSpartacus
      • 3 years ago

      For better or worse, I kinda trust Nvidia to actually deliver. Poor amd…

      • rudimentary_lathe
      • 3 years ago

      This chip is rumoured to overclock to over 2GHz like the 1070 and 1080. Hopefully the partner boards have an 8-Pin.

        • Ninjitsu
        • 3 years ago

        Did GP104 actually hit 2GHz on air, in the “wild”?

        EDIT: oooh, that’s quite nice! thanks for the links!

          • terranup16
          • 3 years ago

          Yes-
          [url<]http://www.hardocp.com/article/2016/06/20/geforce_gtx_1070_1080_fe_overclocking_review/3[/url<]

          • jihadjoe
          • 3 years ago

          Yep. Every single 1070 and 1080 tested by TPU has hit at least 2GHz on air.
          From the latest card they tested. Scroll down a bit for the Max OC comparison table.

          [url<]https://www.techpowerup.com/reviews/Palit/GeForce_GTX_1080_GameRock/28.html[/url<]

      • Ninjitsu
      • 3 years ago

      As does the 120w TDP ๐Ÿ˜€

    • Krogoth
    • 3 years ago

    On paper, it looks like 1060 will probably get somewhere between 970 and 980. The 980 has more ROPs, shader units, and memory bandwidth which will allow it pull ahead when AA is added and resolution increases. The 1060’s performance is depended mostly on its massive boost speed.

      • tipoo
      • 3 years ago

      Or does Pascal do anything similar to primitive discard acceleration on the AMD side? Then AA+tessellation could be less of a big deal on it than on the 980 which does it more with brute force.

    • DPete27
    • 3 years ago

    I would recommend tuning the review sample RX 480 for both its 4GB and 8GB configurations in the GTX 1060 review.

      • akiraaisha
      • 3 years ago

      That would be awesome. I might switch from RX 480 to 1060.

      • anotherengineer
      • 3 years ago

      Here is a bit of a 4GB vs. 8 GB
      [url<]https://www.techpowerup.com/223913/amd-retail-radeon-rx-480-4gb-to-8gb-memory-unlock-mod-works-we-benchmarked[/url<]

        • DPete27
        • 3 years ago

        ermergerd!! 6% improvement for 20% higher cost!!!!

        All kidding aside, that TPU “benchmark” is a joke. Looks like they spent a whole 5 minutes on their benchmarking and article write-up. The world deserves a more in-depth analysis.

          • anotherengineer
          • 3 years ago

          It was never really meant as a benchmark article, rather it was to test if consumer 4GB cards on the market could be unlocked to 8GB.

          And the test was a quick and dirty at 4k to use up vram.

          “To confirm that this mod works, we first tested our 8 GB review sample with its untouched 8 GB BIOS, and used that as control. Next, we tested the retail 4 GB card with the BIOS it shipped with. Lastly, we flashed this 4 GB card using ATIFlash with the 8 GB BIOS, which we extracted from our 8 GB card using ATIFlash. We ran “Call of Duty: Black Ops III,” on the three. This game can consume dedicated video memory beyond 4 GB at 4K Ultra HD (3840 x 2160). “

    • Tristan
    • 3 years ago

    AMD must release more custom 480 4GB, with good OC potential. This is the only way to compete with 1060.

      • Krogoth
      • 3 years ago

      1060 isn’t going that much faster if even that.

      Nvidia just needs to match performance and let their mindshare do the rest.

        • rudimentary_lathe
        • 3 years ago

        It’s rumoured to be 10-15% faster than the RX 480 at stock. And if it overclocks like the 1070 and 1080, you can tack on another 15-20% to that (assuming the RX 480 doesn’t overclock well with better cooling/power, which we don’t yet). That would be a huge difference in performance.

    • tipoo
    • 3 years ago

    I’m liking this, looks like there’s no rumored 3GB version to fill in the 250 dollar slot, and 250 dollars itself isn’t too bad of a premium over the RX 480. Some were thinking Nvidia could get away with even 300, at least for the 6GB model.

    Yay for market competition, think what it would be if no RX 480 launched. I hope for AMDs sake the 480 is still a hit though, but I’d be tempted up to the 1060 myself.

      • bfar
      • 3 years ago

      True ๐Ÿ™‚

      However, if I was shopping in this price segment, I think I’d look at a 4GB RX 480 over this (especially if the rumors about unlocking the extra 4GB are true)

        • tipoo
        • 3 years ago

        They’re more than rumors, retail cards in the wild have already been modded with it. However it’s probably of a limited stock of 8GBs that were converted. You have to take it apart to check the chips first, or ??? happens if you mod a 4GB card with 4GB RAM to 8GB.

        • DPete27
        • 3 years ago

        I would NOT suggest buying an 4GB RX480 expecting it to unlock to 8GB. VRAM is mature enough to not need to be binned/disabled. So it’s counter-economic to include extra for no reason.

        The [url=https://www.techpowerup.com/223913/amd-retail-radeon-rx-480-4gb-to-8gb-memory-unlock-mod-works-we-benchmarked<]two[/url<] [url=https://tweakers.net/nieuws/113169/eerste-rx-480-gpus-met-4gb-hebben-8gb-geheugen-waarvan-helft-actief.html<]examples[/url<] I've seen are both German orders and although I have no knowledge of retailers there (whether the retailer is legit) they're likely getting a pull from a VERY small batch of remaining review samples (which were done to give the reviewers the ability to test 2 cards using 1 physical product..smart). Your chances of getting such a card at retail are VERY small, and will only get smaller as the days count on. When custom cooled boards come out [with proper 8-pin power BTW] don't expect board partners to be including an extra 4GB VRAM for free.

        • pranav0091
        • 3 years ago

        What use is an extra 4 GB (+2 GB) if the games run slower?

        <I work at Nvidia, but my opinions are only personal>

      • rechicero
      • 3 years ago

      We need to wait for the reviews, maybe it’s not 980 performance (half the shaders… but with lesser clocks), but 970. In that case, the nVidia “premium” would justify less memory and thats it. Anyway, we’ll know shortly enough. But I seriously doubt that nVidia would position the 1060 at the same price point of the 480 if it has better perf. They simply dont need it.

      • ImSpartacus
      • 3 years ago

      I think the general consensus is that we might get a cut down “1050” with 3gb of vram. That’s the card that will keep amd up at night because it’ll be at or under $200.

      • slowriot
      • 3 years ago

      I’m surprised by the $250 price point for a 6GB model. It’s better than I expected. We’ll see where actual products end up and how difficult finding one without a ridiculous mark up will be though.

        • nanoflower
        • 3 years ago

        Based on what happened with the 1080 I wouldn’t be shocked to see AIB cards coming out at $300-350 given that they believe their cards are better than the FE version and so worthy of a premium over the 1060FE price of $299. That won’t be a permanent thing but may last through the summer.

    • chuckula
    • 3 years ago

    Incidentally, the NDA for the product announcement has lifted but the NDA for actual reviews lifts later.

    Phoronix has one in-house with some photos but no benchmarks: [url<]http://www.phoronix.com/scan.php?page=article&item=nvidia-gtx1060-soft&num=1[/url<] I eagerly await the overwatting results.

      • tipoo
      • 3 years ago

      I love that that’s a term now

      • maxxcool
      • 3 years ago

      OVERWATTS!!

        • Bomber
        • 3 years ago

        That’s 190 OVERWATTS!

        For some reason I’m picturing Doc Brown right now…

          • maxxcool
          • 3 years ago

          Hell yes on Doc Brown .. ๐Ÿ˜‰

Pin It on Pinterest

Share This