Rumor: reference-cooled GeForce GTX 1060 breaks cover

Get your saltcellars ready, folks. Now that the GeForce GTX 1080 and GTX 1070 are on the market, it would only make sense that Nvidia has plans to begin filling out its lower-end lineup with Pascal GPUs, as well. A poster on the r/nvidia subreddit apparently got in touch with someone in possession of a reference "GeForce GTX 1060" card recently, and they were able to grab a high-resolution snap of the purported card. Feast your eyes:

Source: "samlhs" on Reddit

The poster didn't offer any other information on the card beyond this photo, but given what we know about Pascal chips so far and the past progression of Maxwell through the marketplace, it seems likely this card will use a new, cut-down GPU paired with some amount of GDDR5 RAM. It also seems likely to play in the $200-$300 range that AMD seems to be targeting with its Radeon RX 480. Until we hear something official, however, that's all just speculation. Enjoy the eye candy in the meantime.

Comments closed
    • maroon1
    • 3 years ago

    If GTX 1060 has 1280 shader and 192-bit bus as rumored then it will probably be as faster or at least as fast as RX480

    GTX 1070 has 50% higher shader power and only 33% higher memory bandwith than GTX 1060 (at the same clock)

    So, if you we assume that GTX 1060 has same clock as GTX 1070, then GTX 1070 will be somewhere between 33% or 50% faster

    GTX 1070 is already around 50% faster than RX480

    Also, GTX 1060 will probably be more power efficient. We already know that GP104 has better than performance per watt than polaris 10

    • Ryhadar
    • 3 years ago

    Looks kinda fake.

    • Krogoth
    • 3 years ago

    Nvidia is trying to downplay the impending 480 launch in $200-$299 tier but the real question is if they can get the 1060 in numbers. The same deal goes for 480 and 470.

    $200-299 is where the real money comes from in gaming GPUs. The high-end gaming market has always been halo products that barely covers themselves in R&D costs.

    • Theolendras
    • 3 years ago

    If only Nvidia would also support freesync…

      • Voldenuit
      • 3 years ago

      [quote<]If only Nvidia would also support freesync...[/quote<] I was on the G-sync hate train before, but after doing some research, decided that G-sync was the way to go for me. It does have some advantages beyond freesync. For one thing, g-sync works across the refresh range of the monitor (0-144 Hz for most G-sync displays, 0-60 for 4K, 0-165 for some newer displays, and 4K models above 60 Hz later this year). Whereas you have to be really careful before buying a freesync monitor and look at the available VRR values for each display. Some only do VRR between 30-90 fps, some only do 45-75 Hz, etc. Meanwhile G-sync has no minimum refresh rate because it uses the monitor's internal framebuffer to refresh the pixels before they decay. Also, G-sync currently supports both fullscreen and windowed applications, whereas Freesync (afaik) still only supports fullscreen, although I would imagine this is something that could be fixed in drivers. The price premium on G-sync is coming down (some models are only $50-70 above the Freesync counterpart from the same manufacturer). It sucks that there have to be competing and incompatible standards between the two GPU makers, but going Freesync limits you to one discrete GPU maker in the same way that going G-sync limits you to nvidia, even if it's mostly nvidia's fault for wanting to keep their technology proprietary in the first place.

        • Theolendras
        • 3 years ago

        I’m not the one that downvoted you, your point of view makes sense. For me tough, buying a new monitor is a tempting prospect and putting about a 1000$ buck at it is not a value buy to me be it Freesync or g-sync compatible. Putting that kind of money to put me in a vendor locking mechanism is not doing it for me. Intel is gonna support Freesync and a some point and will become the de-facto standard afterward. Nvidia could still continue to develop g-sync to get more and more feature, but most of them won’t probably be compatible with current monitors…

        Eventually, I’m quite convince that Nvidia will support both technologies. At that point I will view them favorably to competition if they maintain their current lead, otherwise, I can get GPU with comparable performance for about 25-30% less $ with higher power consumption. Performance difference masked by a variable display technology. Since I have a Intel 6700k the only part I am really susceptible to have to upgrade is the GPU in say every 2 GPU generations and I would think the monitor would be good for 4-5 years, that will probably span 3 GPU purchase getting me locked in a possible vendor locking in the process it is not the way I want to play it.

          • Voldenuit
          • 3 years ago

          That’s a perfectly reasonable decision. Most users’ monitors outlast the GPU, and it’s sensible not to lock yourself into a single vendor.

          It’s too bad that whichever way you go with VRR, you end up with some kind of vendor lock-in, because nvidia doesn’t want to play ball.

          But I’ve been very happy with my G-Sync display, as it’s given a noticeable improvement in gaming smoothness and video/desktop use, and I would recommend VRR (of whichever stripe) to anyone looking to buy a new monitor today.

        • NoOne ButMe
        • 3 years ago

        G-sync is not a standard.
        Free-sync is a VESA standard shortened to A-sync (oh the irony…) with AMD’s branding. Nvidia actually has support for it, notebook G-sync is actually a branding of A-sync.

        Also, what do you mean down to 0 Hz? It doesn’t from any review or news I’ve heard of it. It operates down to 30Hz or so. At which point for FPS under that it attempts to guess to make the frames refresh seamlessly. Which Freesync can, but as you point out, only on a subset of monitors.

        Gsync certainly has it’s advantages over Freesync, the largest being not having to research every single monitor’s refresh rate, but I’m lost when you talk about down to 0Hz.

          • Voldenuit
          • 3 years ago

          [quote<]Gsync certainly has it's advantages over Freesync, the largest being not having to research every single monitor's refresh rate, but I'm lost when you talk about down to 0Hz.[/quote<] I was trying to come up with a way to convey that G-sync operation had no lower bound, not sure if that was the best way. As you and Freon mentioned, Freesync does have the provision to repeat last rendered frame, and I remember reading about that a while back, but the implementation seems to be more limited (kindly correct me if I am mistaken on this)*. But yeah, buying a Freesync monitor does mean the user should do more research into the VRR range; really, the manufacturers are as much to blame as those specs should be front and center, not hidden away as they are right now. * EDIT: Probably also worth pointing out that once framerates get low enough, no amount of display compensation will hide the fact that the user is getting a lousy experience, so this could all be academic.

        • Freon
        • 3 years ago

        I think your explanation of the technology is fundamentally wrong, but the trick here is you just want a Freesync monitor with at least the 2.5:1 ratio of max:min so you get the low framerate compensation (LFC). This allows the GPU to refresh rate double (or triple, or quaruple…), just like Gsync. This keeps variable sync to keep working when framerates fall below the minimum refresh rate of the monitor. This was a big deal that Gsync had from the beginning and Freesync did not. However, they added it to the Radeon driver many months ago.

        I think since AMD added LFC the two are more or less feature parity. Freesync is a bit less regulated, lots of panels with very narrow Freesync ranges like 40-75, which kinda sucks, so you just have to watch out.

        • RandomGamer342
        • 3 years ago

        What monitors actually allow a full range of refresh rates though? The ROG Swift for example can’t handle <36 FPS.

        • Flapdrol
        • 3 years ago

        gsync may have some slight advantages, but freesync screens are cheaper and available in lower price brackets. All gsync screens are either high refresh, high resolution or both.

          • Voldenuit
          • 3 years ago

          Yeah, the only reason I got my G-Sync display was because I got a refurb Asus PG278Q for $210 off list. I would not have (and still would not) pay full MSRP for one.

          The Dell S271DG has been known to go for sale for $489-509 from time to time, at which I would say that’s a good price to get it. Ironically, it hit $489 the day after my ASUS arrived. Typical.

          At this point, though, it makes no sense to switch GPU camps in favor of one GPU maker or another. If you want VRR and have an AMD GPU, get a Freesync Display. If you have an nvidia GPU, get a G-Sync. If you’d rather wait until the two parties stop squabbling and settle on a common standard and/or cross-compatibility, that is also an option. I was in the market for a new monitor, and didn’t want to wait. At worst, if I switch GPU makers in the next year or so, I’ll still have a 144 Hz fixed-refresh display.

        • tipoo
        • 3 years ago

        I don’t deny the technological superiority at all, but on the card side…Whynotboth.jpg. It’s an open Displayport spec, so the only reason to *only* support G-sync is to tie you down to one class of monitor, and then that monitor further ties you down for future GPU purchases.

    • Theolendras
    • 3 years ago

    The technical alignement and launch window does make it sounds credible. But the photo itself might just be a 1080 with about two hundred pixel re-edited from a junior photoshopper…

    • puppetworx
    • 3 years ago

    Nice timing.

    • Ninjitsu
    • 3 years ago

    Oh no, GTX 1060 FE?

    • anotherengineer
    • 3 years ago

    I wonder if it’s going to come in cheaper than the cnd going price for the 4GB GTX 960??

    Which after taxes is over $300
    [url<]http://www.newegg.ca/Product/Product.aspx?Item=N82E16814121928&cm_re=gtx_960-_-14-121-928-_-Product[/url<] If it does, might be a bit of old inventory kicking around for awhile.

      • Flying Fox
      • 3 years ago

      Not unless we suddenly have a 10+ point appreciation in the Canadian dollar.

    • CScottG
    • 3 years ago

    Hmm, must be Federation-approved based on the surrounding fan design.

    • southrncomfortjm
    • 3 years ago

    Purported. I sees it.

    • ronch
    • 3 years ago

    Short PCB, long cooler. It’s almost as though it’s done to spite the RX 480.

      • Chrispy_
      • 3 years ago

      It’s an Nvidia trait all the way through 28nm that AMD have only just started copying. GTX
      660, 760, 960 have all had that short PCB, long cooler design so why do you think it is it ‘spite’ for the 1060?

    • djayjp
    • 3 years ago

    Nvidia will obviously make it just a hair more powerful than the 480 and charge a premium for the difference. My guess anyway.

      • Mat3
      • 3 years ago

      And with all the sheep out there, it’ll still sell more than the 480.

        • brucek2
        • 3 years ago

        It’d be interesting to see the data on say how many people have bought exclusively one brand or the other, vs. how many have varied their choice over the years.

        I don’t doubt there are some sheep out there, but I want to believe that on the whole most people are making the choice that makes the overall best sense to them. Price for performance is one important factor, but there are others. Personally, as someone who has bought AMD cpus and gpus in the past, I bought nvidia early this round, because it is (sort of) available now, because I believe it is likely to deliver a good experience with few hassles for me on the games I play at the time I play them, and because the price difference is ultimately small for someone who makes a good living and is making a purchase that will receive hundreds of hours of high engagement use.

          • NovusBogus
          • 3 years ago

          For what it’s worth I’ve bought from a number of OEMs on both sides over the years, but eventually got pissed off with mixed results and a rather inconvenient driver issue and said ‘screw it I’m sticking with EVGA, not worth the hassle.’

          • Voldenuit
          • 3 years ago

          [quote<] It'd be interesting to see the data on say how many people have bought exclusively one brand or the other, vs. how many have varied their choice over the years.[/quote<] No matter which team you root for (or switch sides as needed), it's in the consumer's best interests that both vendors are competitive with each other. This way, we get better prices and better performance from both camps. We don't want to see a situation like intel charging $1700 for a Broadwell-E CPU because they can, in the GPU front . To a degree, this has already happened because AMD was on the back foot with power consumption, driver optimization and performance the last couple of gens. The lugubrious early adopter tax nvidia charged with its FE markups was the most recent example of this*. May it die a swift and timely death. *all that extra moolah, and they didn't even put a good cooler on the card. Shame! Shame! Shame!

      • ImSpartacus
      • 3 years ago

      If it’s GP104-based, then it’ll have to be better than the 480, and yes, it would be costlier as a result.

        • Voldenuit
        • 3 years ago

        I can imagine a situation where nvidia uses GP104 chips for the 1060 to start, and then, as yield improve and they ramp up GP106 production, “silently” switch the 1060 over to GP106.

        It’s not like this hasn’t happened before, on both sides of the fence. It happens a lot more in OEM parts, but also in the consumer space.

        And at 314mm^2, the GP104 in the 1070/1080 is already significantly smaller than the 398 mm^2 die in the 970/980, so chip harvesting of early poor yields might still be profitable for them.

        This possibility dovetails nicely with nvidia “rushing” the 1060 to counter RX 480.

          • NovusBogus
          • 3 years ago

          The brand formerly known as Kingston called, they said that trying to pull this in a market with a vocal enthusiast vanguard is a really bad idea.

      • Hattig
      • 3 years ago

      If it is 1280 CUDA cores, then they will have a lot of difficulty making it more powerful than the RX 480, especially in DX12 titles. Unless the default guaranteed turbo clock is 2GHz!

        • Krogoth
        • 3 years ago

        It doesn’t need to be more *powerful*. It just needs to have similar performance at similar price points.

        Nvidia’s mindshare does the rest.

        • maroon1
        • 3 years ago

        GTX 1070 has only 50% shader power and only 33% memory bandwidth at same clock. It will not have any trouble matching RX480

        Also, the default clock speed of GTX 1060 might end up being higher than GTX 1070

    • Redocbew
    • 3 years ago

    I’m a bit surprised the reference cooler isn’t smaller than 1070. It looks like the PCB is smaller though, so maybe non-reference coolers will be.

      • tipoo
      • 3 years ago

      Maybe it is? The shroud is the same size, but the heatsink inside may be smaller. Looks like there’s no glass cover and it’s just a painted on design in the middle to me, too.

    • DPete27
    • 3 years ago

    [url=http://www.fudzilla.com/news/40991-begun-the-amd-nvidia-wars-have<]Fudzilla[/url<] is saying Nvidia has moved up the GTX1060 launch from August to July to compete with the 480. Should be interesting. This is the price point for the majority of the market after all.

    • PrincipalSkinner
    • 3 years ago

    Charlie over at SA has something to say about Pascal production woes. I don’t know what exactly since he wants money for it. But if there are yield issues and GTX 1060 is a cut down GP104, it does not bode well for availability. Or the price.
    But he has been known to be anti-Nvidia so it is to be taken with a handful of salt.

      • chuckula
      • 3 years ago

      There have been plenty of other reports from actual retailers that the high-end Pascals didn’t have any supply issues. Instead, there were demand issues.

        • PrincipalSkinner
        • 3 years ago

        He has to sell his subscription somehow.

        • DPete27
        • 3 years ago

        No doubt. AMD might be launching Polaris at the pricing “sweet spot” but the 16nm jump was significant and drummed up a lot of excitement. Nvidia no doubt made a killing on their higher margin 1080/1070 cards in the past 2 months, not to mention AMD isn’t bringing any real competition to those cards in the near future either. Nvidia is probably laughing all the way to the bank.

          • slowriot
          • 3 years ago

          Made a killing? You couldn’t even buy the cards until the last week or so. If anything it feels like Nvidia wasted a good portion of their head start.

            • BoilerGamer
            • 3 years ago

            Need to work on your card buying skills(Nowinstock.net/Distill Web Monitor). Just got my 3rd 1080(FTW) in the door today from newegg and ready to send the 2nd one(Gigabyte Xtreme Gaming) packing to Singapore for profit on Ebay. Got my first one(FE) 18 days ago and within a week it was gone on Ebay for profit.

            • anotherengineer
            • 3 years ago

            How much profit are you making per card?

            • slowriot
            • 3 years ago

            Huh? You’re literally taking advantage of the poor availability… therefore highlighting my point that the “killing” Nvidia would have been making was largely negated by a lack of world wide availability.

            Like really… your little enterprise here wouldn’t be possible if the cards were “Go over to Microcenter and pick one up” levels of available and yet that flies past your head.

            • Voldenuit
            • 3 years ago

            Yeah, scalper. F#@$% you.

            • NoOne ButMe
            • 3 years ago

            please go find a way to leave the human race scalper. Thanks.

            • sweatshopking
            • 3 years ago

            Not sure why people are treating you like garbage for being a capitalist. Last time I checked it was the economic system of most of the planet.

      • NoOne ButMe
      • 3 years ago

      Seems doubtful to me. More like Nvidia just didn’t wait for inventory build up to launch.

      I did ask someone who formerly worked [s<]at[/s<] with TSMC, but I expect any answer they get will be vague at best.

        • chuckula
        • 3 years ago

        If Pascal really was a paper launch then Nvidia would not have slashed the prices on the GTX-980Ti after the GTX-1080 first hit the market.

        There would be no reason to do so, and no, from the early benchmarks a $230 Polaris card that performs like a $250 R9 390 would certainly not have been a valid reason for a price cut.

          • NoOne ButMe
          • 3 years ago

          If I have, and I likely have, in the past stated it is a paper launch I have to retract that and say I was wrong.

          It has however been much less supply than the market demands. A market which Nvidia dominates and should know how many they need.

          At the end of the day it is probably mostly an issue of TSMC not having enough wafers Nvidia can purchase or yields on largish chips are still to low (see 1070).

      • kilkennycat
      • 3 years ago

      Fyi: Overwhelming demand for the GTX1080 in any form. Have not examined GTX1070 sales in any detail, but probably the same story…

      If one takes the effort to burrow down on the Newegg site, you will find that virtually all of the GTX1080 offerings from multiple vendors have been verified-customer reviewed, with the Founder’s edition having the highest number (eg: eVGA1080 Founder’s 29 reviews). Even the custom eVGA1080 FTW Edition has already had 2 customer reviews. The eVGA SC Edition has 21 reviews, spread over 6/9 thru 6/26.

      I personally acquired a GTX1080 Founder’s directly from eVGA on 3 June to effectively replace dual 780 in SLI. No regrets about that purchase………..

    • SuperSpy
    • 3 years ago

    Boo no shot of the port cluster.

    This card has my name written all over it. Lets hope nVidia doesn’t cut it down too far. I’d love one of these things if it came with an uncrippled memory interface/size.

      • DreadCthulhu
      • 3 years ago

      Well, I see a DVI port, and my eldritch powers tell me that it will also have at least one HDMI port & at least one Displayport. 😉

      • NTMBK
      • 3 years ago

      [quote<]This card has my name written all over it[/quote<] Your parents had a cruel sense of humour, Mr GTX

    • BurntMyBacon
    • 3 years ago

    I approve this use of the word purported.
    [quote<] A poster on the r/nvidia subreddit apparently got in touch with someone in possession of a reference "GeForce GTX 1060" card recently, and they were able to grab a high-resolution snap of the purported card.[/quote<] A poster just happened to get in touch with "someone" in possession of a reference GTX 1060 that was willing to give them hi-res photos one day in advance of RX 480 availability. As usual, Well Played nVidia, Well Played.

      • tipoo
      • 3 years ago

      Nvidia plays *hard*. I always swing between admiration and resentment.

      • chuckula
      • 3 years ago

      Eh. Par for the course, and frankly there have already been plenty of rumors about the GTX-1060.

      AMD actively encouraged thousands of people on a mailing list to disrupt an Nvidia event, that’s dirty:
      [url<]http://www.pcgameshardware.de/Grafikkarten-Grafikkarte-97980/News/Game24-AMD-Fans-Team-Red-Nvidia-Veranstaltung-infiltrieren-1135966/[/url<]

        • BurntMyBacon
        • 3 years ago

        There have been plenty of rumors, yes, there always are. How many pictures have we seen to date? The timing is quite convenient, as well.

        However, you seem to have misunderstood. I’m not berating nVidia here. Its a good tactic to draw peoples attention from a competing product and put a little uncertainty in their decision. I’m just pointing out that it is both obvious and not unexpected. “If it ain’t broke …”

    • tipoo
    • 3 years ago

    If it’s around 200 and anywhere near the RX 480s performance, I think SMS and SMP will be big game changers for VR, taking a very little performance hit for drawing two scenes for VR. I’m surprised there was still nothing mentioned about that in Polaris.

    Then again, I guess it’s also unsure how many people buying 700 dollar VR kits will be pairing them to 200 dollar video cards. There’s always future cheaper VR kits though.

      • TheRazorsEdge
      • 3 years ago

      The VR market is a niche, and it will be for a while. After the prices drop and hopefully some standards emerge, it will become mainstream.

      When that happens, it would be convenient if your graphics card supported VR. But there’s a question of degree—does it barely run with minimal settings, or does it run fluidly?

      I think AMD is painting themselves into a corner. Without a competitive alternative to SMP, AMD’s chips will be relegated to second-rate VR in the long run. That can hurt their perception a lot since they are specifically marketing the card for its VR capabilities.

      But if they don’t mention VR at all, they will probably lose heavily to nVidia since most people like having the capability even if they don’t plan to use it right away.

      • ImSpartacus
      • 3 years ago

      I think you’re on the money with future kits.

      It’s all about building a foundation of users that can utilize vr. Also, people are suckers for the idea “future proofing”.

      In a year or two, we’ll see vr “2.0” get announced and marketed to the now-larger install base.

      • Krogoth
      • 3 years ago

      VR doesn’t matter for $199-$299 demographic since the VR headgears start at $499 or more.

      These cards are aim at people who still game at 1080p a.k.a the masses that want to still pump out a high framerate.

        • chuckula
        • 3 years ago

        This can’t be emphasized enough. Raj got up and went on and on about VR (never demoed it though) but he never said one thing about what AMD is doing to make proper VR headsets affordable. Selling a $230 video card today in hopes that by 2019 VR headsets will be ubiquitous doesn’t sound like a winning proposition.

          • Voldenuit
          • 3 years ago

          He’s selling people on the upsell. Anyone who buys a $200 GPU now thinking that VR headset prices will go down in a couple years’ time will probably be wanting a new GPU in 2 years’ time anyway.

          You can’t blame him. It’s not like they have Vega ready to sell right now, so they have to talk up the 480, even if one of the major talking points doesn’t really make any economic sense to the prospective buyers.

            • Redocbew
            • 3 years ago

            Probably true. With VR being a buzzword at the moment he’d probably hear about it if he didn’t talk about VR. Using buzzwords for the sake of them being buzzwords always makes me nervous though. There should be enough to talk about without them, no?

    • torquer
    • 3 years ago

    Make one that doesn’t need aux power and is passively cooled with 50% or more of the 1070’s performance and it’d be a winner for me

      • ImSpartacus
      • 3 years ago

      That’d be a 1050. The 1060 will almost certainly need more than 100 watts (just like the 480).

        • DPete27
        • 3 years ago

        Since they just recently released the aux power-less GTX950, it’s doubtful that we’ll see a 1050 before the end of 2016. Nvidia is likely reserving precious fab “space” for the higher performance cards.

          • ImSpartacus
          • 3 years ago

          Yes, I think Nvidia is positioning themselves to survive without a gp106/gp107 for at least another couple months.

      • torquer
      • 3 years ago

      I didn’t say it was likely to happen, just that it’d be sweet. I have my Asus Strix 1080 arriving tomorrow, but I have lots of friends who game on a budget who’d love a high po upgrade from their 750 Tis

    • DreadCthulhu
    • 3 years ago

    It will be interesting to see how much Nvidia cuts down its 1080/1070 GPU to make this chip. Too much of a cut down, and they will be wasting lot of good dies that could have sold as 1070s. Too little cutdown, and they will kill the 1070 market. Or maybe that have a smaller die size chip in the works, and the 1060 will be the beefiest card based on that chip. Or maybe they will end up rebadging the 970 or 980 for this; wouldn’t be the First time that sort of thing has happened.

      • Magic Hate Ball
      • 3 years ago

      I’d expect a cut down to 1280 cuda cores, with a 192-bit memory interface supporting 6gb of RAM.

      That way they can sell half bad 1080 chips and still make money.

        • ImSpartacus
        • 3 years ago

        I don’t see them doing that unless it’s cheaper than the 8gb 480 ($230ish) if only because Nvidia had historically shown that they are very conscious about how graphics cards get marketed by vram capacity (e.g. 550 ti, 660, 660 ti, etc).

      • mczak
      • 3 years ago

      The rumours are saying this is using a different chip (GP106). With half the units as GP104 (2 GPC, 1280 cuda cores), except the memory interface (and with that, the ROPs) only cut down to 3/4 (so 192bit – most likely no gddr5x).

      • tipoo
      • 3 years ago

      That’s probably why they’re waterfalling it this way, in addition to higher margins. With the 1080 and 1070 shipping, even if they’re hard to get, they’ll be racking up those partially working dies that didn’t make either cut.

Pin It on Pinterest

Share This