Nvidia brings its GTX 1080, GTX 1070, and GTX 1060 to laptops

After setting a blistering pace of Pascal releases for the desktop, Nvidia is turning its attention to the mobile marketplace. The company is bringing its GeForce GTX 1060, GTX 1070, and GTX 1080 graphics chips to notebooks today. This time around, there's no M attached to any of these parts. Much like we saw with the mobile GTX 980 and its fully-enabled GM204 GPU, each Pascal mobile chip has provisions similar to its desktop counterpart.

The chips aren't identical to their desktop siblings, though. Taking Pascal mobile requires minor clock drops on the GTX 1070 and GTX 1060. In the case of the GTX 1070, Nvidia has made up for the slower speeds by enabling more cores and texture units. According to Anandtech, the GTX 1070 mobile chip has 2048 SPs and 128 TMUs, as opposed to 1920 and 120 on the desktop. 

Here's a partial spec sheet for each mobile Pascal chip:

The arrival of Pascal in mobile gaming machines means that practically every laptop with one of these cards is ready to power Oculus' Rift and HTC's Vive headsets to varying degrees. Those chips will also be able to drive new mobile G-Sync displays with refresh rates as fast as 120 Hz, up from the 75-Hz maximum available in the first generation of G-Sync gaming notebooks. Nvidia says its notebook partners will offer mobile G-Sync displays in 1920×1080, 2560×1440, and 3840×2160 flavors.

Nvidia has enlisted a wide range of partners to make Pascal-powered gaming notebooks ranging from mild to wild, and those machines are available now. Newegg already has Asus, MSI, and Gigabyte notebooks with Pascal chips inside. Going by Newegg's stock, gamers can expect to pay $1500 and up for a handsomely-equipped GTX 1060 machine, or about $2000 and up for a GTX 1070-powered mobile monster. Even if those prices are stiff, Pascal-powered gaming notebooks look like they'll deliver a quantum leap in mobile graphics performance. We'll have to try and get our hands on one of these laptops soon to see how they perform.

Comments closed
    • Wonders
    • 3 years ago

    Say whaaaaaaaaaat?
    Let the record state that I didn’t see this one coming. As a VR developer, this is going to have some highly practical applications I assumed were still further off.

    • phileasfogg
    • 3 years ago

    Is it any wonder that they announced this lineup on the first day of IDF?
    JHH will never tire of poking Intel in the eye at every possible opportunity, or so it seems.

    • maroon1
    • 3 years ago

    The mobile GTX 1080 have slightly lower base clock if you look at anandtech article

    Desktop version 1607Mhz while mobile version will be 1556MHz, but boost clock is identical

      • sweatshopking
      • 3 years ago

      Yeah. You’ll really have to check specific model benchmarks to see how the cooling holds up.

    • ronch
    • 3 years ago

    Gotta love how mobile parts get the desktop product line’s naming scheme, shoves it in its mouth, chews, and spits it out the window.

    • Bauxite
    • 3 years ago

    Hope the cpu refresh is soon, most laptops (even gamer) pipe the output through the intel gpu ports which means downgrading to hdmi 1.4 and dp 1.2.

      • zqw
      • 3 years ago

      No G-Sync or SLI laptops use Optimus (Intel GPU video out.) This has been a huge problem for VR dev for years, and has been heavily researched and discussed..

    • TheRazorsEdge
    • 3 years ago

    This is the first time I can ever remember seeing core/clock parity between desktop and mobile.

    I have always shied away from gaming laptops because they were effectively good for a few years at most. And then replaced entirely since MXM never really took off.

    Gaming laptops might actually have a normal lifespan now, especially with 1080p and 1440p panels.

      • tipoo
      • 3 years ago

      Going to bet dollars to donuts it won’t actually be clock speed parity. Boost clocks and thermal cooling envelopes for one.

      • dyrdak
      • 3 years ago

      I also find “gaming laptops” to be an oxymoron. It’s either gaming cinder-block or a laptop. And MXM is just not enough to make graphics upgradable. Both the slot kept changing and the replacement module had to fit original cooling interface. Said that, it had not stopped me from replacing failing nvidia card with ati in old Acer. Some dremel action was required though.

    • derFunkenstein
    • 3 years ago

    That’s quite a bulge in your sack there. Are you packing Pascal or are you just happy to see me?

    • chuckula
    • 3 years ago

    I’m sure there’s a fancier way to do this under windows, but under Linux the nvidia-smi utility lets you adjust the power limit for a GPU (within a predetermined range). For example, on my GTX-1060 the limits are 60W minimum and 140W maxium [120W default] and the GTX-1080 does a range of about 90W to 215W [180W default].

    These power limits do not directly change the clockspeeds in the same way as over/underclocking although they do of course have an indirect effect on the real-world clockspeeds, and especially the ability to maintain a turbo-boost speed for a prolonged time period.

    I’m willing to bet that in addition to binning and potentially some undervolting, the firmware on these chips will specify a lower power limit by default than the desktop cards to keep the power consumption sane. That’s of course in addition to potential thermal throttling, which is more dependent upon the cooling solution rather than the underlying silicon.

      • xeridea
      • 3 years ago

      I can save ~25% GPU only power draw by undervolting my 470 (not sure what lowest stable voltage is yet), high ASIC quality of 88.5%. It is unlikely they can take a 180W card though and have any hope of actually boosting while being cooled in a laptop.

    • kamikaziechameleon
    • 3 years ago

    The new egg link isn’t showing any laptops.

    I’m really curious to see if and when these make a leap into a macbook or a razor laptop. Those form factors are amazing, and if they can utilize the potential in these GPU’s it would be great news for mobile power users everywhere.

    Plus there is the reliability of Nvidia drivers… that is always a plus.

      • mnemonick
      • 3 years ago

      It is now, it probably took them a while to get the updated pages up. I was just looking at the MSI 15.6″ with 6700HQ, 16GB, 6GB 1060 and 256GB SSD/1TB HD. [url=http://www.newegg.com/Product/Product.aspx?item=N82E16834154296<]Link[/url<] It's 0.69" thick at 4lbs. 😀 It's also $1800. 🙁

      • Neutronbeam
      • 3 years ago

      The links to products seem to be changing…I can only find one 1070 model now. This is what I have so far
      [url<]http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&IsNodeId=1&N=100167748%20601211659[/url<]

      • sweatshopking
      • 3 years ago

      Newegg reviews of desktop 1k line is FULL of people complaining about bad drivers on the new NVidia stuff.

    • ronch
    • 3 years ago

    That’s the trouble when your competition has a more efficient architecture than you do. It’s not such a big problem in the desktop market but in the mobile market you are asking to drain the battery faster and make cooling more fussy than your competition does. And that’s not a good way to grab design wins unless you’re willing to price cheaper and hurt your bottom line.

    • Chrispy_
    • 3 years ago

    I’m looking forward to GP108 or whatever the baby version is.

    If we can get GTX960 performance (which, let’s face it, is absolutely fine for modern games on 1080p laptop screens) we can have quieter, cooler-running laptops and perhaps (though I’m probably being far too optimistic here) we can finally game for a sensible length of time on battery power.

    Wait, what am I saying, that’s far too sensible and obvious! No, far more likely the vendors will find new and pointless ways to make the gaming laptops thinner than ever with horrendously tiny fans and genital-roasting thermal limits.

      • HERETIC
      • 3 years ago

      Yup-something like 1/2 a 1060.
      Should perform just under desktop 960 at around 50 to 60 watts.
      Perfect compromise for a lappy………………………………..

        • Chrispy_
        • 3 years ago

        Well, with a bit of luck Nvidia can cherry-pick the better yielding chips for mobile and dial back the clockspeeds a bit to hit the 32W target of the 940M and 45W target envelopes that the old 650M/740M/745M/850M/860M.

        There were some nice 13.3″, 14″ and 15.6 laptops that weren’t heavy and bulky with those GPUs. They weren’t silly thin like Ultrabooks that have no room for even normal sized ports, but they were the sort of thing you didn’t mind carrying around a lot. There’s no way I’d carry most modern “gaming laptops” around since most of them are at least 3Kg+ (7lbs or something like that) and that doesn’t include their vast power bricks which are often heavier than a typical ultrabook!

          • HERETIC
          • 3 years ago

          NVGreedy will most certainly bin it’s dies that’ll run at a lower voltage-they can then charge
          twice as much for them-probably around 45 Watts power usage.

    • NTMBK
    • 3 years ago

    I dislike the naming scheme. They’re parts with different TDPs, and hence probably very different base clocks. They’re not going to perform the same, they’re going to throttle more and generally run at lower clocks. Which is fine- that’s the best way to make an efficient mobile GPU! If only there were some way to make it obvious in the name. Slap a letter on the end or something.

      • kuraegomon
      • 3 years ago

      I hear you, but I also think NVidia has a point here. We’ve never seen equal or higher core counts in mobile parts before, and we tolerate clock-speed variances in factory-overclocked vs. reference cards without requiring renaming. We also really don’t know what the real-world performance of these chips looks like yet, or the actual base clocks. I think it’s appropriate to reserve judgement until we see benchmarks.

      If they hit reasonably comparable frame rates (and frame times!), then NVidia will have more than earned the right to re-use the desktop names.

      • ronch
      • 3 years ago

      You dislike the naming scheme? Here, let me give them new product names for you:

      GTX 1080 –> Papa Bear
      GTX 1070 –> Mama Bear
      GTX 1060 –> Baby Bear

      Well, what do you think?

      • Freon
      • 3 years ago

      It’s at least still a step forward. Ex. a 970M is arguably closer to a desktop 960 than desktop 970 in raw specs even if you ignore TDP. Certainly in actual performance the 970M is closer to the 960.

      It’ll at least be closer to parity now, even though it’s never going to be equal. And, somehow they feel the extra transistors on the larger 1070, 1080 chips is worth it over just running a 1060 faster, right? TDP over performance isn’t a straight line relationship, I imagine even modest drops in voltage and clock lead to large reductions in TDP.

    • xeridea
    • 3 years ago

    Is there a Founders Edition?

      • ronch
      • 3 years ago

      You get today’s Sarcasm Award™ Congrats!!!

        • Chrispy_
        • 3 years ago

        Hmmmm.
        Can’t tell if sarcastic…

        <Philip J Fry.jpg>

        Or if that’s an award I really really want.

    • Meadows
    • 3 years ago

    Hold on. If these are fully enabled Pascal GPUs, how are you going to cool the damn thing?

    A single GTX 1060 has twice the power budget of an entire all-in-one laptop, which is why I’m asking.

    That is unless the “boost clocks” are a daydream from some rainbow paradise and the effective clock speeds will be far below that in a usual situation.

      • xeridea
      • 3 years ago

      Likely they will barely manage base clocks, and boost is just a number they threw out there for someone wanting to make a 20 lb laptop with a ludicrous power brick, and a mammoth battery.

        • Leader952
        • 3 years ago

        [quote<]Likely they will barely manage base clocks[/quote<] I see you enjoy getting slapped down: [url<]http://www.pcper.com/reviews/Graphics-Cards/NVIDIA-Pascal-Mobile-GTX-1080-1070-and-1060-Enter-Gaming-Notebooks[/url<] [quote<]Using Unigine Heaven and looping it for more than 10 minutes gets us to a stable clock speed and temperature for both configurations. The ASUS G752VS provides an average clock rate of 1570 MHz on the GTX 1070. That is above the 1442 MHz base clock[/quote<]

          • xeridea
          • 3 years ago

          Referring more to the 1080, with a lot more cores, and higher theoretical boost. It is possible, just saying the laptops would be heavy, more of DTR rather than “lap”top. With the battery on that laptop, it may run 30 minutes under load unless it severely downclocks. 200W power brick (guessing, they don’t say, but GPU TDP 115W, then add system, and extra so it might actually charge battery) would not be small either.

            • Leader952
            • 3 years ago

            Here is a DUAL GTX 1080 Gaming Notebook:

            GT83VR TITAN SLI (Geforce® GTX 1080 SLI)
            [url<]https://us.msi.com/Laptop/GT83VR-TITAN-SLI-6th-Gen-GTX-1080-SLI.html#hero-overview[/url<] It weighs 10 lbs not the 20 lbs you originally stated that they would weigh: [quote<]MSI's GT83VR and GT73VR Titan series laptops are company's new premium products. At the top of the line is the GT83VR Titan SLI, which boasts two GeForce GTX 1080 graphics chips running in SLI alongside Intel's Core i7-6920HQ processor. A mechanical keyboard, an 18.4" IPS Full HD panel, and 64GB of DDR4 round out the specs for this 10 pound (4.5 kg) machine. [/quote<] Can we expect more "made up numbers"?

      • Rand
      • 3 years ago

      Their definitely taking the best least leaky chips that can run at the lowest voltages and using them for the notebook variants. The power consumption of the desktop GTX 1060 variant is almost certainly far higher then the mobile GTX 1080. Their just hand picking the very best dies for mobile. On desktop it’s less important if the chips need to run at high voltages to maintain rated clockspeeds or are very leaky as performance is largely all that most people care about, few are worried if the chip consumes a little more power in a large case where you can easily remove the excess heat and no worries about battery life.

        • Chrispy_
        • 3 years ago

        Leaky chips are not power-efficient at all, they overclock well when given adequate cooling on large watercooled rigs. That’s their advantage and why they’re used there….

          • Meadows
          • 3 years ago

          He said “least leaky”.

            • Chrispy_
            • 3 years ago

            I dyslexia’d all over the “least leaky” and skipped the word least.

            A bit like when you put
            put two words in but with a
            a line break between them
            and then you fail to spot the
            the extra word somehow.

            • xeridea
            • 3 years ago

            I did the same thing, LEAst and LEAky look really similar, and I just woke up.

            • dyrdak
            • 3 years ago

            I bet it’s because of “best least leaky”. The mind will take shortcut through this linguistic obstacle.

        • Narishma
        • 3 years ago

        It’s “they’re”, not “their”.

          • ronch
          • 3 years ago

          In Britain it’s ‘thea’.

      • flip-mode
      • 3 years ago

      With a fan.

    • christos_thski
    • 3 years ago

    Someone correct me if I’m wrong, but have laptop GPUs ever, historically, been so close, performance-wise, to desktop performance as these seem to promise?

      • brucethemoose
      • 3 years ago

      Laptop GPUs have [i<]always[/i<] been desktop GPUs, but historically they've been branded differently. A 6970M is actually a 6870, and so on. Clocks vary, but sometimes they're close. How much the branding and clockspeed difference differentiates the GPUs depends on your point of view.

        • Rand
        • 3 years ago

        Their generally much more clearly differentiated from their desktop brethren though. Fewer cores, lower clockspeeds. These chips will perform near identical to the desktop variants. That is definitely unique.

          • synthtel2
          • 3 years ago

          Yup, usually the clocks have to be dropped quite a lot more than this.

          • EndlessWaves
          • 3 years ago

          It’s never a good idea to use marketing department stuff as your reference point.

          nVidia’s laptop chips have been available with the same core count and similar clockspeeds to desktop chips for at least the last few of generations:
          GM204 was 2048/1116Mhz on desktop, 2048/1064Mhz on laptop
          GK104 was 1536/1046Mhz on desktop, 1536/993Mhz on laptop
          GF104 was 336/650Mhz on desktop, [b<]384[/b<]/575Mhz on laptop. All that's changed is nVidia's stopped rebranding them as a step further up, there's (hopefully) no more calling the mobile version a 960M instead of a 750ti. That's good, but if they're anything like their desktop counterparts power usage will increase slightly rather than decrease. Which is always bad news for gaming laptops are it means they're louder and hotter.

      • Hattig
      • 3 years ago

      No, nor have they had TDPs so close to the desktop products either.

      These are not mobile GPUs. They are GPUs that live in a laptop that you can use in a non-laptop situation (power cord plugged in, on desk).

    • sweatshopking
    • 3 years ago

    Also not only difference. Different number of cores clocked lower vs desktop.

      • Jeff Kampman
      • 3 years ago

      Whoops! Changed my take to account for that.

    • sweatshopking
    • 3 years ago

    FIRST Mobile gpu launch in a while worth getting excited about.

Pin It on Pinterest

Share This