Nvidia tops GeForce 600M series with Kepler-based GPU

Since March, the fastest members of the GeForce GTX 600M series have been older, 40-nm GPUs based on Nvidia’s Fermi architecture. No longer. Nvidia has announced the GeForce GTX 680M, a new mobile flagship featuring the same GK104 silicon as the desktop-bound GeForce GTX 670, 680, and 690.

The 680M shares its desktop namesake‘s 32 ROP pixels/clock, 2GB GDDR5 memory capacity, and 256-bit memory interface, but it has fewer shader ALUs (1344 instead of 1536), can filter fewer textures per clock (112 instead of 128), and clocks its core and memory lower: 720 and 900MHz, respectively, down from 1006/1500MHz.

 

The GTX 680M will be available in some real notebooks soon. According to Nvidia, Alienware plans to add the GPU to its jumbo M17x and M18x gaming notebooks “in the next couple of weeks.” With the M18x, you’ll have the option of getting dual GTX 680Ms in an SLI setup, though that particular config might not be available until the end of the month. MSI, Ava Direct, Maingear, and Origin all plan to offer the GTX 680M in their laptops, as well.

Comments closed
    • maroon1
    • 7 years ago

    HD7970m is HD7870 with a 17.6% lower clock speed

    GTX 680m is GTX 670 with 27% lower clock speed

    So, I think GTX 680m should be faster

      • DeadOfKnight
      • 7 years ago

      Disregard

      • Airmantharp
      • 7 years ago

      It will be faster- that’s easy. How much faster, and how much that will cost, remains to be seen.

      I went with the GTX675m because I wanted Nvidia mobile drivers and Optimus, which works, unlike AMDs solution. Also, I’m not paying the +$300-500 for a 680m, when my GTX675m will still play BF3 at 1080p and high settings.

    • swaaye
    • 7 years ago

    They put Thermi in notebooks too. AMD could surely use Tahiti if they felt like it, considering all of the fancy power management tweaks/controls both companies use now.

      • crabjokeman
      • 7 years ago

      Zombie Enrico Fermi will rise from his grave and drop an atomic weapon on you if you say ‘Thermi’ again.

    • RichardLAnderson
    • 7 years ago

    [url<]http://goo.gl/FGK8b[/url<]

      • CasbahBoy
      • 7 years ago

      Oh look at that, yet another Amazon referral link to a random product, in this case a hard disk drive. Go away.

      Again, people you can see where a goo.gl link goes without actually giving anyone a clickthrough by adding a + to the end of it.

    • Alchemist07
    • 7 years ago

    Just wondering guys, is there any reason why there was no coverage for the 7970M when it launched?

    And no mention of it here when its the direct competitor? (though probably between 675M and 680M) AMD not paying any marketing moneys?

      • Deanjo
      • 7 years ago

      [url<]https://techreport.com/discussions.x/22843[/url<]

      • torquer
      • 7 years ago

      Seriously?

      My only complaint with TR has ever been that they’ve been unable to produce much in the way of GPU reviews on launch days like some of the other larger sites. I’ve NEVER felt they were biased toward any one company.

      You’re implying otherwise and you clearly haven’t been reading the site. See Deanjo’s link below.

      • Airmantharp
      • 7 years ago

      This GPU has no direct competitors- AMD can’t match Nvidia on performance per watt, and this part runs at the highest wattage available to mobile parts.

        • Duck
        • 7 years ago

        Unless you use direct compute maybe.

          • Airmantharp
          • 7 years ago

          Sure- but wouldn’t you want a Fermi- based (GF110) Quadro then? I know AMD upped the ante on their GPGPU stuff, but I also have to wonder how effective what they put into the HD7970m (and underclocked HD7870) would be in comparison to something designed for the job.

            • Duck
            • 7 years ago

            Well Fermi is going to be 40nm and should loose to a 28nm GPU. I expect the 7970 to beat it at performance per watt. Although I haven’t looked for any compute based benchmarks comparing the 2 cards.

            These high end GPUs like 7970/GTX680 are so unsuitable for going in a laptop. Something like GF110 or it’s Kepler based replacement would both be even bigger. An obscenely big 500+ mm² part for servers and high end workstations. I just can’t imagine a mobile part based on it will be possible. Even GTX 680M is a big GPU, I wonder how rare and expensive they are going to be (there must be some major binning going on).

            • Airmantharp
            • 7 years ago

            I thought the GPU in the 680m (GK104) was smaller than the 560Tis GPU, the GF114. And it’s more efficient all-around.

            Really, I’m not sure though- IIRC, the GTX480m was actually a GF100, cut to bits and underclocked of course. Still, a GF100/GF110 could be made to fit in the 100w envelope, and would outperform just about anything for GPGPU in that envelope, considering that CUDA seems to be so much more efficient than DirectCompute/OpenCL at this time.

            So I guess the real question is, how good is the HD7970m at DirectCompute/OpenCL?

            • Duck
            • 7 years ago

            GK104 is not too massive and is clearly efficient at gaming but was not designed to be a compute monster like GF110. Radeon 7000 GCN GPUs all have the same compute abilities as each other, scaled up/down for difference performance/price points. GTX680 is not like this, and has compromised compute performance in order to excel in gaming.

            GF110 in a 100W envelope?? 100W usually gets something like a 7770 and that’s with a 28nm process compared to a 40nm one!

            CUDA seems popular, I’m not sure about more efficient. You might need to rewrite your code to get the most out of a 7970 though.

            • Airmantharp
            • 7 years ago

            You know, you’re right- given that GCN is fully present in the HD7970m, it should be fairly competent with GPGPU- and right again, that it will only be competent when code is properly written and compiled for that, but that’s nothing new to me either.

            I agree that shoving GF110 into a 100w envelope seems crazy. I’m also sure that it can (and may already) be done, given just how much of a compute monster Fermi is; it also has a much higher efficiency when running compute versus game code, which would lead me to believe that a CUDA application on a 100w GF110 might actually be faster than a 100w GCN core running the same application compiled in OpenCL/DirectCompute.

            I’m not rooting against AMD or counting them out here, it’s very nice to see them show GPGPU some love with GCN. It means that GPGPU is coming to the masses!

            • Deanjo
            • 7 years ago

            In all fairness, if you are running any kind of GPGPU code for performance reasons you are not going to be running it on a laptop. That is just asking for trouble as laptops are not built or designed for any long haul full loads. It reminds me of people using their old 30 Gig ipods for back ups. Sure you can use it in a pinch, just don’t expect it to last to long if you try to use it for regular sustained periods.

            • brucethemoose
            • 7 years ago

            The 7870 is the efficient gaming GPU, like the GTX 680, while the 7970 is designed to be AMD’s GPGPU card like GF110.

            I think the 7870 and 680 are both pretty bad at GPGPU.

            • Airmantharp
            • 7 years ago

            While I might agree with you otherwise, the idea that GCN made it unscathed from the 7900s to the 7800s does contradict that view.

    • Firestarter
    • 7 years ago

    No TDP mentioned? Then you cannot compare it to any other mobile GPU yet.

      • Arclight
      • 7 years ago

      Regardless of it, when you have this kind of GPUs and you use them for gaming, you have to plug the power cable in……so i think TDP doesn’t matter. What matters is how good the cooling works and if it needs additional help, like a notebook cooler to keep the system stable during long periods of gaming.

      But most importantly would be price. Why on Earth would i pay 2000 dollars for a laptop with this GPU when i can build a 1000 dollars desktop the far exceeds it in performance? Given that i don’t require the mobility.

        • jpostel
        • 7 years ago

        [quote<]Regardless of it, when you have this kind of GPUs and you use them for gaming, you have to plug the power cable in......so i think TDP doesn't matter.[/quote<] [quote<]Given that i don't require the mobility.[/quote<] Methinks your lack of requirement trumps your understanding of those that do, or those that want. If the GPU is powerful enough, and the TDP is low enough (i.e. efficient), then one can run games when a power plug is not readily available. I'm sure there are several people that wouldn't mind getting outside for some fresh air and a game of SC2. Besides games, some people have jobs where they have to travel, and GPUs/CPUs matter (graphic designers and the like).

          • Arclight
          • 7 years ago

          [quote<]If the GPU is powerful enough, and the TDP is low enough (i.e. efficient), then one can run games when a power plug is not readily available. I'm sure there are several people that wouldn't mind getting outside for some fresh air and a game of SC2[/quote<] Those that do are a minority.....what gamer wants to get distracted by the outside world while playing SC2? It does not compute [quote<]Besides games, some people have jobs where they have to travel, and GPUs/CPUs matter (graphic designers and the like).[/quote<] Performance always comes at higher power consumption and when you put a high performance part in a laptop...you do need to plug it in. Basically your applications for the GTX680M are pure fantesy. Such an efficient GPUat this level of performance cannot be built at this moment in time. I frankly can't wait for the review to see how many minutes it lasts during high load on battery.

        • Firestarter
        • 7 years ago

        A GPU with high TDP needs good cooling, which more often than not is big, heavy and noisy.

          • Arclight
          • 7 years ago

          So far so good, i agree and don’t see any conflict with my opinion…….

            • Firestarter
            • 7 years ago

            So you think it doesn’t matter how big, heavy and loud a laptop’s cooling system is? Remember that this is a [i<]mobile[/i<] GPU we're talking about.

            • Arclight
            • 7 years ago

            Nobody wants monstrosities but they (as in nvidia/AMD) will do anything to satisfy any potential market. nvidia saw a potential market in high performance mobile GPUs and they went for it.

            Performance comes at the price of TDP, but they aren’t the ones selling the laptops, ofc. They can go as far as they can make sure that reasonably sized laptop cooling will be able to dissipate all the heat, well at least they promise.

      • Silus
      • 7 years ago

      Why ?

      Also, this type of mobile GPU (high-end one) shouldn’t break the 100w mark:

      [url<]http://www.notebookcheck.net/Nvidia-takes-curtains-off-GeForce-GTX-680M.75738.0.html[/url<]

        • Firestarter
        • 7 years ago

        [quote<]Why ?[/quote<] Because a laptop with a 30W GPU is going to be significantly smaller and lighter than a laptop with a 100W GPU. So, anyone shopping for one will probably not even consider the other, because it just does not suit their needs. Therefore, such a press release tells us almost nothing if we do not at least have a ball-park TDP figure. Without knowing whether these numbers correspond to 80W, 100W or 120W configurations (or whatever OEMS usually use), it's no use to get all excited about it IMO.

          • Silus
          • 7 years ago

          Er…these types of GPUs only go to high-end laptops that by themselves are already brick heavy, so I really don’t understand your point. Plus these types of GPUs also are not rated above 100w, so you already have your ballpark, even if you choose to ignore it.

          And it’s hilarious that you believe that the TDP is everything and that the press release is worthless without it! Yeah cause estimated performance, features and that sort of thing mean absolutely nothing. TDP is what matters 🙂

            • Firestarter
            • 7 years ago

            [quote<] Plus these types of GPUs also are not rated above 100w, so you already have your ballpark[/quote<] Oh they aren't? Tell me how I can deduce that this GPU is needs less than 100W, or any other number. I say these FPS numbers are worthless without a TDP because without a TDP we're not going to know whether this thing could fit in a bondafide gamer laptop or if we're only going to see it in monstrous desktop replacements.

      • pogsnet1
      • 7 years ago

      TDP is 100W+ (they will report it to be 100W only)

      • crabjokeman
      • 7 years ago

      These will probably be paired with Intel IGP’s in hybrid/Optimus configuration, so you use the Intel IGP when you’re on battery.

      • jpostel
      • 7 years ago

      I am not understanding the downgrade on the TDP comments by Firestarter and dpaus. We are talking about a mobile GPU, so TDP matters.

      Is this a fanboi thing that I am missing?

        • Silus
        • 7 years ago

        And BECAUSE it is a mobile GPU we already know that its TDP won’t exceed 100 watts. That’s the norm.

          • dpaus
          • 7 years ago

          Yeah, but when your IvyBridge/Trinity CPU is only 17W [i<]including[/i<] decent graphics, I think the question is perfectly valid.

            • sweatshopking
            • 7 years ago

            don’t bother talking nvidia with silus. it’s like trying to claim i’m not the cutest TR member. it does not compute.

            • DeadOfKnight
            • 7 years ago

            Those 17W parts don’t have decent graphics for gaming.

            • Airmantharp
            • 7 years ago

            Yup- also, this GPU is going into existing enclosures that have 100w limits. The Alienware and Clevo chassis have been around for months (at least) and support previous generation GPUs with that envelope.

            So figuring that out is actually pretty easy.

    • Myrmecophagavir
    • 7 years ago

    “The 680M shares its desktop namesake’s 1344 shader ALUs…”

    Doesn’t the desktop 680 have 1536 ALUs? In fact, 680M is more like a 670 with lower clock speeds. Same ALU count etc.

    • Deanjo
    • 7 years ago

    Should make for some nice graphic options in the new Mac’s.

      • sweatshopking
      • 7 years ago

      haahahahahahahahh

        • Deanjo
        • 7 years ago

        You think they are going to be powering those retina displays with integrated graphics?

          • Kurotetsu
          • 7 years ago

          They just might. Regardless, do you seriously think Apple is going to stick a 100W mobile GPU into any of their laptops?

            • Deanjo
            • 7 years ago

            Considering Apple does not make “laptops” there is zero chance. (Apple refers to them as portables and the manuals very clearly state that they are not to be used on the lap).

            • sweatshopking
            • 7 years ago

            technically, nobody makes laptops. it’s all notebooks. what’s your point? it’s irrelevant where you use them. they’re not going in.

            • Deanjo
            • 7 years ago

            [url<]http://shop.lenovo.com/ca/en/products/laptops/[/url<] [url<]http://www.dell.com/ca/p/laptop-deals?ST=%20dell%20%20laptops%20canada&dgc=ST&cid=3852&lid=4242950[/url<] [url<]http://us.toshiba.com/computers/laptops[/url<] And on and on and on....

      • cynan
      • 7 years ago

      That was a j/k, right? When was the last time you saw a 2 inch thick Macbook?

      However, I suppose that maaaaybe this could show up as an upgrade option for an Imac?

        • Deanjo
        • 7 years ago

        Exactly, iMac’s use mobile GPU’s. On the mobile side they will probably use something like a GTX-650.

          • cynan
          • 7 years ago

          TO be fair, it is kind of easy to forget that Apple sells PCs other than Macbooks…

            • Deanjo
            • 7 years ago

            I don’t know about that, there are a ton of them out there especially in the business sector now days. It seems like every office you visit now the receptionist has an iMac in front of them.

            • cynan
            • 7 years ago

            I was being a bit cheeky on that one. You got me.

    • HisDivineOrder
    • 7 years ago

    Greeeaaaat…

    nVidia, you have crushed all comers in the portable market now with your super high end for the mobility sector.

    Everyone cheers for you. Well, everyone who’s wanting to spend over $1k and not get an ultrabook, right? Now that we’re all done cheering, could you pretty please with sugar on top go ahead and release your Kepler-based 7870 and 7850 competitors? Thxkbai!

      • brute
      • 7 years ago

      Looks like that divine order that will be ignored.

      Heh heh heh

      • cynan
      • 7 years ago

      I think meant you Kthxbai?

    • entropy13
    • 7 years ago

    And I am merely channeling Duck’s spirit in my comments. lol

      • crabjokeman
      • 7 years ago

      Reply fail? If not, context would be helpful…

        • entropy13
        • 7 years ago

        Context: The topic of this post

        That’s why it’s not a reply fail since I made replies in separate comments already.

    • pedro
    • 7 years ago

    They’ve pulled no punches with this one. Over at anandtech they compared this with other high-end Nvidia mobile GPUs and the generational improvement is pretty staggering. It’ll be interesting to see how that paper improvement translates to the real world.

      • DeadOfKnight
      • 7 years ago

      Supposedly the new part will play every game available today at 1080p with maximum in-game settings.

      • Oberon
      • 7 years ago

      Anandtech did no such thing; they just gave you the numbers NVIDIA gave them.

    • DeadOfKnight
    • 7 years ago

    Holy crap…anyone see the performance benchmarks for this thing? It’s a true desktop replacement chip!

    [url<]http://www.geforce.com/whats-new/articles/introducing-the-geforce-gtx-680m-mobile-gpu/[/url<] [url<]http://www.anandtech.com/show/5914/nvidia-geforce-gtx-680m-kepler-gk104-goes-mobile[/url<] [url<]http://pcper.com/news/Mobile/NVIDIA-Announces-New-GTX-680M-World%E2%80%99s-Fastest-Mobile-GPU[/url<] Maybe the binning of these chips is why supply has been so limited, since OEMs come first always.

      • dpaus
      • 7 years ago

      what’s the TDP?

        • sweatshopking
        • 7 years ago

        10000000

      • DeadOfKnight
      • 7 years ago

      That Clevo P150EM is the sexiest laptop I’ve seen out of them in awhile.

      [url<]http://www.xoticpc.com/sager-np9150-clevo-p150em-p-4341.html[/url<]

        • entropy13
        • 7 years ago

        But you can’t have them because Kepler is nonexistent.

          • HighTech4US2
          • 7 years ago

          Hey entropy it’s time to pull your head out of your (or is that char-lies) *ss.

          GTX680’s are in stock at newegg:

          [url<]http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&DEPA=0&Order=BESTMATCH&N=-1&isNodeId=1&Description=gtx680&x=18&y=16[/url<] and the GTX670's have always been in stock since release: [url<]http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&DEPA=0&Order=BESTMATCH&N=-1&isNodeId=1&Description=gtx670&x=22&y=13[/url<]

            • entropy13
            • 7 years ago

            In this thread, call me Duck. lol

            • Duck
            • 7 years ago

            wat

            • xeridea
            • 7 years ago

            Yeah, a high end or 2 for considerably more than msrp. Out of 22 models, 2 are in stock, and its been what, 2 1/2 months? Thats what you call “being in stock”, let me tell ya. HD 7970s were in stock nearly from day 1, its also a bigger chip. Nvidia just has yield issues out the rear, but won’t admit it.

            • HighTech4US2
            • 7 years ago

            There were three in stock when I posted. Now there are two. The lower cost one has since Sold Out. If you snooze you lose.

            The demand for the HD 7970 is low so yea it is easy to keep it in stock whereas demand for the GTX680 is great thus the low inventory levels.

            And the GTX 680 is in stock at Newegg something that char-lie and entropy deny.

            As for admitting anything it is you (and char-lie) who believe the issue is yields whereas Nvidia has publicly stated it is wafer supply.

      • entropy13
      • 7 years ago

      [quote<]Maybe the binning of these chips is why supply has been so limited, since OEMs come first always.[/quote<] No, supply has been limited because of Nvidia.

        • MadManOriginal
        • 7 years ago

        Right, because every good business wants to limit the supply of their product that is sold out in order to sell fewer than possible. derp.

      • Deanjo
      • 7 years ago

      [quote<]Maybe the binning of these chips is why supply has been so limited, since OEMs come first always.[/quote<] One OEM in particular who is due to release some massive updates next week.

      • Goty
      • 7 years ago

      Funny; if these had been benchmark numbers supplied by AMD for the 7970M people would be jumping up and down, red in the face, yelling that they won’t believe the numbers until they see independent benchmarks.

Pin It on Pinterest

Share This