Nvidia bolsters mobile GPU lineup with MX models

Remember the GeForce4 MX? The low-end GPU was pretty forgettable, to be honest, but Nvidia is bringing back the MX moniker for handful of new mobile GPUs. Instead of being budget offerings, these new parts start at the high end of the mobile GeForce lineup. The fastest of the three is the GeForce GTX 680MX, which is the first mobile GPU to offer Kepler’s full payload of 1536 shader ALUs. The previous top-of-the-line model, the GTX 680M, has only 1344 ALUs.

The two mobile 680s share the same 720MHz core clock, down considerably from Kepler’s 1GHz+ clock speed on the desktop. Nvidia has ramped up the MX’s memory speed by quite a bit, though. While the 680M’s GDDR5 memory runs at 3.6 GT/s, the MX’s RAM clocks in at 5 GT/s. That’s only 1 GT/s short of desktop GTX 680 parts.

One step down the line, the GeForce GTX 675MX improves upon the 675M. The new chip has 960 ALUs, more than doubling the 384 ALUs in its predecessor. The 675MX’s 600MHz core clock has been dialed back by 20MHz, but the memory speed has been increased by 600 MT/s. A similar story plays out with the GeForce GTX 670MX, which has more ALUs and a higher memory speed than the old 670M.

We haven’t seen any new notebooks announced with these MX GPUs, but Nvidia points out that the GeForce GTX 680MX and 675MX will be available as options on Apple’s new 27″ iMac. According to the Apple Store, that system won’t start selling until December. When it arrives, the new 27″ iMac could be the most potent all-in-one gaming system around. The market for all-in-one systems is growing at a much faster rate than the one for traditional desktops, which means we could see a lot more mobile GPUs moonlighting outside of notebooks. Thanks to EXPreview for the tip.

Comments closed
    • lowtide
    • 7 years ago

    the Fermi ALUs are actually twice as fast as the Kepler ones. The 675MX is about 20-30% faster top which corresponds to the increase in ALUs and speed
    [url<]http://www.gaminglaptopsjunky.com/first-gtx-675mx-benchmarks-performance-gain-over-gtx-670m/[/url<]

    • chrissodey
    • 7 years ago

    Maybe Apple could make an add-on for the new mini that looks like the mini but has two GTX 680MX gpus in sli. Connect the device to the mini via thunderbolt. That gpu upgrade would also be compatible with the macbooks and imacs.

      • Airmantharp
      • 7 years ago

      There’s not enough bandwidth in TB for that bro… it IS essentially ePCIe, but it’s nowhere close to a PCIe 2.0 x16 slot’s worth of bandwidth.

      Granted, the idea is sound- making external GPU boxes out of MXM modules would probably work well if the bandwidth was there, and hell, stacking them or making dual-module boxes might even be feasible, especially with how potent and efficient Nvidia’s mid-range silicon has become.

      Just don’t expect any of that to come cheap, or soon. Intel won’t be wiring TB up to an x16 slot until it makes sense for a much broader share of the market and those GTX680m modules go for over $600 each, if you can find them on the open market.

    • vargis14
    • 7 years ago

    With the high resolution of the 27 inch apple AIO system i bet you can game a full resolution with that 680mx with most games. I am not a macman, I don’t even know if BF3 is available for macs if it is i do not think it will handle ultra settings:) But it could handle High settings at native resolution.

      • Airmantharp
      • 7 years ago

      Bootcamp should make it a non-issue for those who care, and Microsoft is practically giving Windows 8 away.

    • crsh1976
    • 7 years ago

    “When it arrives, the new 27″ iMac could be the most potent all-in-one gaming system around.”

    It’s worth noting the 675MX is the stock GPU on the higher-end 27-inch model only, the 680MX is an upgrade option only for that model as well – it starts at $1999.

    The base 27-inch model ($1799) is stuck with the 660M, no GPU upgrade option is available.

    • Chrispy_
    • 7 years ago

    Oh, and just one more thing:

    [quote<] When it arrives, the new 27" iMac could be the most potent all-in-one gaming system around.[/quote<] Uh, [b<][i<]'all-in-ones'[/i<] are to [i<]'gaming'[/i<] what [i<]'castration'[/i<] is to [i<]'fertility'[/i<].[/b<] Even entry-level gaming these days demands at least one additional PCI-E power connector and a dual-slot cooling solution. Performance per watt is the limiting factor and if you try and jam anything into a tiny TDP with limited cooling, the performance is going to nosedive or it is going to sound like a midget hurricane.

      • Washer
      • 7 years ago

      The HD 7750 does not require a PCIe power connector and provides a very enjoyable gaming experience on a budget.

        • Chrispy_
        • 7 years ago

        On an iMac’s four-megapixel display, the 7750 isn’t going to provide an enjoyable gaming experience with modern games.

        Sure, you can turn down the resolution, but then that just proves my point; Lower TDP requires major compromises to reach acceptable gaming performance.

      • Deanjo
      • 7 years ago

      [quote<]Even entry-level gaming these days demands at least one additional PCI-E power connector[/quote<] On a desktop with conventional slots that is true because of the maximum current draw that the PCI-E connector specs out at. When incorporating it onto a MB that no longer is true as the extra current draw can easily be compensated with thicker traces.

      • internetsandman
      • 7 years ago

      Core i7 3770, GTX 680MX, and 32GB of RAM. This sounds capable of quite a lot more than entry level gaming, just go get yourself a pair of good noise isolating headphones and the noise won’t matter

        • Airmantharp
        • 7 years ago

        I honestly think that if Apple is willing to shove one into an iMac, then the noise (and TDP it results from) will be more than controllable.

        A 27″ iMac actually starts looking like a good deal here, as painful as those words are to type!

          • internetsandman
          • 7 years ago

          When you look at what you get in terms of the screen size, it does seem like a good deal. Gonna have to wait till we see the full customization options on apples online store, but considering equivalent screens usually cost a thousand bucks, it’s not too bad

          Other than RAM of course. That’s still insultingly expensive to upgrade

            • MadManOriginal
            • 7 years ago

            That’s one thing (the only thing?) that is easily upgradable in the new iMac.

            [url<]http://www.anandtech.com/show/6402/up-close-with-the-new-27inch-imacs-user-serviceable-memory-panel[/url<] Apple is losing their touch, letting users upgrade their machines :p

            • Airmantharp
            • 7 years ago

            Considering that I’d want the entire package as is, and can add memory and storage cheaply, just being able to get that GPU with that panel in Apple-spec is more than worthwhile. Obviously it’d still need a nice mouse and an MX-Brown based keyboard as well as a copy of Windows, but there’s a mac right there that can run BF3 in full glory! That’s cool!

    • Chrispy_
    • 7 years ago

    Whoever decides model names at Nvidia is clearly stark, raving mad.

    MX = the suffix for budget, feature-disabled or old generation die-shrinks.
    Ti = the suffix for full-fat, higher-performing, higher-clocked variants.

    Why, in all hell’s name, have they done this? If some idiot is trying to imply that “add an X” is better, then perhaps we’d better remind them by introducing the back of their head to an XFX X1950XT XXX Edition (hugs and xx’es)

      • derFunkenstein
      • 7 years ago

      Well, here I think MX is for Mobile eXtreme (or something else lame). The problem is for people in the 30+ age bracket that actually remember the GeForce 2MX and GeForce 4MX.

        • Airmantharp
        • 7 years ago

        The MX’s were labelled jokingly as ‘Graphics Decelerators’ back then; the 2MX was a broken GF2, and the 4MX was also a broken GF2. It was real sad watching people buy those and expect them to play games.

          • Prion
          • 7 years ago

          Still better than the original “3D Decelerator” the S3 ViRGE, where playing games with software rendering was usually the better option.

            • Airmantharp
            • 7 years ago

            That is where the term came from, IIRC. Those things were truly awful.

          • derFunkenstein
          • 7 years ago

          The 2MX was actually heralded as the king of the budget cards at its release. Go back and reread the reviews of the time.

          The 4MX was a disappointment. TR’s reviews bear that out.

      • Airmantharp
      • 7 years ago

      They’ve all been stark raving mad for a long time brother. It’s the way of things when engineering specifications meet the marketing department.

      • Meadows
      • 7 years ago

      NVidia dropped the old “MX” years ago, they’re reintroducing it in a different role as derFunk pointed out.

      And “Ti” is still “full-fat”, it’s still more powerful than the same vanilla model number.

    • basket687
    • 7 years ago

    Adding more confusion to an already very confusing naming scheme…

      • internetsandman
      • 7 years ago

      Actually, it’s not all that bad. 680M < 680 MX < Desktop 680, where M is the mobile denotation and the X is the extreme performance denotation, or something like that

        • Chrispy_
        • 7 years ago

        I thought the X for performance came after GT prefix. So adding another one on the suffix too is justx toox manyx x’esxx
        XXX
        XOMGX, XWHAT’SX XHAPPENNINGX XTOX XME!??X

        XHALP!!XX
        XXXXXXXXX
        XXXXXXXXXXXXXXXXX
        XXXXXXXXXXXXXXXXXXXXXXXXXXXXX………………………

    • Sahrin
    • 7 years ago

    Ah, who else remembers the legendary GeForce 2 rebadge – the GeForce 4 MX?

      • MadManOriginal
      • 7 years ago

      Everyone who read the first sentence of this news post?

      • dragosmp
      • 7 years ago

      Here’s a reason not to rant about the 4MX: it was cheap and as fast as a high end GeForce2 (GTS? not sure). It took a while until ATI launched a cheaper card (the 9500), so I had many friends gaming on 4MX w/ the better IMC. Maybe it didn’t deserve the 4, but it was still good.

    • TurtlePerson2
    • 7 years ago

    This past summer I was at Quakecon, which is the largest LAN party in North America. I saw two Apple computers out of the thousands of computers there. One of them was an iMac. I never saw anyone at the computer, I assume that it’s because you lose all computer gamer credibility when you’re playing on an all-in-one iMac.

      • NeelyCam
      • 7 years ago

      Having “MaCoLyTe” as your screen name is like putting a target on your back [i<]and[/i<] forehead

    • dpaus
    • 7 years ago

    No power figures? Not that I expect to see these used extensively on battery…

      • Washer
      • 7 years ago

      I have not seen a GTX680M without Optimus. I know the MSI and Alienware 17″ models can get near 4 hours of usable battery life using the Intel graphics.

        • derFunkenstein
        • 7 years ago

        Irrelevant to people who want to play games on the battery.

          • Beelzebubba9
          • 7 years ago

          Well enjoy your 17 minutes of fun. 🙂

          • Washer
          • 7 years ago

          Those people have no clue when it comes to PC hardware then, ignorance is a poor excuse for unreasonable expectations. Frankly I think 4 hours of functional battery life on a 17″ gaming laptop shows how far the hardware has come in terms of scaling.

      • tviceman
      • 7 years ago

      Anyone who buys a laptop with a high end discrete GPU, expecting to get adequate gaming time off the battery is, as Ann Coulter likes to say, a “retard.”

        • crsh1976
        • 7 years ago

        Never seen Ann Coulter referenced in a useful way like that, you made my day.

          • Anvil
          • 7 years ago

          I thought it was going to be a joke about her freak-hands.

      • Airmantharp
      • 7 years ago

      Power figures would be nice, but it’s hard to imagine them breaking out of the power envelopes of current chassis- so figure 100w max, probably around 70w.

      I think the main take home here is the all-in-one potential; even at 1440p, a slightly down-clocked full-blown Kepler is more than potent enough for practically anything an AIO user might play, and most games will run at max settings barring unreasonable amounts of MSAA.

Pin It on Pinterest

Share This