Nvidia chipsets could power next-gen MacBooks

The latest rumor among the Mac crowd is that Apple won’t use Intel chipsets in its next-gen MacBooks. If true, that means Apple will need to either pick a suitable third-party alternative to Centrino 2 or design one itself.

Ryan Shrout over at PC Perspective has written a speculative piece on the subject, and he thinks Apple will likely adopt Nvidia’s upcoming MCP79 chipset. Why? First, he believes Apple doesn’t have the time or resources to build a mobile chipset with quality integrated graphics on its own. Apple could get a capable IGP solution from AMD, but that would entail using AMD processors—and Intel has a very clear advantage in the mobile CPU market.

That pretty much leaves Nvidia as the only likely contender. Shrout thinks Apple has every reason to choose the MCP79, too. The chipset should have a capable DirectX 10-class graphics processor, and its rumored support for HybridPower and GeForce Boost technologies could make it an excellent complement to the discrete Nvidia GPUs Apple already uses. The rumor mill also suggests the MCP79 will feature a space-saving single-chip design, support for 1066MHz front-side bus speeds, and compatibility with DDR3 RAM.

We may find out for sure by the end of September. That’s when Apple plans to unveil its latest round of MacBooks, insiders claim.

Comments closed
    • PRIME1
    • 13 years ago

    Well they want to stick with Intel CPUs and have a good GPU so NVIDIA was probably the only choice.

    • Silus
    • 13 years ago

    The speculation in the article makes sense, simply because Apple is committed to Intel right now. To choose AMD, would mean that Apple would need to change the whole platform to AMD based products and with Intel’s pressure, NVIDIA is their best bet right now. Also, in terms of IGP, even if 780G has the edge overall, the difference between NVIDIA’s 8300 IGP, is not big at all. It’s win some, lose some

    • StashTheVampede
    • 13 years ago

    One other reason why going to Nvidia for chipsets would work: with 10.6 coming around, they want a GPU powerful enough to run things like OpenCL. Intel’s current, on board GPU, cannot do the “general purpose” work that Nvidia/ATI can.

    • sparkman
    • 13 years ago

    Huh? Tech Report has run multiple stories about how AMD’s Puma is kicking Centrino’s ass everywhere (in the mobile market).

    > and Intel has a very clear advantage in the mobile CPU market.
    > he thinks Apple will likely adopt Nvidia’s upcoming MCP79 chipset

    Puma is likely to be superior to nVidia too, correct? This article makes no sense.

      • UberGerbil
      • 13 years ago

      Huh? TechReport has run multiple articles about how AMD has /[https://techreport.com/discussions.x/14644<]§ TR has also run multiple articles noting that the scheduled June introduction of Puma has come and gone, and Puma "has hardly swamped retail shelves" §[<https://techreport.com/articles.x/15131<]§ Have you actually been in a store or shopped online recently? How many Puma-based machines have you seen? They're trickling out, now, finally, but they're certainly a long way from "kicking Centrino's ass" in the market. (And I wouldn't hold my breath that a 65nm K8-derived design will be able to "kick the ass" of a 45nm Core 2-based Montevina system in raw CPU performance either, though it may do well in perf/watt).

        • sparkman
        • 13 years ago

        I see. My mistake.

    • UberGerbil
    • 13 years ago

    I wonder if Apple is looking at nVidia’s Tegra for a future iteration of Apple TV.

    • pogsnet
    • 13 years ago
    • DrDillyBar
    • 13 years ago

    Apple: Now with Hybrid-boost-PhysX-POWER!

      • Saribro
      • 13 years ago

      Isn’t that incredibly amazing?…
      😀

    • StashTheVampede
    • 13 years ago

    Adding another chipset to their line of supported devices is a GOOD thing. Apple is continuing to use the market, as the market it should be: the better product gets their $.

      • srg86
      • 13 years ago

      That is true. The likes of apple continuing to use the market, means that it will push competition. So we all benefit

    • bowman
    • 13 years ago

    I highly doubt Apple would chance on Nvidia chipsets. Why would they risk it? If the chipsets acted up the way current Nvidia chipsets do their brand would be buried. Useless.

    I think there’s a much higher chance they’d jump on a new AMD/ATI chipset solution with an IGP based on the RV770.

      • eitje
      • 13 years ago

      maybe they started planning the new systems more than one month ago, before the news broke about the chipset issues w/ nvidia.

      • axeman
      • 13 years ago

      Nvidia’s IGP chipsets have never really been troubled by the heat, power consumption, and stability issues that some of their other chipsets have. Given the modest expansion needs of the iMac or MacBooks, coupled with the fact that their IGP whips Intel’s like a Mexican donkey, makes this a pretty logical choice. Maybe Mac is planning some serious eye candy in the future for OS X, and Intel’s integrated garbage just isn’t going to cut it. Not to mention the Nvidia chipset will probably save money.

    • torax
    • 13 years ago

    Anyone know why it seems like Centrino 2 is being ruled out?

      • Stijn
      • 13 years ago

      Maybe Centrino 2 is too expensive?

      • adisor19
      • 13 years ago

      Apple never used WiFi chipsets from Intel and won’t begin to. They like to have their brand: Apple shine and not look like another generic laptop manufacturer.

      Adi

        • UberGerbil
        • 13 years ago

        Many of the generic laptop manufacturers didn’t use Intel’s WiFi offering either, so in that respect Apple is following the herd, not departing from it. Most of the larger OEMs put the Intel card into the default configurations for some notebooks (allowing them to collect the “Centrino” branding and co-marketing dollars) but then offer a cheap/same-cost alternative WiFi implementation in the BTO configurations. Intel’s WiFi implementation doesn’t suck as much as it did in the early Centrino years, but there are still reasons to go with an alternative.

          • A_Pickle
          • 13 years ago

          What are some of those reasons? I’m not being, or trying to be a smart-ass here, I’m just relatively un-educated in this arena and would like to make a wiser choice of wireless card in the future. What are some of the flaws of Intel-based Wifi?

            • UberGerbil
            • 13 years ago

            Intel never adopted .11a and they were slow with .11g and .11n. Early on they had issues with range / signal strength. And while the drivers were always fine, their client software caused conflicts with the built-in XP client software.

            But they’ve been fixing these issues, and to be honest I don’t know the state of the WiFi adaptors in the latest iteration of Centrino — I’ve been using non-Intel WiFi for so long. I think it is pretty interesting that all the BTO OEMs still offer alternatives, though that may be a legacy of the earlier problems, and they’re offering them so that customers can use something that is already on an approved list or consistent with all the other WiFi adapters already in the enterprise or whatever.

    • 5150
    • 13 years ago

    Yeah, I want to see NVIDIA piss off the Macolytes. Hell will be like a five-star resort compared to what those guys will do to you.

      • A_Pickle
      • 13 years ago

      Post of the day. 😀

      • DrDillyBar
      • 13 years ago

      2nd’d

    • Scrotos
    • 13 years ago

    I like how there is mention that it will be a DX10 capable GPU. Like OS X uses DX10. Yeah, yeah, I know, it’s just sayin’ that the GPU will be “good”, I just found it amusing that Apple and DX10 were mentioned together.

    p.s. don’t mention Parallels or Bootcamp, ok?!? I just found it amusing, is all!

      • Forge
      • 13 years ago

      DX8/DX9/DX10/DX10.1 are being increasingly used to refer to the DX level’s featureset, not to the DX level itself.

      F’r’instance, my GeForce 8800GT is a ‘DX10 card’. Since I’m running XP64, it’s really not, but that’s how the verbiage is being (mis)used.

    • Hattig
    • 13 years ago

    If this is true (and it is the obvious solution should it be true) then let’s hope that NVIDIA have created a cooler running solution than some of their other chipsets.

    Let’s hope that Apple use this in addition to dropping MacBook prices to more competitive prices. I wouldn’t mind an $899 MacBook. Also, Apple could use it in the Mac Mini, which is long overdue for a replacement (and waiting for NVIDIA MacBooks could explain why there isn’t a new Mac Mini yet).

    • Forge
    • 13 years ago

    Fire? Wut? Burning laptops no sell good.

    (Frost Pist)

      • wingless
      • 13 years ago

      Nvidia has too much at risk to sell bad chipsets again. You can be damn sure their QC will make sure every mm of silicon that goes out is near perfect.

        • srg86
        • 13 years ago

        I still don’t particularly trust nVidia.

          • wingless
          • 13 years ago

          I own an HD4870 and wouldn’t really buy an Nvidia discreet GPU, but I have faith in what they would do for Apple with their notebooks, well performance wise. I’m sure the battery life will go down a little if Nvidia keeps on making power hungry chipsets LOL.

          I see no way after what has recently happened that they would let themselves make the SAME mistake again. You all can be certain their silicon will be in top form this time around.

      • kilkennycat
      • 13 years ago

      Your “facts” are wrong. It was the chip-to-substrate bonding material that failed, causing the silicon to fail through localized over-temp. No fires, Try again…

        • Forge
        • 13 years ago

        The humor, it fails you.

        In case you’re too dense or angry to understand *that* as well, I was making a lame joke. A joke is a humorous phrase or story intended to evoke humor. Apparently I failed, but your rather bitter and snide response admirably overlooked the humor and smacked me around with ‘facts’ that I already know, and that I hope are clear to everyone with a clue.

        You = winnar (also humor, this one’s called sarcasm).

        • Silus
        • 13 years ago

        Forget it. There’s really no point. These are probably the same people that keep on touting the INQ’s FUD, with the G84 and G86 “article”. I’m starting to think this goes beyond fanboyism and there’s some money involved. People just CAN’T be that biased, for free….

Pin It on Pinterest

Share This