Next Mac OS to include GPGPU programming interface

Nvidia is generating a substantial amount of hype surrounding its CUDA application programming interface, which lets developers tap into GeForce 8 and 9 graphics processors for general-purpose computing tasks. Apple seems interested in the concept, too, but it apparently wants to go about the implementation in a different way.

As the iPhone 3G fervor died down yesterday, the iPod maker announced a few details about its upcoming Mac OS X 10.6 "Snow Leopard" operating system. Included among them was the Open Computing Language, or OpenCL, an API Apple says "makes it possible for developers to efficiently tap the vast gigaflops of computing power currently locked up in the graphics processing unit." The similarities with CUDA don’t end there, because OpenCL is also based on the C programming language.

However, Apple says OpenCL "has been proposed as an open standard," meaning the company may want a single GPGPU API that can drive both AMD and Nvidia graphics processors. CUDA is Nvidia-only right now, while AMD has its own, lower-level API dubbed Close to Metal. Nvidia Chief Scientist David Kirk recently revealed in an interview with Bit-tech.net that Nvidia was open to licensing CUDA to AMD, but that hasn’t happened yet. Kirk explained, "[CUDA]’s not exactly an open standard, but there’s really not very much that is proprietary about it. . . . The pieces of the tools we build are made available to whoever is interested in using them."

Aside from OpenCL, OS X 10.6 will feature supports for greater amounts of RAM, Microsoft Exchange, and "unrivaled support for multi-core systems" through a new set of technologies called Grand Central. Snow Leopard will also include QuickTime X, "a streamlined, next-generation platform that advances modern media and Internet standards." The new OS will come out in roughly a year, Apple says.

Comments closed
    • Smurfer2
    • 12 years ago

    This all sounds quite exciting! Nice to see Apple trying to tap GPU power for programs. However, I’ll buy it when I can just buy the disk and pop it into my PC and install it without hacking…. In the mean time, WinXP and Ubuntu will suit me fine.

    P.S. I realize that will not happen with the current Apple business model…

      • Scrotos
      • 12 years ago

      Amen to that! You can pick up a refurb Dell Inspiron 530 quad-core (Q6600) system for $500. That system can take up to 8 GB of RAM. Throw in a decent video card and that’d be great as an OS X workstation. Ah, dreams…

    • Usacomp2k3
    • 12 years ago

    Why are they trying to reinvent the wheel? Because they are cool?

      • End User
      • 12 years ago

      Are you against progress?

    • bozzunter
    • 12 years ago

    Uhm, apart from answering post #1 (just check §[<http://www.apple.com<]§ and you'll see that iMac can have GeForce 8800 GS, I have a general question about using GPU for such things... Doesn't it use MUCH more power than Intel or AMD processors? I mean, when a graphic cards works at full power, it's about 170/200 W, where AMD&/Intel chips just use a fraction... Or maybe I'm missing anything about all this buzz?

      • rhema83
      • 12 years ago

      The latest crop of graphics cards are actually much more power efficient. TR run a comparo earlier. The HD3870 and 9600GT are both greener than the 8800GT at idle and peak. They also have tons of computing power and many cores, so they may beat the Intel processors to the FLOP per Watt ratio, for specific applications.

    • ThomasDM
    • 12 years ago

    “Apple knows a lot about CUDA,” Huang said, implying the company might be ready to formally embrace Nvidia’s technology to make it easier to exploit graphics chips inside Macs. Apple’s implementation “won’t be called CUDA, but it will be called something else,” Huang said in an interview here at Nvidia’s headquarters on Wednesday.

    §[<http://news.cnet.com/8301-13579_3-9962117-37.html<]§

    • albundy
    • 12 years ago

    wonder how intel feels when gpu’s are stealing from cpu’s spotlight! lol.

      • Corrado
      • 12 years ago

      Well, since intel is getting into the GPU business, I’m sure they don’t care as much.

        • albundy
        • 12 years ago

        yeah, but i’m talking about a real gpu not some s3 wannabe.

          • Corrado
          • 12 years ago

          Last time Intel made a GPU it was very competitive. i740 was 90% as fast as the Voodoo’s out at the time, and cost half as much. It had a real OGL ICD also.

            • adisor19
            • 12 years ago

            That wasn’t even THEIR design ! And that was the last time any graphics “GPU” from Intel was ever remotely competitive.. What was the company name again where they bought it ?!

            Adi

            • Scrotos
            • 12 years ago

            §[<http://en.wikipedia.org/wiki/Intel740<]§ Real3D. "Starfighter" or something.

            • A_Pickle
            • 12 years ago

            You know… Intel has been designing chips for a really long time. I think it’s highly presumptuous for all of you to assume that they’re somehow gonna tank as some “S3 wannabe” despite the fact that any evidence we have (their past performance designing microprocessors) indicates that they’re not incompetent at the business they’re in.

            I hope they wipe the floor with Nvidia and ATI, and make all of you expert, forum-going, silicon engineers eat your words.

    • StashTheVampede
    • 12 years ago

    So many cores on CPUs and GPUs with yet ANOTHER interface to build on?

    Apple has to be working on getting their core apps using all cores since their “consumer” apps barely tap two and four cores.

    • Grigory
    • 12 years ago

    “Apple seems interested in the concept, too, but it apparently wants to go about the implementation in a different way.”

    Biggest twist of the year. O_O

    • ssidbroadcast
    • 12 years ago

    That’s nice, but how about getting a GUI that’s as pretty as Aero?

      • Corrado
      • 12 years ago

      You can’t be serious….

        • ssidbroadcast
        • 12 years ago

        What, I am. Back when Leopard 10.5.1 came out we got a taste of having a GUI with nice, if a bit restrained, transparency effects. A loud, annoying minority of people/”journalists” complained, and so now we got a boring-white GUI with barely any transparency.

        Aero is perhaps the ONE thing that Vista has over Leopard… that and Texas Hold ‘Em.

          • StashTheVampede
          • 12 years ago

          Link to video, of Leopard effects that do NOT exist in the shipping product. Plenty of eye candy in OSX (really, there could be more), so I’d like to sww what was supposedly cut.

            • Scrotos
            • 12 years ago

            Well, there were complaints about transparency and whatnot when 10.5.0 came out:

            §[<http://arstechnica.com/reviews/os/mac-os-x-10-5.ars/4<]§ (and page 3 has more comments and screenshots) 10.5.2 added an option to fix two of the major visual complaints: §[<http://arstechnica.com/journals/apple.ars/2008/02/11/10-5-2-whats-new-pussycat-and-what-isnt<]§ "Apple finally added the 'Translucent Menu Bar' option to the Desktop and Screen Saver System Preferences pane." I don't know... did he think they took away the menu bar translucency completely? Or was there something else that magically changed from 10.5.1 to 10.5.2? Or did 10.5.1 add something so much better than the initial 10.5.0 release? I didn't see anything that stood out, graphically, in the release notes. Anyway, I hope this sheds some light on what he MIGHT be talkin' aboot.

            • ssidbroadcast
            • 12 years ago

            First off, the editor for Ars is Ars-Tarded when it comes to transparency. For some reason, he *really* dislikes the idea of the menubar having even slight transparency.

            Yes, as of 10.5.2 the menubar was given a clickable option of being transparent or not. HOWEVER, as of 10.5.2 /[

            • adisor19
            • 12 years ago

            They haven’t REMOVED transparency effects. They made the level of transparency lower, but it’s still there if you look hard enough. I personally prefer it this way. While it’s not as transparent as it was in 10.5.0, it’s also less annoying and distracting at the same time.

            Adi

            • ssidbroadcast
            • 12 years ago

            Note that I said “virtually” opaque. Meaning that it’s 90% or above opaque. Meaning that no one will notice unless they’re looking REALLY hard for it.

            For example, a prison can be “virtually” impossible to escape. Meaning that every single precaution has been taken, and no manner ever devised or foreseeable could be used to escape this prison, but it’s technically not impossible to escape.

            I’d like to add that even in pre-10.5.2 the transparency of the menus was slight enough that I didn’t even /[

            • Scrotos
            • 12 years ago

            While I agree with you that the user should have a bit more control, say a slider that can adjust the transparency to the users’ linking, I have a hard time seeing how essential of a feature this is if you didn’t notice it until you had two different versions of 10.5.x sitting next to each other and only THEN appreciated it.

            It was more than just the press that complained, I think. Apple’s got some feedback channels for customers to use and I believe it was more than a few articles that caused Apple to change that.

            As for Vista, I think the transparency was less of an issue as was the whole “I hate all of Vista” sentiment that most people had. Like, the entire house is on fire so why make a note of one of the tables also burning?

            I agree with you that it’d be nice to have that feature toggleable (and you can probably find a UI haxie that will give you that control) in Leopard, I also get a chuckle out of you getting worked up over a feature you initially didn’t notice. 😀

            But hey, I can see where you’re coming from with that.

            • ssidbroadcast
            • 12 years ago

            Shapeshifter is the only haxie that comes close to getting me what I want, and it’s for Tiger only at this point.

          • adisor19
          • 12 years ago

          Oh wow. Leopard does have transparency effects galore ! The difference between it and Vista, is that it doesn’t kill your productivity.

          Adi

            • A_Pickle
            • 12 years ago

            It doesn’t in Vista either, fanboy.

            • Scrotos
            • 12 years ago

            That reminds me of the time I got Vista and made it look as close as possible to Win2K so the UI wouldn’t get in the way of my work. As it is, the file explorer still vexes me to a degree.

            Oh, wait, that was about a week ago when I installed Vista with the intent to actually use it for the first time! Hey, I even kept UAC (I’m already trained to click OK boxes without reading them, so it doesn’t slow me down) enabled! I’m just happy I can get rid of Aero and those OS X widget ripoffs, I forget what they’re called on the sidebar thing. Bleah.

            And for the record, the latest Mac I have is a B&W G3 running 10.3, so I wouldn’t consider myself much of a Mac fanboy. Fanboy fanboy fanboy. What a worthless term to throw around in a discussion. Try adding something interesting instead of gettin’ yer panties in a bunch.

      • Hattig
      • 12 years ago

      You nearly had me there!

      • BobbinThreadbare
      • 12 years ago

      Compiz > aero and OSX.

        • thecoldanddarkone
        • 12 years ago

        I hate compiz personally…

          • BobbinThreadbare
          • 12 years ago

          That’s a pretty broad statement given the wide variety of options compiz has.

        • adisor19
        • 12 years ago

        Due to the expose like effect when moving the mouse in specific screen corners, i’d say Compiz > Aero. However, it’s not better then OS X. At least not yet. The way things are going, looks like Compiz will catch up rather soon and that’s a good thing.

        Adi

      • adisor19
      • 12 years ago

      That was a joke, right ? RIGHT ?!

      Adi

        • A_Pickle
        • 12 years ago

        Wow. Are you for real?

    • d0g_p00p
    • 12 years ago

    Now only if Apple can get decent video cards in their machines I might care.

      • zqw
      • 12 years ago

      Yeah, that’s exactly why this is odd. It must be aimed at cluster computing.

Pin It on Pinterest

Share This