Nvidia to purchase Ageia Technologies

Confirming rumors that appeared last month, Nvidia has announced that it has agreed to purchase Ageia Technologies. Although Ageia’s PhysX physics accelerator cards only met moderate success on the PC, its physics application programming interface has been used in an array of console and PC titles, including hits like Gears of War, Ghost Recon: Advanced Warfighter, Red Steel, and Unreal Tournament 3. Nvidia says there are 140 games based on PhysX technology and over 10,000 registered users of the PhysX developer toolkit.

What Nvidia plans to do with PhysX is evident, and company CEO Jen-Hsun Huang makes no secret of it in the acquisition press release. “By combining the teams that created the world’s most pervasive GPU and physics engine brands, we can now bring GeForce®-accelerated PhysX to hundreds of millions of gamers around the world,” Huang announces.

Nvidia says it will release more details about the acquisition during a conference call on Wednesday, February 13. It should be interesting to see whether the company plans to support existing PhysX-accelerated titles via its graphics cards, and especially whether it plans to license PhysX tech to AMD. Of course, with Microsoft having shown interest in adding physics processing to the DirectX API bundle, proprietary hardware-accelerated physics APIs could turn out to be a short-lived phenomenon.

Comments closed
    • Bensam123
    • 12 years ago

    I don’t believe this is the death gnoll for Ageia’s PPUs. Maybe its current revision wont exist anymore, but we will more then likely see them come back around in the future after lots of titles become based off the API AND make actual use of it and PPUs will actually show tangible results because of it.

    • aleckermit
    • 12 years ago

    Agidia? Nvegea?

    😉

      • Meadows
      • 12 years ago

      It was /[

        • Semper1775
        • 12 years ago

        How about NVagia, say it aloud and I think you’ll get it. Yes crude I know…

      • eitje
      • 12 years ago

      Nvidiageia!

    • eitje
    • 12 years ago

    I think they’re buying Aegia for their API. As an interface, they should be able to hook up the known-and-working Aegia functions to the Nvidia hardware. Voila!

    • DrDillyBar
    • 12 years ago

    DAAMIT! … no wait, crap. Now I have to think up another one.

      • MrJP
      • 12 years ago

      Nvidgeia?…

      • ludi
      • 12 years ago

      Bad news. I tried feeding “Nvidia Ageia” through the Internet Anagram Server and it kept spitting out things related to ideas and female anatomy.

        • DrDillyBar
        • 12 years ago

        if you use each letter once, I get “gen diva” or “evading” or “and give” …. not very good…. lol

        • eitje
        • 12 years ago

        creative ridicule can sometimes be very difficult!

    • Prospero424
    • 12 years ago

    Perhaps they didn’t buy Ageia to use their tech in games at all, but to integrate their IP into their HPC “GPU Computing” push.

    Just a thought…

    • Unckmania
    • 12 years ago

    hehe, I remember that when the rumor was that AMD was buying Ageia, everyone said, “Oh my, two bad companies merging, they’re going to hell”. But now that Nvidia is doing it, it’s like “Awesome, they’ll be making an awesome product soon enough”. Why couldn’t we see that AMD had such chance too… because they are the losers right now.

      • indeego
      • 12 years ago

      I think it comes down to people see Nvidia as a well managed company. Few mistakes, lots of wins, the company is well run from the technical and business side. (even marketing.) AMD, well, notsomuchg{<.<}g So any purchase after ATI will be looked over critically for AMD.

        • JustAnEngineer
        • 12 years ago

        NVidia’s marketing are evil geniuses. Technically, they have had some outstanding successes and some brutal failures, but they have remained willing to spend piles of money on development.

          • ludi
          • 12 years ago

          True, but to /[

    • computron9000
    • 12 years ago

    My true prediction March 2006–almost 2 years ago exactly–and I quote Tech Report Forums:

    *[< _[

      • kilkennycat
      • 12 years ago

      Not quite.

      The Ageia PPU is dead. Use the board as a doorstop if you have been unwise enough to buy one.

      Ageia have been very successful in the SOFTWARE side of their business — definitely NOT on the hardware side where they have struggled with a dead-end PCI implementation and limp game-developer support. Ageia’s software expertise is the reason why nVidia is buying them. nV will incorporate their physics algorithms in libraries for their GPUs and may carry out some minor architectural changes for computational efficiency in future GPUs. There will be never be stand-alone dedicated-to-physics silicon (PPU) from nVidia.

      Do you know that nVidia has a rapidly-growing business in industrial and research-oriented desktop computing, using their GPUs and their CUDA toolset for massively-parallel processing? And nVidia’s next-gen GPUs are highly-likely to be true GPGPUs with double-precision data-paths, equallly-capable of either massively-parallel computation or spectacular graphics.Not only will the Ageia software expertise continue to be used (and expanded – thanks to nVidia’s deep pockets) for gaming, but also for industrial applications requiring efficient physics algorithms. The future is very bright for nVidia. And Intel’s Larrabee is only 2, or is that now 3 years?? away…..

      • 0g1
      • 12 years ago

      That red font color is annoying.

    • Prospero424
    • 12 years ago

    Yeah, if they’re gonna get game developers on board as an industry rather than on the onesy-twosey promotional basis Physx and other proprietary technologies have relied upon in the past, Nvidia is either going to have to leverage it in a major console first or they’re going to have to license the hardware (if there will even BE hardware) to competitors like AMD and Intel.

    As powerful as Nvidia is, they just won’t be able to get all of the major game developers to support their physics tech if it’s only available on their cards.

    I think acceleration support through a cross-hardware API like DirectX is far more likely. I’m just wondering how Ageia’s tech would give Nvidia an edge if this is how it pans out unless they go for the whole tiered-licensing scheme like Creative did with hardware positional sound aceleration.

    Creative are on, what, EAX version 6, now? While they will only license EAX 2 for use by competitors. Maybe we’ll see a similar scheme for physics acceleration: Physx 1.0 for competitors; Physx 4.0 for Nvidia.

    That would kinda suck.

      • d0g_p00p
      • 12 years ago

      The physics part can also be added to the nVidia “The way it plays” or whatever they call it. I see that logo on game startups more and more recently.

        • Meadows
        • 12 years ago

        *[

      • 0g1
      • 12 years ago

      Well, I think nVidia will release drivers within a few months to turn a (most likely the 8000 series) GPU into a virtual PhysX card. I can’t see nVidia working with ATI on drivers. nVidia knows there are quite a few physics API’s out there for developers to choose from. It’s quite a mess and more so with Microsoft looking to join. Game developers will just use the API that gives the best results for their customers. Right now there’s a lot of confusion as the API’s support different hardware.

      One thing is clear and that is nVidia stands to make a lot of money selling hardware that supports existing and future PhysX titles. I for one am considering a Tri-SLI setup already ;). So even if no new PhysX titles are made and CUDA isn’t used for physics, nVidia still made money. I think for that reason, AMD won’t get to support PhysX — or else why would nVidia even buy Ageia if they were to just hand it over? Also, PhysX is most likely dead. Same with Havok.

    • FireGryphon
    • 12 years ago

    These GPUs are becoming beasts. Imagine what old school hackers (known today as demo artists) would do if they dedicated themselves to fully exploiting one generation of cards the way they did whole computers back then.

    • StashTheVampede
    • 12 years ago

    I’m not trolling with this remark, but here goes: Nvidia is buying Aegia for the console possibilities (both current and future). Microsoft, Sony and Nintendo are already looking to build their “next gen” stuff — physics is going to be a HUGE advancement for realism in games when it’s dead simple to implement.

      • My Johnson
      • 12 years ago

      No, that’s not trolling. More like speculation, but it sounds spot on.

        • Saribro
        • 12 years ago

        Quite, and a physics chip makes much more sense in a console than a PC, as a developer can now rely on it’s presence instead of having to provide a multitude of codepaths.

    • lucas1985
    • 12 years ago

    I see this adquisition more geared towards Tesla than general gaming.
    Remember than Larrabee is just “around the corner”.

    • fpsduck
    • 12 years ago

    OMG! My prophetic prediction finally came true.
    Graphics card should have PhysX things integrated.

    Now let’s watch for AMD/ATI reaction.

    • PRIME1
    • 12 years ago

    Separate physics card = bad
    GPU/CPU physics = good

    Here’s to hoping their stuff just makes it into future NVIDIA GPUs and drivers.

    Although they could add an Ageia chip onto their motherboards. I’m not sure how I feel about that.

      • derFunkenstein
      • 12 years ago

      They should do both so that way the greatest possible number of PC’s will have it.

      I know that in the Socket A days there were tons of nForce2’s with Radeon 9800’s, and that stretched into Socket 754/939 with nForce 3’s and Radeon 9800 XT’s.

      If both companies did that, then you could feasibly see machines that could do either physics implementation. That’d be pretty swell.

    • wingless
    • 12 years ago

    OH SHEIST!!!! GPU PHYSICS ON THE WAY!

    I can’t wait to go back to Nvidia after this 2900XT debacle I got myself into. I can’t wait to see what Nvidia will pull off with this purchase. 2009 will be a good gaming year for the PC!

    ATI will have to develop their own GPU physics engine at this rate or depend on M$’s DirectPhysic (which may turn out to be great for them). This will usher in a whole new playing field to compete in. As I see it, Intel actually has the upper hand at the moment and they’re tough for anyone to beat.

      • scpulp
      • 12 years ago

      GPU physics, even hardware physics in general, is a technological dead end. Given how fast hardware is scaling (especially with multicore processors), this was always a bad solution. If anything, I suspect nVidia will grab the software and let the hardware rot.

      Physics is something that can be handled by a good CPU, at least if Portal has anything to say about it.

        • BenBasson
        • 12 years ago

        … and Crysis, and the new Unreal engine. Okay so you get better FPS with physics hardware, but it’s not substantially better in any benchmarks I’ve read, apart from titles specifically showcasing it (i.e. run like crap anyway).

        • Anonymous Coward
        • 12 years ago

        Though I am no expert on how they are doing what they do, any sort of simple bulk calculations will always be faster on a bulk calculating processor such as a suitably generalized GPU. Presumably nVidia intends that their future graphics cards will all have the ability to run physics code through a nice API.

          • SPOOFE
          • 12 years ago

          nVidia’s market presence could build up momentum; it’s true a dedicated solution can, potentially, be a superior option, but that’s if and only if software takes advantage of it.

    • LSDX
    • 12 years ago

    nvidia will probably put some of the multicore knowledge from Ageia Physx hardware into its GPUs. Also they could easily start selling cards with Geforce GPU, but without video connectors, just as pure SLI / physX addon cards.

    Back when they got hold of 3dfx, it was kind of sad, but in the end i think they made really good use of the IP they acquired.

    • delsydsoftware
    • 12 years ago

    I wonder what the per-chip cost of the Ageia PhysX processor is, anyway. I’m guessing that their pci card has a lot of support hardware on it that jacks up the price. They might be able to piggyback a PhysX processor onto an existing graphics card design with a minimal cost increase, since the support hardware would likely be the same. Plus, with access to nVidia’s fabs, they could dramatically cut costs. It may not be ideal as a standalone card, but it could make a nice co-processor for graphics cards and motherboards. I would certainly buy a motherboard with physx support, just as a value-added feature.

      • Forge
      • 12 years ago

      Nvidia doesn’t have fabs.

        • Flying Fox
        • 12 years ago

        That means they are not real men? 😉

    • bdwilcox
    • 12 years ago

    I heard nVidia also just made a bid for Rendition and RRedline as well as S3 and MeTal.

    • Tommyxx516
    • 12 years ago

    Yes I know its being implemented through software. I just stated the hardware was obsolete. You don’t need to buy a Physx card because it can all be done through software.

    • PetMiceRnice
    • 12 years ago

    A good deal for Ageia, that’s for sure.

    • Tommyxx516
    • 12 years ago

    Ageia implemented their physics engine through hardware, Havok implements their’s through software – both have the result.The consoles have plenty of processing power to process the Havok engine so yes, those $200 Ageia physics cards are obsolete.

      • toxent
      • 12 years ago

      Actually Ageia has a software API just like Havok. It’s being used right now in all PS3 systems.

        • Tommyxx516
        • 12 years ago

        Yes I know its being implemented through software. I just stated the hardware was obsolete. You don’t need to buy a Physx card because it can all be done through software.

          • Anonymous Coward
          • 12 years ago

          You could also write graphics renderers in pure software (except for an interface to dump the image to the screen) but that approach is not about to replace hardware graphics cards.

            • SPOOFE
            • 12 years ago

            But there was also a period of several years in which GPU’s weren’t a necessity for playing games; people that had one enjoyed it immensely, but those without could still easily and readily play games (as long as they had a good processor and RAM) without ’em. There was that point of going over the hump where it became completely nuts to even think about gaming without a GPU.

            I don’t know if PPU’s will follow a similar pattern. Having nVidia behind it certainly gives the concept a lot more credibility, I suppose, rather than having some no-name approach game developers and fill ’em in on the latest Must-Have.

      • Nitrodist
      • 12 years ago

      Ok, you have no idea what you’re talking about. Sorry.

    • MrJP
    • 12 years ago

    PhysX cards obsolete overnight, then?

      • Damage
      • 12 years ago

      Keep it in your museum next to the Voodoo 2.

        • MrJP
        • 12 years ago

        There are two Voodoo 2s in my personal “museum”, so no space for real dead ends like the PhysX unfortunately. 😉

        • computron9000
        • 12 years ago

        I’ve actually still got one. Want to buy it, Damage? It is pristine. It might even still work.

        • Vrock
        • 12 years ago

        Heh. Those cards don’t deserve to sit next to the the Voodoo 2. The Voodoo 2 was one heck of a product and it had a fine run…Physx cards? Meh.

          • Bauxite
          • 12 years ago

          Yep, how quickly people forget.

          GLquake + 3dfx, you know, the real reason theres 3d gamer cards worth a damn today instead of only having the choice of $5k+ 3D CAD accelerators, non-realtime rendering and probably really bad consoles?

        • ludi
        • 12 years ago

        Museum, muschmeum. I have a working 12MB Voodoo2 installed in a working P233MMX with an S3 Trio64 2MB and 64MB of system RAM…

      • Sargent Duck
      • 12 years ago

      sorry #1. This has nothing to do with your post

      *[

Pin It on Pinterest

Share This