Nvidia supports developer porting PhysX to Radeons

Late last month, NGOHQ posted a screenshot of PhysX software apparently running on an AMD Radeon HD 3850 graphics card. Nvidia began porting PhysX to its GeForce cards using its CUDA application programming interface (API) after acquiring Ageia, but there’s no official way to run PhysX on Radeons just yet.

Many initially dismissed the screenshot as a hoax, but NGOHQ maintains that it’s actively porting the PhysX API to run on Radeons. In fact, the site claims it has received support from Nvidia to do so, and it invites readers to sign up for a beta test program.

The folks at TG Daily have contacted Nvidia to verify that story, and surprisingly, the company admitted to working with NGOHQ editor-in-chief Eran Badit. "Eran and I have been talking via email and we have invited him to join NVIDIA’s registered developer program. We are delighted at his interest in CUDA and in GPU accelerated physics using PhysX," stated Nvidia Developer Relations VP Roy Taylor. Nvidia PR chief Derek Perez chimed in, "We’ll help any and all developers are using CUDA. That includes tools…documentation…and hands on help."

AMD has its own general-purpose APIs for GPUs, and it announced last month that it was working with middleware developer Havok to bring physics to its Radeon graphics cards. Perhaps for those reasons, NGOHQ claims AMD has refused to send it a Radeon HD 4800-series card sample. A successful port might happen without AMD’s backing, though. In an interview with Bit-Tech back in April, Nvidia Chief Scientist David Kirk mentioned, "We do take every opportunity to discuss the ability to run CUDA with anyone who’s interested. It’s not exactly an open standard, but there’s really not very much that is proprietary about it. Really, it’s just C and there are these functions for distributing it."

Comments closed
    • eitje
    • 11 years ago

    so, really, this would be a port of CUDA to AMD’s hardware. That would be significant.

    • Umbragen
    • 11 years ago

    So if I want to use this driver, I shouldn’t install Vista 64?

    • vojc
    • 11 years ago

    maby NVIDIA will buy one ATI 4000 series for that guy, becouse it is look like that ATI don t wan t giwe it to him

      • d0g_p00p
      • 11 years ago

      i tink u are wrung. mybe he shal purchade one himselfs. jut a toght.

        • DrCR
        • 11 years ago

        You sir just made my day.

    • mr_greedy
    • 11 years ago

    I just wish these CUDA accelerated apps would hurry up and come out..

    Video transcoding? 7z compression?

    Please?

    • bogbox
    • 11 years ago

    One thing is clear AMD is a very privileged position now.
    Intel and Nvidia needs AMD more then ever.Intel for Havok and CF and Nvidia for CUDA and chipsets.
    So AMD is vital in this epic war Intel vs Nvidia ,AMD probably gets money from each company ( enemies) ,for supporting “things” from them.
    The irony of it!
    If nvidia wants PhysX large scale adoption ,pay ATI!

      • WillBach
      • 11 years ago

      Not only that, but Intel and NVIDIA are also COMPETING against AMD in their respective markets. This is sort of like watching a three-way, close range grenade fight! That said, I think that we are currently seeing the best possible market setup for fostering close cooperation (important for making sure our computers actually work) and intense competition (important for making next year’s computers fast and cheep)!

      • Meadows
      • 11 years ago

      Large scale adoption -> pay ATI?

      Bwhahahahaha, thanks for a gigantic laugh.
      ATI videocards were the minority (by far) last I checked. To initiate large scale adoption, nVidia only need to release it with no extra effort.

      As for AMD’s position, it doesn’t look as rosy as you say, so I doubt they’re getting paid so much by competition.

        • ish718
        • 11 years ago

        AMD market share is undoubtly raising, soon Nvidia won’t have the majority of the market share if AMD keeps up the good work.

    • wingless
    • 11 years ago

    CONVERSATION CARRIED OVER FROM TUESDAY SHORTBREAD:

    Original Post:

    I feel that Nvidia is trying to regain some PR points with their move to support PhysX on AMD hardware. This move also makes CUDA more acceptable in the eyes of GPGPU developers. It will ultimately lead to Nvidia having control over the GPGPU software market and that probably won’t be a good thing.

    Still, its nice to know my 4870 I bought yesterday, and old 2900XT will both have a place in a new, power hungry system.

      • flip-mode
      • 11 years ago

      Can’t believe you bought a -[

        • asdsa
        • 11 years ago

        I can’t believe people are buying GTX 280:s.

          • Meadows
          • 11 years ago

          Jackasses have been, are, and will be.

            • greeny
            • 11 years ago

            While I wouldn’t buy a 280 myself I think calling people jackasses for doing so is a bit much. They are a jackass if they walk away thinking they got a good deal, but if you want the fastest single card on the market and have the money then thats the card to buy.

    • l33t-g4m3r
    • 11 years ago

    I still think its a hoax.
    Falling leaf systems alky project anyone?

    NGO hasn’t done anything that proves without a doubt they actually have something, and they are keeping tight reigns on their forums.
    Every post is read by a mod before it’s posted, and anything skeptical is immediately deleted.

      • Scrotos
      • 11 years ago

      Why would a mod let a message post that was skeptical in the first place, then go back and delete it? Wouldn’t it make more sense for them to not let the message post in the first place?

        • l33t-g4m3r
        • 11 years ago

        Correct, the message does not get posted.

    • wibeasley
    • 11 years ago

    Maybe Nvidia believes that the Cuda approach will be at a disadvantage on AMD architecture? That would exagggerate any existing superiority of Nvidia cards, while increasing the adoption of the Cuda language.

    • Mystic-G
    • 11 years ago

    q[

    • Krogoth
    • 11 years ago

    Nvidia is doing this for impending battle against Intel.

    Better to have allies when you are going against the real overlord of the market. Especially, when they have plans to make a programmable GPU. 😉

      • TheEmrys
      • 11 years ago

      +1. Nvidia is really wanting to go after the big boy.

        • poulpy
        • 11 years ago

        I’d say it’s more the other way around with the big boy really fancying a go at both Nvidia and ATi (well another go really as it failed back in the days).
        And Nvidia is bracing itself and covering all bases before this happens making the incoming Intel attempt all the more credible.

      • Mystic-G
      • 11 years ago

      Agreed… but I think Intel will come with more bark than bite.

        • greeny
        • 11 years ago

        If intels GPU somehow manages to beat either AMD or Nvidia who have both been making perforamce GPUs for many many years, heck, I’ll eat all of my hats lol

        Suppose it could happen but I really honestly dont think so.

    • asdsa
    • 11 years ago

    “It’s not exactly an open standard…”, “…it’s just C and there are these functions for distributing it”. I bet there are some really good functions like this:

    if (hardwareName.startsWith(“nvidia”) ) {
    processingSpeed = 1;
    } else {
    processingSpeed = 0.5;
    }

      • aleckermit
      • 11 years ago

      lol, too true.

      ps: man I hated AP Computer Science in Highschool.

      • anjulpa
      • 11 years ago

      CUDA is just a language abstraction, AMD (or a guy in a garage with CTM experience) could write a compiler that makes sure processingSpeed is 1.0.

    • ApockofFork
    • 11 years ago

    I’m not sure nVidia is thrilled about CUDA on radeons because I don’t think they really charge for CUDA (perhaps software as a service?) but physX on radeons could only help them sell the physX package so this makes sense. I’m not really sure how they are “porting” CUDA to radeons. There was a forum thread discussing this and it still seems kinda odd.

      • slaimus
      • 11 years ago

      A game using Physx (Nvidia controlled) is a game NOT using Havoc (Intel controlled), so Nvidia had more leverage against Larrabee.

    • anjulpa
    • 11 years ago

    Now, what if CUDA applications start running faster on AMD?

      • WillBach
      • 11 years ago

      If CUDA applications start running faster on AMD GPUs, then PhysX becomes more valuable and even more widely accepted. Developers would feel confident that almost all modern gaming computers could run their program, as opposed to only some. NVIDIA wouldn’t lose all of its market share overnight, they would still have a chance to release faster GPUs in the future and win back market share in a market where PhysX is much more important than it had been.

      A time line in simpler terms:

      Right now: NVIDIA cards power 99% of PhysX capable computers, which represent a sizable but growing minority of computers used for gaming. That said, PhysX risks being overlooked by developers that want their games to play on as many of their customers’ computers as possible (if AMD continues to win the graphics fight with its 4850 and 4870 cards, and those card don’t offer PhysX support, PhysX could be marginalized).

      Short-term future: AMD’s cards support PhysX, so now NVIDIA cards power slightly more than 50% of PhysX capable computers, which represent a large fraction of computers used for gaming, and will represent much more than half of computers used for gaming two years from now. NVIDIA is losing market share by percentage, but the fraction of computers capable of PhysX is growing quickly.

      Medium-term future: PhysX plays a significant role in the games industry.

      Long-term future: NVIDIA releases new graphics cards, gains back market share. Life continues.

      Either that, or NVIDIA could release a separate, more closed version of PhysX, which any developer in its right mind would avoid…

    • anjulpa
    • 11 years ago

    [repeat post] sorry!

    • A_Pickle
    • 11 years ago

    Wow. Good on Nvidia.

Pin It on Pinterest

Share This