news physx 3 0 adds support for multi core cpus

PhysX 3.0 adds support for multi-core CPUs

As E3 shows us the fruits of game developers’ latest labors, Nvidia has served up a little something especially for those developers: a new version of its PhysX physics middleware SDK. The updated development toolkit, in Nvidia’s words, "has been over three years in the making and features a new modular architecture and a completely rewritten PhysX engine." It’s available now from this page.

Nvidia goes into more detail about PhysX 3.0 on its official blog. Additions in PhysX 3.0 include the ability to merge multiple actors into a single "aggregate . . . managed as a single bounding-box entity" to simplify collision prediction, more efficient streaming of actors into scenes, and improved artist tools.

Arguably more noteworthy is a new Task Manager and managed thread pool, which "allows games to take advantage of multi-core processors on all platforms." You might recall that, last year, we discovered that certain games completely fail to implement PhysX in a way that takes advantage of multiple CPU cores—or even modern instruction sets like SSE. PhysX 3.0, it seems, is tackling that issue.

Perhaps the new multithreading goodness has to do with Nvidia’s greater ambitions for its physics middleware. The company says PhysX 3.0 targets a "broad spectrum of multi-core gaming devices – such as PCs, notebooks and gaming consoles, as well as emerging gaming platforms like handheld game devices, tablets, and smartphones." Failing to take full advantage of the CPU doesn’t make much sense when you’re dealing with handhelds with very limited computing resources.

0 responses to “PhysX 3.0 adds support for multi-core CPUs

  1. I know your expertise on this. I must say we should have an online discussion on this. Writing only comments will close the discussion straight away! And will restrict the benefits from this information.

    [url<][/url<] [url<][/url<] [url<][/url<]

  2. I don’t really care much for this PhysX thing. It’s not like I need 100% accurate physics in my games. I just need to have fun.

  3. Rubbish. There is no AMD bias to speak of, either from TR staff or the general TR “population.” There are fanbois of both companies but they’re easy to spot and easy to ignore.

    nVidia catches flak from most of TR’s readership (except irrational nVidia fans) for their nasty moves, like pushing the industry towards their PhysX API using what amounts to bribes, to the detriment of developers and consumers alike, and of course their competitor(s).

    You can bet that AMD would receive similar treatment if they pulled similarly underhanded manoeuvres.

  4. Right you are, sir.

    Either way, though, the nVidia PhysX page says ME1/ME2 don’t use PhysX on the PC, so no software PhysX for the PC, unless the nVidia page is incorrect.

  5. Console tards and Nvidia and AMD trying to tear the physics game out of Ageia’s hand back when they first released PhysX. They did a damn good job of killing it off almost instantly simply by staking out their territory and telling developers what they were and weren’t going to do.

    Look where GPU accelerated physics got us today… enabling them destroys your framerate in the few games that use them (which was called upon back when TR did a article on it).

    It was supposed to be the next step in video game evolution. Big fucking GG to AMD and NVidia.

  6. Not worth commenting potatochobit, you should know by now. Here on TR comments ATI = good, Nvidia = bad no matter what the situation is.

  7. I’m surprised how much physics implementation in games has kinda stalled in recent years.

  8. The point of AVX is not to give higher precision (more bits per “number”), but to be able to perform calculations on multiple “lower” precision “numbers” at the same time (SIMD – Single Instruction, Multiple Data).

    Since many of the physics calculations involved are inherently parallel (that’s basically why people started running them on GPUs), they should be able to benefit significantly from SIMD.

  9. I see a lot of press releases regarding PhysX, Nvidia has always been good about talking long and loud regarding it’s proprietary features regardless of popularity or lack thereof, what I don’t see are games actually using PhysX for anything PhysX was ever meant for.

    PhysX has always been a major fail but because it was Nvidia’s fail so long as they continue to talk about it then it’s not an epic fail.

  10. I want physics where I play a telekenetic who kills people by throwing shipping containers at them. Somebody make that game, if it doesn’t already exist.

  11. I was using it as a metaphor underlining how grave the difference is between perceived physics now and what physics truly is. You can write a paper on that if you wanted to.

  12. more advanced physics, such as in multiplayer would really bog down an online game
    serious gamers may want their realistic physics, but it won’t work so well in group gaming until the internet changes, which you know how that goes

  13. [quote<]Do the Mass Effect titles even use PhysX?[/quote<] Mostly no: [url<][/url<] If I interpret the nVidia PhysX page correctly then Mass Effect 1 uses the middleware on the Xbox 360, which is a software only thing. No PhysX, and therefore no hardware acceleration on the PC, apparently.

  14. It is, but it’s not the kind of physics that gamers really want to see in their games. Most physics so far is eye candy, not much of it is used for actual gameplay purposes

  15. AVX probably won’t do too much. Its more for HPC stuff AFAIK. Yes I know Physx is a physics sim API but its a fairly quick and dirty one in order to be fast. Giving it a 256 bit FPU is a bit of a waste since they’ll probably never make use of all that precision.

  16. PhysX surely isn’t going to be relevant at all in a couple of years? If it is then would be nice to see a TR comment on pro’s and cons for it rather than a non critical press release? Come on, I’ve started to expect such low quality press release based journalism from Ars-Technica, not this site. Coverage, Investigation, Opionion, Detail, please! Don’t use the blog excuse either, I could write a blog (it would be useless too, you guys can do so much better).

  17. WTF… doesn’t anyone remember the original marketing for PhysX? It was scaleable to as many cores as you had available. NVidia ripped multithreading out from it to push physX on the GPU. So, it would completely hinder performance of it on anything other then it’s GPUs (even though if it was multithreaded on a Quad core it could outperform a GPU accelerated variant and give higher framerates to boot).

    Hell, there is even benchmarks done between two different versions before and after NVidia took over PhysX that shows this. I’m baffled by how all of you forgot this.

    This is basically just NVidia tooting it’s horn because it thinks the GPU accelerated physics war is over (after they as well as AMD tried so hard to kill it off by making grabs it never followed up on), so it’s re-enabling an old feature.

  18. Since it is middleware, I wonder if this can be dropped in for existing games. Did the API change?

  19. you know, if i made this comment it would have a negative eight rating already

    physx was never intended to be a gimmick. but you know, with how rushed and crappy games are these days, developers just have too much to do to get a game to market that fine tuning the capabilities physx brings to the experience just gets pushed to the back of the list.

    any game can look beautiful, it just takes time and money.

  20. Tools like Unity3d are ‘built’ on Physx, so it doesn’t takes any bribes for thousands upon thousands of developer to use Physx.

    Sure 3D games dont need pieces of floating cloth, but they need a solid collision system and rigid body.
    Its middle-ware every mildly complex 3d game need… Bonus is the AAA title get access to complex visual features.

  21. Games don’t need *.

    “There are plenty of other ways to implement physics,”
    then why aren’t content producers putting them in their games left and right without Nvidia branding? There has to be a monetary incentive all around.

    “but as long as devs take bribes I don’t see PhysX going away.”

    Short term this may (arguably) work, but think long term. If consumers don’t see a benefit to this technology, and they see a markup, then they can go with competitor, and their marketing strategy is a loss, and a big one, as it can ruin your future sales as well as current.

    I guess I don’t know what your ultimate complaint is here. Nobody has a game that is 100% physx required, and never will.

  22. What’s to stop Nvidia from using all kinds of other tricks to sap performance, a fps cap, or even removing the optimizations under certain situations? The whole point of the code being inefficient, was to give nvidia/ageia a dishonest advantage so that they could pretend their product was worth buying. All this announcement means is that Nvidia’s planning to grab as much marketshare as they possibly can with the middleware. Games don’t need PhysX, and it’s been pretty obvious from day 1 that the only reason devs even used the thing was the TWIMTBP program. (All major PhysX games are TWIMTBP titles.) There are plenty of other ways to implement physics, and I’d like to see them utilized instead of some obviously deceitful middleware, but as long as devs take bribes I don’t see PhysX going away. I was hoping that dx11 would kill it, and it did for a while, since physx completely dropped off the map when fermi came out, (physx was nvidia’s stopgap gimmick vs dx11), but alas the temporary relief was too good to be true.

  23. u dont need an nvidia card but the game publisher still needs to pay a license for it
    many games have been doing this for a long time, if you buy a game on steam you wil seel a physx download redistributable package sometimes

  24. Will this have any effect on older PhysX games using software rendering such as Mass Effect 1 and 2 once the drivers are updated? Or does it require the game to be compiled with PhysX v3.0 in it?

  25. Nah, Intel will be just as screwed as AMD, it’ll be best (CPU wise) on ARM.

  26. It looks like they think they can have their cake and eat it, too. They are pretty much driving the high end mobile space and could potentially turn PhysX into a de facto standard for the coming generation of mobile games.

    They’ll surely use it as a selling point for the next generation of consoles, which will be powered by a single CPU+GPU SoC, likely with quite a few cores. They can’t just supply the GPU anymore, so they’ll have to offer a compelling platform, or they’ll be shut out of yet another market. Winning over just one console could mean hundreds of PhysX games.

    None of those things depend on PhysX’s success with PCs. In the past, PhysX was just a way to upsell people to a more expensive video card, which used to be a big part of their business model. Some doors closed and a lot more opened since then.

  27. You know, I think you may be right. After all, it’s this same company that artificially stopped users of Radeon’s having a GeForce card as a PPU alongside it.

    Makes no sense in my book – If PhysX was so great, then people who have Radeon’s may have been compelled to buy a GeForce card too. More sales for Nvidia = More £££ in profit…

  28. And the thousand upon of thousands of iphone games… if something collide in the game, its most likely physx code.

  29. Even weirder, It seem more and more developer are using it. its like everywhere now…

    And I haven’t heard of any of the big player backing off, all the new announcement I seen just re-enforce its use all over the place.

    I guess its not that weird then 🙂

  30. OAS meant that they were 100% about making CPUs look bad and GPUs look awesome back in 2008. They were pushing only their GPUs with their Physx API. Now they have some ARM CPUs that they want to leverage it on too.

    On the consoles their Physx library has been much more CPU optimized though because it absolutely has to be there.

  31. What do you mean even then? Go play it with and without PhysX enabled, its an obvious difference.

  32. Sure but if they properly implemented it in the first place it could be ubiquitous by now. Then when Nvidia is the only company with the ability to handle it in the mobile market place it would have been a huge selling point.

    Not to mention all the licensing money they could have made.

  33. I’ve been hearing about this PhysX crap for the last 4 years or so and I’ve yet to see anything truly tangible and accessible to consumers.

    Maybe Brian_S can fill me in

  34. Au contraire, in the ballooning market of tablets, smartphones, and soon to be ARM SoC based Windows 8 laptops, you won’t have any other choice.

  35. Nvidia will make it magically faster on Intel/Nvidia hardware while crippling the performance on AMD(ATI) hardware…yes NV is that petty.

  36. In 2008, Nvidia wasn’t building quad-core CPUs. The timing is completely self-serving.

  37. I hope they add AVX support. I would like to see benchmarks between GPU and CPU when using an 8 core(true cores, not threads) chip with AVX, of either Intel or AMD.

  38. Ha, looks like you may no longer need to buy Nvidia hardware to use PhysX properly. Any decent quad core CPU can carry out the physics and let the graphics card get on with the rendering…