A first look at Nvidia’s GPU physics

If you’d told me a year ago that my PC would have hardware PhysX support today, I’d have been a little dubious. Last summer, running hardware game physics simulations involved shelling out $150-200 for a PhysX card, and all you got for your investment was limited support in a handful of titles. Not exactly a stocking-stuffer.

That will all change next week. On August 12, Nvidia will release new graphics drivers that will allow owners of most GeForce 8, GeForce 9, and GeForce GTX 200-series cards to use PhysX acceleration without spending a dime. Along with the drivers will come a downloadable PhysX software pack containing free Unreal Tournament 3 maps, the full version of NetDevil’s Warmonger, a couple of Nvidia demos, and sneak peeks at Object Software’s Metal Knight Zero and Nurien Software’s Nurien social-networking service. Nvidia provided us with early access to the pack, and we’ve been testing it over the past couple of days.

Physics on the GPU

Before getting into our tests, we should probably talk a little bit about what PhysX is and how Nvidia came to implement it on its graphics processors. In early 2006, Ageia Technologies launched the PhysX “physics processing unit,” a PCI card with a custom parallel-processing chip tweaked for physics computations. Game developers could use Ageia’s matching application programming interface to offload physics simulations to the PPU, enabling not only lower CPU utilization, but also more intensive physics simulations with many more objects.

We reviewed the PhysX PPU in June 2006, but we came away somewhat unimpressed by the hardware’s intimidating price tag (around $250-300) and the dearth of actual game support. Ageia displayed some neat effects in its custom tech demos, but actual games like Ubisoft’s Ghost Recon Advanced Warfighter used the PPU for little more than extra debris in explosions.

As PhysX PPUs seemed to be fading into obscurity, Nvidia announced plans to purchase Ageia in February of this year. Barely a week after the announcement, Nvidia said it would add PhysX support to GeForce 8-series graphics cards using its CUDA general-purpose GPU API. The idea looks great on paper. Running a physics API on a popular line of GPUs bypasses the need for expensive third-party accelerators, and it should spur the implementation of PhysX effects in games. Nvidia counts 70 million GeForce 8 and 9 users so far, which is probably quite a bit more than the installed base for PhysX cards.

The PhysX API is quite flexible, as well, since it can scale across different types of hardware and doesn’t actually require hardware acceleration to work:

The PhysX calculation path. Source: Nvidia.

Nvidia’s PhysX pipeline patches API calls through to different “solvers” depending on the host machine’s hardware and settings. There are solvers for plain x86 CPUs, Nvidia GPUs, PhysX PPUs, and more exotic chips like the Cell processor in Sony’s PlayStation 3. According to Nvidia, PhysX lets developers run small-scale effects on the CPU and larger-scale effects in hardware. “For example, a building that explodes into a hundred pieces on the CPU can explode into thousands of pieces on the GPU, while maintaining the same frame rate.”

To give you an idea of the performance difference between different solvers, Nvidia claims its GeForce GTX 280 can handle fluid simulations up to 15 times faster than a Core 2 Quad processor from Intel. Check out page four of our GeForce GTX 280 review for more details.

How does Nvidia’s PhysX-on-GPU implementation actually affect graphics quality and performance, then? I used my GeForce 8800 GT-powered desktop system as a guinea pig to get a feel for PhysX’s behavior on mainstream graphics hardware.

Our testing methods

As ever, we did our best to deliver clean benchmark numbers. Tests were run at least three times, and the results were averaged.

Our test system was configured like so:

Processor Intel Core 2 Duo E6400 2.13GHz
System bus 1066MHz (266MHz quad-pumped)
Motherboard MSI P965 Platinum
BIOS revision 1.8
North bridge P965 MCH
South bridge ICH8R
Chipset drivers INF update 8.3.1.1009, Intel Matrix Storage Manager 7.8
Memory size 4GB (4 DIMMs)
Memory type 2x 2GB Corsair ValueSelect DDR2-667 SDRAM
CAS latency (CL) 5
RAS to CAS delay (tRCD) 5
RAS precharge (tRP) 5
Cycle time (tRAS) 15
Command rate 2T
Audio Creative Sound Blaster X-Fi XtremeGamer
Graphics Zotac GeForce 8800 GT Amp! Edition

with ForceWare 177.79 beta drivers

Hard drive 2x Western Digital Caviar SE16 320GB SATA
OS Windows Vista Home Premium x64
OS updates Service Pack 1, latest updates at time of writing

The test system’s Windows desktop was set at 1680×1050 in 32-bit color at a 60Hz screen refresh rate. Vertical refresh sync (vsync) was disabled.

We used the following versions of our test applications:

The tests and methods we employ are usually publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.

Unreal Tournament 3

Our first stop was Epic Games’ latest multiplayer shooter, one of the biggest and most recent titles to take advantage of PhysX hardware acceleration. For our testing, we broke out the Unreal Tournament 3 Extreme PhysX mod pack, which includes three maps chock-full of fancy physics effects: a special version of Heat Ray, which we benchmarked below, as well as capture-the-flag arenas Tornado and Lighthouse.

Let’s start with Heat Ray. Epic featured this map in the original UT3 demo, but the PhysX-enhanced version in the mod pack adds plenty of destructible items, plus an ongoing hail-storm that bombards the environment with hundreds of little ice lumps. Explosions and plasma balls from the shock rifle send hailstones and other debris flying.

Actually, a screenshot doesn’t really do the hail effect justice. We’ve uploaded part of an Nvidia-recorded demo that showcases it in motion:

Curious to see the impact of those shiny effects on performance, we opted to run some benchmarks. We tested first with GPU physics enabled, then with software physics only, and finally in the default version of the map without added effects. Each time, we played through five 60-second deathmatch sessions against bots and recorded frame rates using FRAPS. This method likely reflects gameplay performance better than pre-recorded timedemos, although it’s not precisely repeatable. Averaging five samples ought to yield sufficiently trustworthy results, though.

These numbers say it all. Using part of the GPU to compute fancy physics effects induces a performance hit, although in this case, that hit was small enough not to seriously affect playability. The version of the map without PhysX effects did feel noticeably smoother, though. As for running the PhysX map in software mode, you can forget it—long stretches of frame rates in the single digits made that config unplayable.

We also had a stroll through the other two maps—Tornado and Lighthouse. The latter isn’t particularly interesting unless you really like destructible walls and floors, but Tornado uses PhysX capabilities in a cooler and more original way.

A tornado slowly crawls through the map and sucks in just about everything in its path: debris, rocks, wall chunks, pipes, shipping crates, and even liquid from a toxic pool. Roof plates bend like sheets of paper toward the sky, while projectiles from the game’s flak cannon fly up in circles if you fire into the tornado. Trying to play a CTF match in this map is an interesting experience, since the tornado creates new obstacles by repositioning large objects, and it can kill players with flying debris or by flinging them against walls. Personally, I thought seeing my freshly killed corpse swallowed up into the heavens made waiting to respawn more fun.

UT3‘s PhysX implementation isn’t perfect, of course. We encountered a number of bugs, such as objects vibrating in place and occasionally sliding in strange patterns. Planks and stone slabs in the Lighthouse map unrealistically exploded into many pieces, kind of like giant graham crackers. That said, these maps came out before Nvidia’s acquisition of Ageia, so I’m not too surprised they weren’t polished to a mirror shine.

Nvidia’s PhysX Particle Fluid Demo

Many of us love Unreal Tournament 3, but what kind of physics eye-candy can we expect to see in next-gen games? Nvidia has whipped up a couple of demos to showcase just that. One of those is the PhysX Particle Fluid Demo, which pretty much does what you’d expect: take a gazillion particles, make them look water-y, set them loose in a sample map, and have the GPU simulate their interactions. In theory, this technique should let game developers achieve the nirvana of fully interactive volumetric water. In practice, it looked more like tapioca soup.

Yes, the water flows sort-of-realistically and fills little pools like it’s supposed to. But the liquid has a strange, almost jelly-like quality, and you can see circular “water” particles fly around every now and then. Perhaps a greater number of particles would make the effect more believable, or perhaps better-looking shader effects would do the trick. Either solution probably wouldn’t improve the demo’s already-low frame rates, though:

Volumetric, particle-based liquids may work great when everyone has GeForce GTX 200-class hardware (or better), but I’d be surprised if many developers were implementing this effect in their games today—especially when titles like 2K Games’ BioShock manage to fake volumetric liquids quite believably.

As a side note, the software PhysX implementation in this test only seemed to use one processor core. CPU utilization was paradoxically higher in the hardware physics mode, even though the GPU shouldered the simulation work.

The Great Kulu

In this demo, Nvidia shows off soft-body physics through Kulu, a giant tentacled slug-caterpillar that chases you down corridors by hideously distorting itself like a trash bag full of Jell-O. I’ve had to sleep with the light on ever since testing this.

Gross.

Nvidia may have written this demo with its GeForce GTX 200 graphics cards in mind, but we had no trouble playing it at 1680×1050 on our lowly GeForce 8800 GT. We didn’t benchmark this particular test, because we somehow couldn’t run it with PhysX acceleration disabled. You probably get the idea by now, though—PhysX-heavy games and demos tend to run like slideshows without hardware acceleration.

The Great Kulu gives us an interesting glimpse at how games could feature more “organic” objects that bend and squeeze depending on what they collide with. I can’t be the only one tired of seeing rag-doll character corpses that behave like they’re made of cast titanium. Nvidia’s demo goes a little over the top with completely Jell-O-like objects, but the effect remains cool nonetheless.

Update 08/19: The Great Kulu demo seems to only supports GPU-accelerated PhysX on GeForce GTX 200-series cards, so physics simulations ran on the CPU in my testing with the GeForce 8800 GT. Because frame rates felt (mostly) playable, I incorrectly assumed physics acceleration was forced on when it was actually forced off. Nvidia says the following about running the demo with software PhysX:

This demo is available for free and can be installed and played without a PhysX acceleration enabled. However, the minimum system requirements anticipate PhysX being accelerated and it is likely that non-PhysX accelerated systems will experience severe performance degradation at times of high physics load (the ending room). This degradation will not be present at all moments, but should be clearly evident during standard play.

This is more or less consistent with my experience, although I attributed the slowdowns to the GPU choking under the load instead of the CPU.

Conclusions

It’s hard to dislike what Nvidia has done with PhysX. The company has taken an expensive niche product and given it to the masses for free, while at the same time giving game developers a replacement for the apparently defunct Havok FX API. Our brush with the ForceWare 177.79 driver release has shown that a sub-$150 graphics card can handle PhysX effects quite well, and Nvidia card owners can flock to the (small) library of existing PhysX-capable software without having to wait for new games to come out.

Speaking of new games, Nvidia told us about two upcoming titles that will feature PhysX hardware acceleration. One of them is DICE’s Mirror’s Edge, which will feature awesome-looking first-person free running in a futuristic dystopia. Another is Natural Motion’s Backbreaker, a third-person football sim. Nvidia claims studios have signed on to implement PhysX in another 10 games—and that’s just in the month following the Ageia acquisition.

The downside of all this ought to be obvious to most folks with a red graphics card. PhysX currently doesn’t work on AMD GPUs, which is a shame considering the excellent performance of the firm’s new Radeon HD 4000-series products. We wouldn’t be surprised to see Radeons gain physics support one way or another, though. Nvidia claims to be supporting an independent developer who wants to port PhysX to AMD cards, and truly widespread use of advanced physics effects may hinge on whether hardware from both companies supports the technology.

With all that said, we probably wouldn’t recommend basing a graphics card purchase on PhysX support. At least not until the dust settles and more PhysX-enabled games come out.

Comments closed
    • Cyril
    • 11 years ago

    Correction: Hardware PhysX acceleration was forced off, not on, in the Great Kulu demo. I’ve updated the article with an explanation.

    • Bummer
    • 11 years ago

    guru3D did a review on it as well , using a gtx 280 as a graphics card and a 9600 gt as a physX card.looks pretty promising….:-)

    • erick.mendes
    • 11 years ago

    Cmon man…. 35% small ?!? It crippled the card! I prefer to pay for a dedicated physX card to have my 2/3 of my GPU firepower.

    At least a nicer demo would take the idea to a new level… but why I would buy in a technology that I had not see working right, that also cripple my gpu?

    That is not going to work. Get back to the drawnboard nVidia.

    • Cannyone
    • 11 years ago

    I can’t help but wonder why these “tests” were not performed with at least 2 8800GTs? I mean it doesn’t seem logical to me that Nvidia intended for this “feature” to be used with just one card.

    • WaltC
    • 11 years ago

    /[

      • eitje
      • 11 years ago

      we could get into the “human eye” argument at this point, but i don’t feel like it. 40 FPS is playable, 60 FPS is playable.

        • WaltC
        • 11 years ago

        Oh, I couldn’t agree more!…;) But you know, how often is it that we see benchmarks run on comparative hardware where differences of far less than 35% are displayed as though they make all the difference in the world, and the topic of “what’s playable and what’s not” is never broached? Every day?

          • eitje
          • 11 years ago

          an excellent point.

          though, i WOULD argue that the techreport editors, above most others, do an excellent job of calling out situations where it does and does not occur.

            • WaltC
            • 11 years ago

            Yes, I think TR does an excellent job…but let’s face it, if frame-rate differences make no difference in playability, then there’s absolutely no reason to chart and graph them /[

            • Dagwood
            • 11 years ago

            35% is a lot if your compairing two different cards, but in the big picture it is barley noticable. Even doubling processing power is not that big of a deal. Sure there is difference between 30 frames and 15 frames per second, but just lower the resolution and you get those frames back. I don’t usually upgrade untill the new stuff is 2 times faster than what I have.

            • Meadows
            • 11 years ago

            You got it all horribly wrong.
            GPU PhysX was enabled in both situations – it was the very map itself, that was changed. That’s the purpose of the PhysX pack for UT3, to add a few maps with extra bling, further increasing physics load.
            This whole misperception of “GPU PhysX comes with a performance penalty” needs to die right about now, because it is as untrue a claim as they come.

    • bogbox
    • 11 years ago

    If the GPU will do the video processing and physics what the CPU will do?
    A gpu is a video processor ,why put it for performance penalty ?

    I like havok because is free and you don’t need a second video card to have physics and 100% speed of the GPU .
    Is a option but you need to buy the same gpu for sli (1 for physics and 1 for video), Ati has a better option with Crossfire X buy a cheap video for physics and a powerful one for video but not PhysX (for now).

    ps put some benchmarks of the 4800 and the 200 series with PhysX!

      • Meadows
      • 11 years ago

      I bet you’re not experienced in the subject, but the utilization of PCs is better this way.
      There is zero performance penalty. None.
      If you remain dumb and skeptical, I’ll explain it in plain language just for you.

        • Damage
        • 11 years ago

        Meadows, if you remain boorish and rude, we’ll ban you. Dial it back and act like a civilized person, or else.

          • Meadows
          • 11 years ago

          If I have to go, I’ll take them with me!
          *yells that as he presses the big red button on his dynamite belt*

            • ludi
            • 11 years ago

            Nah, the police snipers will get you first.

            • Meadows
            • 11 years ago

            Oh sh-
            Splat.

            • A_Pickle
            • 11 years ago

            Your onomatopoeia’s, being so properly typed and so un-surrounded by asterisks, seem awfully dark and macabre.

            Just saying.

            • eitje
            • 11 years ago

            Have you ever searched for “Gratutious” on TR?

            • Meadows
            • 11 years ago

            Have you ever spelled _[

            • Jigar
            • 11 years ago

            Why can’t you be polite to others ?? Is some thing wrong with you or do you feel that everyone here comes to fight especially with you ?

            • eitje
            • 11 years ago

            you’re an asshole. literally.

            • Meadows
            • 11 years ago

            An asshole who can spell!

    • Klopsik206
    • 11 years ago

    Guys, I got a question:

    How about multicore CPU support?
    Does it balance workload between CPU and GPU?
    To rephrase: should you run those benchmark on quad core results would be significanlty higher?
    What would be results for running the benches on quad core with ATI GPU?

      • bogbox
      • 11 years ago

      This is the point of PhysX , buy 2 gpu and no quad (at least thats what Nvidia said).
      The physics work only with the gpu , so even a single core will do.

        • Klopsik206
        • 11 years ago

        Well, The chart on the first page says it can run on GPU and multicore CPU, as well. 😉
        My question if it can balance the workload between those two.

    • computron9000
    • 11 years ago

    The problem with water is that it gets things wet (almost never shown) and it usually looks totally unremarkable. The obsession with making it blue I think hampers designers from getting it to look correct. In small volumes water is clear and plain. I also don’t think they do the physics correctly. Water bounces around like it is blobby-rubber instead of being dense.

    • Aphasia
    • 11 years ago

    I’m just affraid devs will go overboard with neat effects before they finally manage to scale down the effects to enhance instead of degrade everything. But oh well, i guess a few games with blobby enemies wont be so bad.

    • Space Bags
    • 11 years ago

    The problem I have with this is it all comes from Nvidia. Like ATi’s cinema demo, it’s heavily optimized, and doesn’t provide any unbiased tests. This entire article is Nvidia FUD marketing to me, no offense to the author.

    • Jon
    • 11 years ago

    Where do we download the particle fluid demo? I can’t find it on the nvidia site.

      • Cyril
      • 11 years ago

      As the article says, Nvidia plans to release the PhysX pack with the demos we tested on August 12.

      • indeego
      • 11 years ago

      They’ve been crosslinking for like 7+ years nowg{

      • Jigar
      • 11 years ago

      Nope, do not do that.. PCI or PCI-E 1X just wont cut the fast pace of your graphic card… Better put that money together and get faster graphic card (Nvidia for now)

        • Meadows
        • 11 years ago

        A PhysX card gives you better UT3 performance than letting the GPU do it, but generally either way would be fine.

          • Usacomp2k3
          • 11 years ago

          Not turning physics on will give you the best performance 😉

            • Meadows
            • 11 years ago

            You can’t “not turn physics on” unfortunately, but you can offload it to other components, and you’ll find that you can squeeze more fps out of the game with a separate PPU, especially if you have a gigantic screen and run the game at unearthly resolutions (ergo the GPU is busy and yields no improvements with regards to physics).

          • Jigar
          • 11 years ago

          Just UT3 performance .. anything else ?? Your system will be dang slow.. i can bet you on that ..

            • moose17145
            • 11 years ago

            Well i meant if NVidia pushes physics development into future games. Obviously (most) current games it would be of minor / no benefit.

            • Jigar
            • 11 years ago

            But as far physx card goes, current Physx cards are coming with PCI or PCI-E 1X slot .. they will slow down your graphics calculation, cause your GPU card will have to wait for the physx card to do the calculation, which is connected to slower link..

            Well i could agree with you only if Physx card were coming with 8X or 16X link …

            • DrDillyBar
            • 11 years ago

            If Ageia originally said that the PCI bus was not a limiting factor, I can’t see why a PCIe x1 slot would not be sufficient bandwidth.

    • SonicSilicon
    • 11 years ago

    The first I saw of this story was on the front page, so, of course, I stared at the image before reading the title below it. My first thought was
    “That is one really strange watercooling mod.”
    I guess that supports ludi’s comment about the level appearing 6 inches tall. Is it also a mark on how many heatsinks are on most modern mainboards?

    I will echo a sentiment already stated; it just doesn’t seem to have any substantial impact on gameplay. This seems especially so for first person shooters, a realm where many competitive and semi-competitive players will strip out as many distracting visual elements as possible (eliminating fog, foliage, flares, etc.)

    • Jakubgt
    • 11 years ago

    Quick question, will my 9600gt support physx?

      • Meadows
      • 11 years ago

      Eventually… I think.

    • lethal
    • 11 years ago

    Wouldn’t it make more sense to test this with any of the quad core CPUs? Using an older core 2 duo at just 2.13 Ghz is not going to do the software renderer any favors.

    • Rza79
    • 11 years ago

    Is Tech Report now removing posts!
    That’s news to me …

      • Damage
      • 11 years ago

      Shockingly, we somehow don’t appreciate you spamming the comments with links to some German website. Please stop before we have to ban you.

        • Rza79
        • 11 years ago

        Well if that German website answers his question then i don’t see the issue. Whatever …

    • ludi
    • 11 years ago

    The problem with that water demo is that water while water CAN behave like that, it would only do so in a model six inches tall — e.g. like those little desktop fountains.

    In real life, a full-size implementation of that contraption (assuming the ladder rungs are sized for a human to climb) would have the water exiting the pipe as a boiling froth, and churning up more air as it landed and splashed. Even if there was a big enough pump to achieve laminar flow, it would distort and difract light through the column much more severely than that, and still froth the water below upon landing and splashes would fire out like shrapnel rather than bounce around blobs.

    IMO that’s why it looks cheesy. It is realistic, but at the wrong scale.

      • MadManOriginal
      • 11 years ago

      Looks like it’s a Ladder to Nowhere so it must be in Alaska, the water should really be frozen :p

        • ludi
        • 11 years ago

        Actually, the Bridge to Nowhere was intended to link Ketchikan to the airport island across the channel, and Ketchikan is dead west of B.C., on the tail end of a warm ocean current. Instead of ice, it actually has a cold-rainforest climate (12 feet of precipitation each year, but average temperatures in the 40F range outside of summer) and only rarely gets snow or frost 😉

          • MadManOriginal
          • 11 years ago

          Way to take a joke with a ‘:p’ way too seriously. You win the ‘look I’m smart today on the Internets’ award though.

            • ludi
            • 11 years ago

            :p

            s[

    • Thorburn
    • 11 years ago

    “Yes, the water flows sort-of-realistically and fills little pools like it’s supposed to. But the liquid has a strange, almost jelly-like quality, and you can see circular “water” particles fly around every now and then. Perhaps a greater number of particles would make the effect more believable, or perhaps better-looking shader effects would do the trick. Either solution probably wouldn’t improve the demo’s already-low frame rates, though”

    This raises an interesting point, surely as you ramp up the graphical detail/resolution the GPU physics calculations are going to be robbing you of increasingly valuable rendering horse power.

    • PRIME1
    • 11 years ago

    Well I just got my 260, so this news just makes it that much sweeter.

      • Jigar
      • 11 years ago

      Nvidia fan boy officially happy now 😛

        • PRIME1
        • 11 years ago

        I’ve been happy since I got my first TNT2 😉

          • l33t-g4m3r
          • 11 years ago

          The tnt2 is what made me quit buying Nvidia.
          I had a diamond viper v770, 32-bit color wasn’t very playable, and nvidia used an annoying checkerboard dither pattern in 16-bit.
          Then the diamond custom drivers got 100+ fps in quake3, and the Nvidia stock drivers lowered it to 30+fps.

          Stuff like that didn’t wanna make me upgrade to a geforce2, instead I bought ATI.
          Using ATI ever since.

          Although If I was going to buy another nvidia card, it might be a 260.

            • Dagwood
            • 11 years ago

            My Diamond Viper 770 is still in my PIII and still works.

            At the very least, PhsX is as interesting as a testalator or 16xAA. Why is this not a selling point for Nvidia? AMD and Intel don’t have it. AMD does not have the resources, and Intel is not interested with anything it does not own. This is one thing that Nvidia is doing well and deserves a little praise for.

            • ludi
            • 11 years ago

            Mmmppphh…my upgrade history on my main machine is a laundry list of three 3dfx cards, five Nvidia cards, and three ATi. Plus a smattering of other cards for secondary machine(s). All three companies had their strengths and weaknesses at various times, but I can’t see the logic in avoiding any one of those vendors entirely, for years on end. In fact ATi didn’t really start getting a bead on good driver programming until the 9500/9700.

      • A_Pickle
      • 11 years ago

      What would happen if you and Fighterpilot met?

    • MadManOriginal
    • 11 years ago

    q[< Nvidia will release new r[

      • erick.mendes
      • 11 years ago

      Larabee ring any bells?

    • Jigar
    • 11 years ago

    Hhmm… looks like my 2X 8800GT will now face real problem running new physX based games… Hope i can stay with them for atleast 1 year more or else i am at big loss here 🙁

      • d0g_p00p
      • 11 years ago

      It seems like every post you make you talk about your SLI setup. Each post the cards change as well, 8800GTS, 8800GT 8800GTX, etc. So which pair of fantasy SLI cards do you own?

        • Jigar
        • 11 years ago

        Fantasy =4X 280GTX
        Real = 2X 8800GT EVGA superclocked 😉

        The reason for the reminder or may be i didn’t notice much about that brag was because it’s really common to have a setup like that this days…

    • flip-mode
    • 11 years ago

    Honestly I’m not very impressed. Is it cool? Yeah, kinda. But flying hail doesn’t make a good video game. The tornado level is much more interesting because the tornado and flying objects actually impact the play of the game.

    That water, OTOH, needs to look a lot more like water.

    PhyxX is still mainly just a technology with a potentially very promising future as far as I’m concerned. I’d like to experience for myself, but if I don’t I’m not going to be too disappointed.

    • Scrotos
    • 11 years ago

    I would love to see an actual Ageia PhysX card run in these tests as well. Well, considering I have one. But Usacomp has a great point about a secondary video card as well.

    If you do these types of tests again in the future, it’d be great to also add the PhysX as a “baseline” and a second card to see if that plan pans out for nvidia/ati as well in their quest to sell more cards.

      • flip-mode
      • 11 years ago

      100%

    • Usacomp2k3
    • 11 years ago

    Do you have another video card that you could throw in real quick and let us know whether offloading the PhysX to another gpu releaves the primary one of that stress and hopefully provides framerate back up to where it was before. (or if you had that Ageia PhysX card laying around, that’d be a great comparison too).

    • wingless
    • 11 years ago

    AMD and Intel better step it up with Havok FX! Nvidia’s PhysX seems to be solid and ready for the mass market. It sucks that my HD 4870 won’t be able to play in this arena for a while but this will only promote aggressive competition on AMD/Intel’s side so no worries. Hmm, with Big Bang II drivers we may be able to get a cheap Nvidia card as a dedicated PhysX accelerator. I can’t wait for confirmation of that functionality.

Pin It on Pinterest

Share This