Single page Print

Ageia's many challenges
Since their first press release hit the wire, naysayers have been predicting Ageia's failure—and with good reason. You may have gathered by now that Ageia is attempting to do something fundamentally difficult. They face a number of challenges, including the chicken-and-egg problem involving a dearth of PhysX support in games and the lack of an installed base of PPU hardware. I must admit that I don't have much of a taste for all of the triumphalist doomsaying we've been hearing. As a PC enthusiast, I love the idea of realistic physics simulations in games, and I'm generally favorably inclined toward new types of custom chips to make it happen. One would hope the PC gaming market would attract efforts like this one and reward them if they succeed.

That said, Ageia's prospects are undeniably cloudy. We've already talked about how Ageia is addressing the software development question, but we should probably consider some other dark clouds on the horizon, as well. Among them:

Physics is not graphics. Everyone loves to use the analogy of GPUs when thinking about the development of physics acceleration. It's almost inescapable. That analogy is sometimes helpful but fundamentally flawed, kind of like the car analogies that have plagued CPU performance discussions since the dawn of time (or the 1970s, whichever came first.) There are many ways in which physics and graphics are different, but the one that matters most, I think, has to do with the way physics support can be incorporated into games.

Old-timers like me remember when the first 3D graphics cards arrived. We were able to pop in a 3dfx Voodoo card or the like and get better graphics almost instantly thanks to modified versions of existing games, like GLQuake. The image quality was higher than software acceleration, and we could run games at higher display resolutions, too. This instant gratification sparked a wave of upgrades and helped 3dfx become a household name in a matter of months. Physics, however, has no easy analog to higher display resolutions and better quality texture filtering. Gamers can't grab a PhysX card and expect an instant payoff. We'll have to wait for games to catch up, and that could literally take years.

Eye candy isn't interactivity. When PhysX support does arrive in games, it will likely take the form of improved visual effects, as it does today in Ghost Recon Advanced Warfighter. When you blow stuff up, the smithereens are legion. Bits and pieces of things are flying everywhere. But none of it affects gameplay in any meaningful way, because the game's physical world wasn't designed with hardware-accelerated physics in mind. Nifty visual effects present lots of opportunities to game developers, but if that PhysX card is going to be worth my money, I want to feel the impact of physics acceleration. Getting game developers to change their assumptions and really take advantage of physics hardware in ways that alter gameplay will probably be extremely tough, especially since, one would presume, the software-based fallback will be much slower. Playing the same game on a non-PPU-equipped system would have to be a different experience, with fewer physical objects onscreen and fewer possibilities for interaction.

There's no killer app. This one flows from the last two and is very simple. I'm not the first one to say it, either. The PhysX PPU needs at least one really good game to demonstrate its power and really sell people on the concept. So far, it's not here, and I'm not even sure a strong contender for this role is imminent.

The GPU/MPU challenge. ATI and NVIDIA have already teamed up with Ageia's rival in the physics middleware market, Havok, to get preliminary demos of a GPU-accelerated physics API up and running. The graphics guys are talking big about the number of FLOPS they can bring to physics processing and are hungry to prove GPUs can do more than push pixels. The GPU-accelerated physics API, Havok FX, is currently an eye-candy-only affair, so game-critical physics simulations must still happen on the CPU. Still, if dedicating a second or third (or fifth) graphics card to physics can achieve results similar to Ageia's in the short term, Ageia's life could get complicated.

On top of that, execution cores in microprocessors are multiplying like rabbits these days, with two cores set to begin giving way to four by this time next year. I'm convinced that a custom chip designed for physics could theoretically outdo a multi-core CPU and probably a GPU in terms of peak physics processing capabilities, and Ageia talks a lot about the joys of custom chips when this topic comes up. Best to leave the graphics to the GPU and the game AI to the CPU, they say. What they haven't convinced me, however, is that a multi-core CPU or a GPU—not to mention the combination of the two—isn't sufficient to deliver in-game physics that are incredibly realistic and compelling.

Of course, just above we were fretting that the gap between the PPU haves and have-nots might be too large, so who the heck knows?

Is PCI the short bus to physics? I mention this one because we have a few folks in our news comments who persistently mention it as a show-stopping problem. Right now, PhysX cards will only plug into a PCI slot, and ye olde PCI is known for being relatively slow. No doubt Ageia chose PCI for cogent reasons, like the fact that they started development long ago, when PCI-E was but a gleam in Intel's eye, or that they want to sell lots of cards as upgrades for existing PCs. Still, Ageia does have plans for a PCI Express version of the PhysX card at some point in the future.

Some folks seem to think PCI is impossibly slow for a really solid PPU implementation. I've asked Ageia about this issue repeatedly, they insist using PCI is really not a problem. Given the fact that the cards have 128MB of fast local memory, I'm mostly inclined to believe them. Of course, we won't really know until we have games that really stress a PhysX card's capabilities.

Is there room in the market? So you drink Bawls soda by the gallon and split your free time between Counter-Strike tournament play, trying out the latest game demos, and mastering the nuances of Oblivion. On the side, you've developed an entire alternate personality in WoW. You want to prove your bona-fides by building yourself the ultimate gaming rig, but your you're on a generous-but-strict $1200 budget. How do you choose between paying more for a dual-core CPU, ponying up for a discrete sound card, abusing the plastic for a second video card, adding a drive for RAID 0, or going for that gorgeous new 20" LCD? Wait, now you're supposed buy a separate card for physics, too? For almost $300?!

Even if PhysX is worthy, Ageia may find it difficult to prosper in a PC market crowded with other pricey, enthusiast-oriented goodies.

I suppose I could dream up some more potential problems for Ageia, but that about covers the major ones. I think one really good killer app that illustrates compelling potential for PhysX could cut through most of these clouds like a bolt of sunlight, but it's not here yet.

Since we're a PC hardware review site, I'm bound by law and social contract to test the PhysX card and make some graphs. That portion of the review follows.