Gelato has been developed by adhering to a fundamental principle: never compromise on the quality of the final rendered image. As a result, Gelato delivers images that meet the most rigorous demands of the film industry.The software is just a 1.0 revision, of course, and it probably has its limitations, but it's the beginning of something big, no doubt. Linux eval copies are downloadable, and the program ships with a Maya plug-in. We'll try to get our hands on a copy and benchmark it against Pentiums and Opterons ASAP. I expect it will be many times faster than any current CPU. If we can get it running on a GeForce 6800 Ultra, I expect scary-good things.
Key to this doctrine of no compromises is the nature of how Gelato uses the NVIDIA Quadro FX GPU. Instead of just using the native 3D engine in the GPU, as done in games, Gelato also uses the hardware as a second floating point processor. This allows Gelato to use hardware acceleration without being bounded by the limits of the 3D engine. Gelato is one of the first in a wave of software applications that use the GPU as an off-line processor, a "floating-point supercomputer on a chip," and not simply to manage what is displayed on the screen.
I should note, also, that ATI's ASHLI seems to provide some similar hooks to apps like Maya, but it looks like more of a conversion tool than a final-frame renderer.
Some folks thought I'd jumped the gun when I said in this article that GPUs would be taking over for CPUs before too long. It has taken a couple of years, but if Gelato delivers what it promises, the time has arrived.