Is the T-buffer back?


— 9:08 PM on January 28, 2003

Kyle at the cold, HardOCP has posted some very interesting info about screenshots in the recent round of GeForce FX previews. He writes:

The GeForceFX's technology applies filters that effect AntiAliasing and Anisotropic filtering before the frame buffer and after the frame has left the frame buffer. In short, this means that all of our screenshots do not accurately represent the true in-game visual quality that the GFFX can and will produce, as the screen shots were pulled from the frame buffer (in the "middle" of the AA process). We have come to conclusions about the GFFX IQ (Image Quality) that may be simply wrong.
If this info turns out to be true, the GeForce FX may have some variant of 3dfx's old T-buffer scheme in which images from multiple color ("frame") buffers are combined as they are output to the RAMDAC. The T-buffer offered performance improvements over more conventional accumulation buffers at the time of the Voodoo 5. However, the scheme created problems with screenshots, because the final image couldn't be read back into main memory.

The fact that GeForce FX screenshots may not come out right makes it sound like the T-buffer is back. This fact, if it is a fact, reveals a big potential weakness, because the final, anti-aliased images are apparently output to the screen without making their way into a readable buffer. Display-centric schemes like this one can create yet another problem with cinematic rendering on a VPU. T-buffer-like mechanisms do have potential advantages in terms of performance and image quality, but those advantages are probably largely negated by high-color pixel formats and a wide memory bus.

We'll watch to see what exactly is going on here. For now, though, NVIDIA certainly seems to have put the "FX" in GeForce FX.

 
   
Register
Tip: You can use the A/Z keys to walk threads.
View options

This discussion is now closed.