Now that we've delved into the shaders a bit, we should take a step back and look at the bigger picture. The Xbox 360 GPU not only packs a lot of shader power, but it's also the central hub in the Xbox 360 system, acting as the main memory controller as well as the GPU. The Xbox 360 has 512MB of GDDR3 memory onboard running at 700MHz, with a 128-bit interface to ATI's memory controller. The ATI GPU, in turn, has a very low latency path to the Xbox 360's three IBM CPU cores. This link has about 25GB/s of bandwidth. Feldstein said the graphics portion of the chip has something of a crossbar arrangement for getting to memory, but he didn't know whether the CPU uses a similar scheme.
Embedded DRAM for "free" antialiasing
The GPU won't be using system memory itself quite as much as one might expect, because it packs 10MB of embedded DRAM right on the package. In fact, the Xbox 360 GPU is really a two-die design, with two chips in a single package on a single substrate. The parent die contains the GPU and memory controller, while the daughter die consists of the 10MB of eDRAM and some additional logic. There's a high-speed 2GHz link between the parent and daughter dies, and Feldstein noted that future revisions of the GPU might incorporate both dies on a single piece of silicon for cost savings.
The really fascinating thing here is the design of that daughter die. Feldstein called it a continuation of the traditional graphics pipeline into memory. Basically, there's a 10MB pool of embedded DRAM, designed by NEC, in the center of the die. Around the outside is a ring of logic designed by ATI. This logic is made up of 192 component processors capable of doing the basic math necessary for multisampled antialiasing. If I have it right, the component processors should be able to process 32 pixels at once by operating on six components per pixel: red, green, blue, alpha, stencil, and depth. This logic can do the resolve pass for multisample antialiasing right there on the eDRAM die, giving the Xbox 360 the ability to do 4X antialiasing on a high-definition (1280x768) image essentially for "free"i.e., with no appreciable performance penalty. The eDRAM holds the contents of all of the back buffers, does the resolve, and hands off the resulting image into main system memory for scan-out to the display.
Feldstein noted that this design is efficient from a power-savings standpoint, as well, because there's much less memory I/O required when antialiasing can be handled on the chip. He said ATI was very power-conscious in the design of the chip, so that the Xbox 360 could be a decent citizen in the living room.
My conversation with Bob Feldstein about the Xbox 360 GPU was quick but, obviously, very compact, with lots of information. I hope that I've gotten everything right, but I expect we will learn more and sharpen up some of these details in the future. Nonetheless, ATI was very forthcoming about the technology inside its Xbox 360 GPU, and I have to say that it all sounds very promising.
For those of you wondering how the Xbox 360 GPU relates to ATI's upcoming PC graphics chips, I wish I could tell you, but I can't. Feldstein said the Xbox 360 GPU "doesn't relate" to a PC product. Some of elements of the design seem impractical for PC use, like the 10MB of embedded DRAM for antialiasing; PCs don't use one single, standard resolution like HDTVs do. Still, it's hard to imagine ATI having some of this technology in its portfolio and not using it elsewhere at some point.