Single page Print

The chip: Large and in charge
When I say the GT200 has a whole lot of everything, that naturally includes transistors: roughly 1.4 billion of them, more than double the 681 million transistors in the G80. Ever faithful to its dictum to avoid the risk of transitioning to a new fab process with a substantially new design, Nvidia has stuck with a 65nm manufacturing technology, and in fact, it says the GT200 is the largest chip TSMC has ever fabricated.

Mounted on a board and covered with a protective metal cap, the GT200 looks like so:


Holy Moses.

When I asked Nvidia's Tony Tamasi about the GT200's die size, he wouldn't get too specific, preferring only to peg it between 500 and 600 mm². Given that, I think the reports that GT200's die is 576 mm² are credible. Whatever the case, this chip is big—like "after the first wafer was fabbed, the tide came in two minutes late" big. It's the Kim Kardashian's butt of the GPU world.

To give you some additional perspective on its size, here's a to-scale comparison Nvidia provided between a GT200 GPU and an Intel "Penryn" 45nm dual-core CPU.


Source: Nvidia.

Such a large chip can't be inexpensive to manufacture, since defect rates tend to rise exponentially with chip area. Nvidia almost seems to revel in having such a big chip, though, and it does have experience in this realm. It certainly seems as if Nvidia's last large chip, the G80, worked out pretty well. Perhaps they're not crazy to do this.

If you're curious about what's where on the die, have a look at the helpfully colored diagram below.


The GT200's basic layout. Source: Nvidia.

Tamasi noted that the shader cores look very regular and recognizable because they are made up of custom logic, like on a CPU, rather than being the product of automated logic synthesis. He also pointed out that, if you count 'em, the GT200 has exactly the number of on-chip shader structures you'd expect, with 10 TPCs easily visible and no extras thrown in to help increase yields. Of course, nothing precludes Nvidia from selling a GT200-based product with fewer than 10 TPCs enabled, either.

You may be wondering, with a chip this large, about power consumption—as in: Will the lights flicker when I fire up Call of Duty 4? The chip's max thermal design power, or TDP, is 236W, which is considerable. However, Nvidia claims idle power draw for the GT200 of only 25W, down from 64W in the G80. They even say GT200's idle power draw is similar to AMD's righteously frugal RV670 GPU. We shall see about that, but how did they accomplish such a thing? GeForce GPUs have many clock domains, as evidenced by the fact that the GPU core and shader clock speeds diverge. Tamasi said Nvidia implemented dynamic power and frequency scaling throughout the chip, with multiple units able to scale independently. He characterized G80 as an "on or off" affair, whereas GT200's power use scales more linearly with demand. Even in a 3D game or application, he hinted, the GT200 might use much less power than its TDP maximum. Much like a CPU, GT200 has multiple power states with algorithmic determination of the proper state, and those P-states include a new, presumably relatively low-power state for video decoding and playback. Also, GT200-based cards will be compatible with Nvidia's HybridPower scheme, so they can be deactivated entirely in favor of a chipset-based GPU when they're not needed.

As you may have noticed in the photograph above, the GT200 brings back an innovation from the G80 that we hadn't really expected to see again: a separate, companion display chip. This chip is similar in function to the one on the G80 but is a new chip with additional capabilities, including support for 10-bit-per-color-channel scan out. GT200 cards will feature a pair of dual-link DVI outputs with HDCP over both links (for high-res HD movie playback), and Nvidia claims they will support HDMI via a DVI-to-HDMI adapter, although our sample of a production card from XFX didn't include such an adapter. GT200 boards can also support DisplayPort, but they'll require a custom card design from the vendor, since Nvidia's reference design doesn't include a DisplayPort, er, display port. (Seriously, WTH?)

Incidentally, if you're going to be playing HD movies back over one of those fancy connections, you'll be pleased to learn that Nvidia has extended the PureVideo logic in the GT200 to handle decoding of files encoded with the VC-1 and WMV9 codecs, as well as H.264.