Single page Print

The card: GeForce GTX Titan


You can tell at first glance that the Titan shares its DNA with the GeForce GTX 690. The two cards have the same sort of metal cooling shroud, with a window showing the heatsink fins beneath, and they share a silver-and-black color scheme. Nvidia seems to know one grand is a big ask for a video card, and it has delivered a card with the look and feel of a premium product.

GPU
base
clock
(MHz)
GPU
boost
clock
(MHz)
Shader
ALUs
Textures
filtered/
clock
ROP
pixels/
clock
Memory
transfer
rate
Memory
interface
width
(bits)
Peak
power
draw
GeForce GTX 680 1006 1058 1536 128 32 6 GT/s 256 195W
GeForce GTX Titan 836 876 2688 224 48 6 GT/s 384 250W
GeForce GTX 690 915 1019 3072 256 64 6 GT/s 2 x 256 300W

One of our big questions about the Titan has been where the GK110 would land in terms of clock speeds and power consumption. Big chips like this can be difficult to tame. As you can see in the table above, Nvidia has elected to go with relatively conservative clock frequencies, with an 837MHz base clock and an 876MHz "Boost" clock, courtesy of its GPU Boost dynamic voltage and frequency scaling technology. That's a bit shy of the gigahertz-range speeds that its siblings have reached. However, the Titan may run as fast as ~970-990 MHz, in the right situation, thanks to its new iteration of GPU Boost. (More on this topic shortly.)


Those clock speeds contribute to the Titan's relatively tame 250W max power number. That's low enough to be a bit of a surprise for a card in this price range, but then the Kepler architecture has proven to be fairly power efficient. The card requires one eight-pin aux power connector and another six-pin one, not dual eight-pins like the GTX 690. The Titan's power draw and connector payload fits well with its potential to be the building block of a multi-GPU setup involving two or three cards. Yes, the dual connectors are there to allow for three-way SLI—you know, for the hedge fund manager who loves him some Borderlands.

One other help to multi-GPU schemes, and to future-proofing in general, is the Titan's inclusion of a massive 6GB of GDDR5 memory. Nvidia had the choice of 3GB or 6GB to mate with that 384-bit memory interface, and it went large. Perhaps the extra RAM could help in certain configs—say in a surround gaming setup involving a trio of four-megapixel displays. Otherwise, well, at least the extra memory can't hurt.


The Titan measures 10.5" from stem to stern, putting it smack-dab in between the GTX 680 and 690. That makes it nearly half an inch shorter than a Radeon HD 7970 reference card.


Revealed: the blower and the heat sink fins atop the vapor chamber. Source: Nvidia


A bare Titan card. Source: Nvidia

Nvidia tells us Titan cards should be widely available starting next Monday, February 25th, although some may start showing up at places like Newegg over the weekend. Nvidia controls the manufacture of Titan cards, and it sells those cards to its partners. As a result, we're not likely to see custom coolers or circuit boards on offer. Only select Nvidia partners will have access to the Titan in certain regions. For the U.S., those partners are Asus and EVGA.

Although those restrictions would seem to indicate the Titan will be a limited-volume product, Nvidia tells us it expects a plentiful supply of cards in the market. The firm points out that the GK110 has been in production for a while, so we shouldn't see the sort of supply issues that happened after the GTX 680's introduction, when GK104 production was just ramping up. Of course, the Titan's $999.99 price tag may have something to do with the supply-demand equation at the end of the day, so we're probably not talking about massive sales volumes. Our hope is that folks who wish to buy a Titan won't find them out of stock everywhere. I guess time will tell about that.

Meanwhile, the Titan will coexist uneasily with the GeForce GTX 690 for now—you know, talking trash about micro-stuttering versus raw performance, throwing elbows when Jen-Hsun isn't looking, that sort of thing. The two cards are priced the same, sharing the title of "most expensive consumer graphics card" and catering to slightly different sets of preferences.