I said just last week that GPUs are breeding like rabbits, and here we have another example of multi-GPU multiplication. The brand-spanking-new GeForce 9800 X2 combines a pair of G92 graphics processors onto one card for twice the goodness, like an incredibly geeky version of an old Double-Mint gum commercial.
Cramming a pair of GPUs into a single graphics card has a long and familiar pedigree, but the most recent example of such madness is AMD's Radeon HD 3870 X2, which stole away Nvidia's single-card performance crown by harnessing a duo of mid-range Radeon GPUs. The folks on the green team tend to take the heavyweight performance crown rather seriously, and the GeForce 9800 GX2 looks primed to recapture the title. We already know the G92 GPU is faster than any single graphics processor AMD has to offer. What happens when you double up on them via SLI-on-a-card? Let's have a look.
Please welcome the deuce
Dressed all in black, the GeForce 9800 GX2 looks like it means business. That's probably because it does. This beast packs two full-on G92 graphics processors, each running at 600MHz with a 1500MHz shader clock governing its 128 stream processors. Each GPU has its own 512MB pool of GDDR3 memory running at 1GHz (with a 2GHz effective data rate) on a 256-bit bus. For those of you in Rio Linda, that adds up to 1GB of total graphics memory and a whole lotta bandwidth. However, as in any SLI setup, memory isn't shared between the two GPUs, so the effective memory size of the graphics subsystem is 512MB.
The G92 GPU may be familiar to you as the engine behind the incredibly popular GeForce 8800 GT, and you may therefore be tempted to size up the GX2 as the equivalent of two 8800 GT cards in SLI. But that would be selling GX2 short, since one of the G92 chip's stream processor (SP) clusters is disabled on the 8800 GT, reducing its shader and texture filtering power. Instead, the GX2 is closer to a pair of GeForce 8800 GTS 512 cards with their GPUs clocked slightly slower.
Translation: this thing should be bitchin' fast.
The Johnny-Cash-meets-Darth-Vader color scheme certainly works well for it. (And before you even ask, let's settle this right away: head to head, Cash would defeat Vader, ten times out of ten. Thanks for playing.) Such color schemes tend to go well with quirky personalities, and the GX2 isn't without its own quirks.
Among them, as you can see, is the fact that its two DVI ports have opposite orientations, which may lead to some confusion as you fumble around behind a PC trying to connect your monitor(s). Not only that, but only port number 1 is capable of displaying pre-boot output like BIOS menus, DOS utilities, or the like. Nvidia calls this port "bootable." The second port will drive a display only once you have video drivers installed and are booted into a proper OS.
To the left of the DVI and HDMI ports in the picture above are a pair of LED indicators to further confuse and astound you. The lower blinkenlight turns green to indicate that all of the necessary power leads are connected to the GX2, while the upper one lights up blue to indicate which of the two GX2 cards in (ahem) a quad SLI setup owns the primary display port.
As you can see in the picture above, a black plastic shroud envelops the entire GX2, as if it were a Steve Jobs-style black turtleneck. The GX2's full-coverage shroud furthers its image as a self-contained graphics powerhouseand conceals its true, dual nature, as we'll soon find out.
A few holes in the shroud do expose key connectors, though. This card requires both a six-pin aux PCIe power connector and an eight-pin one. Take note: plugging a six-pin connector into that eight-pin port isn't sufficient, as it is for some Radeon cards. The GX2 requires a true eight-pin power lead. Unfortunately, space around this eight-pin plug is tight. Getting our PSU's connector into the port took a little extra effort, and extracting it again took lots of extra effort. Nvidia claims the problem is that some PSUs don't comply with the PCIe spec, but that's little comfort. Cutting a slightly larger hole in the shroud would have prevented quite a few headaches.
Extra exposure below the shroud, though, doesn't seem to be part of the program. For instance, just to the left of the six-pin power plug is an audio S/PDIF input, needed to feed audio to the GX2's HDMI output port. This port was concealed by a rubber stopper on this XFX card out of the box.
The GX2's SLI connector lurks under a plastic cover, as well, semi-ominously suggesting the potential for quad SLI. Those who remember the disappointing performance scaling of Nvidia's previous quad SLI solution, based on the GeForce 7950 GX2, will be relieved to hear that the 9800 GX2 should be free from the three-buffer limit that the combination of Windows XP and DirectX 9 imposed back then. Nvidia says it intends to deliver four-way alternate frame rendering (AFR), a la AMD's CrossFire X. That should allow for superior performance scaling with four GPUs, provided nothing else gets in the way.
Price, certainly, should be no object for the typical quad SLI buyer. XFX has priced its 9800 GX2 at $599 on Newegg, complete with a copy of Company of Heroes. That puts the GX2 nearly a hundred bucks above the price of a couple of GeForce 8800 GTS 512MB cards. In this case, you're paying a premium for the GX2's obvious virtues, including its single PCIe x16 connection, dual-slot profile, quad SLI potential, and the happy possibility of getting SLI-class performance on a non-Nvidia chipset.
|A first look at USB 3.1 performance||3|
|FREAK vulnerability can affect Windows, as well||3|
|Deal of the week: A G-Sync display, Shield Tablet, and more||24|
|Wolfenstein: The Old Blood trailer is delightfully pulpy||44|
|Thursday Night Shortbread||36|
|We learned more about Vulkan at Valve's glNext presentation||75|
|Steam Controller gets November release, $50 price tag||25|
|Monitors with AMD's FreeSync tech now available in select regions||59|
|And Samsung makes new phone with no sd slot lol whaw whaw whaw||+59|