How it works
The concept of dual-GPU teaming in a single card seems simple enough, but doing it well requires some extra hardware, especially since the ultimate goal is scaling up gracefully to Quad SLI. In order to make things work right, the GeForce 7950 GX2 has a new helper chip onboard: a custom 48-lane PCI Express switch created by NVIDIA. The switch divvies up its PCI-E lanes into three groups of sixteenone to each GPU and one to the rest of the system. This arrangement allows for high-bandwidth communication between either GPU and the rest of the system, as well as for fast data transfers between the GPUs.
Most data transfers between the GPUs, however, should happen over the dedicated scalable link interconnect (SLI!) that bridges the GX2's two PCBs. If you look down between the card's two circuit boards, you can see the physical connection between the cards that carries both PCI-E and SLI data.
All of this plumbing results in a single-card graphics subsystem that looks something like this, logically:
Because this is SLI on a card, it has some limitations. You may see the 7950 GX2 advertised as a 1GB card, and it undeniably has that much video RAM onboard. Yet that RAM is segregated into two 512MB pools, one for each GPU. Yes, the GX2 has about twice the memory bandwidth of a normal card, but functionally, it has a 512MB memory space. Textures and other data must be uploaded to each GPU and stored in each GPU's associated memory.
The GX2's PCI Express switch presents another problem, simply because it's unfamiliar. Having a couple of GPUs in a single slot behind a PCI-E switch can cause device enumeration problems, so most motherboards will require a BIOS update in order to work with the GX2. NVIDIA has put together a list of mobos and BIOS revisions that it's confirmed will work with the GX2 and has plans to post the list on its web site. Many of the most popular mobos are on the list already, but not all of 'em, so you'll definitely want to check before buying a GX2.
I should note, by the way, that running a GeForce 7950 GX2 does not require an NVIDIA SLI chipset or even an NVIDIA chipset at all. Intel-based mobos and the like are happily on the GX2's compatibility list.
Another artifact of SLI that mars the GX2's operation affects multi-monitor setups. As with a dual-card SLI setup, you've got to switch the GX2 manually between multi-GPU mode and multi-display mode. Here's how the option looks in NVIDIA's fancy-pants new driver control panel. (This control panel, incidentally, is focus-group tested, just like ATI's Catalyst Control Center. I hate it. Once again, focus groups prove they know nothing about good interface design.)
If you're in multi-display mode, one of the GX2's GPUs will output to two different displays concurrently and drive them like any other card would. In order to harness all of the GX2's 3D horsepower, though, you've got to switch into multi-GPU mode, at which point one of those two displays will go blank. Frustrating, but that's life with SLI.
Beyond the annoyance of having to pop in and out of multi-monitor mode manually, the GX2 generally appears to work like any other video card, with the GPU teaming operating transparently. The drivers default to multi-GPU mode, and you won't see any pop-up messages reminding you to enable SLI like you will with dual-card rigs. Of course, in the background, the usual SLI stuff is happening. If there's no SLI profile for a game in NVIDIA's drivers, the GX2 won't be able to accelerate it with two GPUs automatically. As usual, though, the savvy user may create a custom profile for an application if needed.
The deal with Quad SLI
The GeForce 7950 GX2 was designed for use in Quad SLI configurations and is fully capable of working in them. You may have noticed that the GX2 has only one "golden fingers" SLI connector on it, not two like on early Quad SLI cards. The ring topology we discussed in our early look at Quad SLI has been modified for the GX2. Apparently, one of those two SLI links was superfluous. That makes sense, if you think about it, because only two images need to be combinedone from each GX2in the final compositing stage of a Quad SLI rendering mode.
NVIDIA says each 7950 GX2 should pull maximum of 142 watts in real-world operation, so that a pair of 'em would require less than 300 W total. That should fit fairly easily within the power envelope of today's best PC power supplies.
Before you get all excited about the prospect of dedicating 96 pixel shaders, 2GB of memory, and 1.1 billion transistors to pumping out boatloads of eye candy inside your PC, however, there's some bad news. NVIDIA says Quad SLI will, at least for now, remain the exclusive realm of PC system builders like Alienware, Dell, and Falcon Northwest because of the "complexity" involved.
Yeah, that's really what they said.
This "complexity" line is not just some off-the-cuff statement by one guy at NVIDIA, either; it's practically corporate policy, repeated with consistency by several of the company's representatives.
I'm not entirely sure what to make of this statement. As far as I can tell, Quad SLI requires a motherboard BIOS update, a fairly high-wattage PSU, sufficient case cooling, and a single SLI bridge connection. When explaining to your best customers why they can't purchase two of your $649 video cards for themselves without also buying a $5K PC built by someone else, it's probably not good idea to use a shaky excuse with an embedded insult. Especially if it also subtly sends an unnerving message about the competency of your board partners' customer support organizations. Yet this is what NVIDIA is saying.
To underscore its commitment to keeping the GX2 chaste, NVIDIA declined to send us a second GeForce 7950 GX2 for testing in a Quad SLI config, and its current GX2 drivers apparently don't support Quad SLI mode, anyhow. The company says it expects to see DIYers hacking together Quad SLI systems using GX2s, but such setups won't be officially supported. There does seem to be some hope for DIY Quad SLI in the future, but NVIDIA hasn't committed to any timetable for enabling this feature for those of us who didn't pay Voodoo PC's hefty premiums.
|AMD drops prices on the Radeon RX 460 and RX 470||42|
|Reports: Radeon RX 470D is a budget Polaris card for China||9|
|Examining reports of slow write speeds on the 32GB iPhone 7||29|
|Cellular Insights dissects iPhone 7 Plus modem performance||11|
|Deals of the week: scads of high-performance storage and more||9|
|Tobii's Eye Tracker 4C knows where your head is||4|
|GeForce driver 375.57 is prepared for Titanfall 2||8|
|Phanteks Eclipse P400 gets a tempered glass option||0|
|Radeon 16.10.2 drivers add support for October's big games||10|
|A real "console monitor" would be 720p @ 30 Hz ;P||+63|