Confused? So are we. But we do have a GeForce 7950 GX2 in our grubby little hands, and it’s still a heckuva thing, quad SLI or no.
What you see below is BFG Tech’s version of the GeForce 7950 GX2.
Each of this SLI sandwich’s two printed circuit boards carries a G71 graphics processor, 512MB of memory, and a low-profile cooler. That G71 GPU is the same chip that powers the rest of the GeForce 7900 series, and in this application, it’s clocked at 500MHz. The memory chips run at 600MHz. That makes the GX2 roughly the equivalent of a pair of GeForce 7900 GT cardsbut with slightly faster GPU clocks, slightly slower memory clocks, and double the RAM per GPU.
So, uh, yeah. Powerful.
This puppy is also revised quite a bit compared to the cards that shipped in early Quad SLI systems. At 9.5″ inches, the GX2 is shorter than the earlier cardsand no longer than a Radeon X1900.
Protruding from the GX2’s expansion slot cover is a pair of dual-link DVI ports and a TV-out port. These unassuming ports include something new: full HDCP support.
I know, breathtaking, hardware copy protection as a feature!
But you’ll need it to plug into the latest HDTVs, so here it is. The board has a crypto ROM on it that works in concert with the GPU and an HDCP-ready playback application to make the magic happen. Then, all you have to do is plop down on the couch, rest your peg leg, and watch that new Blu-ray title with your one good eye. (Our BFG Tech review unit, however, did not ship with an HDMI plug adapter.)
NVIDIA says you can expect to cough up roughly $599 to $649 worth of pirate booty in order to purchase a GeForce 7950 GX2, and like many of its recent product introductions, this one should be followed by near-immediate availability of cards at online retailers.
How it works
The concept of dual-GPU teaming in a single card seems simple enough, but doing it well requires some extra hardware, especially since the ultimate goal is scaling up gracefully to Quad SLI. In order to make things work right, the GeForce 7950 GX2 has a new helper chip onboard: a custom 48-lane PCI Express switch created by NVIDIA. The switch divvies up its PCI-E lanes into three groups of sixteenone to each GPU and one to the rest of the system. This arrangement allows for high-bandwidth communication between either GPU and the rest of the system, as well as for fast data transfers between the GPUs.
Most data transfers between the GPUs, however, should happen over the dedicated scalable link interconnect (SLI!) that bridges the GX2’s two PCBs. If you look down between the card’s two circuit boards, you can see the physical connection between the cards that carries both PCI-E and SLI data.
All of this plumbing results in a single-card graphics subsystem that looks something like this, logically:
Because this is SLI on a card, it has some limitations. You may see the 7950 GX2 advertised as a 1GB card, and it undeniably has that much video RAM onboard. Yet that RAM is segregated into two 512MB pools, one for each GPU. Yes, the GX2 has about twice the memory bandwidth of a normal card, but functionally, it has a 512MB memory space. Textures and other data must be uploaded to each GPU and stored in each GPU’s associated memory.
The GX2’s PCI Express switch presents another problem, simply because it’s unfamiliar. Having a couple of GPUs in a single slot behind a PCI-E switch can cause device enumeration problems, so most motherboards will require a BIOS update in order to work with the GX2. NVIDIA has put together a list of mobos and BIOS revisions that it’s confirmed will work with the GX2 and has plans to post the list on its web site. Many of the most popular mobos are on the list already, but not all of ’em, so you’ll definitely want to check before buying a GX2.
I should note, by the way, that running a GeForce 7950 GX2 does not require an NVIDIA SLI chipset or even an NVIDIA chipset at all. Intel-based mobos and the like are happily on the GX2’s compatibility list.
Another artifact of SLI that mars the GX2’s operation affects multi-monitor setups. As with a dual-card SLI setup, you’ve got to switch the GX2 manually between multi-GPU mode and multi-display mode. Here’s how the option looks in NVIDIA’s fancy-pants new driver control panel. (This control panel, incidentally, is focus-group tested, just like ATI’s Catalyst Control Center. I hate it. Once again, focus groups prove they know nothing about good interface design.)
If you’re in multi-display mode, one of the GX2’s GPUs will output to two different displays concurrently and drive them like any other card would. In order to harness all of the GX2’s 3D horsepower, though, you’ve got to switch into multi-GPU mode, at which point one of those two displays will go blank. Frustrating, but that’s life with SLI.
Beyond the annoyance of having to pop in and out of multi-monitor mode manually, the GX2 generally appears to work like any other video card, with the GPU teaming operating transparently. The drivers default to multi-GPU mode, and you won’t see any pop-up messages reminding you to enable SLI like you will with dual-card rigs. Of course, in the background, the usual SLI stuff is happening. If there’s no SLI profile for a game in NVIDIA’s drivers, the GX2 won’t be able to accelerate it with two GPUs automatically. As usual, though, the savvy user may create a custom profile for an application if needed.
The deal with Quad SLI
The GeForce 7950 GX2 was designed for use in Quad SLI configurations and is fully capable of working in them. You may have noticed that the GX2 has only one “golden fingers” SLI connector on it, not two like on early Quad SLI cards. The ring topology we discussed in our early look at Quad SLI has been modified for the GX2. Apparently, one of those two SLI links was superfluous. That makes sense, if you think about it, because only two images need to be combinedone from each GX2in the final compositing stage of a Quad SLI rendering mode.
NVIDIA says each 7950 GX2 should pull maximum of 142 watts in real-world operation, so that a pair of ’em would require less than 300 W total. That should fit fairly easily within the power envelope of today’s best PC power supplies.
Before you get all excited about the prospect of dedicating 96 pixel shaders, 2GB of memory, and 1.1 billion transistors to pumping out boatloads of eye candy inside your PC, however, there’s some bad news. NVIDIA says Quad SLI will, at least for now, remain the exclusive realm of PC system builders like Alienware, Dell, and Falcon Northwest because of the “complexity” involved.
Yeah, that’s really what they said.
This “complexity” line is not just some off-the-cuff statement by one guy at NVIDIA, either; it’s practically corporate policy, repeated with consistency by several of the company’s representatives.
I’m not entirely sure what to make of this statement. As far as I can tell, Quad SLI requires a motherboard BIOS update, a fairly high-wattage PSU, sufficient case cooling, and a single SLI bridge connection. When explaining to your best customers why they can’t purchase two of your $649 video cards for themselves without also buying a $5K PC built by someone else, it’s probably not good idea to use a shaky excuse with an embedded insult. Especially if it also subtly sends an unnerving message about the competency of your board partners’ customer support organizations. Yet this is what NVIDIA is saying.
To underscore its commitment to keeping the GX2 chaste, NVIDIA declined to send us a second GeForce 7950 GX2 for testing in a Quad SLI config, and its current GX2 drivers apparently don’t support Quad SLI mode, anyhow. The company says it expects to see DIYers hacking together Quad SLI systems using GX2s, but such setups won’t be officially supported. There does seem to be some hope for DIY Quad SLI in the future, but NVIDIA hasn’t committed to any timetable for enabling this feature for those of us who didn’t pay Voodoo PC’s hefty premiums.
We only received drivers for the GX2 in the middle of last week, so our testing time with the card has been extremely limited. As a result, we’ve restricted our testing to a small set of competing cards and to a single resolution and quality level per game. This is, as you may know, not our usual practice, but it will have to suffice for now. In order to tease out real differences between these products, we chose game settings and display resolutions intended to push the limits of the fastest cards we’re testing. The differences between the cards might not be so great a lower resolutions, and they might be even greater with higher ones. With luck, though, our chosen settings will present a reasonably good picture of how these products compare to one another.
Our testing methods
As ever, we did our best to deliver clean benchmark numbers. Tests were run at least three times, and the results were averaged.
Our test systems were configured like so:
|Processor||Athlon 64 X2 4800+ 2.4GHz|
|System bus||1GHz HyperTransport|
|Motherboard||Asus A8N32-SLI Deluxe||Asus A8R32-MVP Deluxe|
|North bridge||nForce4 SLI X16||Radeon Xpress 3200|
|South bridge||nForce4 SLI||ULi M1575|
|Chipset drivers||ForceWare 6.85||ULi Integrated 2.20|
|Memory size||2GB (2 DIMMs)||2GB (2 DIMMs)|
|Memory type||Corsair CMX1024-4400 Pro
DDR SDRAM at 400 MHz
|Corsair CMX1024-4400 Pro
DDR SDRAM at 400 MHz
|CAS latency (CL)||2.5||2.5|
|RAS to CAS delay (tRCD)||3||3|
|RAS precharge (tRP)||3||3|
|Cycle time (tRAS)||8||8|
|Hard drive||Maxtor DiamondMax 10 250GB SATA 150||Maxtor DiamondMax 10 250GB SATA 150|
|Audio||Integrated nForce4/ALC850 with Realtek 22.214.171.12460 drivers||Integrated M1575/ALC880 with Realtek 5.10.00.5247 drivers|
|Graphics||GeForce 7900 GT 256MB PCI-E with ForceWare 91.29 drivers||Radeon X1800 XT 512MB with Catalyst 6.5 drivers|
|GeForce 7900 GTX 512MB PCI-E with ForceWare 91.29 drivers||Radeon X1900 XTX 512MB with Catalyst 6.5 drivers|
|GeForce 7950 GX2 1GB PCI-E with ForceWare 91.29 drivers|
|OS||Windows XP Professional (32-bit)|
|OS updates||Service Pack 2, DirectX 9.0c update (April 2006)|
Thanks to Corsair for providing us with memory for our testing. Although these particular modules are rated for CAS 3 at 400MHz, they ran perfectly for us with 2.5-3-3-8 timings at 2.85V.
Our test systems were powered by OCZ GameXStream 700W power supply units. Thanks to OCZ for providing these units for our use in testing.
Unless otherwise specified, image quality settings for the graphics cards were left at the control panel defaults.
The test systems’ Windows desktops were set at 1280×1024 in 32-bit color at an 85Hz screen refresh rate. Vertical refresh sync (vsync) was disabled for all tests.
We used the following versions of our test applications:
- Quake 4 1.2 with trq4demo5 demo
- Battlefield 2 1.3
- The Elder Scrolls IV: Oblivion 1.1 beta
- FEAR 1.05
- Half-Life 2: Lost Coast with trcoast2 demo
- FutureMark 3DMark05 Build 1.20
- FRAPS 2.7.2
The tests and methods we employ are generally publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.
The GeForce 7950 GX2 is really just SLI on a single card, but with a little fudging, it’s possible to express the GX2’s peak fill rate potential and memory bandwidth alongside the single-GPU cards out there.
| Core clock
|Peak fill rate
| Peak fill rate
| Memory bus
| Peak memory
|Radeon X1600 XT||590||4||2360||4||2360||1380||128||22.1|
|GeForce 6600 GT||500||4||2000||8||4000||1000||128||16.0|
|GeForce 6800 GS||425||8||3400||12||5100||1000||256||32.0|
|GeForce 6800 GT||350||16||5600||16||5600||1000||256||32.0|
|Radeon X800 XL||400||16||6400||16||6400||980||256||31.4|
|Radeon X1800 GTO||500||12||6000||12||6000||1000||256||32.0|
|GeForce 7600 GT||560||8||4480||12||6720||1400||128||22.4|
|GeForce 6800 Ultra||425||16||6800||16||6800||1100||256||35.2|
|GeForce 7800 GT||400||16||6400||20||8000||1000||256||32.0|
|Radeon X1800 XL||500||16||8000||16||8000||1000||256||32.0|
|Radeon X850 XT||520||16||8320||16||8320||1120||256||35.8|
|Radeon X850 XT PE||540||16||8640||16||8640||1180||256||37.8|
|XFX GeForce 7800 GT||450||16||7200||20||9000||1050||256||33.6|
|Radeon X1800 XT||625||16||10000||16||10000||1500||256||48.0|
|Radeon X1900 XT||625||16||10000||16||10000||1450||256||46.4|
|GeForce 7800 GTX||430||16||6880||24||10320||1200||256||38.4|
|Radeon X1900 XTX||650||16||10400||16||10400||1550||256||49.6|
|GeForce 7900 GT||450||16||7200||24||10800||1320||256||42.2|
|GeForce 7800 GTX 512||550||16||8800||24||13200||1700||256||54.4|
|GeForce 7900 GTX||650||16||10400||24||15600||1600||256||51.2|
|GeForce 7950 GX2||2 * 500||32||16000||48||24000||1200||2 * 256||76.8|
….and this little experiment shows us what a monster the GX2 really is. All told, this thing has over twice the peak multitextured fill rate of a Radeon X1900 XT and a whopping 76.8 GB/s of memory bandwidth.
Feed the thing a synthetic fill rate benchmark, and it proves those numbers are for real:
No other “single” card comes close. Of course, taking advantage of this power in the real world could prove tricky. Let’s move on to some games and see what happens.
We tested Oblivion by manually playing through a specific point in the game five times while recording frame rates using the FRAPS utility. Each gameplay sequence lasted 60 seconds. This method has the advantage of simulating real gameplay quite closely, but it comes at the expense of precise repeatability. We believe five sample sessions are sufficient to get reasonably consistent and trustworthy results. In addition to average frame rates, we’ve included the low frames rates, because those tend to reflect the user experience in performance-critical situations. In order to diminish the effect of outliers, we’ve reported the median of the five low frame rates we encountered.
We set Oblivion’s graphical quality settings to “High.” The screen resolution was set to 1600×1200 resolution, with HDR lighting enabled. 16X anisotropic filtering was forced on via the cards’ driver control panel.
In order to make sure we pushed the video cards as hard as possible, we enabled Quake 4’s multiprocessor support before testing.
We’ve used FRAPS to play through a sequence in F.E.A.R. in the past, but this time around, we’re using the game’s built-in “test settings” benchmark for a quick, repeatable comparison.
All in all, the GX2 looks very potent compared to the single-GPU cards.
Half-Life 2: Lost Coast
This expansion level for Half-Life 2 makes use of high-dynamic-range lighting and some nice pixel shader effects to create an impressive-looking waterfront. We tested with HDR lighting enabled on all cards.
We test BF2 using FRAPS and manual gameplay, much like we did with Oblivion.
Pretty impressive. In Battlefield 2, the GX2 actually achieves twice the average frame rate of the GeForce 7900 GTand more than twice its median low frame rate.
The GX2 keeps it going in 3DMark06, totally outclassing the rest of the field.
We measured total system power consumption at the wall socket using a watt meter. The monitor was plugged into a separate outlet, so its power draw was not part of our measurement. To keep things even, we did our power consumption testing for all cards using the Asus A8R32-MVP Deluxe motherboard.
The idle measurements were taken at the Windows desktop with AMD’s Cool’n’Quiet CPU clock throttling function disabled. The cards were tested under load running Oblivion using the game’s High Quality setting at 1600×1200 resolution with 16X anisotropic filtering.
Unsurprisingly, the GX2 consumes more power when sitting at the Windows desktop than any of the single-GPU cards. The shocking thing is the power use under load. The Radeon X1900 XT-based system draws 27 watts more power at the outlet than the otherwise-identical GX2-based rig. Craziness.
Ok, maybe it’s not so crazy. We’ve seen similar achievements with dual-core CPUs, after all. And the GX2 certainly acts the part; its cooler seems a little quieter under load the Radeon X1900 XTX’s. The GX2 isn’t as whisper quiet as the GeForce 7900 GTX, but it’s pretty darned good, considering.
Somehow, I didn’t really expect a “single” GeForce 7950 GX2 card to be a compelling product outside of a Quad SLI configuration. This puppy does have its warts, including the need for mobo BIOS updates and the SLI-like limitations for multi-monitor use that may turn some power users away. Still, the GX2 is remarkably tame overall. This card takes up no more space, draws no more power, and generates no more heat or noise than a Radeon X1900 XTX, but its performance is in another class altogether. The multi-GPU mojo generally happens transparently, too, thanks to a healthy collection of SLI profiles already established in NVIDIA’s drivers. Putting two GPUs on a card has allowed NVIDIA to overcome the limitations of its present GPU designs and of current fab process tech to achieve new performance heights in a single PCI Express slot.
Of course, such things have been possible for quite some time in the form of a two-slot SLI or CrossFire solution, but the GX2 still has much to recommend it. Doubling up on GeForce 7900 GTs would get you an SLI setup in the same price range, but with lower (stock) GPU clock speeds and only 256MB of memory per GPU. And the GX2 works in any chipset, which may prove to be a real boon if you fancy one of Intel’s Conroe processors, an Intel chipset, and uber-fast graphics. I could see that combination becoming very popular this summer, if things shake out as expected.
All that’s left now is for NVIDIA to enableand supportQuad SLI in consumer-built systems. Let’s hope NVIDIA comes to its senses on that one sooner rather than later.