NVIDIA’s GeForce 7950 GX2 graphics card

NVIDIA HAS BEEN TALKING publicly about Quad SLI for half a year now, and Quad SLI configs have been shipping for a few months in a select number of ultra-high-end PCs. Today, at long last, NVIDIA is unveiling a consumer version of its Quad SLI component card, the GeForce 7950 GX2. A single GX2 plugs into one PCI Express slot, but it actually has a pair of printed circuit boards, two GPUs, and two sets of memory chips onboard. By itself, the GeForce 7950 GX2 is an SLI setup on a stick, a dual-GPU powerhouse that fits into the same space as any other high-end graphics card with a dual-slot cooler. Slide two of these puppies into a system side by side, and you have the potential for Quad SLI—but not the reality, apparently, if you’re just a lowly DIYer.

Confused? So are we. But we do have a GeForce 7950 GX2 in our grubby little hands, and it’s still a heckuva thing, quad SLI or no.

The card
What you see below is BFG Tech’s version of the GeForce 7950 GX2.

Each of this SLI sandwich’s two printed circuit boards carries a G71 graphics processor, 512MB of memory, and a low-profile cooler. That G71 GPU is the same chip that powers the rest of the GeForce 7900 series, and in this application, it’s clocked at 500MHz. The memory chips run at 600MHz. That makes the GX2 roughly the equivalent of a pair of GeForce 7900 GT cards—but with slightly faster GPU clocks, slightly slower memory clocks, and double the RAM per GPU.

So, uh, yeah. Powerful.

This puppy is also revised quite a bit compared to the cards that shipped in early Quad SLI systems. At 9.5″ inches, the GX2 is shorter than the earlier cards—and no longer than a Radeon X1900.

Protruding from the GX2’s expansion slot cover is a pair of dual-link DVI ports and a TV-out port. These unassuming ports include something new: full HDCP support.

I know, breathtaking, hardware copy protection as a feature!

But you’ll need it to plug into the latest HDTVs, so here it is. The board has a crypto ROM on it that works in concert with the GPU and an HDCP-ready playback application to make the magic happen. Then, all you have to do is plop down on the couch, rest your peg leg, and watch that new Blu-ray title with your one good eye. (Our BFG Tech review unit, however, did not ship with an HDMI plug adapter.)

NVIDIA says you can expect to cough up roughly $599 to $649 worth of pirate booty in order to purchase a GeForce 7950 GX2, and like many of its recent product introductions, this one should be followed by near-immediate availability of cards at online retailers.

 

How it works
The concept of dual-GPU teaming in a single card seems simple enough, but doing it well requires some extra hardware, especially since the ultimate goal is scaling up gracefully to Quad SLI. In order to make things work right, the GeForce 7950 GX2 has a new helper chip onboard: a custom 48-lane PCI Express switch created by NVIDIA. The switch divvies up its PCI-E lanes into three groups of sixteen—one to each GPU and one to the rest of the system. This arrangement allows for high-bandwidth communication between either GPU and the rest of the system, as well as for fast data transfers between the GPUs.

Most data transfers between the GPUs, however, should happen over the dedicated scalable link interconnect (SLI!) that bridges the GX2’s two PCBs. If you look down between the card’s two circuit boards, you can see the physical connection between the cards that carries both PCI-E and SLI data.


The GX2’s internal board-to-board link is nestled between PCBs

All of this plumbing results in a single-card graphics subsystem that looks something like this, logically:


A block diagram of the GeForce 7950 GX2’s logical layout. Source: NVIDIA.

Because this is SLI on a card, it has some limitations. You may see the 7950 GX2 advertised as a 1GB card, and it undeniably has that much video RAM onboard. Yet that RAM is segregated into two 512MB pools, one for each GPU. Yes, the GX2 has about twice the memory bandwidth of a normal card, but functionally, it has a 512MB memory space. Textures and other data must be uploaded to each GPU and stored in each GPU’s associated memory.

The GX2’s PCI Express switch presents another problem, simply because it’s unfamiliar. Having a couple of GPUs in a single slot behind a PCI-E switch can cause device enumeration problems, so most motherboards will require a BIOS update in order to work with the GX2. NVIDIA has put together a list of mobos and BIOS revisions that it’s confirmed will work with the GX2 and has plans to post the list on its web site. Many of the most popular mobos are on the list already, but not all of ’em, so you’ll definitely want to check before buying a GX2.

I should note, by the way, that running a GeForce 7950 GX2 does not require an NVIDIA SLI chipset or even an NVIDIA chipset at all. Intel-based mobos and the like are happily on the GX2’s compatibility list.

Another artifact of SLI that mars the GX2’s operation affects multi-monitor setups. As with a dual-card SLI setup, you’ve got to switch the GX2 manually between multi-GPU mode and multi-display mode. Here’s how the option looks in NVIDIA’s fancy-pants new driver control panel. (This control panel, incidentally, is focus-group tested, just like ATI’s Catalyst Control Center. I hate it. Once again, focus groups prove they know nothing about good interface design.)

If you’re in multi-display mode, one of the GX2’s GPUs will output to two different displays concurrently and drive them like any other card would. In order to harness all of the GX2’s 3D horsepower, though, you’ve got to switch into multi-GPU mode, at which point one of those two displays will go blank. Frustrating, but that’s life with SLI.

Beyond the annoyance of having to pop in and out of multi-monitor mode manually, the GX2 generally appears to work like any other video card, with the GPU teaming operating transparently. The drivers default to multi-GPU mode, and you won’t see any pop-up messages reminding you to enable SLI like you will with dual-card rigs. Of course, in the background, the usual SLI stuff is happening. If there’s no SLI profile for a game in NVIDIA’s drivers, the GX2 won’t be able to accelerate it with two GPUs automatically. As usual, though, the savvy user may create a custom profile for an application if needed.

The deal with Quad SLI
The GeForce 7950 GX2 was designed for use in Quad SLI configurations and is fully capable of working in them. You may have noticed that the GX2 has only one “golden fingers” SLI connector on it, not two like on early Quad SLI cards. The ring topology we discussed in our early look at Quad SLI has been modified for the GX2. Apparently, one of those two SLI links was superfluous. That makes sense, if you think about it, because only two images need to be combined—one from each GX2—in the final compositing stage of a Quad SLI rendering mode.

NVIDIA says each 7950 GX2 should pull maximum of 142 watts in real-world operation, so that a pair of ’em would require less than 300 W total. That should fit fairly easily within the power envelope of today’s best PC power supplies.

Before you get all excited about the prospect of dedicating 96 pixel shaders, 2GB of memory, and 1.1 billion transistors to pumping out boatloads of eye candy inside your PC, however, there’s some bad news. NVIDIA says Quad SLI will, at least for now, remain the exclusive realm of PC system builders like Alienware, Dell, and Falcon Northwest because of the “complexity” involved.

Yeah, that’s really what they said.

This “complexity” line is not just some off-the-cuff statement by one guy at NVIDIA, either; it’s practically corporate policy, repeated with consistency by several of the company’s representatives.

I’m not entirely sure what to make of this statement. As far as I can tell, Quad SLI requires a motherboard BIOS update, a fairly high-wattage PSU, sufficient case cooling, and a single SLI bridge connection. When explaining to your best customers why they can’t purchase two of your $649 video cards for themselves without also buying a $5K PC built by someone else, it’s probably not good idea to use a shaky excuse with an embedded insult. Especially if it also subtly sends an unnerving message about the competency of your board partners’ customer support organizations. Yet this is what NVIDIA is saying.

To underscore its commitment to keeping the GX2 chaste, NVIDIA declined to send us a second GeForce 7950 GX2 for testing in a Quad SLI config, and its current GX2 drivers apparently don’t support Quad SLI mode, anyhow. The company says it expects to see DIYers hacking together Quad SLI systems using GX2s, but such setups won’t be officially supported. There does seem to be some hope for DIY Quad SLI in the future, but NVIDIA hasn’t committed to any timetable for enabling this feature for those of us who didn’t pay Voodoo PC’s hefty premiums.

 

Test notes
We only received drivers for the GX2 in the middle of last week, so our testing time with the card has been extremely limited. As a result, we’ve restricted our testing to a small set of competing cards and to a single resolution and quality level per game. This is, as you may know, not our usual practice, but it will have to suffice for now. In order to tease out real differences between these products, we chose game settings and display resolutions intended to push the limits of the fastest cards we’re testing. The differences between the cards might not be so great a lower resolutions, and they might be even greater with higher ones. With luck, though, our chosen settings will present a reasonably good picture of how these products compare to one another.

Our testing methods
As ever, we did our best to deliver clean benchmark numbers. Tests were run at least three times, and the results were averaged.

Our test systems were configured like so:

Processor Athlon 64 X2 4800+ 2.4GHz
System bus 1GHz HyperTransport
Motherboard Asus A8N32-SLI Deluxe Asus A8R32-MVP Deluxe
BIOS revision 1205 0404
North bridge nForce4 SLI X16 Radeon Xpress 3200
South bridge nForce4 SLI ULi M1575
Chipset drivers ForceWare 6.85 ULi Integrated 2.20
Memory size 2GB (2 DIMMs) 2GB (2 DIMMs)
Memory type Corsair CMX1024-4400 Pro
DDR SDRAM
at 400 MHz
Corsair CMX1024-4400 Pro
DDR SDRAM
at 400 MHz
CAS latency (CL) 2.5 2.5
RAS to CAS delay (tRCD) 3 3
RAS precharge (tRP) 3 3
Cycle time (tRAS) 8 8
Hard drive Maxtor DiamondMax 10 250GB SATA 150 Maxtor DiamondMax 10 250GB SATA 150
Audio Integrated nForce4/ALC850 with Realtek 5.10.0.6060 drivers Integrated M1575/ALC880 with Realtek 5.10.00.5247 drivers
Graphics GeForce 7900 GT 256MB PCI-E with ForceWare 91.29 drivers Radeon X1800 XT 512MB with Catalyst 6.5 drivers
GeForce 7900 GTX 512MB PCI-E with ForceWare 91.29 drivers Radeon X1900 XTX 512MB with Catalyst 6.5 drivers
GeForce 7950 GX2 1GB PCI-E with ForceWare 91.29 drivers  
OS Windows XP Professional (32-bit)
OS updates Service Pack 2, DirectX 9.0c update (April 2006)

Thanks to Corsair for providing us with memory for our testing. Although these particular modules are rated for CAS 3 at 400MHz, they ran perfectly for us with 2.5-3-3-8 timings at 2.85V.

Our test systems were powered by OCZ GameXStream 700W power supply units. Thanks to OCZ for providing these units for our use in testing.

Unless otherwise specified, image quality settings for the graphics cards were left at the control panel defaults.

The test systems’ Windows desktops were set at 1280×1024 in 32-bit color at an 85Hz screen refresh rate. Vertical refresh sync (vsync) was disabled for all tests.

We used the following versions of our test applications:

The tests and methods we employ are generally publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.

 

Pixel-filling power
The GeForce 7950 GX2 is really just SLI on a single card, but with a little fudging, it’s possible to express the GX2’s peak fill rate potential and memory bandwidth alongside the single-GPU cards out there.

  Core clock
(MHz)
Pixels/
clock
Peak fill rate
(Mpixels/s)
Textures/
clock
Peak fill rate
(Mtexels/s)
Memory
clock (MHz)
Memory bus
width (bits)
Peak memory
bandwidth (GB/s)
Radeon X1600 XT 590 4 2360 4 2360 1380 128 22.1
GeForce 6800  325 8 2600 12 3900 700 256 22.4
GeForce 6600 GT 500 4 2000 8 4000 1000 128 16.0
Radeon X800 400 12 4800 12 4800 700 256 22.4
GeForce 6800 GS 425 8 3400 12 5100 1000 256 32.0
GeForce 6800 GT 350 16 5600 16 5600 1000 256 32.0
Radeon X800 XL 400 16 6400 16 6400 980 256 31.4
Radeon X1800 GTO 500 12 6000 12 6000 1000 256 32.0
GeForce 7600 GT 560 8 4480 12 6720 1400 128 22.4
GeForce 6800 Ultra 425 16 6800 16 6800 1100 256 35.2
GeForce 7800 GT 400 16 6400 20 8000 1000 256 32.0
All-In-Wonder X1900 500 16 8000 16 8000 960 256 30.7
Radeon X1800 XL 500 16 8000 16 8000 1000 256 32.0
Radeon X850 XT 520 16 8320 16 8320 1120 256 35.8
Radeon X850 XT PE 540 16 8640 16 8640 1180 256 37.8
XFX GeForce 7800 GT 450 16 7200 20 9000 1050 256 33.6
Radeon X1800 XT 625 16 10000 16 10000 1500 256 48.0
Radeon X1900 XT 625 16 10000 16 10000 1450 256 46.4
GeForce 7800 GTX 430 16 6880 24 10320 1200 256 38.4
Radeon X1900 XTX 650 16 10400 16 10400 1550 256 49.6
GeForce 7900 GT 450 16 7200 24 10800 1320 256 42.2
GeForce 7800 GTX 512 550 16 8800 24 13200 1700 256 54.4
GeForce 7900 GTX 650 16 10400 24 15600 1600 256 51.2
GeForce 7950 GX2 2 * 500 32 16000 48 24000 1200 2 * 256 76.8

….and this little experiment shows us what a monster the GX2 really is. All told, this thing has over twice the peak multitextured fill rate of a Radeon X1900 XT and a whopping 76.8 GB/s of memory bandwidth.

Feed the thing a synthetic fill rate benchmark, and it proves those numbers are for real:

No other “single” card comes close. Of course, taking advantage of this power in the real world could prove tricky. Let’s move on to some games and see what happens.

 
The Elder Scrolls IV: Oblivion
We tested Oblivion by manually playing through a specific point in the game five times while recording frame rates using the FRAPS utility. Each gameplay sequence lasted 60 seconds. This method has the advantage of simulating real gameplay quite closely, but it comes at the expense of precise repeatability. We believe five sample sessions are sufficient to get reasonably consistent and trustworthy results. In addition to average frame rates, we’ve included the low frames rates, because those tend to reflect the user experience in performance-critical situations. In order to diminish the effect of outliers, we’ve reported the median of the five low frame rates we encountered.

We set Oblivion’s graphical quality settings to “High.” The screen resolution was set to 1600×1200 resolution, with HDR lighting enabled. 16X anisotropic filtering was forced on via the cards’ driver control panel.

Quake 4
In order to make sure we pushed the video cards as hard as possible, we enabled Quake 4’s multiprocessor support before testing.

F.E.A.R.
We’ve used FRAPS to play through a sequence in F.E.A.R. in the past, but this time around, we’re using the game’s built-in “test settings” benchmark for a quick, repeatable comparison.

All in all, the GX2 looks very potent compared to the single-GPU cards.

 

Half-Life 2: Lost Coast
This expansion level for Half-Life 2 makes use of high-dynamic-range lighting and some nice pixel shader effects to create an impressive-looking waterfront. We tested with HDR lighting enabled on all cards.

Battlefield 2
We test BF2 using FRAPS and manual gameplay, much like we did with Oblivion.

Pretty impressive. In Battlefield 2, the GX2 actually achieves twice the average frame rate of the GeForce 7900 GT—and more than twice its median low frame rate.

 

3DMark06

The GX2 keeps it going in 3DMark06, totally outclassing the rest of the field.

 

Power consumption
We measured total system power consumption at the wall socket using a watt meter. The monitor was plugged into a separate outlet, so its power draw was not part of our measurement. To keep things even, we did our power consumption testing for all cards using the Asus A8R32-MVP Deluxe motherboard.

The idle measurements were taken at the Windows desktop with AMD’s Cool’n’Quiet CPU clock throttling function disabled. The cards were tested under load running Oblivion using the game’s High Quality setting at 1600×1200 resolution with 16X anisotropic filtering.

Unsurprisingly, the GX2 consumes more power when sitting at the Windows desktop than any of the single-GPU cards. The shocking thing is the power use under load. The Radeon X1900 XT-based system draws 27 watts more power at the outlet than the otherwise-identical GX2-based rig. Craziness.

Ok, maybe it’s not so crazy. We’ve seen similar achievements with dual-core CPUs, after all. And the GX2 certainly acts the part; its cooler seems a little quieter under load the Radeon X1900 XTX’s. The GX2 isn’t as whisper quiet as the GeForce 7900 GTX, but it’s pretty darned good, considering.

 
Conclusions
Somehow, I didn’t really expect a “single” GeForce 7950 GX2 card to be a compelling product outside of a Quad SLI configuration. This puppy does have its warts, including the need for mobo BIOS updates and the SLI-like limitations for multi-monitor use that may turn some power users away. Still, the GX2 is remarkably tame overall. This card takes up no more space, draws no more power, and generates no more heat or noise than a Radeon X1900 XTX, but its performance is in another class altogether. The multi-GPU mojo generally happens transparently, too, thanks to a healthy collection of SLI profiles already established in NVIDIA’s drivers. Putting two GPUs on a card has allowed NVIDIA to overcome the limitations of its present GPU designs and of current fab process tech to achieve new performance heights in a single PCI Express slot.

Of course, such things have been possible for quite some time in the form of a two-slot SLI or CrossFire solution, but the GX2 still has much to recommend it. Doubling up on GeForce 7900 GTs would get you an SLI setup in the same price range, but with lower (stock) GPU clock speeds and only 256MB of memory per GPU. And the GX2 works in any chipset, which may prove to be a real boon if you fancy one of Intel’s Conroe processors, an Intel chipset, and uber-fast graphics. I could see that combination becoming very popular this summer, if things shake out as expected.

All that’s left now is for NVIDIA to enable—and support—Quad SLI in consumer-built systems. Let’s hope NVIDIA comes to its senses on that one sooner rather than later. 

Comments closed
    • GeForce6200
    • 13 years ago

    I have a Fortron source bluestorm 500 watt PSU. §[<http://www.newegg.com/Product/Product.asp?Item=N82E16817104934<]§ Will it be able to handle this graphics card?

    • Forge
    • 13 years ago

    I’m rather disappointed that the ‘SLI acceleration that is fully transparent to software’ that was rumored didn’t happen. Would have been really compelling to have an SLI ‘single card’ that showed up as one uber-GPU to everything in the system this side of the drivers.

    • My Johnson
    • 14 years ago

    Why does MSI brag about quad SLI with their cards?

    §[<http://www.msicomputer.com/msiforms2/NX7950GX2.asp<]§

    • d0g_p00p
    • 14 years ago

    Once again I have to laugh at the fools who spent $1000 and more to run SLI when 6 months later or so a better card that is just as fast or faster comes at half the price. I have a SLI based motherboard but would never consider going that route. This product however seems pretty nice for the price. Shopping aound I have found it for a little over $500 bucks. Damn good price. If I gamed as much as I used to, I would pick up one of these puppies. However I don’t play many games anymore and my 7800GT is fine when I need to.

    *Rick James*

    SLI is a hella’ of drug.

    • VForce001
    • 14 years ago

    Tom’s shows that the GX2 and X1900XTX are neck and neck in Lost Coast at 1600×1200 (4X/8X) while TR has the GX2 well ahead at the same resolution but 4X/16X. Is this a result of the 16X or am I missing something.

    §[<http://www.tomshardware.com/2006/06/05/geforce_7950_gx2/page8.html<]§

      • Thebolt
      • 14 years ago

      Tom’s is garbage for the most part, just try to forget whatever you read over there and you’ll be set. I doubt the difference in AF would do too much, tom probably just cashed a check at the bank from ATI or something.

      (kind of kidding)

      • Fighterpilot
      • 14 years ago

      The test results from Anand are /[http://www.anandtech.com/video/showdoc.aspx?i=2769&p=8<]§

        • MorgZ
        • 14 years ago

        The AnandTech results are not completely different, taking into account they often have different AA and AF settings than the TR benchmarks they pretty much paint a similar picture – although it is nice to see the 7900GT SLI rig comparison in there as well.

      • Convert
      • 14 years ago

      68.6% difference between the GX2 and the single 7900GTX @ 1600×1200 4xAA/16xAF (Techreport)

      76% ” ” @ 1600×1200 4xAA/NO-AF (Anand)

      85.7% ” ” @ 1600×1200 4xAA/8xAF (Toms)

      All other things being the same then anand should have the lowest % difference out of the bunch followed by Toms and then TR.

      No idea about clocks for Anand or TR because neither of them felt the need to mention them (stock or not it would be nice to know). Toms has a slightly faster GTX though (675/820).

      Toms *only* used the 91.29’s for the GX2 however, the 7900gtx used the 84.21’s. TR used the 91.29’s throughout and Anand didn’t even bother mentioning any drivers at all (thanks again anand).

      Seems like a huge mess between the tests with regards to what settings were used, the time demo used, drivers, clocks and who knows what else.

        • Bauxite
        • 14 years ago

        A huge mess…unless you don’t bother reading toms and anands anymore. Personally, both feel waaaaaaay too “corporate”/ad oriented for my taste, need a grain of salt.

        • My Johnson
        • 14 years ago

        You missed HDR or I’m blind.

      • Fighterpilot
      • 14 years ago

      Could that be because TR’s and Anand’s ATI test cards were XTs not XTXs….?

      • My Johnson
      • 14 years ago

      Scott mentions that HDR is enabled for his test and he has 16X aniso enabled. I don’t know if he had qualtiy 16x or optimized (angle dependent.) But Tom’s does not state he had HDR enabled like Scott does. So, what is noticed is that at 1600×1200 with 4xAA and 8xAniso both the 7950 and 1900 are hitting the CPU limit. That’s why they show as neck and neck.

      You may read whatever bias you want into it, but I honestly think way too many previous posters are projecting a bit too much. Step back, lay the crack pipe down slowly, and tell me how Tom’s different test settings make his review biased.

    • PLASTIC SURGEON
    • 14 years ago

    Seems the Red team also wants people to sell their collective souls to Satan for Multiple card setups. This one for Physics in a crossfire setup with 3 cards methinks.

    §[<http://enthusiast.hardocp.com/article.html?art=MTA3OSwxLCxoZW50aHVzaWFzdA==<]§

    • zqw
    • 14 years ago

    quad sli here, still sketchy.
    §[<http://www.hardwarezone.com/articles/view.php?cid=3&id=1927<]§

    • green
    • 14 years ago

    dx10 won’t matter with this card
    pretty sure i saw it on TR that balmer indicated it could be as far back as feb next year for vista
    by which time you’ll most likely see the next gen of cards popping up
    plus older gen cards will still be compatible with vista
    just not on the best setups is all
    and i haven’t heard of a game coming between now and vista release that will be dx10*
    dx10 compatible maybe, but not dx10

    *note i don’t look much up on games

    • Ryu Connor
    • 14 years ago

    q[

      • BobbinThreadbare
      • 14 years ago

      I think part of the appeal was buy one card now, and another later.

    • SnowboardingTobi
    • 14 years ago

    I’m curious just how well the GPU that’s sitting on the inside is being cooled. There doesn’t seem to be a whole lot of space for it’s fan to pull in any good amount of air.

    BTW, Am I the only one that can’t see how those 2 cards are linked together by looking at the picture provided on page 2?

    • ShadowEyez
    • 14 years ago

    Assuming this wouldn’t void your contracts with Nvidia, why not ask them or go to another hardware review site to get your hands on another GX2 card and see if you can do quad-SLI yourselves. Write your experiences and see if you can get it to work, what issues you run into (just heating and bios updates, or do the drivers really not support it) and give us the performance numbers. That would be awsome!!!

      • Convert
      • 14 years ago

      He did try to get another one from Nvidia.

        • MorgZ
        • 14 years ago

        the thing is i doubt quad sli offers the same benefits over sli than sli does over a single card.

        In some benchmarks sli gives nearly twice the framerate as a single card. I doubt quad sli gives anywhere near this gain – its probably 70% gimmick except on a couple of flag ship games.

      • My Johnson
      • 14 years ago

      “…its current GX2 drivers apparently don’t support Quad SLI mode, anyhow.”

        • Stranger
        • 14 years ago

        jack them from a working alien ware system(with the bios)? it seems as though you’d need an identical MB for the bios but otherwise possible….

          • My Johnson
          • 14 years ago

          Good idea.

    • Convert
    • 14 years ago

    Do you have to restart when enabling SLI for the GX2?

    If you didn’t then Nvidia should think about doing app association stuff so when a game .exe is launched sli is turned on but when the .exe is no longer being run it switches back automatically.

    • madgun
    • 14 years ago

    for just 629$ on newegg.com, thats an amazing piece of hardware with a combined 1GB Gfx ram and two monstrous GPUs.

      • AKing
      • 14 years ago

      As TR says in the review, its “only” 512mb of effective videoram.

        • BobbinThreadbare
        • 14 years ago

        But they still have to pay to put 1 gig on the card.

          • AKing
          • 14 years ago

          Of course you have to take that ins consideration, but its still not 1 gig of effective GFX ram. The benefit from these SLI configs is not memory but speed.

    • Beomagi
    • 14 years ago

    It’s ok, but there’s little stopping you from running 2 7900gt’s.

    The difference at load compared to the 7900gt is 71watts. pretty much the same amount pulled by a 7900gt anyway.

      • Bauxite
      • 14 years ago

      Price is fairly close if you compare the higher clocked GTs SLI and should come closer, only takes 1 slot, double the effective ram.

      SLI with less PINTA

      Price/perm better than GTX too, if you’re going to step up from GT might as well go the bit more. 2 x GTX is reaching, 2 of these is nuts.

    • link626
    • 14 years ago

    i can see the massive amount of dust caught in the pcb sandwich already….

    • sacremon
    • 14 years ago

    In all likelihood, the deal with no Quad-SLI is so that Alienware/etc. have a headstart in the market. This would be an incentive to these system builders to use Nvidia cards rather than ATI. And as much as you might think we are Nvidia’s ‘best customers’, we’re not. It is the system builders.

      • continuum
      • 14 years ago

      What about the stability issues we’ve read on some other sites?

      It sounds like quad-SLI isn’t quite ready for prime time, at least from what I remember reading.

      • shank15217
      • 14 years ago

      I highly doubt that end users are not nvidia’s best customers for high end cards. The enthusiast crowd is atleast as large as the premade machine gamer crowd. Infact i have yet to meet a single owner of an alienware/voodoo machine.

        • Bauxite
        • 14 years ago

        If you do, beat him up and take his wallet 😉

        • sacremon
        • 14 years ago

        Alienware is owned by Dell. Do you think that the enthusiast crowd is larger than Dell? Stepping on Alienware’s toes /[

    • jbraslins
    • 14 years ago

    Would be nice to see GX2 comapred to pair of 7900GT and 7900GTXs.

    Wonder if having two cards on an sli ready mobo using two PCI-E slots and more power amounts to any performanace gains/losses over GX2 config.

      • kvndoom
      • 14 years ago

      Yeah that’s kinda what I was looking for too. Whether anything was lost / gained by having SLI on a stick.

      • continuum
      • 14 years ago

      Agreed… considering that you’d probably have better luck OC’ing two 7900GT’s in SLI rather than a 7950GX2… =)

    • Proesterchen
    • 14 years ago

    The GX2 seems like a pretty damn good product, the extra performance (and lower power consumption) certainly is worth the extra €100 over an XTX.

    Things that put me off buying one of these cards, though:

    – not as good AA and no HDR+AA (compared to ATi, of course)
    – as a multi-monitor user, the whole switching SLi on/off thing seems rather bothersome, also, it kills one of multi-mon’s greatest feats, being able to keep something else in view during a gaming session
    – (to a lesser extend) the physcal arrangement will make watercooling a bitch

    Again, a nice product, but probably not for me.

      • Jigar
      • 14 years ago

      yap include me too in that list….

    • Freon
    • 14 years ago

    Wow, I never realized about the multi-display thing.

    No way I would go SLI just for that. I go from games to dual display desktop several times a day. I would never be willing to manually switch like that. That’s ridiculous…

    • wmgriffith
    • 14 years ago

    I had to re-read the bit on multi-display and multi-gpu mode being mutually exclusive. Does this means that, because both outputs come from GPU#1 (from the figure in the article), multi-display actually turns off GPU#2?

    Is it generally true in the “traditional” SLI configurations that SLI necessarily means only one display?

      • continuum
      • 14 years ago

      y[

    • Zenith
    • 14 years ago

    Well, hot darn! That’s one sexy card. I was expecting it to suck up more power than the amazon like most real SLI setups..but it’s rather…civilized. Although I won’t be satisfied until I’ve seen it compared to SLI and Crossfire setups, I am far more attracted to the idea of this card, rather than a real SLI setup.

    Still, no matter how cool it runs, I feel that the cooling arrangements (From the looks of them) lack any real innovation. It’s just…same ‘ole same ‘ole. Kinda sad considering you’re shelling out over $600.

    • Fighterpilot
    • 14 years ago

    Well those are sure some impressive numbers.Is that faster than 2 card SLI or Crossfire?

    • Dposcorp
    • 14 years ago

    Thanks for the hard work Scott.
    Nice review, but I gotta agree that this is no biggie.

    Reminds of the AMD 4×4 announcement.

    It’s nice, but nothing evolutionary, since we had dual core, dual Opteron systems for a while.

    We have had SLI around for a while now, and two DX9 cards in SLI are still 2 DX9 cards in SLI, just in a single slot.

    If I were Nvidia, I would have saved this announcement for my DX10 cards to really make a big splash.

    • flip-mode
    • 14 years ago

    Hmm… I bet that Nvidia forbid TR from expressly saying that the GX2 will work with ATI chipsets.

    • Usacomp2k3
    • 14 years ago

    Wow, those power numbers are something. Especially considering that it’s on the nVidia board that pulls more power.

      • Damage
      • 14 years ago

      No, it isn’t.

        • Usacomp2k3
        • 14 years ago

        Oh, my bad then.

      • Jigar
      • 14 years ago

      Seriously speaking this numbers are not that great …. and i think a fool will only buy this card because all the powerUsers or Gamers will migrate to VISTA as soon as it is released and its very soon when we see vista around … and as u know we need DX 10 card then..

        • Wajo
        • 14 years ago

        *Runs to buy Vista*… wait… I cant…

    • Jigar
    • 14 years ago

    Ok so we are almost done with DX 9… and i will repeat where are DX 10 cards…

      • MorgZ
      • 14 years ago

      Totally agree, a lot of people who are not lucky enough to have a decent knowledge of the industry are going to be buying swanky new gfx cards which wont have all the features to run a “soon to be released” OS in roughly 6 months time (fingers crossed!).

        • Jigar
        • 14 years ago

        One thing which makes me wonder .. How come a company like nvidia comes out with this type of Powerfull and Costly card when they know that they are soon going to sell DX 10 cards…..

      • flip-mode
      • 14 years ago

      Be quiet.

        • Jigar
        • 14 years ago

        oh your telling to me …. sorry i am quiet but its just my fingers ,,,,,, they are typing ….. 😉

      • Bauxite
      • 14 years ago

      Show me DX10-only games in development that will actually show up with vista pls. Or even ones that have something worthwhile in DX10 mode versus DX9 mode. (you think any other sane company will be making dumb shit like DX10 only halo? lol)

      If I was a betting man, I’d put it at 18-24 months until DX10 means anything in software even to the most rabid early adopter. 6-12 months for the hardware to even show up. What fool would factor that until their upgrade plans for right now? I might as well start waiting for [unnamed intel/amd ’08 roadmap cpu] and prebuy the DDR3 memory or whatever.

      And HDCP…bout time, considering how it was lied about in specs by both parties so long. Although you don’t need it to “plug into the latest HDTVs” because they will accept raw DVI/dongled HDMI just fine. But its nice to have the option to play either of the DRM-DVD formats eventually.

        • Jigar
        • 14 years ago

        Sorry to bother your feelings for DX 9 cards … but seriously speaking DX 10 games are going to arrive very soon….. as soon as vista comes out ,.. apart from halo … Unreal tornament 07 will also come in DX 10 flavour … and there are more games to be lined up… i guess till that time u still going to cry with you DX 9 card right…. any ways i will repeat sorry to bother your feelings for DX 9 cards 😉

          • Bauxite
          • 14 years ago

          When it means a shit, I’ll be buying a new card like everyone else. That won’t be in 6 months.

            • Jigar
            • 14 years ago

            agreed bro …. but i am just asking about the DX 10 cards nothing much neither did i say that this card is crap… this is a good card but its just that i am or other people who are intelligent enough would never buy this card…

            • TREE
            • 14 years ago

            Jigar do you mean card or cards? thats the real question 🙂

            • Jigar
            • 14 years ago

            Cards for all of us.. 😉

            • Bauxite
            • 14 years ago

            The only people that should care about DX10 right now and actually consider waiting longer then planned are those than upgrade every 5-6 years or so, which would make their current card a 9700 pro or something. (suprisingly playable though)

            Those of us on the 12/18/24 cycles don’t care, either way we’ll be ready when it matters, we sure aren’t putting off an extra year++

            The crazies on the 6 month cycles buy every new generation anyways.

            • MorgZ
            • 14 years ago

            Not many people buy brand new systems every 12 months, i think 24 months is much more reasonable as you’d really hope that a £1000 computer will play games well for atleast 2 years. If i was about to buy a new pc for luxury and my current one was just about hanging on, id wait to see how DX10 / vista / Intel Core etc pan out.

            • Jigar
            • 14 years ago

            Agreed…..

            • 0g1
            • 14 years ago

            Nah, I generally upgrade on a performance criteria rather than time requirement. Generally has to provide double the performance.

            • Bauxite
            • 14 years ago

            Systems != Upgrades

            A lot of people tackle it piecemeal, and the age of the various parts in the system might be anywhere from a few months to 5 years or more. You can ride out some parts a lot longer than others…and recycle downwards/sideways to the extra computers too. One new vidcard, but 4 upgrades 😛

            Pretty much par for the course at any gathering/event/lan/etc. I like to get a rundown on everyone’s stuff, not for e-penis wagging but to see all the variety out there. (theres some strange shit for sure)

            My longest part is a nice gleaming aluminum -[<747 at takeoff dustbuster<]- tower case that should be good for a decade or more...especially since the BTX-will-replace-ATX debacle failed and its so big I could shove anything in it regardless.

            • Chryx
            • 14 years ago

            the 9700 Pro isn’t 4 years old yet.

            s[

            • Nullvoid
            • 14 years ago

            An ati card with no X’s in its name, those were the days…

            • Beomagi
            • 14 years ago

            I think people that bought the 9500 pros and flashed them to 9700’s with some overclocks got the best video card deal just about ever.

            • Bauxite
            • 14 years ago

            If someone only upgrades every 5-6 years, they would be the only people factoring in DX10 to their next card purchase.

            By the time DX10 is ready to roll…the 9700 *[

          • Zenith
          • 14 years ago

          No they aren’t. Microsoft said DX10 was years off, last year. They meant 2007-2009.

          *EDIT* Oh, and stop being such a jerk.

            • Jigar
            • 14 years ago

            we will surely see ….. and oh stop being such a wise man ,,,, it doesnt suit your style…

          • Shintai
          • 14 years ago

          Relax abit on the doomsday prediction with DX10. Its not long ago you didn´t even knew if DX9 cards would work in Vista. Along with DX10 cards supporting DX9.

          DX10 games is also not something we gonna see in early 2007. And besides maybe 2-3 games, i doubt you see any DX10 games before 2008. Game development just takes time, and I doubt any of them got/use DX10 tools yet.

          For the cards, its not even sure if G80 gonna be a DX10 card yet (but DX9L instead), and R600 that will be is some 3 months after G80.

          So before any real amount of games comes out to DX10, both G80 and R600 will be old cards…

            • Jigar
            • 14 years ago

            ya that is true……. i was just wondering about u where were u till now. 😉

            • Beer Moon
            • 14 years ago

            The first games that support DX10 won’t make DX9 obsolete.

            Heck, look at the DX8 paths in HL2 – it’s still playable and looks good. Just not great.

            The 7800GTX for $500+ purchase a matter of weeks before the 7900GTs came out at 3 bills – now THAT would have been an ill-advised purchase.

            DX10 isn’t going to do anything to make any top of the line hardware obsolete that the next generation of graphics cores wasn’t going to do already – and we’ll have at least 2 obsolescense-making cycles of GPU cor releases before DX10 becomes standard in games.

      • Stranger
      • 14 years ago

      I havn’t gotten the impression that DX10 is going to be a neccessary jump for quite a while… And it doesn’t seem like Microsofts going to start pushing it for quite a while, even after vista launchs.

      • swaaye
      • 14 years ago

      I feel that DirectX 10 is more a Vista feature bullet than a necessary change. It ties in totally with Vista’s driver overhauls. And those changes don’t all fit with the best interests of games, btw. See how they killed off audio acceleration, for example. Why not just add a Shader Model 4 to DX9? Cuz it doesn’t catch your attention as much as “10”. And that is becoming more obvious by the day.

      It’s scary to see that because MS owns the world with their OS that they can leverage so many different pieces of software to force users into their newest product.

      This is why OpenGL needed to win, but that group couldn’t get their act together well enough to even remotely compete. Of course it doesn’t help, again, that MS has total control over the environment. OpenGL had a good chance though. Direct3D sucked for a long time. But that is well over now.

      DX9 has been with us since 2002. It has lasted the longest of any DX version, by far. It works well, and it supports the latest stuff. I was beginning to feel like the D3D9 component was more of a subset of the thing lol and that it was extensible with new tech.

      MS wants to kill off all previous versions of WIndows, for the mass market, and MAKE you buy VIsta. 🙂

        • Krogoth
        • 14 years ago

        MS is just hindering the adoption of DX10 by making it Vista-only. VIsta on its own merits is not enough to jusify an upgrade for most gamers in the foreseeble future. Game developers in turn will not support DX10 features until the Vista userbase is large enough. It will probably take 2 to 3 years for userbase to get large enough.

          • BobbinThreadbare
          • 14 years ago

          I thought MS was going to bring DX10 to XP with a service pack?

          It was just a rumor going around, but it makes sense to me.

            • Shintai
            • 14 years ago

            To get DX10 on XP you would need a complete overhaul of the kernel, drivermodel etc of XP. And then you can start add all the issues with non LDDM drivers. So its alot easier to just change to Vista. Thats also one of the same reasons DX10 on Vista will be about 20% faster than DX9 on XP since you can start from scratch again.

            • cAPS lOCK
            • 14 years ago

            Well, yeah, that’s the official explanation, and it sounds all right, until you start thinking about it…

            DirectX is an API, i.e. a collection of methods allowing you to draw triangles on the screen, apply textures, rotate them, etc.

            /[

            • Shintai
            • 14 years ago

            DX aint just an API, Its API+Drivers.

            • cAPS lOCK
            • 14 years ago

            Sorry, but no, you’re wrong. Just like OpenGL, DirectX is just an API. If you want to see all the gory details then visit msdn §[<http://msdn.microsoft.com/directx/<]§ The API itself may choose to delegate work to the hardware and this of course is where the drivers come into play. MS could just ask nVidia/ATI to implement any required methods allowing the API to make use of any nifty hardware acceleration. It doesn't require a new OS, it just requires that a few dll's publish a few more methods. That you would need a new driver _model_ for this ...well, sorry but that's just bollocks. /cl

            • Shintai
            • 14 years ago

            DX10 relies on WGF 2.0, that again relies on LDDM drivers. So yes you need API+Drivers to use DX10.

            Also you can´t directly compare OpenGL to DX. DX today is alot more advanced than OpenGL, and DX includes everything from sound to network aswell. So DX10 on XP would also require new drivers of those too.

            So yes, you do need the new driver model. We have had this current one for what, 10years? Its just not fit for the job anymore. But again, how hard can it be to replace the kernel, the drivermodel, the drivers and such in a service pack or DX10 install on XP…

            MS dont need to sell Vista on DX10, Vista will sell itself via OEMs like the windows version before.

            • poulpy
            • 14 years ago

            DX10 does more than graphics but that’s not the point, we’re speaking of the graphic side here, let’s call it Direct 3D 10 then.

            D3D 10 allows more cutting edge effects than OpenGL but again that doesn’t change anything, it does have more functions built-in and what ? It’s still an API, whatever it’s size or complexity.

            D3D is relying on many layers including the driver.
            But it doesn’t have to deal with the internals of that one, that’s the whole point of a driver : providing an abstraction layer to the hardware so that APIs have an interface to rely on.

            Drivers models have changed many times in Linux, they didn’t break compatibility with previous OpenGL version.

            So from an API (which is what D3D 10 is) point of vue, D3D 10 must be able to work under XP provided the right interface !

            Now as you said DX is a swiss knife and must be tied more than we can think of with other Vista pieces, surely making the whole DX10 a nightmare to backport to XP…

            • d0g_p00p
            • 14 years ago

            Direct X is more advanced than OpenGL? Lawls, I guess someone should tell Carmack that he is a hack using a non advanced API for graphics. SUN, SGI ,Pixar, ILM, Apple, Lightwave ,Masa and others must be wrong as well.

            • Shintai
            • 14 years ago

            Read again before commenting.

            y[

    • Delta9
    • 14 years ago

    I see this thing on a shelf next to my voodoo 5500 in two years.

      • Sniper
      • 14 years ago

      It’ll probably cost a fortune… and it doesn’t have Directx10. Yay?

    • Usacomp2k3
    • 14 years ago

    Woohoo
    /me starts reading.

Pin It on Pinterest

Share This