So here we are, knee-deep in yet another chapter of the ongoing struggle for graphics supremacy between the Radeon camp and the GeForce camp, between the red team and the green team, between ATInow AMDand Nvidia. In the past few months, the primary battlefront in this war has shifted away from the traditional slug-fest between ridiculously expensive video cards and into the range between $200 and $300an unusual development, but a welcome one. Nvidia has captured the middle to upper end of that price range with the GeForce 8800 GT, a mighty performer whose sole liability was the initial difficultly many encountered with finding one in stock at their favorite online store. Meanwhile, AMD offers a capable substitute in the form of the Radeon HD 3870, which isn't quite as quick as an 8800 GT but is still a solid value.
In short, everybody's a winner. And we can't have that, now. Can we?
Nvidia has held the overall graphics performance crown uninterrupted since the fall of 2006, when it introduced the GeForce 8800 GTX, and the only notable change since then was when the green team added a few additional jewels to its crown and called it the GeForce 8800 Ultra. The folks at AMD have no doubt been fidgeting nervously over the past year-pluschewing on the tips of their pens, tapping their fingers on their desks, checking and re-checking the value of their stock optionswaiting for the chance to recapture the performance lead. Frustratingly, no single Radeon HD GPU would do it, not the 2900 XT, and not the 3870.
But who says you need a single GPU? This is where the Radeon HD 3870 X2 comes into the picture. On the X2, two Radeon HD 3870 GPUs gang up together to take on any GeForce 8800 available. Will they succeed? Let's find out.
The X2 up close
The idea behind the Radeon HD 3870 X2 is simple: to harness the power of two GPUs via a multi-GPU scheme like CrossFire or SLI in order to make a faster single-card solution than would otherwise be possible. AMD did this same thing in its last generation with the Radeon HD 2600 X2, but the card never found its way into our labs and was quickly usurped by the Radeon HD 3870. Nvidia was somewhat more successful with the GeForce 7950 GX2, an odd dual-PCB card that was a key component of its Quad SLI scheme.
AMD's pitch for the 3870 X2 sounds pretty good. The X2 should possess many of the Radeon HD 3870's virtuesincluding DirectX 10.1 support and HD video decode accelerationwhile packing a heckuva punch. In fact, AMD claims the X2 offers higher performance, superior acoustics, and lower power draw than two Radeon HD 3870 cards in a CrossFire pairing. That should be good enough, they say, for gaming in better-than-HD resolutions.
At 10.5", the X2 card is every bit as long as a GeForce 8800 Ultra and, in fact, is deeper than most motherboards. A card this long may create clearance problems in some motherboards, especially if the mobo has poorly placed SATA ports. Unlike regular Radeon HD 3870 cards, the X2 has only a single CrossFire connector onboard, presumably because attaching more than two of these puppies together would be borderline crazy.
The X2 has two auxiliary power plugs onboard, one of the older six-pin variety and another of the newer eight-pin sort. The card will work fine with a six-pin plug in that eight-pin port, but you'll need to use an eight-pin connector in order to unlock the Overdrive overclocking utility in AMD's control panel.
Remove about 19 screws, and you can pull off the X2's massive cooler for a look beneath, where things really start to get interesting. The first thing you'll notice, no doubt, is the fact that the two GPU heatsinks are made from different materials: one copper and one aluminum. Reminds me of a girl I dated briefly in high school whose eyes were different colors.
Hey, I said "briefly."
I think it's a good thing both heatsinks aren't copper, because the X2's massive cooler is heavy enough as it is.
And the X2 has much to cool. AMD rates the X2's peak power draw at 196W. However, the Radeon HD 3870 GPUs on this beast do have some power-saving measures, including a "light gaming" mode for less intensive 3D applications. With that mode in use, the X2's rated power draw is 110W.
With all of the talk lately about hybrid graphics schemes, the X2 might seem to have a built-in opportunity for power savings in the form of simply turning off one of its two GPUs at idle. I asked AMD about this possibility, though, and they said the latency involved in powering down and up a whole GPU was too great to make such a scheme feasible here. Ah, well.
|Gigabyte has two A320 boards for bread-and-butter Ryzen builds||11|
|MSI GTX 1080 Ti Armor 11G is the first custom card on e-tail shelves||6|
|Google points deep-learning machines at audio effect subtitles||4|
|Throw a Quadro card on Gigabyte's Z270X-Designare||10|
|Deals of the week: an RX 480 4GB for $150 and more||22|
|Dell UltraSharp 32 8K embarrasses 4K monitors||59|
|EVGA readies a Hybrid Waterblock for Nvidia GP102 cards||7|
|Elgato Stream Deck lets streamers play news desk||7|
|Puppy Day Shortbread||27|
|I need this because of reasons.||+41|