As I learned from a trip to KFC this summer, doubling down can have its risks and its rewards. Sadly, the Colonel's new sandwich wasn't exactly the rewarding explosion of bacon-flavored goodness for which I'd hoped. Eating it mostly involved a lot of chewing and thinking about my health, which got tiresome. Still, I had to give it a shot, because the concept held such promise for meat-based confections.
If there's one thing I enjoy as much as dining on cooked meats, it's consuming the eye candy produced by a quality GPU. (Yes, I'm doing this.) Happily, doubling down on a good graphics card can be much tastier than anything the Colonel has managed to serve in the past 15 years, and thermal grease isn't nearly as nasty as the stuff soaking through the bottom of that red-and-white cardboard bucket. The latest GPUs support DirectX 11's secret blend of herbs and spices, and the recently introduced GeForce GTX 460 has set a new standard for price and performance among them.
In fact, at around 200 bucks, the GTX 460 is a good enough value to raise an intriguing question: Is there any reason to plunk down the cash for an expensive high-end graphics card when two of these can be had for less?
With this and many other questions in mind, we fired up the test rigs in Damage Labs and set to work, testing a ridiculous 23 different configurations of one, two, and, yes, three graphics cards against one another for performance, power draw, noise, and value. Could it be that doubling down on mid-range graphics cards is a better path to gaming enjoyment? How does, well, nearly everything else perform in single and multi-GPU configs? Let's see what we can find out.
The case for multiple GPUs
Multi-GPU schemes have been around for quite a while now, simply because they're an effective way to achieve higher performance. The very parallelizable nature of graphics as a computing problem means two GPUs have the potential to deliver nearly twice the speed of a single chip a pretty high percentage of the time. These schemes can have their drawbacks, when for one reason or another performance doesn't scale well, but both of the major graphics players are very strongly committed to multi-GPU technology.
Heck, AMD has replaced its largest graphics processor with a multi-chip solution; its high-end graphics card is the Radeon HD 5970, prodigiously powered by dual GPUs. Multiple Radeon cards can gang up via CrossFireX technology into teams of two, three, or four GPUs, as well.
Nvidia's SLI tops out at three GPUs and is limited to fewer, more expensive cards, but then Nvidia is still making much larger chips. A duo of GeForce GTX 480s is nothing to sneeze atthe mist would instantly vaporize due to the heat of hundreds of watts being dissipated. Also, they're pretty fast. The green team hasn't yet introduced a dual-GPU video card in the current generation, but it has a long history of such products stretching from the GeForce GTX 295 back to the GeForce 7950 GX2, which essentially doubled up on PlayStation 3-class GPUs way back in 2006. (Yeah, the whole PCs versus next-gen consoles hardware debate kinda ended around that time.)
Nvidia arguably kicked off the modern era of multi-GPU goodness by resurrecting the letters "SLI", which it saw sewn into a jacket it took off the corpse of graphics chip pioneer 3dfx. Those letters originally stood for "scanline interleave" back in the day, which was how 3dfx Voodoo chips divvied up the work between them. Nvidia re-christened the term "scalable link interconnect," so named for the bridge connection between two cards, and turned SLI into a feature of multiple generations of GeForces. Since then, Nvidia has expended considerable effort by working with game developers to ensure smooth compatibility and solid performance scaling for SLI configurations. These days, Nvidia often adds support for new games to its drivers weeks before the game itself ships to consumers.
AMD's answer to SLI was originally named CrossFire, but it was later updated to "CrossFireX" in order to confuse people like me. Mission accomplished! AMD hasn't always been as vigilant about providing CrossFire support for brand-new games prior to their release, but it has recently ratcheted up its efforts by breaking out CrossFire application profiles into a separate download. Those profiles can be updated more quickly and frequently than its monthly Catalyst driver drops, if needed.
Game developers are more aware of multi-GPU solutions than ever, too, and they generally have tweaked their game engines to work properly with SLI and CrossFireX. As a result, the state of multi-GPU support is pretty decent at present, particularly for games that really need the additional graphics horsepower.
Multi-card graphics solutions can make more sense inside of a desktop gaming rig than you might first think. For instance, a pair of graphics cards can use twice the area of a single, larger card for heat dissipation, making them potentially quieter, other things being equal. Two mid-range graphics cards will draw power from two different PCIe slots, which may save you the trouble of having to accommodate a card with one of those annoying eight-pin auxiliary power connectors. And these days, the second graphics card in a pair is generally pretty good about shutting down and not requiring much power or making much noise when it's not in use. Add up all of the considerations, and going with dual graphics cards might be less trouble than some of the pricey single-card alternatives.
The value equation can tilt in the direction of multiple cards, too, in certain cases. Let's have a look at some of the options we're faced with in the current market, and then we'll consider more specifically what combinations of cards might be the best candidates for a pairing.
|Socketed Intel desktop Broadwell coming mid-year||11|
|Intel announces Achievement Unlocked dev relations program||4|
|Intel partners with Raptr to optimize game settings for Iris graphics||6|
|Microsoft announces PC wireless adapter for Xbox One controller||21|
|Nvidia demos new Titan X graphics card at GDC||97|
|Valve's Source 2 engine will be free, too||16|
|FREAK vulnerability exploits old encryption export restrictions||17|
|Zotac's Steam Machine is ready to power your living room||39|
|And Samsung makes new phone with no sd slot lol whaw whaw whaw||+55|