In order to take full advantage of high-end graphics cards these days, you've got to ask a lot of 'em. That's why we decided to conduct our testing for this review with a trio of monitors, all Dell U2410s, each with a display resolution of 1920x1200.
Together, they have a collective resolution of about six megapixels, roughly 50% more pixels than the 30" monitor we usually use for GPU testing. The increased resolution and complexity made it fairly easy to push the limits of these multi-GPU setups. We even had to go easy on the image quality settings in some cases to maintain playable frame rates.
Most of our multi-GPU pairings were built from cards we've tested before, but our GTX 680 team had one brand-new member: Zotac's GeForce GTX 680 AMP!, a product just announced today. Obviously, that's not a stock cooler, but it is very swanky. This is an AMP! edition, so its default clock speeds are quite a bit higher than a stock GTX 680's. The base and boost frequencies are 1111MHz and 1176MHz, well above the stock 1006/1071MHz speeds. Even more impressively, perhaps, the Zotac card's memory speed is 1652MHz, up from 1502MHz stock. We suspect memory bandwidth may be an important performance limiter on the GTX 680, so the higher RAM speeds are noteworthy. Zotac is asking $549 for this card, 50 bucks above the stock GTX 680's list price.
For the purposes of this review, we committed the heinous crime of dialing back the Zotac GTX 680 card's base and memory clock speeds to match the other card in the SLI pairing, which was a standard-issue GTX 680. We're worried about GPUs being out of sync, after all, and we didn't want to make matters worse with a mismatch. (We did the same with the XFX Radeon HD 7970, bringing it back to stock clocks to match the other card.) The thing is, the utilities we had on hand wouldn't let us straightforwardly control the Zotac card's boost clock, so perfect symmetry eluded us.
With the GTX 680, that is kind of the way of things, though. Nvidia expects slightly variant performance from every GTX 680 card thanks to GPU Boost, which will adjust to the particulars of a card's thermals, the individual chip's properties, and such. Two GTX 680s in SLI aren't likely to run at exactly the same speed, since the thermal conditions at one spot in a system will vary from those at another. Nvidia anticipates that the frame metering capabilities in the GK104 will keep frame delivery consistent, regardless.
Oh, and please note that we tested the Radeon HD 6990 with its "AUSUM" switch enabled, raising its clock speed and PowerTune limits. We saw no reason not to test it in that configuration, given what it is.
Our testing methods
As ever, we did our best to deliver clean benchmark numbers. Tests were run at least three times, and we've reported the median result.
Our test systems were configured like so:
|Chipset||Intel X79 Express|
|Memory size||16GB (4 DIMMs)|
DDR3 SDRAM at 1600MHz
|Memory timings||9-9-11-24 1T|
|Chipset drivers||INF update
Rapid Storage Technology Enterprise 126.96.36.19920
with Realtek 188.8.131.5226 drivers
|Hard drive||Corsair F240 240GB SATA|
|Power supply||Corsair AX850|
|OS||Windows 7 Ultimate x64 Edition
Service Pack 1
DirectX 11 June 2010 Update
|GeForce GTX 590||ForceWare 301.24||608||854||3072|
|GeForce GTX 680||ForceWare 301.33||1006||1502||2048|
|GeForce GTX 680 + Zotac GTX 680||ForceWare 301.33||1006||1502||2048|
|GeForce GTX 690||ForceWare 301.33||915||1502||4096|
|Radeon HD 6990 AUSUM||Catalyst 12.4 + 12.3 CAP 1||880||1250||4096|
|Radeon HD 7970||Catalyst 12.4 + 12.3 CAP 1||925||1375||3072|
|Radeon HD 7970 + XFX HD 7970||Catalyst 12.4 + 12.3 CAP 1||925||1375||3072|
Thanks to Intel, Corsair, and Gigabyte for helping to outfit our test rigs with some of the finest hardware available. AMD, Nvidia, and the makers of the various products supplied the graphics cards for testing, as well.
Unless otherwise specified, image quality settings for the graphics cards were left at the control panel defaults. Vertical refresh sync (vsync) was disabled for all tests.
We used the following test applications:
Some further notes on our methods:
We used the Fraps utility to record frame rates while playing a 90-second sequence from the game. Although capturing frame rates while playing isn't precisely repeatable, we tried to make each run as similar as possible to all of the others. We tested each Fraps sequence five times per video card in order to counteract any variability. We've included frame-by-frame results from Fraps for each game, and in those plots, you're seeing the results from a single, representative pass through the test sequence.
We measured total system power consumption at the wall socket using a Yokogawa WT210 digital power meter. The monitor was plugged into a separate outlet, so its power draw was not part of our measurement. The cards were plugged into a motherboard on an open test bench.
The idle measurements were taken at the Windows desktop with the Aero theme enabled. The cards were tested under load running Skyrim at its Ultra quality settings with FXAA enabled.
We measured noise levels on our test system, sitting on an open test bench, using an Extech 407738 digital sound level meter. The meter was mounted on a tripod approximately 10" from the test system at a height even with the top of the video card.
You can think of these noise level measurements much like our system power consumption tests, because the entire systems' noise levels were measured. Of course, noise levels will vary greatly in the real world along with the acoustic properties of the PC enclosure used, whether the enclosure provides adequate cooling to avoid a card's highest fan speeds, placement of the enclosure in the room, and a whole range of other variables. These results should give a reasonably good picture of comparative fan noise, though.
We used GPU-Z to log GPU temperatures during our load testing.
The tests and methods we employ are generally publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.
|G.Skill's Ripjaws KM780R gaming keyboard reviewed||3|
|Nvidia teases a "Special Event" tomorrow at 6PM PT||42|
|Rumor: Nvidia's GeForce GTX 1080 shows its face in 3DMark||46|
|Chromebooks get multi-monitor support with DisplayLink||5|
|AMD bolsters its budget storage options with its R3 SSDs||18|
|Radeon Software 16.5.1 drivers fix Forza follies||7|
|Fallout 4 gets more love from Bethesda with Far Harbor expansion||20|
|Intel debuts embedded Skylake-R CPUs with Iris Pro graphics||48|
|AMD adds refresh-rate ranges to its FreeSync monitor page||39|