Another productive day yesterday. In the morning, I grabbed screenshots from each of the new games I'm testing, worked up a table of specs for our new GPU test rigs and the various graphics cards, and did the basic layout for the GPU review. Then, I looked into power/noise/temperature testing.
One of the questions there is which application to use while testing the graphics cards under load. I prefer to use a real-world app like a game from our test suite, but not every game loads up the GPU (and the rest of the system) in the same way. Not only that, but some apps scale better than others with CrossFire and SLI, which also affects power use. To determine which one to use, I tried a number of the latest games with a dual-GPU video card installed in the test system. Here's a list of the games and what the typical power draw was for the entire test system while running them:
- Call of Duty: World at War — 344
- Dead Space — 349
- Far Cry 2 — 352
- Fallout 3 — 356
- Crysis Warhead — 365
- Left 4 Dead — 382
Now, this is just a quick sample from each game, generally taken from the level where we tested and using the same image quality and resolutions settings as in our performance benchmarks. These things can vary pretty widely in different circumstances within the same game. Still, I'm not too shocked by the outcome. Left 4 Dead uses a dated game engine and relatively simple lighting and shaders, and it runs well on just about any recent GPU. Sometimes, the simplest programs do the best job of lighting up all the transistors in a chip. Code that's more difficult to execute can result in lower actual utilization. Naturally, I decided to use L4D for power, noise, and temp testing.
After sorting that out, I started testing in earnest. I measure power, noise, and temperatures in a big batch, one GPU config after the next, because I prefer to test these things under similar conditions. I can only really test when the house is absolutely quiet—no kids playing, no washing machine, dishwasher, TV, or radio making noise, nobody walking around or talking upstairs, and no HVAC system running. I generally just wait until everyone goes to sleep and test then, when everything is quietest. Even so, testing is time consuming because I have to wait on several things: for our Vista-based test rig to finish booting completely and finish its "idle tasks" before taking idle readings, for the GPU to heat up and get its fan up to full speed before taking load readings, and for any HVAC system cycles to end before taking an acoustic reading in either case. Waiting for everything to line up for 12 different GPU configs turns out to be very time-consuming. I was up until 3AM last night finishing up.
Even then, I have some stray cards with apparent thermal/fan-control issues I'll have to spend a couple of hours dealing with today. I need to try different firmware and driver configs before simply declaring them faulty.
Working with the new sound level meter has been rewarding, though. Not only does it have a much broader range, including a lower minimum level it can measure, but it also just feels like a more precise instrument than the old one. Not sure I can describe it entirely, but the numbers coming out of it and the way it reacts to sound in the environment (even on the "slow" setting we use) simply inspires more confidence. It doesn't hurt that I'm getting minimum readings as low as 36 dB at idle for our test systems—well below the 40-ish dB minimum of the old meter—with the mic 8" from the test rig. We have some real, substantial differences between the various cards' noise levels at idle now, so it appears my new, quieter CPU cooler is helping on that front. I think the minimum dB number would be somewhat lower with a passive cooler on the video card. I also want to measure the noise floor of the room itself, but I didn't get around to that last night.
I'm sure by talking about this stuff I'm inviting some pedantic know-it-alls to swoop in and make snide comments about how I should be spending ten grand on an isolation chamber in order to test acoustics properly. Let me preemptively offer those folks the chance to go take a flying leap. The great majority of us simply wish not to be annoyed by our PCs, and I think the readings we produce offer a decent enough gauge of such things. Besides, dB readings aren't everything, and we do offer subjective impressions when a card is especially grating or particularly quiet.
Speaking of contentious issues, I also have realized at several points along the way toward putting together this latest review that some folks will probably bristle at our selection of games (I tipped my hand above) because (as some of you may have heard) Nvidia has been pushing for folks to test five of the six games we used. I understand the concern, but look, I happen to agree that those games are worthy of attention, which is why I chose them. I don't often agree with Nvidia's PR hype on any issue you could name, but it does happen from time to time. I'd also hoped to have Race Driver GRID in the mix, but it was just too much. Testing four of those games with FRAPS for 12 different cards is time consuming enough. On the plus side for the AMD fanboys on the verge of having a coronary right now, we did use relatively new drivers, and AMD has had some time to address the (seemingly inevitable) issues it's had with some of these games.
Anyhow, now that I've stirred up a couple of hornet's nests, I'm going to spend the next hour or three sorting out thermal and acoustic issues on these problematic cards and then maybe get photography for the review underway.