Today’s graphics cards: the value angle

The last time we attempted to quantify the value propositions of a large cross-section of competing products, we concentrated on microprocessors. Ever since then, we’ve wanted to explore the same concept with graphics cards. Thanks to our latest round of graphics card reviews, which culminated with the massive GeForce 9 series multi-GPU extravaganza last month, we’ve ended up with enough benchmark data to paint a fairly complete picture of today’s mid- to high-end GPU market.

Armed with this information, we’ve taken another crack at quantifying value, this time by looking at what sort of GPU power you get for your dollar. The results are interesting, if nothing else. Read on to see what we found.

Quantifying GPU value

This is an iffy exercise, this attempt to quantify a value proposition that often involves considerations one can’t easily boil down to numbers. We have no illusions that this little thought experiment will yield the only or best information you need in order to decide which graphics card to buy. Our graphics card reviews include a mix of subjective impressions and objective test data, and they’re probably your best starting point in these matters.

Heck, even if you want to go purely for a “raw value” approach, you may be better served by looking at one of our recent reviews, like the aforementioned GeForce 9 multi-GPU extravaganza. You could conceivably just hunt for the cheapest card that achieves playable frame rates at (or near) your monitor’s native resolution and call it a day. There’s nothing inherently wrong with that, and it’s certainly a very value-oriented approach. But it won’t get you anything much in the way of future-proofing, and it won’t tell you how much graphics processing capability you’re getting for your money.

For this article, we wanted to focus more closely on the question of GPU power per dollar. To make that work, we’ve relied on test data from (as much as possible) clearly GPU-limited scenarios—performance results obtained at very high resolutions (most often 2560 x 1600) with some level of antialiasing and anisotropic filtering enabled. That way, we can highlight questions of performance scaling. We can see how the cut-down shader power and memory capacities of the less expensive cards impact performance and observe how much multi-GPU solutions can distance themselves from their single-GPU brethren.

Obviously, concentrating on GPU-limited scenarios like these sidesteps the question of “playable frame rates” in today’s games, as we’ve noted. Vast differences in GPU power may not be readily apparent at 1680×1050 in currently popular titles, but they may become very important in the future as game developers ratchet up the shader effects and scene complexity, packing more richness into each pixel. Our hope is that we can help you make a more informed evaluation of the value proposition, one that considers how your video card might serve you over its entire lifespan.

Our test subjects

Anyway, let’s have a look at our subjects. We’ve included results for 20 GPU configurations in total, nine of which are single cards…

Single
card
GPUs Price
Radeon HD 3850 1 $154.99
GeForce 9600 GT 1 $159.99
Radeon HD 3870 1 $164.99
GeForce 8800 GT 1 $189.99
GeForce 8800 GTS 512 1 $254.99
GeForce 9800 GTX 1 $309.99
Radeon HD 3870 X2 2 $389.99
GeForce 8800 Ultra 1 $519.99
GeForce 9800 GX2 2 $549.99

…and 11 are multi-card setups:

Multi-card
setup
GPUs Price
Radeon HD 3850 CrossFire 2 $309.98
GeForce 9600 GT SLI 2 $319.98
Radeon HD 3870 CrossFire 2 $329.98
GeForce 8800 GT SLI 2 $379.98
Radeon HD 3870 X2 + 3870 3 $554.98
GeForce 9800 GTX SLI 2 $619.98
Radeon HD 3870 X2 CrossFire 4 $779.98
GeForce 9800 GTX SLI 3 $929.97
GeForce 8800 Ultra SLI 2 $1,039.98
GeForce 9800 GX2 SLI 4 $1,099.98
GeForce 8800 Ultra SLI 3 $1,559.97

You might wonder where we sourced the prices listed above. In our CPU value article last year, we simply took bulk prices directly from AMD’s and Intel’s respective websites. Real-world CPU prices fluctuate and don’t necessarily reflect bulk pricing, but the two are often pretty close, and bulk prices have the benefit of coming from “official” sources.

Graphics card pricing is a very different matter. AMD’s and Nvidia’s official launch prices for new GPUs are only loose guidelines, and they often cover a range of possible prices—for instance, “$199-249” for the GeForce 8800 GT. Cards usually drift from their launch prices ranges after a while, too, so we were left with no option but to nab prices from online retailers. In the end, we relied on a mix of prices from Newegg and our price search engine. Overall, we believe the figures we picked should do a good job of representing typical real-world pricing.

We ran into a little snag while collecting prices for the Radeon HD 3850 and GeForce 9600 GT, however, since the models we tested are both significantly marked-up “overclocked in the box” models. To avoid skewing our data too much, we looked for cheaper cards with similar clock speeds. We selected a GeForce 9600 GT with 700MHz core and 950MHz memory speeds to be our price reference for the 700MHz/1000MHz card we tested, and we picked a 720MHz/910MHz Radeon HD 3850 to do the same for our 725MHz/900MHz test model. Those slight clock speed differences shouldn’t impact performance too substantially, and the lower prices are better indicators of what you can expect to pay for a Radeon HD 3850 or GeForce 9600 GT these days.

Presenting the data

We have our reasoning, our performance results, and our prices. Here’s how we sliced and diced the data to give us some insight.

We carried over two tools from our CPU value comparo. One is the simple “performance per dollar” chart, which, in this case, ranks cards based on how many frames per second $10 will buy. The exact formula is: ((frames/second)/price))*10. We relied on our good friend Alexander Hamilton rather than his buddy George Washington for this task, because ranking cards based on FPS per dollar yields hard-to-read numbers with too many leading zeros.

Ranking performance per dollar doesn’t give us the whole picture, though—far from it. As you’ll see in the next few pages, the one card that ranks highest in performance-per-dollar charts is usually the cheapest (and thus the slowest), and it’s often followed by slightly slower but equally expensive competitors. To get a look at which cards lie in the proverbial “sweet spot” of price and performance, we used scatter plots like this one:

These scatter plots may look a little intimidating, but they’re really quite simple. The best possible offering would be in the top left corner, offering the highest frame rate for $0, while the worst possible one would be at the bottom right, yielding 0 FPS for the most money. We obviously never get such extreme cases, but the best offerings still typically lie closer to the top left corner of the plot. In the example above, the GeForce 9600 GT looks to be a better deal than either the Radeon HD 3870 or the Radeon HD 3850. The GeForce 9600 GT SLI config puts in a very good showing, as well. Conversely, the GeForce 9800 GTX three-way SLI setup is a poor value.

Now that that’s (hopefully) clear, let’s move on to the benchmarks.

Test notes

As we said on the previous page, our GeForce 9600 GT and Radeon HD 3850 were both “factory overclocked,” so their performance is a little higher than other cards of the same name. All of our other cards run at their stock speeds, except for the GeForce 8800 GTS 512MB, which is “factory overclocked” pretty typically for a model in its price range.

Our testing methods

As ever, we did our best to deliver clean benchmark numbers. Tests were run at least three times, and the results were averaged.

Our test systems were configured like so:

Processor Core
2 Extreme X6800
2.93GHz
Core
2 Extreme X6800
2.93GHz
Core
2 Extreme X6800
2.93GHz
System
bus
1066MHz
(266MHz quad-pumped)
1066MHz
(266MHz quad-pumped)
1066MHz
(266MHz quad-pumped)
Motherboard Gigabyte
GA-X38-DQ6
XFX
nForce 680i SLI
EVGA
nForce 780i SLI
BIOS
revision
F7 P31 P03
North
bridge
X38
MCH
nForce
680i SLI SPP
nForce
780i SLI SPP
South
bridge
ICH9R nForce
680i SLI MCP
nForce
780i SLI MCP
Chipset
drivers
INF
update 8.3.1.1009

Matrix Storage Manager 7.8

ForceWare
15.08
ForceWare
9.64
Memory
size
4GB
(4 DIMMs)
4GB

(4 DIMMs)

4GB
(4 DIMMs)
Memory
type
2
x Corsair
TWIN2X20488500C5D
DDR2 SDRAM
at 800MHz
2
x Corsair
TWIN2X20488500C5D
DDR2 SDRAM
at 800MHz
2
x Corsair
TWIN2X20488500C5
DDR2 SDRAM
at 800MHz
CAS
latency (CL)
4 4 4
RAS
to CAS delay (tRCD)
4 4 4
RAS
precharge (tRP)
4 4 4
Cycle
time (tRAS)
18 18 18
Command
rate
2T 2T 2T
Audio Integrated
ICH9R/ALC889A

with RealTek 6.0.1.5497 drivers

Integrated
nForce 680i SLI/ALC850

with RealTek 6.0.1.5497 drivers

Integrated
nForce 780i SLI/ALC885

with RealTek 6.0.1.5497 drivers

Graphics Diamond Radeon HD
3850 512MB PCIe

with Catalyst 8.2 drivers

Dual
GeForce
8800 GT 512MB PCIe

with ForceWare 169.28 drivers

Dual
GeForce
9800 GTX 512MB PCIe

with ForceWare 174.53 drivers

Dual Radeon HD
3850 512MB PCIe

with Catalyst 8.2 drivers

Dual
GeForce
8800 Ultra 768MB PCIe

with ForceWare 169.28 drivers

Triple
GeForce
9800 GTX 512MB PCIe

with ForceWare 174.53 drivers


Radeon HD 3870 512MB PCIe

with Catalyst 8.2 drivers

Triple
GeForce
8800 Ultra 768MB PCIe

with ForceWare 169.28 drivers

Dual
GeForce
9800 GX2 1GB PCIe

with ForceWare 174.53 drivers

Dual

Radeon HD 3870 512MB PCIe

with Catalyst 8.2 drivers

Dual
Palit GeForce
9600 GT 512MB PCIe

with ForceWare 174.12 drivers



Radeon HD 3870 X2 1GB PCIe

with Catalyst 8.2 drivers


Dual Radeon HD 3870 X2 1GB PCIe

with Catalyst 8.3 drivers



Radeon HD 3870 X2 1GB PCIe
+

Radeon HD 3870 512MB PCIe

with Catalyst 8.3 drivers

Palit
GeForce
9600 GT 512MB PCIe

with ForceWare 174.12 drivers

GeForce
8800 GT 512MB PCIe

with ForceWare 169.28 drivers

EVGA
GeForce 8800 GTS 512MB PCIe

with ForceWare 169.28 drivers

GeForce
8800 Ultra 768MB PCIe

with ForceWare 169.28 drivers

GeForce
9800 GX2 1GB PCIe

with ForceWare 174.53 drivers

GeForce
9800 GX2 1GB PCIe

with ForceWare 174.53 drivers

Hard
drive
WD
Caviar SE16 320GB SATA
OS Windows
Vista Ultimate
x86 Edition
OS
updates
KB936710, KB938194, KB938979,
KB940105, KB945149,
DirectX November 2007 Update

Please note that we tested the single and dual-GPU Radeon configs with the Catalyst 8.2 drivers, simply because we didn’t have enough time to re-test everything with Cat 8.3. The one exception is Crysis, where we tested single- and dual-GPU Radeons with AMD’s 8.451-2-080123a drivers, which include many of the same application-specific tweaks that the final Catalyst 8.3 drivers do.

Thanks to Corsair for providing us with memory for our testing. Their quality, service, and support are easily superior to no-name DIMMs.

Most of our test systems were powered by PC Power & Cooling Silencer 750W power supply units. The Silencer 750W was a runaway Editor’s Choice winner in our epic 11-way power supply roundup, so it seemed like a fitting choice for our test rigs. The three- and four-way SLI systems required a larger PSU, so we used a PC Power & Cooling Turbo-Cool 1200 for those systems. Thanks to OCZ for providing these units for our use in testing.

Unless otherwise specified, image quality settings for the graphics cards were left at the control panel defaults. Vertical refresh sync (vsync) was disabled for all tests.

We used the following versions of our test applications:

The tests and methods we employ are generally publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.

A starting point: Fill rate

Since we’re interested in GPU horsepower, a quick peek at our results for 3DMark06’s multitextured fill rate test seems in order. The number of texels (a.k.a. texture elements, or multi-textured pixels) a given GPU can crank out per second is a long-standing key performance metric. Shader power is becoming more important, but it’s more difficult to test well; we’re not convinced any one of 3DMark06’s shader tests is a good overall measure of shader power. This quick multiextured fill rate benchmark should give us a decent indication of each configuration’s texture throughput.

These numbers aren’t always good predictors of actual in-game performance, but they do give us a sense of this one important variable. The GeForce 8800 GT lands at the top of our texel throughput value rankings in both its single and dual-card configurations.

The distribution in the scatter plot highlights some realities about current GPU architectures. AMD’s RV670 GPU powers all current Radeons, and its texture filtering capacity is relatively modest. As a result, the red points on the plot don’t extend too far up the Y axis, even in three- and four-GPU configurations, where they’re reaching beyond $500. On the Nvidia side of things, the newer G92 GPU packs quite a bit more texture filtering punch than the older G80 chip in the GeForce 8800 Ultra. The three- and four-way G92 configs are just sick, although you’ll pay dearly for them. The GeForce 9600 GT, meanwhile, is based on the G94 GPU, which has roughly half the texture filtering and shader power of the G92. That blunts some of this card’s otherwise-potent value appeal.

Call of Duty 4: Modern Warfare

We tested Call of Duty 4 by recording a custom demo of a multiplayer gaming session and playing it back using the game’s timedemo capability. We enabled 4X antialiasing and 16X anisotropic filtering and turned up the game’s texture and image quality settings to their limits.

Nvidia’s GeForce 9600 GT and GeForce 8800 GT reign supreme in our performance-per-dollar rankings for Call of Duty 4, and our scatter plot shows they’re far and away better deals than anything in the same price range.

Interestingly, the plot also shows an almost-linear progression of prices and frame rates, and we don’t really see a cut-off point after which prices spike and performance stagnates like on our CPU charts.

Looking past the GeForce 9600 GT and 8800 GT, AMD’s Radeon HD 3850 and 3870 CrossFire configurations seem to offer the best performance in the $300-400 range, followed by Nvidia’s GeForce 9800 GX2 in the $500-600 range. We should note that the SLI configurations with the GeForce 9600 GT and 8800 GT perform abnormally poorly at Call of Duty 4 at this particular resolution, a problem Nvidia could solve in future drivers. If that happens, we’d wager that dual 9600 GTs or 8800 GTs would be a better value proposition than the 3850 and 3870 CrossFire offerings—and possibly than the GeForce 9800 GX2.

Enemy Territory: Quake Wars

We tested this game with 4X antialiasing and 16X anisotropic filtering enabled, along with “high” settings for all of the game’s quality options except “Shader level” which was set to “Ultra.” We left the diffuse, bump, and specular texture quality settings at their default levels, though. Shadows, soft particles, and smooth foliage were enabled. Again, we used a custom timedemo.

We’ve excluded the three- and four-way CrossFire X configs here since they don’t support OpenGL-based games like this one.

The GeForce 9600 GT and 8800 GT SLI configurations don’t run into any roadblocks in Quake Wars. The 9600 GT SLI setup is second in our performance-per-dollar chart, and our scatter plot confirms its attractive positioning. For about the same price as the GeForce 9800 GTX, the 9600 GT SLI config yields a 50% higher frame rate, crushing rival AMD CrossFire setups in the process.

Unlike in Call of Duty 4, we actually have somewhat of a value cut-off point here, too. Beyond the 9600 GT SLI setup, you have to pay almost twice as much for a relatively minor frame rate gain (with the GeForce 9800 GX2 and two-way GeForce 9800 GTX), and almost three times as much for a significant one (with three-way 9800 GTXs).

Of course, running an SLI setup has noteworthy downsides, primarily in terms of space taken, power consumed, and noise produced. Those looking for the best single-GPU value can’t go wrong with either a GeForce 9600 GT or a GeForce 8800 GT. The latter’s greater shader power might make it more worthwhile in future games, but we’re not seeing a large gap between the two even at these very stressful settings.

Half-Life 2: Episode Two

We used a custom-recorded timedemo for this game, as well. We tested Episode Two with the in-game image quality options cranked, with 4X AA and 16 anisotropic filtering. HDR lighting and motion blur were both enabled.

Our results in Half-Life 2: Episode Two mirror those from Quake Wars. The GeForce 9600 GT, 9600 GT SLI, and 8800 GT take the top three spots in the performance-per-dollar chart, and our scatter plot shows dual GeForce 9600 GTs in a very favorable spot.

In the single-GPU lane, the GeForce 9600 GT and 8800 GT take the cake again. The pricier single-GPU offerings don’t compare well when you can get a much faster SLI config for roughly the same price.

Crysis

We tested Crysis in the “Recovery” level, early in the game, using our standard FRAPS testing procedure (five sessions of 60 seconds each). The area where we tested included some forest, a village, a roadside, and some water—a good mix of the game’s usual environments.

Because FRAPS testing is time-consuming and Crysis has much higher system requirements than other titles, we tested lower-end cards at 1680 x 1050 and higher-end models at 1920 x 1200. You’ll find G92 SLI and CrossFire X configs in both sets of results.

At 1680 x 1050, the GeForce 9600 GT and Radeon HD 3870 are almost tied on the value scale in Crysis. That said, our scatter plot makes the GeForce 8800 GT look better than either, considering its higher performance and only marginally higher price. Moving up to the $300-400 range, the 9600 GT SLI setup looks like the best overall deal yet again—a perhaps surprising result, considering Crysis’ apparently heavy use of shaders.

At 1920 x 1200, we tested only two single-GPU offerings: the GeForce 9800 GTX and the GeForce 8800 Ultra. The 9800 GTX is hands-down the best overall deal according to our performance-per-dollar chart, with the next step up—Nvidia’s own GeForce 9800 GX2—costing almost twice as much for a 27% increase in performance. We didn’t have the opportunity to test 9600 GT and 8800 GT SLI setups at this resolution, though, so it’s difficult to draw a definite conclusion.

Unreal Tournament 3

We tested UT3 by playing a deathmatch against some bots and recording frame rates during 60-second gameplay sessions using FRAPS. This method has the advantage of duplicating real gameplay, but it comes at the expense of precise repeatability. We believe five sample sessions are sufficient to get reasonably consistent and trustworthy results. In addition to average frame rates, we’ve included the low frames rates, because those tend to reflect the user experience in performance-critical situations. In order to diminish the effect of outliers, we’ve reported the median of the five low frame rates we encountered.

Because UT3 doesn’t natively support multisampled antialiasing, we tested without AA. Instead, we turned up the game’s quality sliders to the max and disabled the game’s frame rate cap.

Epic’s latest multiplayer shooter paints a picture similar to Quake Wars and Half-Life 2: Episode Two, in that the GeForce 9600 GT and GeForce 9600 GT SLI setups are the two best-positioned offerings, both according to our value chart and our scatter plot. We should nonetheless point out that the Radeon HD 3870 CrossFire config sits close to the dual 9600 GTs.

Above the $300-400 range, the next step up looks to be the three-way CrossFire X configuration with the Radeon HD 3870 X2 and Radeon HD 3870. This dual-card, three-GPU setup produces the highest frame rate we’ve seen in UT3 so far, and it costs almost the same as a much slower, single-GPU GeForce 8800 Ultra.

Average performance

To make our data a little easier to digest, we’ve rolled numbers from Call of Duty 4, Enemy Territory: Quake Wars, Half-Life 2: Episode Two, and Unreal Tournament 3 into a single, “average” set of results. Since CrossFire X doesn’t yet support OpenGL games like Quake Wars, we’ve left the three- and four-way Radeon configs out of this comparison. We’ve also omitted Crysis since we didn’t test all cards at a single resolution in it.

Boiling down our previous results into one graph and one plot clarifies things for us nicely. As we’ve been saying all along, the GeForce 9600 GT and the GeForce 8800 GT are the best-positioned single-GPU offerings, and the GeForce 9600 GT SLI setup is the next best step up—even with its lackluster numbers from our Quake Wars test lowering its average slightly. Among the Radeons, the HD 3850 CrossFire solution deserves particular distinction; its performance nearly matches than of the Radeon HD 3870 X2, at a much lower price.

Overall, the almost linear relationship we initially noted between price and performance remains largely intact, with some notable exceptions at the high end of the market. Any config involving a GeForce 8800 Ultra is a relatively poor value with the advent of newer G92-based cards. Things are a little rosier for the GeForce 9800 GTX-based three-way SLI, whose outstanding performance almost (well, kinda) tracks with its outrageous price. Meanwhile, the 9800 GX2-based quad SLI setup has unique one-two combo of being slower than the 9800 GTX three-way rig while costing considerably more.

Conclusions

Over the past few pages, we’ve seen three things. The first is that scatter plots are really cool. Seriously, just look at ’em. Organizing this type of data could hardly get any better.

Our second observation is that two GeForces, the 9600 GT and 8800 GT, easily offer the best overall value in the single-GPU world. The 9600 GT is more attractive now, since it offers performance almost equal to the 8800 GT for a few bucks less. However, we suspect the GeForce 8800 GT could become more interesting in the months to come, should games start harnessing more of its additional shader processors and texture filtering capacity.

Finally, perhaps our most unexpected observation is that multi-GPU setups have the potential to deliver solid value. Mid-range cards like the GeForce 9600 GT and Radeon HD 3850 offer strong value propositions, and that effect is multiplied by pairing two of them together. Two 9600 GTs can be faster than a single GeForce 8800 Ultra, despite the fact that they cost substantially less. Similarly, two Radeon HD 3850s are a better deal than a single Radeon HD 3870 X2, if your motherboard can accommodate them. Of course, SLI and CrossFire bring with them a whole stable of caveats involving chipset compatibility, multi-monitor support, and the need for driver profiles. High-end multi-GPU configs can add additional expense in the form of higher PSU and cooling requirements, as well. But with both AMD and Nvidia now offering high-end cards with dual GPUs onboard, multi-GPU looks like it’s here to stay.

Comments closed

Pin It on Pinterest

Share This

Share this post with your friends!