When shopping for computer hardware, consumers tend to favor bigger numbers. I don’t blame them, either. The average consumer knows about as much about hardware as I know about needlepoint, which is to say very little. Most, I would suspect, have no idea exactly what a megabyte is, let alone a gigahertz. But they can count, so when faced with the choice between 2GHz or 2.5GHz, they’re going to go with the higher number. Because it’s better.
Except when it’s not, as was the case with the Pentium 4. Intel architected the P4 to scale to higher clock speeds than we’d ever seen before, birthing a megahertz myth that conveniently exploited consumer ignorance. Why would Joe Sixpack buy an Athlon at a mere 1.4GHz when he could get a Pentium 4 at 2GHz? Enthusiasts knew the score, but for years, mainstream consumers were easily persuadedif they hadn’t assumed alreadythat the Pentium 4 was a far better processor because it had a higher clock speed.
More recently, we’ve seen a much smaller but not less absurd memory myth take hold in the graphics card industry. Budget cards equipped with ridiculous amounts of memory are the culprit here. For enthusiasts, a gig of memory on a sub-$100 graphics card makes about as much sense as putting a spoiler on a Yugo. Budget GPUs lack the horsepower necessary to run games at the kinds of resolutions and detail levels that would require such a copious amount of video memory. But what about the latest crop of mid-range graphics cards? Nvidia’s GeForce 8800 GT has considerable pixel-pushing power on its own, and when teamed in SLI, that power is effectively doubled. Perhaps a gigabyte of memory on this class of card isn’t so unreasonable.
Conveniently, derivatives of the GeForce 8800 GT are available with 256MB, 512MB, or 1GB of memory, making it easy for us to probe the impact of graphics memory size on performance. We’ve tested a collection of single cards and SLI configurations in a selection of new games, across multiple resolutions, to see where memory size matters, if it does at all. Keep reading for the enlightening results.
Bigger than yours
When Nvidia first introduced the GeForce 8800 GT, the card came equipped with 512MB of memory. Shortly thereafter, in response to AMD’s launch of the Radeon HD 3850 256MB, the green team filled out the low end of the 8800 GT lineup with a cheaper 256MB variant. The 8800 GT’s memory size didn’t scale upward until graphics board makers started fiddling with the design on their own. Palit put 1GB of memory on its 8800GT Super+, releasing it to market alongside some of the most subversively phallic promotional shirts we’ve ever seen.
To the casual observer, the Super+ doesn’t look all that dissimilar to other 8800 GT cards. Sure, it has a custom dual-slot cooler, and even three-phase power for the GPU, but nothing really screams out that this card packs twice the memory of your average GTthat is, until you turn it over.
GeForce 8800 GT cards with 512MB of memory have no problems squeezing the RAM chips on one side of the card. However, to accommodate 1GB of memory, the Super+ fills out both sides. The memory chips on the underside of the card normally lurk behind a bright blue heatspreader, but they’ve agreed to come out just this once for a picture.
Before getting started we should probably take a moment to frame the issue at hand. Today we’re looking for two things: whether memory size has a tangible impact on in-game performance, and if it does, whether that impact comes at resolutions and detail levels that offer playable frame rates. We’re not interested comparing the performance of one slideshow to another.
With the obvious exception of Crysis, we were actually able to get decent frame rates in all of our games with their highest in-game detail levels, and with antialiasing and anisotropic filtering to boot. That makes resolution the most obvious candidate for scaling. I should apologize in advance for not having one of those swanky 30″ monitors that goes up to 2560×1600a fact that pains me on an almost daily basis. The best my test systems can do is 1920×1440 on an old-school CRT, which is still a higher pixel count than common 24″ displays with a 1920×1200 display resolution. If you can afford a 30″ display, chances are you can do better than a GeForce 8800 GT, anyway.
We’ve tested single-card GeForce 8800 GT configurations with 256MB, 512MB, and 1GB of memory. The 512MB and 1GB cards were also tested in SLI. Since some of the cards we used are “factory overclocked,” we used nTune to normalize core, shader, and memory clocks to the GeForce 8800 GT’s reference speeds of 600MHz, 1.5GHz, and an effective 1.8GHz, respectively.
Our testing methods
All tests were run three times, and their results were averaged.
Core 2 Duo E6750 2.66GHz
nForce 780i SLI
nForce 780i SLI
|Chipset drivers||ForceWare 9.46|
|Memory size||2GB (2 DIMMs)|
DDR2 SDRAM at
RAS to CAS delay
with Realtek 1.86 drivers
XFX GeForce 8800 GT Alpha Dog 256MB
Gigabyte GV-NX88T512HP 512MB
Palit 8800GT Super+ 1GB
Western Digital Caviar RE2 400GB SATA
Windows Vista Ultimate x86
KB938194, KB938979, KB940105
We used the following versions of our test applications:
The test systems’ Windows desktop was set at 1280×1024 in 32-bit color at a 60Hz screen refresh rate. Vertical refresh sync (vsync) was disabled for all tests.
All the tests and methods we employed are publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.
Call of Duty 4: Modern Warfare
We tested Call of Duty 4 by recording a custom demo of a multiplayer gaming session and playing it back using the game’s timedemo capability. We cranked the in-game detail levels, including texture filtering, and set antialiasing to 4X.
Our Call of Duty 4 test teases out a tangible difference in performance depending on graphics memory size, but it’s not between configurations with 512MB and 1GB. Instead, it’s the 256MB configuration running behind the curve. At a relatively modest 1280×1024, the 256MB card lags behind by a little more than 10 frames per second. That gap only grows as we crank the resolution, with the 256MB card dropping well below a playable frame rate threshold at 1920×1440.
There’s essentially no difference in performance between our 512MB and 1GB cards here, even when they’re paired in SLI. As one might expect, overall performance drops as we scale the resolution up, but it does so at a slower rate than with the 256MB card.
Crysis is easily the most demanding PC game on the market, and we were able to get reasonably playable frame rates with the game’s high-quality detail settings. We started at a relatively modest 1024×768 display resolution without antialiasing and scaled up from there. The scores below come from a custom timedemo recorded in the game’s first level.
Again, 256MB of memory proves to be a clear handicap for the GeForce 8800 GT. Even at 1024×768, you’re looking at a significant drop in performance. Things only get worse as the resolution goes up.
Between our single-card 512MB and 1GB configurations, we see no meaningful difference in performance up to 1600×1200. At that resolution, we’re under 25 frames per second, which is a little choppy to be considered playable, at least for a first-person shooter.
Our SLI configurations provide a little more drama as the 1GB cards inexplicably deliver lower frame rates than their 512MB counterparts at 1280×1024. We re-ran the tests numerous times and after multiple reboots, but got the same results. Things return to normal at 1600×1200, where only one frame per second separates our 512MB and 1GB SLI configurations, with the latter holding the slight lead.
Enemy Territory: Quake Wars
We tested Quake Wars with its highest in-game detail level settings and both 4X antialiasing and 16X aniso. We used the game’s timedemo functionality with a custom-recorded demo. Unfortunately, our 256MB card repeatedly crashed when running the game at 1920×1440, so we don’t have results for it at that resolution.
Even 256MB of graphics memory looks adequate for Quake Wars, that is until you want to run at resolutions higher than 1600×1200. Our 512MB and 1GB cards are a little faster, but not by nearly the margins we saw in CoD 4 and Crysis.
It’s no surprise, then, that there’s essentially no difference in performance between our 512MB and 1GB cards. There’s a bit of a gap when we start pairing cards in SLI, but surprisingly, it’s the 512MB configuration that comes out on top by a few frames per second.
Half-Life 2: Episode 2
Episode 2 brings higher quality textures than previous versions of Half-Life 2, and we were able to run the game with all its detail level settings maxed in addition to 4X antialiasing and 16X aniso. We used a custom demo in conjunction with the Source engine’s timedemo functionality.
Yet again, we found no difference in performance between GeForce 8800 GT configurations with 512MB and 1GB of memory. Not even SLI could coax a meaningful margin between those memory sizes at the resolutions we used for testing. It is worth noting, however, that as with Quake Wars, this system can probably run Episode 2 with playable frame rates at resolutions higher than 1920×1440.
Our 256MB 8800 GT admirably hangs on at 1280×1024, nearly matching the performance offered by 512MB and 1GB cards. This victory is short-lived, though; the 256MB card stumbles at 1600×1200 and drops well below the playable threshold at 1920×1440.
If you’re looking at running a single GeForce 8800 GT, the card’s default 512MB memory size is easily the best. Doubling the onboard memory to 1GB may make for interesting marketing, but it doesn’t improve performance a lick with the games and resolutions we tested. What’s more, a single 8800 GT runs out of steam at 1920×1440if not at lower resolutionsindicating that higher resolutions that might benefit from additional video memory wouldn’t yield playable frame rates with single-card configs.
The benefits of 1GB of video memory are also a bust for GeForce 8800 GT SLI configurations, at least at resolutions up to 1920×1440. However, we’ve seen the 8800 GT 512MB in SLI deliver playable frame rates at 2650×1600 in Quake Wars and Episode 2, so a 1GB SLI config may yield performance benefits there. We’ve also observed the performance of a 512MB SLI config tank spectacularly in Call of Duty 4 when moving to 2560×1600, suggesting that additional video memory could be of help there, as well.
As for the 256MB variant of the GeForce 8800 GT, well, there’s little hope. The 256MB card fared well enough in Quake Wars, but it couldn’t keep up at even 1280×1024 in CoD 4 and started to drop off at 1600×1200 in Episode 2. And Crysis? Forget about it.
Given the relatively minor price gap between 256MB and 512MB versions of the GeForce 8800 GT, we see little reason to settle for the 256MB card. You really do need 512MB of memory to make the most of today’s games, especially if you want to crank up the eye candy. That said, today’s games aren’t so demanding that they’ll make good use of 1GB of video memory, at least not with the GeForce 8800 GT. Not even Crysis saw a meaningful performance increase with the 1GB cards, suggesting that tomorrow’s games may do just fine with 512MB, as well.