Single page Print

How much graphics memory do you really need?


We test with a GeForce 8800 GT to find out
— 5:42 PM on February 27, 2008

When shopping for computer hardware, consumers tend to favor bigger numbers. I don't blame them, either. The average consumer knows about as much about hardware as I know about needlepoint, which is to say very little. Most, I would suspect, have no idea exactly what a megabyte is, let alone a gigahertz. But they can count, so when faced with the choice between 2GHz or 2.5GHz, they're going to go with the higher number. Because it's better.

Except when it's not, as was the case with the Pentium 4. Intel architected the P4 to scale to higher clock speeds than we'd ever seen before, birthing a megahertz myth that conveniently exploited consumer ignorance. Why would Joe Sixpack buy an Athlon at a mere 1.4GHz when he could get a Pentium 4 at 2GHz? Enthusiasts knew the score, but for years, mainstream consumers were easily persuaded—if they hadn't assumed already—that the Pentium 4 was a far better processor because it had a higher clock speed.

More recently, we've seen a much smaller but not less absurd memory myth take hold in the graphics card industry. Budget cards equipped with ridiculous amounts of memory are the culprit here. For enthusiasts, a gig of memory on a sub-$100 graphics card makes about as much sense as putting a spoiler on a Yugo. Budget GPUs lack the horsepower necessary to run games at the kinds of resolutions and detail levels that would require such a copious amount of video memory. But what about the latest crop of mid-range graphics cards? Nvidia's GeForce 8800 GT has considerable pixel-pushing power on its own, and when teamed in SLI, that power is effectively doubled. Perhaps a gigabyte of memory on this class of card isn't so unreasonable.

Conveniently, derivatives of the GeForce 8800 GT are available with 256MB, 512MB, or 1GB of memory, making it easy for us to probe the impact of graphics memory size on performance. We've tested a collection of single cards and SLI configurations in a selection of new games, across multiple resolutions, to see where memory size matters, if it does at all. Keep reading for the enlightening results.


Bigger than yours
When Nvidia first introduced the GeForce 8800 GT, the card came equipped with 512MB of memory. Shortly thereafter, in response to AMD's launch of the Radeon HD 3850 256MB, the green team filled out the low end of the 8800 GT lineup with a cheaper 256MB variant. The 8800 GT's memory size didn't scale upward until graphics board makers started fiddling with the design on their own. Palit put 1GB of memory on its 8800GT Super+, releasing it to market alongside some of the most subversively phallic promotional shirts we've ever seen.


To the casual observer, the Super+ doesn't look all that dissimilar to other 8800 GT cards. Sure, it has a custom dual-slot cooler, and even three-phase power for the GPU, but nothing really screams out that this card packs twice the memory of your average GT—that is, until you turn it over.


GeForce 8800 GT cards with 512MB of memory have no problems squeezing the RAM chips on one side of the card. However, to accommodate 1GB of memory, the Super+ fills out both sides. The memory chips on the underside of the card normally lurk behind a bright blue heatspreader, but they've agreed to come out just this once for a picture.