We don’t generally spend much time exploring the world of entry-level graphics processors. It’s not for a lack of interest, but out of a sense of propriety. For as long as we can remember, that market has been populated by the lame bastard children of more glorious designs—blessed with right components to make a truly great GPU, yet cursed with an insufficient number of them and condemned to disappoint.
Thus entry-level GPUs hobbled about the world, taking shelter in bargain bins and luring penny pinchers with double-digit price tags. “Only $49.99,” the sticker on the box might say. “$39.99 after a mail-in rebate!” Some even attempted deception, bulking up with cheap, slow RAM, so that at a glance, uneducated shoppers might mistake them for better products capable of using up a large frame buffer.
Unfortunate souls who took pity upon these misshapen atrocities—or were fooled by them—often realized all too late that, far from being upgrades, entry-level cards only perpetuated the mediocre performance to which they’d grown accustomed. Silicon might heat up, and tiny cooling fans might race noisily, but games that should have been beautiful were warped into unsightly slide shows, like faded Polaroids of what they’d have looked like… if only you’d sprung for a faster card.
Is the wind of change blowing at last? It would seem so, thanks in no small part to Intel’s Sandy Bridge processors and their shockingly capable integrated graphics. With the performance floor set at a reasonable level for the first time in, well, forever, entry-level cards have a chance to shine. Or rather, they must, lest they be rendered irrelevant for good.
AMD’s Radeon HD 6450 belongs to a new breed of entry-level graphics cards, which is designed not to offer the absolute bare minimum GPU makers can get away with, but to provide enough of a step up from Sandy Bridge’s IGP to justify the asking price—in this case, $54.99. AMD’s pitch is that the Radeon HD 6450 delivers not just superior performance, but also little perks like DirectX 11 capabilities, support for GPU compute applications, and the ability to drive three displays at once. Better game compatibility ought to be somewhere in there, too.
In short, while this card scrapes the bottom of the barrel, it doesn’t seem like as much of an afterthought as its predecessors. Could it be that, in 2011, a graphics card priced at $55 finally delivers acceptable performance in current games? Surely luxuries like antialiasing and advanced shader effects must be off-limits, but could the Radeon HD 6450 set the bar for “good enough,” provided you’re more interested in having fun than soaking in eye candy? Considering the stagnating hardware requirements of games and the ever-increasing horsepower of graphics silicon, that doesn’t sound like too outlandish a premise.
Anatomy of a $55 graphics card
The Radeon HD 6450 is based on Caicos, a lilliputian graphics processor with a die area my measurements peg at around 75 mm². (Compare that to the 255 mm² GPU of the recently released Radeon HD 6790.) Caicos keeps processing resources to a minimum, with only 160 shader ALUs, or stream processors, and the ability to filter eight textures and output four pixels per clock cycle. The path to memory is only 64 bits.
Caicos is wrapped in two different variants of the new Radeon: one with 1GB of DDR3 memory and another with 512MB of GDDR5, pictured below:
The GDDR5 variant is clocked slightly faster, with a 750MHz GPU and a memory speed of 800-900MHz. (Our model has 900MHz RAM, which translates into about 28.8 GB/s of peak memory bandwidth.) The slower, DDR3-based variant of the Radeon HD 6450 runs at 625MHz and clocks its memory at 533-800MHz. AMD says both versions cost the same, though.
As you might expect, these spartan computing resources mean equally stingy power utilization: only up to 27W for the GDDR5 card and 20W for its DDR3 sibling. Idle power is about 9W, less than one of those regular-sized swirly light bulbs. In other words, the Radeon HD 6450 might just be the sort of GPU you’d want to stick in a home-theater PC—or perhaps a small-form-factor desktop machine tuned for light gaming.
Our testing methods
In celebration of the Radeon HD 6450’s suitability for small-form-factor builds, we based our test platform on Zotac’s H67-ITX WiFi motherboard, which has a Mini-ITX form factor yet accommodates Sandy Bridge quad-core chips.
The processor under that heatsink is a Core i5-2500K, admittedly an onerous (and power-hungry) choice for the kind of build the Radeon HD 6450 might end up inhabiting. Using the i5-2500K presented an advantage, however, in that it let us compare AMD’s new entry-level card with the best Intel has to offer: the HD Graphics 3000. You can think of the IGP results on the next few pages as sort of a high-water mark for what Intel integrated graphics can achieve. It all goes downhill from there, as they say.
We’ll also be comparing the Radeon HD 6450 to Nvidia’s GeForce GT 430, which can be had for $59.99 before rebates in the configuration we tested, as well as a pair of higher-end offerings, the GeForce GTS 450 and the Radeon HD 5770.
You might notice we only ran three games and four individual tests. We usually conduct more thorough testing with new graphics cards, but let’s face it: this is a $55, bargain-basement graphics card, and we have better uses for the precious lab time that further testing would have required.
We settled on a 1440×900 resolution for testing. That resolution is commonly found on cheap 19″ displays, which are just the kind you’d expect to pair up with a $55 GPU (or integrated graphics). 1366×768 is gaining ground as a desktop resolution, but Newegg still stocks fewer of displays with that resolution than 1440×900 ones.
Finally, we did our best to deliver clean benchmark numbers, as always. Tests were run at least three times, and we’ve reported the median result. Our test system was configured as follows:
|Processor||Intel Core i5-2500K|
|Motherboard||Zotac H67-ITX WiFi|
|North bridge||Intel H67 Express|
|Memory size||4GB (2 DIMMs)|
|Memory type||Kingston HyperX KHX2133C9AD3X2K2/4GX
DDR3 SDRAM at 1333MHz
|Memory timings||9-9-9-24 1T|
|Chipset drivers||INF update 220.127.116.115
Rapid Storage Technology 10.1.0.1008
Graphics Media Accelerator Driver 18.104.22.168.2321
with Realtek R2.54 drivers
|Graphics||AMD Radeon HD 6450 512MB (GDDR5)
with Catalyst 8.84.2_RC2 drivers
|Gigabyte Radeon HD 5770 Super OC 1GB
with Catalyst 8.84.2_RC2 drivers
|Asus GeForce GT 430 1GB (DDR3)
with GeForce 270.51 beta drivers
|Zotac GeForce GTS 450 1GB AMP! Edition
with GeForce 270.51 beta drivers
|Hard drive||Samsung SpinPoint F1 HD103UJ 1TB SATA|
|Power supply||Corsair HX750W 750W|
|OS||Windows 7 Ultimate x64 Edition
Service Pack 1
Thanks to Intel, Kingston, Samsung, Zotac, and Corsair for helping to outfit our test rigs with some of the finest hardware available. AMD, Nvidia, and the makers of the various products supplied the graphics cards for testing, as well.
Image quality options were left at the control panel defaults for the Intel IGP and Nvidia cards. With the Radeons, we left optional AMD optimizations for tessellation and surface format conversions disabled. Vertical refresh sync (vsync) was disabled for all tests.
We used the following test applications:
The tests and methods we employ are generally publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.
I’ve made no secret of my appreciation for Bulletstorm‘s cathartic gameplay and gorgeous environments, so it seems like a fitting start to our round of benchmarking. With these budget cards, I turned down the resolution to 1440×900 and set all the graphical options to their lowest settings.
The game has no built-in benchmarking mode, so I played through the first 90 seconds of the “Hideout” Echo five times per card, attempting to keep runs as similar as possible, and reporting the median of average and low frame rates obtained. Fraps was used to record frame rates.
The good news is that the Radeon HD 6450 manages 30 FPS at 1440×900 with the lowest detail settings, which is playable, and the results don’t look half bad. What’s the bad news? Well, it doesn’t provide that great a step up over the Intel IGP, and it’s considerably slower than cards like the GeForce GTS 450 and Radeon HD 5770. The exact variants of those faster offerings we tested have slightly higher-than-normal clock speeds and corresponding price premiums, but vanilla versions can be had for not much more than $100, and they should still be much faster than the 6450 and GT 430.
Civ V has several built-in benchmarks, including two we think are useful for testing video cards. One of them concentrates on the world leaders presented in the game, which is interesting because the game’s developers have spent quite a bit of effort on generating very high quality images in those scenes, complete with some rather convincing material shaders to accent the hair, clothes, and skin of the characters. This benchmark isn’t necessarily representative of Civ V‘s core gameplay, but it does measure performance in one of the most graphically striking parts of the game. We chose to average the results from the individual leaders.
The Intel IGP stutters in most of Civilization V‘s Leader scenes, while the Radeon does not. It’s worth pointing out that the $60 GeForce GT 430 is quite a bit snappier, though.
Another benchmark in Civ V focuses on the most taxing part of the core gameplay, when you’re deep into a map and have hundreds of units and structures populating the space. This is when an underpowered GPU can slow down and cause the game to run poorly. This test outputs a generic score that can be a little hard to interpret, so we’ve converted the results into frames per second to make them more readable.
We see the same pattern with the gameplay scene, except the Intel IGP is even worse off. Civilization V calls for a surprising amount of graphical horsepower, especially if you want to enjoy the game’s nicer shader effects.
Left 4 Dead 2
Valve’s co-op zombie massacre extravaganza is part of the 2009 game catalog, so it’s not exactly the freshest thing out there. It does, however, represent the type of title you might want to play on a $55 graphics card—something a little bit older and not too demanding, but still fun to play with. You might call it the cougar of our test suite.
We tested performance using a custom timedemo recorded during the finale of the Sacrifice campaign.
The Intel IGP appears to do well here, but it cheated—by repeatedly making Left 4 Dead 2 crash when we set the “shader detail” setting to “high” or “very high.” We had to settle for the “medium” setting, which is less taxing.
Among the non-cheaters, the Radeon HD 6450 performed okay, but it once again stood in the shadow of the faster GeForce GT 430. Both cards were left in the dust by the GeForce GTS 450 and Radeon HD 5770, which pulled frame rates in the triple digits. With those cards, you’d obviously want to raise the detail level, perhaps by enabling antialiasing and cranking up anisotropic filtering.
We measured total system power consumption at the wall socket using a P3 Kill A Watt digital power meter. The monitor was plugged into a separate outlet, so its power draw was not part of our measurement. The cards were plugged into a motherboard on an open test bench.
The idle measurements were taken at the Windows desktop with the Aero theme enabled. The cards were tested under load running Bulletstorm at a 1440×900 resolution.
The bargain-basement-dwelling AMD and Nvidia cards are neck-and-neck at idle, but the Radeon is clearly the most power-efficient of the two under load. Interestingly, enabling the Intel IGP leads to power draw above that of the same system with a GeForce GT 430 or Radeon HD 6450. Intel’s HD Graphics 3000 silicon stays dormant when discrete graphics cards are connected, but clearly, it doesn’t sip power when active.
We measured noise levels on our test system, sitting on an open test bench, using a TES-52 digital sound level meter. The meter was held approximately 8″ from the test system at a height even with the top of the video card.
You can think of these noise level measurements much like our system power consumption tests, because the entire systems’ noise levels were measured. Of course, noise levels will vary greatly in the real world along with the acoustic properties of the PC enclosure used, whether the enclosure provides adequate cooling to avoid a card’s highest fan speeds, placement of the enclosure in the room, and a whole range of other variables. These results should give a reasonably good picture of comparative fan noise, though.
And this, folks, is one of the notable downsides of entry-level cards. They’re the loudest of the bunch despite their low power consumption, simply because they’ve been outfitted with small heatsinks and tiny, whiny fans.
To condense the lessons we learned from our performance testing, we whipped up another one of our famous scatter plots. We laid out the different offerings on the Y axis based on their average performance across all of our game benchmarks, and we positioned them along the X axis based on the prices of the cards themselves plus a sample system build worth $522.95. That build corresponds roughly, but not exactly, to our test rig. It includes a Core i5-2500K processor, our Zotac motherboard, a 4GB DDR3-1333 dual-channel kit, a 1TB Samsung SpinPoint F3 hard drive, and an Antec EarthWatts 380W power supply. All prices were collected from Newegg.
I could probably stop here, because the writing is on the wall. But let’s make things crystal clear.
First, the Radeon HD 6450 isn’t a particularly good deal for the money. The GeForce GT 430 can be had for about $5 more right now, and it was faster overall across our test suite.
Second, you shouldn’t buy either of those cards if you plan on doing any proper PC gaming. Yes, you’ll double your graphics card budget by opting for, say, a vanila GeForce GTS 450 for $110 instead of a Radeon HD 6450 for $55. If you account for the cost of the rest of the system, however, that budget increase doesn’t amount to much.
More importantly, the small overall price increase leads to a massive performance increase. You’re looking at literally three times the frame rate going from the 6450 to a GTS 450 or a 5770—perhaps a teeny bit less if you go with a vanilla card instead of one of the slightly souped-up models we tested, but still.
As we saw over the previous pages, entry-level cards like the 6450 may be faster than Sandy Bridge’s integrated graphics, but they still require serious compromises on the image quality front, and they can’t always churn out smooth frame rates even when you push all the quality sliders to the left. The Radeon HD 6450 might just be a good choice as an upgrade to a pre-Sandy Bridge system for someone who’s a very light gamer (think World of Warcraft and The Sims), who might want to use a triple-display config for whatever reason (dumpster diving, perhaps?), and has an extremely limited budget. Anyone else should probably stay clear.