AMD’s Radeon HD 6450 graphics card

We don’t generally spend much time exploring the world of entry-level graphics processors. It’s not for a lack of interest, but out of a sense of propriety. For as long as we can remember, that market has been populated by the lame bastard children of more glorious designs—blessed with right components to make a truly great GPU, yet cursed with an insufficient number of them and condemned to disappoint.

Thus entry-level GPUs hobbled about the world, taking shelter in bargain bins and luring penny pinchers with double-digit price tags. “Only $49.99,” the sticker on the box might say. “$39.99 after a mail-in rebate!” Some even attempted deception, bulking up with cheap, slow RAM, so that at a glance, uneducated shoppers might mistake them for better products capable of using up a large frame buffer.

Unfortunate souls who took pity upon these misshapen atrocities—or were fooled by them—often realized all too late that, far from being upgrades, entry-level cards only perpetuated the mediocre performance to which they’d grown accustomed. Silicon might heat up, and tiny cooling fans might race noisily, but games that should have been beautiful were warped into unsightly slide shows, like faded Polaroids of what they’d have looked like… if only you’d sprung for a faster card.

Is the wind of change blowing at last? It would seem so, thanks in no small part to Intel’s Sandy Bridge processors and their shockingly capable integrated graphics. With the performance floor set at a reasonable level for the first time in, well, forever, entry-level cards have a chance to shine. Or rather, they must, lest they be rendered irrelevant for good.

AMD’s Radeon HD 6450 belongs to a new breed of entry-level graphics cards, which is designed not to offer the absolute bare minimum GPU makers can get away with, but to provide enough of a step up from Sandy Bridge’s IGP to justify the asking price—in this case, $54.99. AMD’s pitch is that the Radeon HD 6450 delivers not just superior performance, but also little perks like DirectX 11 capabilities, support for GPU compute applications, and the ability to drive three displays at once. Better game compatibility ought to be somewhere in there, too.

In short, while this card scrapes the bottom of the barrel, it doesn’t seem like as much of an afterthought as its predecessors. Could it be that, in 2011, a graphics card priced at $55 finally delivers acceptable performance in current games? Surely luxuries like antialiasing and advanced shader effects must be off-limits, but could the Radeon HD 6450 set the bar for “good enough,” provided you’re more interested in having fun than soaking in eye candy? Considering the stagnating hardware requirements of games and the ever-increasing horsepower of graphics silicon, that doesn’t sound like too outlandish a premise.

Anatomy of a $55 graphics card

The Radeon HD 6450 is based on Caicos, a lilliputian graphics processor with a die area my measurements peg at around 75 mm². (Compare that to the 255 mm² GPU of the recently released Radeon HD 6790.) Caicos keeps processing resources to a minimum, with only 160 shader ALUs, or stream processors, and the ability to filter eight textures and output four pixels per clock cycle. The path to memory is only 64 bits.

Caicos is wrapped in two different variants of the new Radeon: one with 1GB of DDR3 memory and another with 512MB of GDDR5, pictured below:

The GDDR5 variant is clocked slightly faster, with a 750MHz GPU and a memory speed of 800-900MHz. (Our model has 900MHz RAM, which translates into about 28.8 GB/s of peak memory bandwidth.) The slower, DDR3-based variant of the Radeon HD 6450 runs at 625MHz and clocks its memory at 533-800MHz. AMD says both versions cost the same, though.

As you might expect, these spartan computing resources mean equally stingy power utilization: only up to 27W for the GDDR5 card and 20W for its DDR3 sibling. Idle power is about 9W, less than one of those regular-sized swirly light bulbs. In other words, the Radeon HD 6450 might just be the sort of GPU you’d want to stick in a home-theater PC—or perhaps a small-form-factor desktop machine tuned for light gaming.

Our testing methods

In celebration of the Radeon HD 6450’s suitability for small-form-factor builds, we based our test platform on Zotac’s H67-ITX WiFi motherboard, which has a Mini-ITX form factor yet accommodates Sandy Bridge quad-core chips.

The processor under that heatsink is a Core i5-2500K, admittedly an onerous (and power-hungry) choice for the kind of build the Radeon HD 6450 might end up inhabiting. Using the i5-2500K presented an advantage, however, in that it let us compare AMD’s new entry-level card with the best Intel has to offer: the HD Graphics 3000. You can think of the IGP results on the next few pages as sort of a high-water mark for what Intel integrated graphics can achieve. It all goes downhill from there, as they say.

We’ll also be comparing the Radeon HD 6450 to Nvidia’s GeForce GT 430, which can be had for $59.99 before rebates in the configuration we tested, as well as a pair of higher-end offerings, the GeForce GTS 450 and the Radeon HD 5770.

You might notice we only ran three games and four individual tests. We usually conduct more thorough testing with new graphics cards, but let’s face it: this is a $55, bargain-basement graphics card, and we have better uses for the precious lab time that further testing would have required.

We settled on a 1440×900 resolution for testing. That resolution is commonly found on cheap 19″ displays, which are just the kind you’d expect to pair up with a $55 GPU (or integrated graphics). 1366×768 is gaining ground as a desktop resolution, but Newegg still stocks fewer of displays with that resolution than 1440×900 ones.

Finally, we did our best to deliver clean benchmark numbers, as always. Tests were run at least three times, and we’ve reported the median result. Our test system was configured as follows:

Processor Intel Core i5-2500K
Motherboard Zotac H67-ITX WiFi
North bridge Intel H67 Express
South bridge
Memory size 4GB (2 DIMMs)
Memory type Kingston HyperX KHX2133C9AD3X2K2/4GX

DDR3 SDRAM at 1333MHz

Memory timings 9-9-9-24 1T
Chipset drivers INF update 9.2.0.1025

Rapid Storage Technology 10.1.0.1008

Graphics Media Accelerator Driver 15.21.12.64.2321

Audio Integrated ALC892

with Realtek R2.54 drivers

Graphics AMD Radeon HD 6450 512MB (GDDR5)

with Catalyst 8.84.2_RC2 drivers

Gigabyte Radeon HD 5770 Super OC 1GB

with Catalyst 8.84.2_RC2 drivers

Asus GeForce GT 430 1GB (DDR3)

with GeForce 270.51 beta drivers

Zotac GeForce GTS 450 1GB AMP! Edition

with GeForce 270.51 beta drivers

Hard drive Samsung SpinPoint F1 HD103UJ 1TB SATA
Power supply Corsair HX750W 750W
OS Windows 7 Ultimate x64 Edition

Service Pack 1

Thanks to Intel, Kingston, Samsung, Zotac, and Corsair for helping to outfit our test rigs with some of the finest hardware available. AMD, Nvidia, and the makers of the various products supplied the graphics cards for testing, as well.

Image quality options were left at the control panel defaults for the Intel IGP and Nvidia cards. With the Radeons, we left optional AMD optimizations for tessellation and surface format conversions disabled. Vertical refresh sync (vsync) was disabled for all tests.

We used the following test applications:

The tests and methods we employ are generally publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.

Bulletstorm

I’ve made no secret of my appreciation for Bulletstorm‘s cathartic gameplay and gorgeous environments, so it seems like a fitting start to our round of benchmarking. With these budget cards, I turned down the resolution to 1440×900 and set all the graphical options to their lowest settings.

The game has no built-in benchmarking mode, so I played through the first 90 seconds of the “Hideout” Echo five times per card, attempting to keep runs as similar as possible, and reporting the median of average and low frame rates obtained. Fraps was used to record frame rates.

The good news is that the Radeon HD 6450 manages 30 FPS at 1440×900 with the lowest detail settings, which is playable, and the results don’t look half bad. What’s the bad news? Well, it doesn’t provide that great a step up over the Intel IGP, and it’s considerably slower than cards like the GeForce GTS 450 and Radeon HD 5770. The exact variants of those faster offerings we tested have slightly higher-than-normal clock speeds and corresponding price premiums, but vanilla versions can be had for not much more than $100, and they should still be much faster than the 6450 and GT 430.

Civilization V
Civ V has several built-in benchmarks, including two we think are useful for testing video cards. One of them concentrates on the world leaders presented in the game, which is interesting because the game’s developers have spent quite a bit of effort on generating very high quality images in those scenes, complete with some rather convincing material shaders to accent the hair, clothes, and skin of the characters. This benchmark isn’t necessarily representative of Civ V‘s core gameplay, but it does measure performance in one of the most graphically striking parts of the game. We chose to average the results from the individual leaders.

The Intel IGP stutters in most of Civilization V‘s Leader scenes, while the Radeon does not. It’s worth pointing out that the $60 GeForce GT 430 is quite a bit snappier, though.

Another benchmark in Civ V focuses on the most taxing part of the core gameplay, when you’re deep into a map and have hundreds of units and structures populating the space. This is when an underpowered GPU can slow down and cause the game to run poorly. This test outputs a generic score that can be a little hard to interpret, so we’ve converted the results into frames per second to make them more readable.

We see the same pattern with the gameplay scene, except the Intel IGP is even worse off. Civilization V calls for a surprising amount of graphical horsepower, especially if you want to enjoy the game’s nicer shader effects.

Left 4 Dead 2

Valve’s co-op zombie massacre extravaganza is part of the 2009 game catalog, so it’s not exactly the freshest thing out there. It does, however, represent the type of title you might want to play on a $55 graphics card—something a little bit older and not too demanding, but still fun to play with. You might call it the cougar of our test suite.

We tested performance using a custom timedemo recorded during the finale of the Sacrifice campaign.

The Intel IGP appears to do well here, but it cheated—by repeatedly making Left 4 Dead 2 crash when we set the “shader detail” setting to “high” or “very high.” We had to settle for the “medium” setting, which is less taxing.

Among the non-cheaters, the Radeon HD 6450 performed okay, but it once again stood in the shadow of the faster GeForce GT 430. Both cards were left in the dust by the GeForce GTS 450 and Radeon HD 5770, which pulled frame rates in the triple digits. With those cards, you’d obviously want to raise the detail level, perhaps by enabling antialiasing and cranking up anisotropic filtering.

Power consumption

We measured total system power consumption at the wall socket using a P3 Kill A Watt digital power meter. The monitor was plugged into a separate outlet, so its power draw was not part of our measurement. The cards were plugged into a motherboard on an open test bench.

The idle measurements were taken at the Windows desktop with the Aero theme enabled. The cards were tested under load running Bulletstorm at a 1440×900 resolution.

The bargain-basement-dwelling AMD and Nvidia cards are neck-and-neck at idle, but the Radeon is clearly the most power-efficient of the two under load. Interestingly, enabling the Intel IGP leads to power draw above that of the same system with a GeForce GT 430 or Radeon HD 6450. Intel’s HD Graphics 3000 silicon stays dormant when discrete graphics cards are connected, but clearly, it doesn’t sip power when active.

Noise levels

We measured noise levels on our test system, sitting on an open test bench, using a TES-52 digital sound level meter. The meter was held approximately 8″ from the test system at a height even with the top of the video card.

You can think of these noise level measurements much like our system power consumption tests, because the entire systems’ noise levels were measured. Of course, noise levels will vary greatly in the real world along with the acoustic properties of the PC enclosure used, whether the enclosure provides adequate cooling to avoid a card’s highest fan speeds, placement of the enclosure in the room, and a whole range of other variables. These results should give a reasonably good picture of comparative fan noise, though.

And this, folks, is one of the notable downsides of entry-level cards. They’re the loudest of the bunch despite their low power consumption, simply because they’ve been outfitted with small heatsinks and tiny, whiny fans.

Conclusions

To condense the lessons we learned from our performance testing, we whipped up another one of our famous scatter plots. We laid out the different offerings on the Y axis based on their average performance across all of our game benchmarks, and we positioned them along the X axis based on the prices of the cards themselves plus a sample system build worth $522.95. That build corresponds roughly, but not exactly, to our test rig. It includes a Core i5-2500K processor, our Zotac motherboard, a 4GB DDR3-1333 dual-channel kit, a 1TB Samsung SpinPoint F3 hard drive, and an Antec EarthWatts 380W power supply. All prices were collected from Newegg.

I could probably stop here, because the writing is on the wall. But let’s make things crystal clear.

First, the Radeon HD 6450 isn’t a particularly good deal for the money. The GeForce GT 430 can be had for about $5 more right now, and it was faster overall across our test suite.

Second, you shouldn’t buy either of those cards if you plan on doing any proper PC gaming. Yes, you’ll double your graphics card budget by opting for, say, a vanila GeForce GTS 450 for $110 instead of a Radeon HD 6450 for $55. If you account for the cost of the rest of the system, however, that budget increase doesn’t amount to much.

More importantly, the small overall price increase leads to a massive performance increase. You’re looking at literally three times the frame rate going from the 6450 to a GTS 450 or a 5770—perhaps a teeny bit less if you go with a vanilla card instead of one of the slightly souped-up models we tested, but still.

As we saw over the previous pages, entry-level cards like the 6450 may be faster than Sandy Bridge’s integrated graphics, but they still require serious compromises on the image quality front, and they can’t always churn out smooth frame rates even when you push all the quality sliders to the left. The Radeon HD 6450 might just be a good choice as an upgrade to a pre-Sandy Bridge system for someone who’s a very light gamer (think World of Warcraft and The Sims), who might want to use a triple-display config for whatever reason (dumpster diving, perhaps?), and has an extremely limited budget. Anyone else should probably stay clear.

Comments closed
    • DavidC1
    • 9 years ago

    Cyril: Your power consumption figures seem different from other sites. Maybe its the particular game that’s being benched. Could you confirm the suspicions for the HD Graphics 3000 again?

    [url<]http://www.xbitlabs.com/articles/video/display/intel-hd-graphics-2000-3000_10.html[/url<]

      • indeego
      • 9 years ago

      What particular bench?
      TR shows HD3000@91W@GPU/CPU load with a 2600K
      XBit shows HD3000@88W@GPU/CPU load with a 2500K

    • flip-mode
    • 9 years ago

    But I need 2GB of video RAM:
    [url<]http://www.newegg.com/Product/Product.aspx?Item=N82E16814161357[/url<]

    • rsaeire
    • 9 years ago

    One thing that some people are forgetting, or are maybe not aware about, is that the Intel IGPs still don’t handle 23.976 Hz content correctly, which most movies and Blu-ray content is authored in, thereby excluding them when deciding to go IGP or discrete for a HTPC build; see more info [url=http://www.avsforum.com/avs-vb/showthread.php?t=1313429<]here[/url<]. What this means is that a single fame will be repeated every 40 seconds or so and will look like judder when viewing; this is mostly evident in panning shots. Some people may not notice it, but others, like myself, can be more susceptible to it and it becomes annoying every time it happens. As I said, it might not be a big deal for some, but it's received a lot of attention from tech sites all over the net and it's definitely worth mentioning when comparing Intel IGPs to AMD/Nvidia's discrete options. Also, AMD's Brazos platform supports 23.976 Hz content fine, so this offers more incentive for Intel to get it right sooner rather than later.

      • derFunkenstein
      • 9 years ago

      Wouldn’t you need a 119.88Hz display to even tell the difference? I guess I’m just not up on it, but how is it handled otherwise?

        • swaaye
        • 9 years ago

        24 Hz or 72 Hz works well too for pure 24 fps sources (or inverse telecined 30 fps NTSC). Some TVs can output those refresh rates.

        60 Hz isn’t a major problem though because some content is true 30fps (or 25 fps) and in that case you wouldn’t want 24 or 72 Hz because you would be back to stutter.

        I try not to pay attention to stutter caused by this because the last thing I want to get into a habit of is tweaking refresh rates for every other video I watch! 😀

    • zqw
    • 9 years ago

    This does 3 monitors WITHOUT requiring DisplayPort? Is it the first? AMD website seems to back that up:
    Native support for up to 3 simultaneous displays
    * Up to 4 displays supported with DisplayPort 1.2 Multi-Stream Transport

    I was sad to find out the 3rd monitor details for my 6850. Even the cheaper passive adapters make it unappealing. And decent size DP monitors seem to start at $300 instead of $150. Are there any gaming cards that can do 3 non-DP monitors?

    Thanks for any info.

      • Kurotetsu
      • 9 years ago

      [quote=”zqw”<] This does 3 monitors WITHOUT requiring DisplayPort? Is it the first? AMD website seems to back that up:[/quote<] I don't think so. First, its worth mentioning that Multi-Stream Transport is the ability to daisy-chain multiple monitors off a single DisplayPort output. To get that 4 monitor config, you'd be hanging 2 off the 1 DisplayPort output, and the other 2 on the remaining digital/analog outputs. The AMD website also lists DisplayPort 1.2 as one of the default outputs for the HD6450: [url<]http://www.amd.com/us/products/desktop/graphics/amd-radeon-hd-6000/hd-6450/pages/amd-radeon-hd-6450-overview.aspx#2[/url<] [quote="AMD"<] Cutting-edge integrated display support * DisplayPort 1.2 o Max resolution: 2560x1600 per display o Multi-Stream Transport o 21.6 Gbps bandwidth o High bit-rate audio * HDMI 1.4a with Stereoscopic 3D Frame Packing Format, Deep Color, xvYCC wide gamut support, and high bit-rate audio o Max resolution: 1920x120013 * Dual-link DVI with HDCP12 o Max resolution: 2560x160013 * VGA o Max resolution: 2048x153613 [/quote<] In other words, theres nothing to suggest that you can get 3 monitors without DisplayPort. Just because the card TR is reviewing doesn't have one (and what outputs the card has is entirely up the manufacturers) doesn't mean you can get that without it. Though considering all the other HD6000 cards have a DP output, its unlikely you'll one without it. This is further supported by the footnotes on the same page (specifically Footnote #1): [url<]http://www.amd.com/us/products/desktop/graphics/amd-radeon-hd-6000/hd-6450/pages/amd-radeon-hd-6450-overview.aspx#4[/url<] [quote="AMD"<] To enable more than two displays, additional panels with native DisplayPortâ„¢ connectors, and/or DisplayPortâ„¢ compliant active adapters to convert your monitor’s native input to your cards DisplayPortâ„¢ or Mini-DisplayPortâ„¢ connector(s), are required. [/quote<]

    • indeego
    • 9 years ago

    Drivers. Drivers. Drivers! How do they perform! This is the whole reason we avoided Vista for a year!
    How is Intel’s Dual display? On XP it is atrocious. I actively avoided anything Intel for my users for Windows 7 because of it.
    2d performance?
    Windows Experience Index?
    Flash/video/youtubers?

      • flip-mode
      • 9 years ago

      All good questions.

    • WaltC
    • 9 years ago

    [quote<]Is the wind of change blowing at last? It would seem so, thanks in no small part to Intel's Sandy Bridge processors and their shockingly capable integrated graphics. With the performance floor set at a reasonable level for the first time in, well, forever, entry-level cards have a chance to shine. Or rather, they must, lest they be rendered irrelevant for good.[/quote<] [i<]shockingly capable integrated graphics[/i<]? Heh...;) I didn't see anything here, especially Sandy Bridge, that would qualify for that moniker. I would think these days that for "entry level" 3d graphics, you'd insist on something in the HD 5500 class (or up), and that if you didn't really want "entry level" 3d, then, yea--any old Sandy Bridge/Brazos IGP will do.

    • swaaye
    • 9 years ago

    Oh it’s more sticker-free cooler purity!

    GDDR5 has finally allowed these gimpy 64-bit cards to move to a new level but yeah this is definitely in the territory of “stick with the IGP or your old 2006-esque leftover card”, especially considering this probably sounds like a giant mosquito in your computer.

    • DancinJack
    • 9 years ago

    I’d like to commend the staff for actually listening and drawing suggestions from the comments. Not only has Cyril changed the charts that I mentioned, he has also updated the scatter plot already for those who don’t actually read the text of the review.

    BRAVO TR, BRAVO

    • bwcbiz
    • 9 years ago

    I wish you had thrown the HD 5570 or 5670 into the mix here. The 5770 isn’t exactly feasible for a live ITX build, no matter how you managed to cram it in for testing. Plus if the 6450 is that great at the entry level, it’s probably comparable to the 5570 for about $20 less.

    • kamikaziechameleon
    • 9 years ago

    I’m wondering if sandy bridges competence will drive down the cost of all GPUs to fill this void?

      • Farting Bob
      • 9 years ago

      newly released GPU’s like this one cant really get below $50, the raw materials, board cost, packaging and shipping limit how cheap it can get. What we can get though is progressively better performance for our money. But since IGP’s are set for a big performance boost with SB doing surprisingly not-as-rubbish-as-expected and AMD IGP’s being not far off the low end discrete, the sub $80 GPU market is in danger of going extinct.
      Luckily though these cards are ones that populate Brick/n/mortar stores to lure in unsuspecting shoppers with plenty of RAM, and that is all that many people seem to understand when it comes to buying them.

    • clone
    • 9 years ago

    I just wanted to mention a cppl things, 1st I’d reccomend using testing settings no lower than 1080p for a cppl reasons.

    1: it’s been standard on todays displays for a while
    2: these aren’t gaming cards anyway
    3: it sends a message that might make headway over time.

    2ndly history has shown the cards like the HD 6450 eventually sell for $30 so while the MSRP may be idiotic and a total ripoff I’d just throw in the footnote that for HTPC use they’ll eventually be an ok product when their price drops to $30

    to the first part I’d just believe it’d be doing a service to everyone to raise the bar to a standard minimum and go with it so that companies might take notice.

      • Meadows
      • 9 years ago

      Dumb, dumber, and [i<]dumberer[/i<]. First off, your points are nonchalant. Like aces170 pointed out, these chips need HTPC comparisons injected into reviews, rather than more and more boisterous HD gaming tomfoolery. As far as "gaming" goes, these cards are only good for several year old games and/or "casual" titles, and are not seriously intended as such, anyway. Your second bullet point illustrates your nonsense best: test with HD resolutions, because "these aren't gaming cards anyway"? Seriously? And "it sends a message"? People, let's sign up Priuses for truck derbies and tree trunk towing races, surely it will [b<][i<]send a message![/i<][/b<] What's the harm, they're [i<]not made for the purpose anyway[/i<]! Looking aside of your crap argument, it's also worth noting that you can't "just raise the bar". That has consequences, first and foremost in the power usage area. If AMD and nVidia wanted to make powerful gaming cards here, they would've focused on that. And oh look, they did, they made [i<]whole other cards[/i<] *just* for that! Imagine the surprise. These chips are for accelerating HD video playback, casual games, and quality driver and/or multi-monitor support, particularly for people who refuse to trust intel implementations. My work computer has an HD 4350 too, and imagine this: we don't use it for gaming, we simply need the straightforward drivers and the multi-monitor support.

        • clone
        • 9 years ago

        you are correct about 1 thing while a child with a flawed feeling of entitlement on the rest.

        my points are indeed nonchalant.

        these are $60 soon to be $30 video cards, but hey let’s compare $20,000 cars because you’re as trivial as silly.

        my nonchalant point is why bother with anything less than low detail 1080p res nowadays when it can be done with no tangible consequences, power useage…. lol yeah the HTPC will explode in flames if it is equipped with a passively cooled HD 5570 vs the HD 6450. lol.

        the truth is HD 6450 is deficient at $60 and LIano will signal it’s inevitable death along with it’s later brethren when launched if the bar isn’t raised.

        Meadows it’s 2011 not 2001, it’s time to raise the bar don’t fight it just love it.

        note: edited to change CPU from Bulldozer to LIano.

          • insulin_junkie72
          • 9 years ago

          [quote<]bulldozer will signal it's inevitable death[/quote<] Bulldozer doesn't come with a GPU (got to wait a year or so for it's successor that does) - it's old-school.

          • khands
          • 9 years ago

          Nah, should be 720p, cause that’s the resolution anyone will game with them on (an HTPC playing a game from 5 years ago at minimum eye candy running on a TV) unless they’ve got some nice perks at work. Plus I doubt you’d be able to even compare them to current IGPs at 1080p because they probably won’t even run.

          Also, you mean Llano, not bulldozer.

            • clone
            • 9 years ago

            my bad / appreciate the correction, gave you a +1 for that.

            an HD 5570 is an outdated card due for a refresh that can almost handle 1080p gaming today while selling for less than the HD 6450, it’s an easy fit for an HTPC, and can be passively cooled.

            so why does the HD 6450 exist at all? it’s 2011 and it’s time to raise the lowest bar so that all benefit, it’s time for the HD 6450 and it’s ilk to disappear entirely.

            I think it’s funny so many negative ranks on my first post like ppl are rushing to defend garbage….. “no don’t raise the bar when it’s available, no you are silly for pointing out the obvious, no I refuse to get more by paying less”…. hehe.

        • swaaye
        • 9 years ago

        I believe that most IGPs can to DVI/HDMI + VGA for dual monitor. But I wouldn’t be surprised if most of these low end cards sold for use as a secondary or tertiary output…

    • Hattig
    • 9 years ago

    Thanks for the review Cyril.

    I second that the graph for Left 4 Dead 2 should read ‘0’ for the Intel graphics, and you should have re-run the other cards at settings that the Intel graphics should have used.

    The most important result of this test is actually how appalling Intel’s graphics are in terms of power consumption under load. The results are so way out that I would initially think that something was wrong in the test – I presume all graphics cards were removed, etc…

    I do feel that the review is lacking HTPC specific testing. Low profile cards are ideal for this market. The HD54xx series had some downsides due to lacking shader resources for some functions – does Caicos resolve these issues?

    And ultimately, for a low-end gaming (with decent general computing use) rig at a fix price, what is better?

    -> Core i3 (or lower, $100 cheaper) + HD67xx
    -> Core i3 (or similar, $55 cheaper than 2500K) + HD6450
    -> Core i5 2500K (integrated graphics)

    Fixed budgets at the low end are … fixed!

    (as an aside, the HD6450 at 750MHz should have a similar processing power to the graphics within the XBox360).

    The 28nm generation will be interesting – that should bump shaders up to around 400 in the low end.

    • dragmor
    • 9 years ago

    It would be nice if you included some older mid range cards so we could see the improvement, something 3-4 years old like 3850, 8600, 9800, etc.

    From the systems I own, I can say that ION2 on a dual core ATOM is faster than a 6600GT with a single core A64. That in mind the 6450 is probably only a little slower than a 8800.

      • bimmerlovere39
      • 9 years ago

      8800GT = 9800GT = 4770 = 5670

      Roughly, I think that’s how things pan out. A 4770 can work for 1080p gaming, just.

    • Prospero424
    • 9 years ago

    But how much faster is it than the Radeon HD 5450? The big attraction of this level of card is, for me, the ability to run with a passive cooler, which the 5450 could do.

    Anyway, I thought it was kind of an oversight in the article. I still see people buying budget cards like this for well-used low- to mid-range PCs, and no one wants to wait for a cooling fan failure on a system they’re pretty much done spending money on.

      • Buzzard44
      • 9 years ago

      I believe the Radeon 5450 only has half (80 instead of 160) stream processors, and comes in different memory variants (DDR2 and DDR3) if I remember correctly. So I’d guestimate about twice the power.

      Aside, 6450 vs. i5-2500K is kinda useless competition to me, despite being an interesting comparison. Nobody with a 2500K will be in the market for a 6450. It would be like buying a 6870 to upgrade from a 6850 – lots of money, very little relative performance difference. If you don’t have Sandy Bridge, then you get a 6450 for cheap, low-power graphics. 2500K + 6450 is just a retarded configuration unless you happen to have both lying around.

      Edit: It is really nice to see a low-end GPU review though. Thanks, Cyril!

      • swaaye
      • 9 years ago

      You could probably run a 5770 passively with a relatively cheap Accelero S1 attached. I’ve cooled a 3870 and 8800GT with these. Also, I believe there are some 5750 cards that come fanless out of the box.

      My requirement with the low end cards is definitely fanless because I don’t think any of the 3450/4350/5450/6450 cards can be expected to be quiet. I’m sure they use these mosquito fans because they are cheaper than more metal in the heatsink.

        • OneArmedScissor
        • 9 years ago

        There were even fanless 4850s. For the cards that don’t run maxed out voltage and clock speed, cooling isn’t really all that demanding. Considering just about anyone is going to have a case fan right next to the wider types of heatsinks, the fact that it’s “passive” is moot beyond the fact that the fan won’t spin up and down. It’s still effectively there.

        If it weren’t for the price of copper continually rising, there would probably be numerous passive cards.

          • swaaye
          • 9 years ago

          I wouldn’t say passive is moot when it goes from highly annoyingly audible in most cases, often the loudest fan in a modern system, to inaudible.

          But I get the impression that most people don’t care about fan noise and that’s more the reason that there aren’t many fanless cards. People aren’t buying them in volume.

      • clone
      • 9 years ago

      they sell HD 5750’s passively cooled as well as HD 5670’s and HD 5570’s!!! their is no attractive reason for HD 6450’s to exist.

      multi monitor support can be handled via IGP and IGP’s are passively cooled and performance is good enough for anything the HD 6450 can handle.

      it’s 2011 and it’s time to raise the bar, HD gaming can be had for minimal expense even with passive cooling hell the passive cooled HD 5570 sells for the same as the HD 6450 while better than doubling it’s performance making it a complete load of trash.

      • rsaeire
      • 9 years ago

      As a HTPC card, the 5450 is perfectly capable of handling Vector Adaptive deinterlacing, the best deinterlacing option available across any graphics card, on video up to 1080i, so technically there is no need for a 6450. Though the 6450 is a more powerful card, as previously detailed, the only benefit I could see in buying this card over the 5450 is if a user wanted to enable Vector Adaptive deinterlacing in addition to one or more post-processing features, e.g. dynamic contrast, edge-enhancement, de-noise etc. While a user may want to enable these features, they end up causing visual anomalies and would only make the video appear worse and not better. As such, the 6450 is merely a card that fills out AMD’s graphics card range rather than adding anything of value.

        • swaaye
        • 9 years ago

        A Radeon 2400 can do vector adaptive interlacing and decode of VC-1 and H.264 1080p too. So they’ve had these features for several generations now.

        Actually the vector adaptive deint goes back to perhaps X800? It was only supported with MPEG2 DVD decoding back then however. ATI has been very competitive with video decoding hardware features since their Rage Pro.

        Also, it’s worth mentioning that these features only come into play with DXVA decoding. That tends to mean using software like PowerDVD, WinDVD, MPC + EVR + internal H.264, etc. Flash does do hardware decode but ignores extra features such as deinterlacing AFAIK. Silverlight only does a GPU-based scaling. Most other playback software doesn’t even try to use DXVA because apparently it’s difficult to implement and the docs are not easy to come by.

        Improvements since the 2400 have primarily been in the post processing features such as noise reduction and auto color tweaks, and these tend to be very iffy on quality benefit in my experience. So I don’t think there’s much reason to upgrade from an older GPU such as Radeon HD 2400+ or GeForce 8200+ for video playback. The 6450 here would maybe be a good option for someone wanting to move beyond Intel graphics if they are troublesome, assuming one doesn’t just want to buy a cheaper 5450 or 4350.

    • rechicero
    • 9 years ago

    Maybe I’m mistaken, but in the first page of GPU reviews, I couldn’t find one with only full system cost scatter graph. And I could find just one with it and the standard card cost scatter graph. I could find card cost scatter graphs for every single card (but this one, obviously).

    I suppose you didn’t notice, but it’s misleading. Very misleading. You can’t just change your criteria because is a cheap card and you don’t want to recommend it. There are other ways to do that.

      • dpaus
      • 9 years ago

      This……

      • grantmeaname
      • 9 years ago

      It’s being compared to an IGP. The IGP does not cost $0! Making the IGP cost $0 would be much more misleading than this, which is in no way misleading because that’s not what that word means.

        • dpaus
        • 9 years ago

        No, but as with the Left 4 Dead 2 graph, if one data point doesn’t fit, leave it off, don’t change the parameters so that it be squeezed in at the expense of consistency/integrity/validity. I nearly passed out when I saw that replacing my 5770 card was going to take ~$675.

        EDIT: ‘data integrity’, not ‘personal integrity’. But Cyril knows what I meant.

        • rechicero
        • 9 years ago

        But the IGP is an exception in the system scatter graph too. 2500 for stand alone cards, 2500K for IGP. Probably P67 mobo for stand alone cards, H67 for 2500K. The IPG point represent a system with another micro and probably another mobo. Misleading again.

        Why the change? Why didn’t he use the value of his test rig? The only possible reason to this is trying to compensate the extra cost of the K version = extra cost of the better IGP

        If you assume the 2500 as an IGP-less micro, as Cyril seems to do, then you can calculate a value for the IGP: 15$ according to newegg (difference from 2500 and 2500K) and there is no excuse to avoid the card price scatter graph.

        And, anyway, if, according to Cyril this card selling point is: “The Radeon HD 6450 might just be a good choice as an upgrade to a pre-Sandy Bridge system for someone who’s a very light gamer (think World of Warcraft and The Sims), who might want to use a triple-display config for whatever reason (dumpster diving, perhaps?), and has an extremely limited budget.”

        Why does he offer an scatter graph that doesn’t show this (his own) point? Isn’t that misleading? According to the system scatter graph, the good option for tighter budgets is the 430, not the 6450.

        Very misleading. I’m sure it was a mistake, but very misleading.

          • Cyril
          • 9 years ago

          I’ve updated the scatter plot in the article with pricing information for the Core i5-2500K, not the Core i5-2500, to keep things consistent with the performance data.

          The rest of the components were chosen because they’re cheaper but otherwise equivalent alternatives to our test components. Including our $150 power supply, for example, would have been far more misleading.

          Also, the notion of doing full-system scatter plots is absolutely not a new one. We’ve done it many times within the past year or so alone:

          [url<]https://techreport.com/articles.x/20562/10[/url<] [url<]https://techreport.com/articles.x/20255/10[/url<] [url<]https://techreport.com/articles.x/20087/10[/url<] [url<]https://techreport.com/articles.x/20037/10[/url<] [url<]https://techreport.com/articles.x/19871/10[/url<] [url<]https://techreport.com/articles.x/19330/10[/url<] [url<]https://techreport.com/articles.x/19342/8[/url<] [url<]https://techreport.com/articles.x/19162/11[/url<] [url<]https://techreport.com/articles.x/18799/15[/url<] [url<]https://techreport.com/articles.x/18581/16[/url<] [url<]https://techreport.com/articles.x/18448/17[/url<] Yes, the information in the 6450 scatter plot [i<]could[/i<] be misconstrued by someone who a) is accustomed to our single-component scatter plots and b) fails to read the text, which clearly explains the nature of this particular plot and discusses the value of faster cards in the context of a full system build. However, we simply can't write these reviews based on the assumption that readers will only look at the graphs/pictures. There are many nuances that cannot be conveyed in graph form alone.

            • rechicero
            • 9 years ago

            Nobody said the full system scatter plot was a bad idea. It’s a bad idea if you don’t add the component scatter plot, as you did in all the examples you provided.

            What changed for this card? As it looks like this is the first time you do a full-system scatter plot alone (no component scatter plot), I’d say it’s a valid question.

            If the card is meant to be a cheap upgrade for older systems, I’d say you should add the bang per buck (card) scatter plot. Just to prove your point.

            (Thanks for the change to 2500K, now the full system scatter plot has more sense).

    • CaptTomato
    • 9 years ago

    “”I’ve made no secret of my appreciation for Bulletstorm’s cathartic gameplay”””

    Time to hit the “doctors” couch champ, LOL.

    • Bensam123
    • 9 years ago

    Very strong opening. Loved the rhetoric and candor. Definitely didn’t expect you to write something like that.

    However, the benchmarking could use some tweaking as aces said. I would also suggest adding a extra chart with 2x and 4x anti-aliasing enabled at those low resolutions. Nvidia, Intel, and AMD all deal with anti-aliasing differently and the impacts it makes on low end cards would definitely show how efficient or inefficient their algorithms are. That is also where faster memory, more memory, and memory that isn’t even on the graphics card (turbo memory) come into play. All around adding anti-aliasing wouldn’t linearly effect all the cards.

    You really should’ve added SC2 and if possible WoW / GW to the tests as well. Those are definitely games this card will see some time in given the demograph that plays MMOs and the normal strain (or lack there of) it puts on your PC.

    It would be nice to see more price points on the pricing graph and not as a complete system (I know you guys like doing it, but I’ve never looked at the whole system price comparisons). Most people shop per component unless they plan on building a completely new system. Doing that they probably will visit the system builders guides instead of a single review though if they’re bent on assembling a whole system all at once without cherry picking the pieces.

    • HisDivineOrder
    • 9 years ago

    Now… we need a test of this card times three in Tri-Crossfire with a hexacore i7. YOU KNOW IT’LL ROCK!

    • aces170
    • 9 years ago

    Are these cards purchased with gaming in mind? I think not, most of the these were purchased with HTPC in mind. So it would be helpful if a more in-depth video performance chart is put up with various popular codecs out there. In addition, in video performance is fair to test videos beyond the 1080p category too.

    My two cents.

      • MrJP
      • 9 years ago

      Good suggestion, but if you’re doing that then I think you’d want ot scale back the CPU as well since the Sandy Bridge in this test easily has power to spare for video processing. However given the relatively affordable starting prices for lower end Sandy Bridge CPUs, then doesn’t this suggest that the HTPC niche for low end video cards is rapidly disappearing, unless video processing quality is still significantly better on the discrete cards? Could be an interesting test.

      • Meadows
      • 9 years ago

      Hear hear.

      • Bensam123
      • 9 years ago

      Agreed.

      • CaptTomato
      • 9 years ago

      Exactly……this is a sheer HTPC card…nothing more.
      People who’re tolerating really ugly GFX in the HD era need to be whipped.

      • dpaus
      • 9 years ago

      Maybe a bit of a variation on the old “If the only tool you have is a hammer…” – If the tests you’re comfortable doing are all games, then every card must be a gaming card.

      • swaaye
      • 9 years ago

      Modern IGPs can do HTPC just fine though.

        • Damage
        • 9 years ago

        Right. That rationale for low-end video cards just isn’t very compelling. It’s even less compelling for loud cards like this one.

          • Meadows
          • 9 years ago

          Well, nobody explicitly told the manufacturer to use a whiny fan, now did they? Someone should release an HTPC branded, downclocked, passively cooled mod, if the factory clocks are indeed too hot to touch.

            • Damage
            • 9 years ago

            It’s not our fault that IGPs became competent as HTPC display/video decode engines and cheap video cards keep getting loud fans, either. You seem to want to see a different review… of a different product. That’s nice and all but not very helpful feedback.

            • Meadows
            • 9 years ago

            Not to sound like an ungrateful ingrate or anything else that taunts to rhyme, but I haven’t noticed the review even *touching* on the subject. Delay the updated look at the issue until Llano comes out in numbers, if you have to, but what I believe most people here thought of is whether a discrete card of any kind is [i<]even worth it[/i<] for non-game purposes. Whether these cards can, for example, afford you a cheaper CPU in turn. Or whether a particular brand sucks at decoding acceleration to begin with. Maybe complete with a light glance at the apparent quality of drivers on a per-brand basis. You know, a little fan service for the obsessive-compulsive among us, and their families and close relatives as end-of-the-chain buyers. Not saying I'm a voluminous market for the idea myself, because as far as HTPCs go, I've only ever made one of those things in my life and have never contemplated one for myself at home, but all these people picking on the topic have made me that much more curious.

            • Damage
            • 9 years ago

            Yeah, video playback is worth testing, and we are due for another look at it. Perhaps on a card that’s actually quiet, where we have a better time window prior to the review… or something we can do after.

            • swaaye
            • 9 years ago

            Well, a 3-4 year old Radeon 3450 or GeForce 8400 can decode 1080p. I had a 3450 that would overheat and freeze because it had an inadequate fanless heatsink attached (oh gotta love the half assed nature of these low end cards). Of course the old 780G and GeForce 8200 IGPs can do it too and they don’t have fans. In other words we’ve had years worth of low-end HD video accelerating GPUs and IGPs now.

            However, I found that an ancient Athlon 64 X2 2.5 GHz could decode Bluray just as well in pure software and it didn’t put out enough heat to spin up the fan even in the slim case I was using. Unfortunately this old CPU will dust Brazos and dual core Atoms. By avoiding gimpy CPUs that can’t handle software decoding you avoid the possible showstopper if you try to play a 720p/1080p file from the web that isn’t encoded to the restrictions imposed by the hardware decoders.

            Basically I think that buying a sad little CPU is a bad call and so is buying these low end cards instead of an IGP for video. They certainly have very little gaming value and no real benefits to video outside of being able to use some more of the typically ugly-fying video post-processing options.

            • thermistor
            • 9 years ago

            Being someone who has purchased a whiny-fan 8400GS (gave to wife for dual monitor purposes), a fanless 3450 (gave to nephew for older online games), and a fanless 4550 (since replaced by X4500 integrated graphics), I agree with your analysis.

            HTPC or TVPC as I call my quaint little bedroom rig, integrated is the way to go. And Intel in their ‘4’ series chipset finally made an integrated graphics option where the chipset can run a core 2, a USB standard-def TV stick and not drop frames while watching/recording standard def TV.

            At lower res, this little 6450 would elevate games-in-a-pinch scenarios somewhat (Half Life 2 series?). But I have a soft spot for anyone who wants to drop $30-40 to make their budget rig a tiny bit more respectable. BTW, I never paid over $30, shipping included, for any of my lower end cards.

    • jensend
    • 9 years ago

    It’s kind of comforting to see the comparison with the HD3000’s power consumption under load. Intel’s IGP power consumption advantage just doesn’t hold up once they start trying to make their IGP competitive performance-wise. This means that Intel is still sufficiently behind nV and AMD that 1) discrete cards- and the competitive market in GPUs- aren’t very seriously threatened yet and 2) Llano can likely use some of AMD’s GPU efficiency advantage to help offset their CPU power/thermal disadvantages, helping keep that market competitive as well.

    On another note, this comparison brings to mind once again the fact that consumers got a really bum deal (and that antitrust law failed to do its job) when Intel killed the 3rd-party chipset market.

    Nonetheless, chips like Caicos are not long for this world. Its advantage over Intel is way too slim to merit $55; the only reason to buy this is to minimize cost for a build with no IGP. I’d like to have higher hopes for Turks, but the “OEM-only” release doesn’t inspire lots of confidence.

      • Veerappan
      • 9 years ago

      Additional reason for low-end cards:

      Adding additional monitor outputs to an IGP system. Although that won’t really work with Sandy Bridge, which disables the IGP in the presence of a discrete card. It will work with most other IGP systems that are out there, though.

        • insulin_junkie72
        • 9 years ago

        [quote<]Adding additional monitor outputs to an IGP system. Although that won't really work with Sandy Bridge, which disables the IGP in the presence of a discrete card.[/quote<] If the IGP gets disabled, that's a board/BIOS issue, not the chipset - probably like many, but not all, boards with integrated graphics historically have freaked out if you try and stick a non-video card in their x16 PCI-E slot. I've run both discrete and on-chip GPU* at the same time on my ASRock H67 board. My ASRock even has a few BIOS settings that deal with that scenario. * in a multi-display setup, this also gets you around the QuickSync w/ a discrete video card issue, too.

      • OneArmedScissor
      • 9 years ago

      As is standard fare for Intel, they’re using simpler chips than everyone else and brute forcing clock speed to make up the difference. With power gating larger chips and switchable graphics becoming so prevalent, they’ll only get away with it for so long.

    • can-a-tuna
    • 9 years ago

    HD6450 is clearly meant for lower end of graphics segment than GT430. Average price of GT430 in newegg is 75$ and some models even go 85$. Comparing cheapest price GT430 and “asking” priced HD6450 is more or less biased. Come back with price-performance conclusions when there is 10 or more Radeon models available.

      • Cyril
      • 9 years ago

      If Radeon HD 6450 cards can be found for less than $55 when they hit e-tail listings, then I’ll be sure to update the review.

      The fact remains that the GT 430 has gotten cheaper since its launch: I see two variants priced at $60, one priced at $65, and three priced at $70 at Newegg right now. The “average” pricing is indeed closer to the GT 430’s old launch price of $79.99, but I think that would have little relevance to a consumer comparing the two products.

    • ShadowTiger
    • 9 years ago

    Thanks for this article… it turns out i was overestimating the performance of the Sandy Bridge chips.

    It will be interesting to see how the fusion chips compare though that probably would only tempt me in a laptop.

    • swampfox
    • 9 years ago

    Glad to see a different Civ V leader for once. Montezuma’s stare always makes me feel like I owe him something…

      • Meadows
      • 9 years ago

      I think it’s just that Scott digs American history while Cyril prefers flashing Europe a bit. Or maybe skin reveal and makeup versus men in togas.

      Who knows.

    • jensend
    • 9 years ago

    “The Intel IGP appears to do well here, but it cheated—by repeatedly making Left 4 Dead 2 crash when we set the “shader detail” setting to “high” or “very high.” We had to settle for the “medium” setting, which is less taxing.”

    What??? Look, that doesn’t excuse putting two entirely incommensurate numbers in the same graph. If you want to report the “very high” results, that’s fine- but don’t put the Intel result in the same graph; if you want to compare the Intel result to something, turn the option down for the other cards too and rerun the test.

    As it stands, this is no better than “the weather in Bamako appears moderate here, but that’s because we reported it in Celsius while the rest of the temperatures are Fahrenheit” or “The Intel IGP appears to do well here with its score of 45, but that’s because while the other bars report framerates, the Intel score reflects the number of cashews Cyril ate during the test.”

      • Bensam123
      • 9 years ago

      Wow, I didn’t even catch that because I skimmed the charts. Yes, definite no-no.

      Reliability and validity are cornerstones of any research. If it’s not reliable it can’t be valid. Extremely misleading and makes the graph as well as the other results it’s comparing it to completely pointless. That doesn’t even take into account reputation repercussions and everything else that goes along with it. We definitely need to get more people to vote this up so it gets noticed.

      Good catch Jensend.

      • jensend
      • 9 years ago

      Perhaps somebody on the TR staff can explain why the lapse? Or promise that as penance they’ll all be (re-?) reading Tufte’s “Visual Display of Quantitative Information” and revisiting the principles of accurate data representation?

      • Mystic-G
      • 9 years ago

      Clearly they should have created a separate graph to clarify what was going on. It is quite misleading.

    • DancinJack
    • 9 years ago

    I like that we have Cyril doing some GPU reviews now. It’s fun to see the different writing styles while still covering the same technology.

    I don’t know if you guys care, but you and Scott represent your power consumption charts differently. You have highest power at the top while Scott puts the juice sippers at the top. Doesn’t really matter to me, I just figure you might want to stick with one convention. Same with your noise level charts.

      • Bensam123
      • 9 years ago

      Yeah, the extra manpower is also welcome. There are rarely budget reviews on here as well as other types of reviews. Cyrils articles aren’t exactly breathtaking, but they’re definitely improving. I’m sure with more time he’ll put out some good ones.

      Agree with conformity among charts for fast referencing and not being misleading.

    • BoBzeBuilder
    • 9 years ago

    First. Great job Cyril, two GPU reviews in one week.

    Just bought a GTS 210 for an office machine. I didn’t bother look for anything newer, what’s the point when its rendering spreadsheets.

      • ClickClick5
      • 9 years ago

      The Microcenter near my house in Denver still sells the Geforce 5200…..(bites tongue)….It will do spreadsheets…..

        • paulWTAMU
        • 9 years ago

        yeah, I’ve seen that and the 5500 in WalMart fairly recently…does it have *any* improvement over IGPs now?

          • Meadows
          • 9 years ago

          No, IGPs are markedly better, so it’s literally a waste of cash. Catches dust and complicates airflow, adds noise, has inferior decoding and standards support compared to a new IGP, draws additional power for inferior performance, and there are no official drivers released for the 5000 series anymore and it’s not officially supported at all in Windows 7 to begin with. (You can, however, still find old drivers for usage in XP. Pensioner videocard meets pensioner OS.)

            • bthylafh
            • 9 years ago

            One hopes they’d not stick something like that in a machine with newer IGP, considering that it’s an AGP card that wouldn’t even fit into the PCIe slot.

            Derp.

          • bthylafh
          • 9 years ago

          I can just see your average Walmart luser looking at the boxes and thinking “it says 5500 on it, that’s better than this other one that only says 4850, and look at all that RAM! What a bargain!”.

        • bthylafh
        • 9 years ago

        Yech. I don’t believe you can use that with Win7.

        • Lazier_Said
        • 9 years ago

        Microcenter here shows 5 new units of “Diablotek VARX-8P ATi Rage XL 8MB SDRAM PCI Video Card” and 8 more of “Diablotek VAR128P-32P ATi Rage Pro 32MB SDRAM PCI Video Card”.

        Makes a FX5200 seem pretty spectacular.

          • bthylafh
          • 9 years ago

          Wow. We used a few of those Rage XLs about seven-eight years ago for server video cards, mostly because my boss didn’t believe in on-motherboard anything.

          Do they have any Trident stuff, perchance? 😛

            • ClickClick5
            • 9 years ago

            HA! I have one of these puppies in the basement if anyone wants it. 1MB of ram, come on now! I’ll sell it for $5! :p

            This guy lasted through the Win95 era…..ah…..floppies, Pentium MMX, BSOD on a bi-daily basis. 16 bit color and 800×600 res. Talk about being a “gangsta”.

            [url<]http://bryankollar.com/shopping/images/trident%20tgui9680.jpg[/url<]

            • yuhong
            • 9 years ago

            In fact, Rage XLs were often themselves integrated into server motherboards.

      • phaxmohdem
      • 9 years ago

      In my opinion these cards should be marketed (and perhaps benchmarked) with business users in mind rather than gamers. Sure then CAN game under certain conditions, but where I work we purchase low brow cards like this, or even 8400GS’s for the sole purpose of enabling dual monitor support on our Core2 HP machines that have integrated Intel craptasticness and only one video output.

      The problem is that most modern business PCs now have integrated graphics that natively support dual video outputs, which combined with the increasing horsepower of chips like Llano, and Sandy Bridge will render this market segment essentially useless. These may be the last of their kind.

Pin It on Pinterest

Share This