Battle of the Radeon HD 6950s

We aren’t graced with a cornucopia of tightly spaced game releases every fall. This one looks particularly bountiful, though. Rage is fresh out the door, and still to come are juggernauts like Battlefield 3, Call of Duty: Modern Warfare 3, Batman: Arkham City, and The Elder Scrolls V: Skyrim. As the days get shorter and the weather colder, some of us are preparing to shutter ourselves in front of our PCs for a whole season of gaming.

Some of us are also going to want to grab new graphics cards in order to fully enjoy this latest round of releases. Therein lies a problem. As plentiful as these fresh titles will be, they’ll still be no match for the sheer quantity of graphics cards in the market. What to choose?

Avid TR readers will know to check our existing portfolio of GPU reviews for the goods on the current crop of chips from AMD and Nvidia. Silicon is only part of the story, though. Even when they’re based on the same GPU, graphics cards can come with different coolers, various combinations of higher-than-stock core and memory clock speeds, and other niceties like bonus display outputs and free bundled games. They’re all priced differently, too, sometimes quite far apart from each other. Choosing a given GPU might not be so tough, but selecting the right card based on that chip, as it turns out, is a whole other story.

Enthusiasts seeking the sweet spot in graphics performance would do well to consider the Radeon HD 6950 1GB. Cards can be snagged for less than $250, and they deliver ample performance at resolutions up to 1080p, making them great choices for the next wave of games. If you’d like more details about what makes the 6950’s Cayman graphics processor tick, check out our original review of the 6900 series. Today, we’ll be focusing on the attributes of Radeon HD 6950 1GB cards from Gigabyte, MSI, and XFX. By the time we’re done, you should have a clear picture of not just which card is the fastest, but also where they stand in terms of noise levels, power consumption, overclocking potential, and bang for your buck.

Gigabyte’s GV-R695OC-1GD

Let’s start alphabetically with our Gigabyte offering. Pardon the awkward-to-read model number, but with card makers peddling multiple variants based on the same GPU, we need to identify the precise model we’re going to review. The GV-R695OC-1GD is perhaps the most visually striking of the cards we’ve assembled, but mercifully, it’s not because of gaudy decorations. The 11.4″ circuit board is dotted with not one, not two, but three PWM fans. These fans rest on a vapor-chamber cooler with three copper heatpipes poking out. It’s quite a sight to behold.

The ample cooler is appropriate, because this bad boy features the highest core clock rate of our three contenders: 870MHz, well above AMD’s 800MHz stock speed. The gigabyte of GDDR5 memory runs at the AMD-prescribed 1000MHz, however, for an effective transfer rate of 5.0 GT/s. Gigabyte has eschewed the reference port arrangement, trading the two mini-DisplayPort outputs for a single, full-sized DisplayPort out. There are still two DVI outputs and one gold-plated HDMI port next to that.

This card belongs to Gigabyte’s Ultra Durable VGA series, which means its circuit board packs twice as much copper as traditional designs, “1st tier” memory from either Samsung or Hynix, solid-state Japanese capacitors, ferrite-core chokes, and low-RDS MOSFETs. In simpler terms, Gigabyte claims these perks enable 5-10% lower GPU temperatures, 10-30% greater overclocking capabilities, and a 10-30% reduction in power switching loss compared to the reference design. We’ll see how the card compares to the competition in our power and noise testing shortly.

The bundled CD doesn’t include any overclocking software—just drivers and, for some reason, the Yahoo! Toolbar. However, Gigabyte has a utility called EasyBoost that can be downloaded from its website. EasyBoost lets you adjust clock and fan speeds in pretty much the same way as AMD’s Overdrive control panel, and like it, EasyBoost limits overclocked speeds to 900MHz for the GPU and 1325MHz for the memory. On the upside, you can use the tool to update the card’s VGA BIOS.

Gigabyte’s GV-R695OC-1GD sells for $239.99 before a $20 mail-in rebate at Newegg. Right now, it also ships with a free copy of Codemasters’ DiRT 3. The card is covered with a three-year warranty.

Read on for the skinny on our other two contenders.

MSI’s R6950 Twin Frozr III 1G/OC

Our second contender, the MSI R6950 Twin Frozr III 1G/OC looks a bit less extravagant (and shorter, at 10.6″) than the Gigabyte card. With dual cooling fans and no fewer than five heat pipes, the Twin Frozr III is by no means subdued.

MSI claims its cooling solution pumps 20% more airflow and generates 13.9 dB less noise than the reference AMD cooler, all the while maintaining lower temperatures. That’s not all. Tucked away at the edge of the PCB, almost hidden under the cooling shroud, lies a slide switch for toggling between two fan presets.

Out of the box, the card comes with the switch in the position pictured above—”Performance.” You’ll want to flick it to the right to select the “Silence” preset. We’ve tested both, so stay tuned for decibel numbers.

Just like its Gigabyte rival, MSI’s Twin Frozr III ventures beyond AMD’s reference specifications. MSI has pushed the core speed from 800 to 850MHz and the memory speed from 1250 to 1300MHz, yielding a 5.2 GT/s peak transfer rate between the GPU and the accompanying 1GB of GDDR5 memory. MSI also touts a special sauce that purportedly makes the card more stable, durable, and tolerant of high speeds than the reference design: a 6+2-phase power distribution system, high-c solid-state capacitors, ferrite-core chokes, and the ability to tweak GPU, memory, and PLL voltages independently (which requires the latest Afterburner beta from MSI’s website).

Unlike Gigabyte, MSI sticks with the default port arrangement—dual DVI, dual Mini DisplayPort, and one HDMI output. A DisplayPort adapter comes in the box, in case none of your displays have Mini inputs. Dig further into the box, and you’ll find a CD containing MSI’s Live Update 5 and Afterburner utilities. Live Update 5 is designed to grab BIOS and driver updates automatically. Afterburner, meanwhile, offers finer fan speed and overclocking controls, not to mention more thorough monitoring capabilities, than AMD’s Overdrive control panel.

As you can see in the screenshot, Afterburner lets users manually set a response curve that dictates how fan speeds should ramp with the GPU temperature. That feature comes especially handy for folks who want their cards to stay as quiet as possible at idle. Out of the box, Afterburner exposes the same amount of overclocking headroom as AMD’s Overdrive control panel—that is, the card is limited to 900MHz core and 1325MHz memory speeds. The latest beta of the Afterburner software lets you go beyond that but requires editing a configuration file, and MSI tells us the functionality isn’t officially supported.

You’ll find the MSI R6950 Twin Frozr III 1G/OC at Newegg for $259.99 before a $25 mail-in rebate. The card comes with a three-year warranty.


Finally, we have our third contender, which you might have seen duking it out with a GeForce GTX 560 Ti not long ago.

Like the MSI model, the XFX HD-695X-ZDDC has a pair of fans and a relatively short (10.1″) circuit board. The card’s dark cooling shroud makes it look little sleeker and stealthier than the MSI design. XFX uses only three heat pipes to connect the heatsink and GPU, but those heat pipes are made of copper.

So far as we can see, XFX makes no claims about exceptionally durable or high-quality components. However, it boats that XXX Edition cards like these have been subjected to “rigorous testing methods, including 55°C burn chamber testing, stress testing, even shock & vibe testing.” The HD-695X-ZDDC has the lowest GPU speed of the three cards we’re testing—830MHz, up from the stock 800MHz. Its 1300MHz memory clock matches the MSI offering and has a 50MHz advantage over both the reference design and the Gigabyte card.

XFX doesn’t offer custom overclocking software, but AMD’s Overdrive control panel takes care of the basics—and its interface isn’t ugly as sin, which is always nice. The overclocking limits are even harsher with this card, though: 840MHz for the core and 1325MHz for the memory.

Right now, Newegg sells the XFX HD-695X-ZDDC for $269.99 before a $30 mail-in rebate, with a free copy of DiRT 3. Unlike its counterparts, the card comes with lifetime warranty coverage. You’ll need to register on XFX’s website to quality for lifetime coverage, otherwise the warranty tops out at two years.

Our testing methods

Since the purpose of this article is to compare three very similar cards, we’ve limited our benchmark suite to four games and left out competing products from Nvidia. If you’d like to see how the Radeon HD 6950 compares to a wider range of competitors, we suggest studying our XFX 6950 vs. Zotac GTX 560 Ti comparison from last April and our initial review of the GeForce GTX 560 Ti from January, which includes numbers for a stock-clocked Radeon HD 6950 1GB.

We conducted testing using the Catalyst 11.10 Preview 2 driver from AMD. We left optional AMD optimizations for tessellation and texture filtering disabled. As ever, we did our best to deliver clean benchmark numbers. Tests were run at least three times, and we’ve reported the median result. Our test system was configured as follows:

Processor Intel Core i5-750
Motherboard Asus P7P55D
North bridge Intel P55 Express
South bridge
Memory size 4GB (2 DIMMs)
Memory type Kingston HyperX KHX2133C9AD3X2K2/4GX

DDR3 SDRAM at 1333MHz

Memory timings 9-9-9-24 1T
Chipset drivers INF update

Rapid Storage Technology

Audio Integrated Via VT1828S

with drivers

Graphics XFX HD-695X-ZDDC

with Catalyst 11.10 Preview 2 drivers

Gigabyte GV-R695OC-1GD

with Catalyst 11.10 Preview 2 drivers

MSI R6950 Twin Frozr III 1G/OC

with Catalyst 11.10 Preview 2 drivers

Hard drive Samsung SpinPoint F1 HD103UJ 1TB SATA

Western Digital Caviar Green 1TB

Power supply PC Power & Cooling Silencer 760W

OS Windows 7 Ultimate x64 Edition

Service Pack 1

Thanks to Asus, Intel, Corsair, Kingston, OCZ, Samsung, and Western Digital for helping to outfit our test rigs with some of the finest hardware available. XFX, Gigabyte, and MSI for supplying the graphics cards for testing, as well.

Unless otherwise specified, image quality settings for the graphics cards were left at the control panel defaults. Vertical refresh sync (vsync) was disabled for all tests.

We used the following test applications:

Some further notes on our methods:

  • Three of the four games we tested lacked scripted, repeatable benchmarks, so we used the Fraps utility to record frame rates while playing a 90-second sequence from each game. Although capturing frame rates while playing isn’t precisely repeatable, we tried to make each run as similar as possible to all of the others. We tested each Fraps sequence five times per video card, in order to counteract any variability.

  • We measured total system power consumption at the wall socket using a P3 Kill A Watt digital power meter. The monitor was plugged into a separate outlet, so its power draw was not part of our measurement. The cards were plugged into a motherboard on an open test bench.

    The idle measurements were taken at the Windows desktop with the Aero theme enabled. The cards were tested under load running Bulletstorm at a 1920×1200 resolution with 4X antialiasing.

  • We measured noise levels on our test system, sitting on an open test bench, using a TES-52 digital sound level meter. The meter was held approximately 12″ from the test system at a height even with the top of the video card. In order to lower our noise floor, we conducted our noise testing with the system powered by a Corsair HX750 power supply, not the PC Power & Cooling Silencer 760W.

    You can think of these noise level measurements much like our system power consumption tests, because the entire systems’ noise levels were measured. Of course, noise levels will vary greatly in the real world along with the acoustic properties of the PC enclosure used, whether the enclosure provides adequate cooling to avoid a card’s highest fan speeds, placement of the enclosure in the room, and a whole range of other variables. These results should give a reasonably good picture of comparative fan noise, though.

  • We used GPU-Z to log GPU temperatures during our load testing.

The tests and methods we employ are generally publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.


I’ve made no secret of my appreciation for Bulletstorm‘s cathartic gameplay and gorgeous environments, so it seems like a fitting start to our round of benchmarking. This shooter was tested at 1920×1200 with 4X antialiasing and the detail settings cranked up to “high.”

Since this game lacks a built-in benchmarking mode, I played through the first 90 seconds of the “Hideout” echo five times per card, reporting the median of average and low frame rates obtained.

The Gigabyte card might have a slightly higher core clock speed, but the MSI wins this round, probably thanks to its quicker memory. Chances are you won’t be able to tell the difference between the three contestants, though, especially when using a 60Hz display.

Crysis 2

Crytek’s new cross-platform shooter has garnered criticism for omitting features like DirectX 11 shader effects and being less PC-focused than the original Crysis and Far Cry. Nevertheless, this title can make a high-end gaming PC sweat—and it looks gorgeous.

Crysis 2 is another one of those games without a built-in benchmark, so I resorted to 90-second Fraps sessions, just as with Bulletstorm. I ran and gunned my way through the game’s version of Battery Park five times per card, sticking to the same path through the level to avoid drastic differences between samples. The game was set to run at a 1920×1200 resolution with the “Extreme” detail preset. (We eschewed the game’s DirectX 11 patch, since it seems to be poorly optimized, and our 6950 cards already got a little choppy with the “Ultra” preset.)

The Twin Frozr III pulls off the highest median frame rate again, albeit in a tie with the Gigabyte card, which has a higher minimum frame rate. Here, too, the differences between individual cards are negligible.

Deus Ex: Human Revolution

Although it wasn’t developed by the same studio as the original Deus Ex, Ubisoft Montreal’s Human Revolution really takes after its predecessor, featuring the same first-person, stealth-oriented gameplay and strong RPG elements. The game’s Crystal Dynamics engine serves up some rather nice graphics, too, with support for DirectX 11 tessellation and all kinds of visual bells and whistles. It doesn’t hurt that the game is a lot of fun.

We tested Deus Ex: Human Revolution by running through the beginning of the game’s first real mission at the Sarif Manufacturing plant. We used a mixture of stealthy sneaking and gunplay, since the game allows for both approaches. The game was run at 1920×1200 with high FXAA antialiasing, 16x anisotropic filtering, soft shadows, high screen-space ambient occlusion, high depth-of-field, and both post-processing and DirectX 11 tessellation enabled.

DE: HR seems to coax greater performance separation out of our trio of cards, with the MSI card beating the XFX by 3-4 FPS. Gigabyte finds itself in the middle of the pack.

TrackMania 2 Canyon

Released by little-known development studio Nadeo last month, TrackMania 2 Canyon blends gorgeous graphics with wild, arcade-style racing that involves loop-the-loops and bursts of speed that would put a maglev train to shame.

We tested TrackMania 2 Canyon using the built-in benchmark at 1920×1200 with FXAA, 16x anisotropic filtering, and the “very nice” detail preset.

This little racing game is more demanding than Bulletstorm and Deus Ex, believe it or not, with our cards producing lower frame rates here than in those other games. Gigabyte gets the gold medal in TrackMania 2, and the order of the cards on the podium suggests this game favors higher GPU speeds more than faster memory speeds.

Power consumption

Interesting. The MSI card outran the competition in three out of our four games, yet it’s the most power-efficient of the bunch.

Noise levels and GPU temperatures

Based on our temperature and noise results, it seems that the XFX card is biased toward lower noise levels rather than cooler temperatures. The other contenders seem to balance the two a little more gracefully—especially the MSI card with its fan switch set to quiet mode.

That said, differences of 1-2 dB are fairly inconsequential. When noise readings are this close, actually listening to the cards and determining the pitch of the noise they produce is helpful. To my ears, our three offerings sound roughly the same at idle but quite different under load. The MSI card produces more of a whine when running a game, even with the quiet fan profile selected. The XFX produces a blend of electrical chirping and soft humming, while the GIgabyte cooler makes more of a wooshing sound with no real tone to it. Paradoxically, despite its slightly higher decibel output, the Gigabyte card seems to sound the least bothersome under load.


All three of our contenders have higher clock speeds than AMD’s reference design, but that doesn’t mean they’re entirely lacking in overclocking headroom. Quite the opposite, in fact.

The MSI and Gigabyte cards both climbed happily up to the maximum speeds allowed by their respective overclocking utilities out of the box—900MHz for the core and 1325MHz for the memory—without instability or apparent visual artifacts. The XFX card also reached its peak without issue, although the Overdrive control panel would only let us push it to 840/1325MHz. Here’s how the cards performed in TrackMania 2 at these overclocked settings:

Not bad for a free performance jolt. Unsurprisingly, the Gigabyte and MSI cards are neck and neck when overclocked, while the XFX card drags its feet a little. As always with overclocking, though, your mileage may vary.


Before we render our final verdict, let’s look at how the cards compare in one of our famous scatter plots. We’ve plotted the cards based on their average performance in our benchmarks on the Y axis and their price on the X axis. We haven’t taken overclocked performance, temporary rebates, or game bundles into account.

Gigabyte GV-R695OC-1GD

October 2011

Gigabyte’s entry is the cheapest of the pack at only $239.99 before a $20 mail-in rebate—and it comes with a coupon for DiRT 3. Despite its low price, this card performed almost as well as the MSI model overall out of the box, as its position in the scatter plot illustrates. Overclocking and cooling performance were identical, too, and the Gigabyte card’s cooler was easier to tune out on our open test bench. (Our decibel meter did pick up a slightly higher reading from it, but we’re talking about such a small delta that the subjective pitch of the noise was more important.) With its only real downsides being a rather long circuit board and slightly higher idle power consumption than the competition, the Gigabyte GV-R695OC-1GD is deserving of our Editor’s Choice award.

MSI’s entry produced the highest overall frame rates, consumed the least power, shipped with the best overclocking and fan control software in the box, and overclocked effortlessly. Our scatter plot suggests the Twin Frozr III is second-best in terms of performance per dollar at $259.99, before a $25 mail-in rebate. That makes it worthy of our TR Recommended award.

Finally, the XFX model is the most expensive at $269.99, although a rebate knocks it down by $30 right now. This is the slowest of the three cards. While it has the quietest cooler according to our decibel meter, it produces an electrical chirping sound that makes it harder to tune out, and its GPU runs quite a bit hotter than the other solutions. XFX’s lifetime warranty nevertheless beats anything the competition has to offer.

As we said earlier, the performance differences between these three Radeon HD 6950 1GB variants amount to very little—and may be difficult or impossible to notice. That leaves us thinking most folks will be better off going with Gigabyte’s GV-R695OC-1GD, despite its long circuit board. The mail-in rebate and free game only sweeten the pot, as do the overclocking results we witnessed and the quiet cooler. Just keep in mind that overclocking success can’t be guaranteed, and that rebates and game bundles are often ephemeral.

If you don’t mind paying a little more and don’t care for DiRT 3, then you may want to consider the MSI card, which is the fastest out of the box and the most tweakable. Folks with a golden ear may want to given the MSI a long look (or listen), too, since its Afterburner utility has impressive fan control options. We still think the Gigabyte card sounded quieter, though.

Comments closed
    • kamikaziechameleon
    • 11 years ago

    We need a new roundup of GPU’s and BF3 benchmarks.

    • pogsnet1
    • 11 years ago

    Thumbs up MSI, the fastest in the bunch yet the lowest power consumption, even though all is negligible.

    • dpaus
    • 11 years ago

    Close; was working my ass off to make wife happy. When she ditched me, I had no one left to make happy but myself. Results were entirely agreeable 🙂

    • dpaus
    • 11 years ago

    I rode right past the TR BBQ last summer, probably less than 1 mile from it, but unfortunately, 4 days too early (was on my way to Sturgis).

    • dpaus
    • 11 years ago

    I can get 27 – 28 mpg on the highway, cruising at 60 mph. I consider that fantastic for this car.

    City mileage? My doctor says I shouldn’t talk about that…. But at the same time, when the weather is nice and the T-roofs are off, I don’t care how much gas I burn cruising around town, I look [i<]cool[/i<] 🙂

    • JustAnEngineer
    • 11 years ago

    Price-wise, the Radeon HD6950 costs the same as a GeForce GTX560Ti.

    Performance-wise, a pair of Radeon HD6950s in Crossfire-X edges out a pair of GeForce GTX570s in SLI, but the GeForce 570 cards cost 50% more.
    [url<][/url<] For single-card performance, the Radeon HD6950 falls somewhere between the two GeForce cards.

    • indeego
    • 11 years ago

    Last page mistake:
    “despite is long circuit board”

    • BobbinThreadbare
    • 11 years ago

    I think you go with price. The idea is to see what the best card you can get for x dollars is.

    • LoneWolf15
    • 11 years ago

    Only the at-release cards are identical. Vendors generally use AMD’s reference design for the first ones. Reference cards usually are overbuilt, because the design is meant for stability.

    It used to be that Rev.2 cards were the chance to make a card even better. Now they can go either way. Some vendors change the voltage regulation, for better –or for cheaper. Either way, I haven’t seen anyone have a lot better luck overclocking a non-reference design over reference –though non-reference fans can be quieter.

    I usually go reference design.

    • LoneWolf15
    • 11 years ago

    Does she game? If not, why give her the 6950?

    I just rebuilt my wife’s system. She’s online regularly, but the most she ever plays is Zuma Blitz, so I got a deal on a Geforce GT 430 and put that in.

    She probably doesn’t even need that much, but when bought for $35 shipped slightly used on someone’s FS/FT forums, with a 2 x 2GB DDR3 kit thrown in for free, I couldn’t beat that with a big stick. My wife gets great stuff from me, but my leftovers are usually massive overkill.

    Of course, it makes a good addition to my small folding farm… 😉

    • LoneWolf15
    • 11 years ago

    6950 cards no longer support shader unlocks; that was reference-design only.

    • willyolio
    • 11 years ago

    rumors say they are attempting a release before christmas, but supply will be very limited even if they manage it. probably late jan by the time you can actually buy one, earliest.

    • kamikaziechameleon
    • 11 years ago

    I was noticing more bugs that any frame rate drops. I assume as they remove bugs we will also see further optimizations on both drivers and in the game engine post launch so I expect to get continually better frame rates as time goes on without changing hardware.

    That being said I do want some new hardware but I’m not expecting the return on performance improvement by making the jump just yet. My 1090T really like BF3.

    • shaq_mobile
    • 11 years ago

    yeah my 470 and i3 530 @ 3.6ghz seems to be fine. though it does get a little chunky in caspian.

    • shaq_mobile
    • 11 years ago

    LOL. It’s strange how often I see people who really do appear to hate their jobs after making years of sacrifice to get there. My job is pretty relaxing and I have a wonderful home life. Very low stress. I suppose being single has it’s advantages. 🙂

    • shaq_mobile
    • 11 years ago

    LS6 are so sweet. I wish I had one but all the local junkyards want 1k+ for a wrecked one and don’t guarantee them to not be smashed on top of charging a pulling fee. That and it took me 3 months to build my 383 vortec and drop it in. Who knows how long it would take for an LS6. If I had a shop or even just a garage I’d do it. I had to do this last engine drop in my parents driveway during the summer. 🙂

    326 ponies at the wheel is fantastic. What mileage you see?

    • kamikaziechameleon
    • 11 years ago

    Why are Nvidias products not priced accordingly??? I understand the 580 and 590 are kinda peerless cards but the 570 and 560 always seem to be about 50 dollars to 100 dollars more than they should be.

    I would buy a Nvidia card for physX but the premium is to much for me to absorb on a feature and only 1 in 10 games feature anything to do with it, not to mention nvidia themselves have been making goofy moves with regards to turning off secondary physx cards running next to AMD GPUs I’m kinda annoyed with their behavior.

    The 6850 2 Gb cards really appeal to me since I run multiple displays and aspire to get a 2560×1600 display. At any rate the remaining 2 months of life on these cards is a turn off. I’ll probably get the 7950 or 7890 next gen and upgrade my cpu and mobo ram etc. early next year.

    • kamikaziechameleon
    • 11 years ago

    These are all good products but lets be honest they are in the twilight of their life cycles. Baring any major fab or supply problems they should be replaced in 2 months time. If its any consolation I plan on holding out. My 460 plays BF3 wonderfully, even the 64 player maps.

    I was gonna rush a rebuild my system but now I’m gonna wait for an ample hardware gen leap before doing so since I can keep gaming plenty fine.

    • LaChupacabra
    • 11 years ago

    I recently had an XFX 6850 go bad (video corruption, was an overheating issue). The RMA was a little painful. There is a ticket system on their website that took 3 days, every time, to respond. From beginning to end it took a week and a half to receive an RMA. After they received it it took 5 days to fix and ship back out. If your XFX card does go bad expect to be down for about 3 weeks.

    • dashbarron
    • 11 years ago

    I think they’re trying to beat NVIDIA to the punch on theirs sometime in Q1.

    • dashbarron
    • 11 years ago

    This review just reiterates an old story. Buy whatever version/maker of a card that gives you say a connector or offers better warranty. I gravitate my purchasing decisions towards companies with better warranties.

    Good review though.

    • can-a-tuna
    • 11 years ago

    What a useful bit of information you just shared.

    • NeelyCam
    • 11 years ago

    I somehow get a feeling that you were going through the traditional route of daily grind in the office just to pay for the bills for the life you hated. Then, you did a complete 180, ditched your wife and career, reinvented yourself, and now you’re living your life to the fullest like you always dreamed of… In the ballpark?


    • Meadows
    • 11 years ago

    Well well, it’s true.

    • SHOES
    • 11 years ago

    Nice dpaus I just picked up a 96 TA… not quite an LS6 but formidable. If you get an aston you had better bring it to the bbq!

    • bcronce
    • 11 years ago

    We have a “system”. She gets little things through out the year, and I get to upgrade my comp. She gets my “left overs”, but they’re still nice parts.

    We both spend quite a bit of time online.

    • kitsura
    • 11 years ago

    I recently bought the MSI twinfozr III and I can say with 100% confidence that a shader unlock is not possible.

    • PrincipalSkinner
    • 11 years ago

    Usually I read people are hiding upgrades from their wives. But not you sir. You rub it on her face! +1 for that 😀

    • PrincipalSkinner
    • 11 years ago

    Life is hard.

    • Cyril
    • 11 years ago

    MSI has let us know that the latest beta of its Afterburner tool exposes voltage controls that were greyed out in the 2.1 release. The beta also lets users overclock beyond 900MHz, provided they first edit a configuration file, although that feature isn’t supported officially. We’ve updated the review accordingly. Our conclusion remains unchanged, however.

    • dpaus
    • 11 years ago

    Seconded. But ‘equivalent’ in…. what? Price? Performance? Freebies? Artwork?

    • dpaus
    • 11 years ago

    Was that ‘q’ in your username supposed to be a ‘g’??

    Not that I’m in any position to talk: 2002 Trans Am Firehawk, 17th-last F-body off the production line, got the LS6 block,dyno’d at 326 RWHP with an A4 and 3.23s. Just over 216,000 trouble-free kms, and tentatively scheduled for replacement in the spring. I’m thinking ‘Aston Martin’ 🙂

    • krazyredboy
    • 11 years ago

    I’m impressed…you managed to crop the picture, just right, on the Deus Ex image, where it states “**** Sarif.” For a moment, I thought you had sneaked it in…though, I’m certain you would never, intentionally, do so.

    • Forge
    • 11 years ago

    No mention of shader unlocks? Since that can be done and undone easily, I would have liked to see that in the OCing results.

    • Xenolith
    • 11 years ago

    Mid-range and mobile parts in December. Suspect the high-end version will be a couple months after that.

    • luisnhamue
    • 11 years ago

    I really feal bad, I couldn’t buy a twin fan HD6950, I have a Sapphire 2GB version, but not the “Dirt” one

    • bcronce
    • 11 years ago

    I paid $280 for my Reference 6950 2GB, but I got it back when the unlock still worked.

    I have my 6950 flashed to 1536 units, 840mhz/1325 @ 1.085v/1.5v. Since these are the flashed settings, I went a bit relaxed on the timings

    I can manually OC it 890/1380 @ 1.08v/1.5v and be stable. I did 30min of Furmark, 30min of Heaven, and 1 hour of GPU protein folding. The lowered voltages REALLY helps with the temp. 6970 stock runs 880/1375 1.15v/1.6v

    Once the next gen(7900?) comes out, I’m buying one of those and putting my current into my wife’s comp.

    All-in-all, I am very happy with this card.

    • bcronce
    • 11 years ago

    Somewhere between Fall and Spring, depending on if they do a paper launch or not.

    Die shrunk versions of the current gen will come out first, which will use substantially less power.

    • ssidbroadcast
    • 11 years ago

    How far off is the next-gen of ATI cards?

    • UberGerbil
    • 11 years ago

    A 1984 trans am? Woot, Michael Knight.

    • UberGerbil
    • 11 years ago

    Love these round-ups. Would be interested in something around the $150 price point (or “under $200,” anyway), and also an equivalent comparison of offerings from nVidia.

    • shaq_mobile
    • 11 years ago

    I guess I’ve always dismissed the cards from different distributors as being completely identical and the only difference being dustbuster size.

    It’s a shame that I spent all my money on getting a $500 1984 trans am in running and (somewhat) socially acceptable condition. I’d love to get a pair of those gigabyte 6950’s together an oced 8150 all on watercooling. I’d ditch one of those wimpy heater cores they use and drop in one of my spare radiators. If it can cool 600 farty american foot pounds it *might* suffice to cool a bulldozer. Who knows, maybe I could play bf3 at full detail with no noticeable frame drops. Lol.

Pin It on Pinterest

Share This

Share this post with your friends!