AMD’s Radeon HD 6990 graphics card


Dual-GPU graphics cards have always been kind of strange. Technically, one of them is probably the fastest video card on the planet at any given time. For over a year now, for instance, the Radeon HD 5970 has held that title. Yet dual-GPU cards don’t garner loads of attention, for a host of reasons: they’re subject to the same performance and compatibility pitfalls as any multi-GPU configuration, they tend to have rather extreme power and cooling needs, and they’re usually really freaking expensive, to name a few.

Nevertheless, these odd creations are here to stay. Much of the credit for that fact goes to AMD. The company has been carefully refining its dual-GPU products over the past few years, ever since it decided to stop producing as-big-as-you-can-make-them GPUs. Instead, recent flagship Radeons have been based on pairs of mid-sized chips.

The Radeon HD 5970 was odd in that it wasn’t the absolute pinnacle of extremeness that one would expect out of this class of product. The card’s sheer length and price tag were both plenty extreme, but its 725MHz default clock speed gave it performance closer to a pair of Radeon HD 5850s than to a pair of higher-end 5870s. The limiting factor there was power draw. AMD had to tune the card conservatively to ensure that it didn’t exceed its 300W—its rated power draw and the max capacity provided by its 6- and 8-pin auxiliary power connectors—even in absolute peak cases. To skirt this limitation somewhat, AMD practically encouraged 5970 owners to venture into 400W territory by overclocking their cards, even going so far as to screen the chips to ensure they would reach clock speeds similar to the Radeon HD 5870’s. It was innovation, of a sort, born of necessity.

Into Antilles

Now the 5970’s successor has arrived in the form of a product code-named Antilles: the Radeon HD 6990, an all-new card based on a pair of the “Cayman” GPUs that power the Radeon HD 6970. Cayman is something of an incremental improvement over the Cypress chip that powered the Radeon HD 5970, so one might expect the 6990 to be an incremental step up from the 5970, as well. That’s not quite the case, for a couple of reasons.

First, AMD has endowed the 6990 with a pair of 8-pin auxiliary power connectors and raised the card’s max power rating to 375W. That gives the card quite a bit more headroom. Second, and more critically, AMD built a power-capping feature into the Cayman known as PowerTune that allows the GPU to monitor its own power draw and ramp back clock speeds if needed to stay within its prescribed power envelope. Although PowerTune doesn’t often limit performance dramatically in typical gaming workloads, we’ve found that it will kick in when synthetic tests push the GPU past its normal bounds. That ability to prevent problems in worst-case scenarios has freed AMD to push for higher default clock speeds without fear of creating problems.

As a result of these and other changes, AMD has set the Radeon HD 6990’s clock speed at 830MHz while leaving all of Cayman’s execution units enabled. Each GPU on the card also has 2GB of GDDR5 memory clocked at 1250MHz, for an effective transfer rate of 5 GT/s. Those numbers put the 6990’s theoretical peak performance right in between what one would expect from a couple of Radeon HD 6950s and a couple of Radeon HD 6970s—not too shabby, to say the least.

AMD apparently wasn’t satisfied with that achievement, though. As you may know, all Radeon HD 6900-series cards have a dual-position switch on the top of the card near the CrossFire connector, ostensibly to enable one to switch to a recovery firmware in the event of a failed video BIOS flash attempt. On the 6990, however, moving that switch from its default position (2) to the other one (1) enables access to a hopped-up BIOS. AMD calls it the “Antilles Unlocking Switch for Uber Mode” or—yes, this is happening—AUSUM. Several things happen when your 6990 cards goes into the umlaut-impaired uber mode. The base GPU clock rises to 880MHz, same as the 6970, and the core GPU voltage rises from 1.12V to 1.175V. Also, the board’s PowerTune limit is raised to 450W. You’re essentially overclocking your card when you switch it into uber mode; AMD doesn’t guarantee proper operation for everyone in every system. However, going AUSUM worked just fine with our 6990 sample on our Intel X58-based test system, much like the 5970 did for us at its “suggested” overclocked speed.

If that’s not enough AUSUM-ness for you, AMD has given 6990 users more than enough leeway to get into real trouble. The Overdrive controls in the AMD control panel will allow GPU overclocks as high as 1200MHz, with memory overclocking as high as 1500MHz (or 6 GT/s).

  Peak pixel

fill rate

(Gpixels/s)

Peak bilinear

integer texel

filtering rate

(Gtexels/s)

Peak bilinear

FP16 texel

filtering rate

(Gtexels/s)

Peak shader

arithmetic

(GFLOPS)

Peak

rasterization

rate

(Mtris/s)

Peak

memory

bandwidth

(GB/s)

GeForce GTX 560 Ti 26.3 52.6 52.6 1263 1644 128
GeForce GTX 570 29.3 43.9 43.9 1405 2928 152
GeForce GTX 580 37.1 49.4 49.4 1581 3088 192
Radeon HD 6850 24.8 37.2 18.6 1488 775 128
Radeon HD 6870 28.8 50.4 25.2 2016 900 134
Radeon HD 6950 25.6 70.4 35.2 2253 1600 160
Radeon HD 6970 28.2 84.5 42.2 2703 1760 176
Radeon HD 5970 46.4 116.0 58.0 4640 1450 256
Radeon HD 6990 53.1 159.4 79.7 5100 3320 320
Radeon HD 6990 AUSUM 56.3 169.0 84.5 5407 3520 320

With or without the AUSUM switch enabled, the 6990’s specifications are downright staggering. On paper, at least, it’s far and away the fastest consumer graphics card ever. Of course, we’re just adding up the capacities of its two individual GPUs in the table above and assuming the best—perfect scaling—will happen. That’s not always how things work out in the real world, of course, but the 6990 has more than enough extra oomph to overcome less-than-ideal outcomes.

The card

The Radeon HD 6970 (top) versus the 6990 (bottom)

The Radeon HD 5970 (top) versus the 6990 (bottom)

Yep, the Radeon HD 6990 is long—just a sliver shy of a full 12″, in fact, inviting all sorts of remarks that are surely beneath us. You will want to check the clearance in your case carefully before ordering up one of these puppies. Even the ridiculously lengthy 5970 is a tad shorter.

Source: AMD.

Beneath the 6990’s massive cooling shroud lies a brand-new board design that, interestingly, places a single blower in the center of the card, above the voltage regulators and the PCIe bridge chip that acts as an interconnect between the two GPUs and the rest of the system. Those VRMs, incidentally, are digital programmable units from Volterra that are unique to Antilles. AMD says they enable lower temperatures and lower power draw than conventional VRMs.

Source: AMD.

The blower is flanked by a pair of heatsinks with copper vapor chambers at their bases. AMD claims that, although this card fits into roughly the same form factor as the 5970, it moves 20% more air with this arrangement. In addition, the firm tell us the thermal interface material between the heatsinks and the GPUs is a special, phase-change variety that offers 8% better performance than the standard gray goo. Take note: if you disassemble your card, you’ll likely have to use regular thermal paste when reassembling it, sacrificing some of its cooling efficiency. We’ve avoided taking ours apart, so far, because we want our power, noise, and temperature readings to track with what you’d see from retail products.

An array of compact Mini-DisplayPort connectors allows the 6990 to sport a rich mix of display outputs while leaving room for a full slot cover of exhaust venting. The 6990, obviously, can drive up to five displays natively. Since it supports DisplayPort 1.2, it can even drive multiple displays simultaneously off of a single output with the help of a DisplayPort hub.

AMD clearly leads the industry on the display output front. The only drawback is the need for adapters to support “legacy” displays with HDMI or DVI inputs. Fortunately, every Radeon HD 6990 will ship with a trio of adapter dongles to convert those Mini-DP ports to serve other standards: one passive HDMI type, one passive single-link DVI type, and one active single-link DVI type. Out of the box, the 6990 should be able to drive a trio of single-link DVI monitors, then. The reason that third adapter is of the “active” variety is that the GPU has a limited number of timing sources for its display outputs. If you’d like to drive more than three “legacy” displays with a 6990, you’ll need additional active adapters. Similarly, driving a second or third dual-link DVI display, such as a 30″ panel, will require additional active, dual-link-capable dongles.

All of this to-do about multiple display is, of course, only an issue because AMD has been pushing its Eyefinity multi-monitor gaming feature so enthusiastically in the past year and a half—and because the 6990 looks like the perfect device for driving large numbers of megapixels. Given the 6990’s five-way output array, AMD has pointed out how naturally this card would support an interesting display configuration: a five-display-wide wall of monitors in portrait orientation. That sounds like a whole lotta bezel area to me, but it’s certainly a bevy o’ pixels.

Before we move on to our test results, where you can see exactly how the 6990 performs, there’s just a couple more details to which we should attend. Although the 6990 is being unveiled today, you likely won’t see it selling at online retailers until some time later this week or perhaps early next. When it does arrive, if you’d like to make one your very own, you need only hand over something close to its list price to your favorite online retailer. That price? $699.99.

Gulp.

That’s a lot, but given that the Radeon HD 6970 is selling for about 340 bucks a pop, this single card that has essentially two of ’em onboard isn’t priced at any great premium, believe it or not.

Our testing methods

As ever, we did our best to deliver clean benchmark numbers. Tests were run at least three times, and we’ve reported the median result.

Our test systems were configured like so:

Processor Core i7-980X
Motherboard Gigabyte EX58-UD5
North bridge X58 IOH
South bridge ICH10R
Memory size 12GB (6 DIMMs)
Memory type Corsair Dominator CMD12GX3M6A1600C8

DDR3 SDRAM at 1600MHz

Memory timings 8-8-8-24 2T
Chipset drivers INF update 9.1.1.1025

Rapid Storage Technology 9.6.0.1014

Audio Integrated ICH10R/ALC889A

with Realtek R2.58 drivers

Graphics
Radeon HD 5970 2GB

with Catalyst 11.4 preview drivers

Dual Radeon HD 6950 2GB

with Catalyst 11.4 preview drivers

Radeon HD 6970 2GB

with Catalyst 11.4 preview drivers

Dual Radeon HD 6970 2GB

with Catalyst 11.4 preview drivers

Radeon HD 6990 4GB

with Catalyst 11.4 preview drivers

MSI GeForce GTX 560 Ti Twin Frozr II 1GB +

Asus GeForce GTX 560 Ti DirectCU II TOP 1GB

with ForceWare 267.26 beta drivers

Zotac GeForce GTX 570 1280MB

with ForceWare 267.24 beta drivers

Zotac GeForce GTX 570 1280MB +

GeForce GTX 570 1280MB

with ForceWare 267.24 beta drivers

Zotac GeForce GTX 580 1536MB

with ForceWare 267.24 beta drivers

Zotac GeForce GTX 580 1536MB +

Asus GeForce GTX 580 1536MB

with ForceWare 267.24 beta drivers

Hard drive WD RE3 WD1002FBYS 1TB SATA
Power supply PC Power & Cooling Silencer 750 Watt
OS Windows 7 Ultimate x64 Edition

Service Pack 1

Thanks to Intel, Corsair, Western Digital, Gigabyte, and PC Power & Cooling for helping to outfit our test rigs with some of the finest hardware available. AMD, Nvidia, and the makers of the various products supplied the graphics cards for testing, as well.

Unless otherwise specified, image quality settings for the graphics cards were left at the control panel defaults. Vertical refresh sync (vsync) was disabled for all tests.

We used the following test applications:

Some further notes on our methods:

  • Many of our performance tests are scripted and repeatable, but for some of the games, including Battlefield: Bad Company 2 and Bulletstorm, we used the Fraps utility to record frame rates while playing a 60- or 90-second sequence from the game. Although capturing frame rates while playing isn’t precisely repeatable, we tried to make each run as similar as possible to all of the others. We raised our sample size, testing each Fraps sequence five times per video card, in order to counteract any variability. We’ve included second-by-second frame rate results from Fraps for those games, and in that case, you’re seeing the results from a single, representative pass through the test sequence.

  • We measured total system power consumption at the wall socket using a Yokogawa WT210 digital power meter. The monitor was plugged into a separate outlet, so its power draw was not part of our measurement. The cards were plugged into a motherboard on an open test bench.

    The idle measurements were taken at the Windows desktop with the Aero theme enabled. The cards were tested under load running Battlefield: Bad Company 2 at a 2560×1600 resolution with 4X AA and 16X anisotropic filtering. We test power with BC2 because we think it’s a solidly representative peak gaming workload.

  • We measured noise levels on our test system, sitting on an open test bench, using an Extech 407738 digital sound level meter. The meter was mounted on a tripod approximately 10″ from the test system at a height even with the top of the video card.

    You can think of these noise level measurements much like our system power consumption tests, because the entire systems’ noise levels were measured. Of course, noise levels will vary greatly in the real world along with the acoustic properties of the PC enclosure used, whether the enclosure provides adequate cooling to avoid a card’s highest fan speeds, placement of the enclosure in the room, and a whole range of other variables. These results should give a reasonably good picture of comparative fan noise, though.

  • We used GPU-Z to log GPU temperatures during our load testing.

The tests and methods we employ are generally publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.

Bulletstorm

This game is based on the aging Unreal Engine, but it’s stressful enough on a GPU to still make a decent candidate for testing. We turned up all of the game’s image quality settings to their peaks and enabled 8X antialiasing, and then we tested in 90-second gameplay chunks.

Our single-GPU configs all struggled with this game, as did our pair of GeForce GTX 560 Ti cards in SLI. Those 560s had the least memory of any cards we tested, at 1GB each. Multi-GPU schemes like SLI and CrossFireX have some memory overhead, and we expect that’s what troubled our 560s in this case.

The 6990, meanwhile, goes toe to toe with a thousand-dollar option from Nvidia: a pair of GeForce GTX 580s in SLI. The Nvidia alternative in the same price range as the 6990 would be a pair of GTX 570s, but those are a bit slower. Then again, a couple of 6950s in CrossFireX perform very similarly to the 6990, and flipping the AUSUM switch doesn’t get you much here, either.

F1 2010
F1 2010 steps in and replaces CodeMasters’ previous effort, DiRT 2, as our racing game of choice. F1 2010 uses DirectX 11 to enhance image quality in a few, select ways. A higher quality FP16 render target improves the game’s high-dynamic-range lighting in DX11. A DX11 pixel shader is used to produce soft shadow edges, and a DX11 Compute Shader is used for higher-quality Gaussian blurs in HDR bloom, lens flares, and the like.

We used this game’s built-in benchmarking facility to script tests at multiple resolutions, always using the “Ultra” quality preset and 8X multisampled antialiasing.

The Radeons had a strong showing in the last game, but this is unexpected dominance from AMD. At the highest resolution where the GPU is the primary bottleneck, dual Radeon HD 6950s outrun a couple of GeForce GTX 580s. The 6990 is faster still, and the AUSUM switch nearly moves the 6990 into dual 6970 territory.

Civilization V
Civ V has a bunch of interesting built-in tests. Up first is a compute shader benchmark built into Civilization V. This test measures the GPU’s ability to decompress textures used for the graphically detailed leader characters depicted in the game. The decompression routine is based on a DirectX 11 compute shader. The benchmark reports individual results for a long list of leaders; we’ve averaged those scores to give you the results you see below.

Obviously, the green team takes this one. Not every compute shader is the same, but this one runs better on Nvidia’s Fermi architecture than on Cayman. Regardless of the GPU type, though, one thing holds steady: the performance gains from adding a second GPU are real but modest. That’s why the 6990 is in an unusual position, near the bottom of the pack.

In addition to the compute shader test, Civ V has several other built-in benchmarks, including two we think are useful for testing video cards. One of them concentrates on the world leaders presented in the game, which is interesting because the game’s developers have spent quite a bit of effort on generating very high quality images in those scenes, complete with some rather convincing material shaders to accent the hair, clothes, and skin of the characters. This benchmark isn’t necessarily representative of Civ V‘s core gameplay, but it does measure performance in one of the most graphically striking parts of the game. As with the earlier compute shader test, we chose to average the results from the individual leaders.

The 6990 comes out looking pretty good here, but why is the 5970 so much faster? My guess is that this pixel-shader-intensive test is causing the Cayman GPUs to heat up and invoke their PowerTune limits. Without PowerTune, the 5970 is slower in most real gaming scenarios, but it’s quicker here.

Another benchmark in Civ V focuses, rightly, on the most taxing part of the core gameplay, when you’re deep into a map and have hundreds of units and structures populating the space. This is when an underpowered GPU can slow down and cause the game to run poorly. This test outputs a generic score that can be a little hard to interpret, so we’ve converted the results into frames per second to make them more readable.

My, how the tables have turned! The GeForces take three of the top four spots. Why? I have another crackpot theory for you. There’s tremendous geometric complexity in this late-game scene, with a huge number of units in view at once. Nvidia’s Fermi architecture has some real advantages in geometry processing throughput, and I suspect they’re making themselves known here.

StarCraft II

Up next is a little game you may have heard of called StarCraft II. We tested SC2 by playing back 33 minutes of a recent two-player match using the game’s replay feature while capturing frame rates with Fraps. Thanks to the relatively long time window involved, we decided not to repeat this test multiple times. The frame rate averages in our bar graphs come from the entire span of time. In order to keep them readable, we’ve focused our frame-by-frame graphs on a shorter window, later in the game.

We tested at the settings shown above, with the notable exception that we also enabled 4X antialiasing via these cards’ respective driver control panels. SC2 doesn’t support AA natively, but we think this class of card can produce playable frame rates with AA enabled—and the game looks better that way.

The demo we used for testing here is newer than any we’ve used before, and Blizzard has made a number of changes to SC2 over time. As a result, this test turns out to be more taxing than some of our past attempts. The GeForces end up looking very good here, not just in terms of higher average frame rates but also in terms of avoiding the lowest valleys. The frame rate minimums for the Radeons, even the AUSUM 6990, are in the teens.

Battlefield: Bad Company 2
BC2 uses DirectX 11, but according to this interview, DX11 is mainly used to speed up soft shadow filtering. The DirectX 10 rendering path produces the same images.

We turned up nearly all of the image quality settings in the game. Our test sessions took place in the first 60 seconds of the “Heart of Darkness” level.

We’ve been see-sawing back and forth between clear wins for AMD and Nvidia, but this game looks like an even match. Two GTX 570s in SLI perform about the same as a 6990 or a pair of 6970s in CrossFireX. Notice, also, the excellent ~40 FPS minimums produced by a single 6970 or GTX 570. Even those single-GPU cards handle Bad Company 2 pretty darn well.

Metro 2033

We decided to test Metro 2033 at multiple image quality levels rather than multiple resolutions, because there’s quite a bit of opportunity to burden these graphics cards simply using this game’s more complex shader effects. We used three different quality presets built into the game’s benchmark utility, with the performance-destroying advanced depth-of-field shader disabled and tessellation enabled in each case.

We’ve seen the same trend in this game for quite a while. As the image quality rises, the Radeons become more competitive. At Metro‘s “Medium” settings, two GTX 570s in SLI are easily faster than the 6990. By the time we reach the “Very high” settings, the opposite is true.

Aliens vs. Predator
AvP uses several DirectX 11 features to improve image quality and performance, including tessellation, advanced shadow sampling, and DX11-enhanced multisampled anti-aliasing. Naturally, we were pleased when the game’s developers put together an easily scriptable benchmark tool. This benchmark cycles through a range of scenes in the game, including one spot where a horde of tessellated aliens comes crawling down the floor, ceiling, and walls of a corridor.

For these tests, we turned up all of the image quality options to the max, along with 4X antialiasing and 16X anisotropic filtering.

Wow, not much drama there as the resolution changes. The Radeons are looking relatively strong here, again, though, with the 6990 out ahead of dual GTX 570s.

Power consumption

AMD initially suggested to us that the 6990’s idle power draw should be somewhat lower than the 5970’s, and so it is. That’s not a huge difference, but it is something. Heck, the entire system based on the 6990 only draws 8W more at idle than the same system equipped with a single GeForce GTX 580.

Under load, the 6990 remains reasonable, drawing less power than a pair of GTX 560s in SLI, though it generally outperforms them. There is a price for being AUSUM, though, and apparently it’s about 50 watts. Who knew? Still, the AUSUM 6990 config draws substantially less power than our GeForce GTX 570 SLI system.

Noise levels and GPU temperatures

The 6990 is the loudest solution we tested, both at idle and, more dramatically, when running a game. That difference is especially perceptible when the card is hitting over 58 on the decibel meter. You will notice the difference between the 6990 and the other solutions; it’s quite audible. The 6990 emits a fairly loud hiss, although its pitch and tenor aren’t especially offensive compared to some of the worst solutions we’ve seen over the years.

Dual-card setups have an acoustic advantage, as our results illustrate. With four slots occupied and two full-length coolers, there’s simply more surface area available for heat dissipation. With that said, AMD has apparently tuned the 6990’s cooler fairly aggressively; it has some of the lowest GPU temperatures of the bunch, and you’ll pay for maintaining those by adding a little extra noise.

Conclusions

With a total of just seven games tested, we can ruthlessly boil down the Radeon HD 6990 and its competition to a simple price-performance scatter plot, like so:

We’ve taken the results from the highest resolution or most intensive setting of each game tested, averaged them, and combined them with the lowest prevailing price at Newegg for each of these configurations. Doing so gives us a nice distribution of price-performance mixes, with the best tending toward the upper left and the worst toward the bottom right.

At present, in the suite of games we tested, AMD looks to have a performance advantage at several key price points. That may be a little jarring if your expectations were set several months ago, when we had something close to parity between red and green. We believe AMD has come by it honestly, delivering some impressive performance gains in recent driver releases. One of those changes, AMD tells us, is a revised resolve mechanism for multisampled antialiasing that improves frame rates generally when MSAA is in use—like in nearly all of our test scenarios—particularly in games that use deferred shading schemes. AMD’s driver developers have made some notable progress in CrossFireX multi-GPU performance scaling, too. SLI scaling has long been one of the hallmarks of Nvidia’s SLI, but AMD has closed that gap in recent months.

Of course, both of these changes benefit the Radeon HD 6990, which has no equal in a single-card package. This is the planet’s fastest single video card, supplanting the Radeon HD 5970 that came before it. The 6990 is even faster than two GeForce GTX 570 cards in SLI, which cost about the same amount, and the 6990 draws less power under load, even in AUSUM uber mode. Add in the 6990’s rich array of display outputs, and there’s no question Nvidia is severely outclassed at this lofty $700 price point. We just hope the 6990 isn’t quite as difficult to find over the next year as the Radeon HD 5970 was during much of its run. We do believe TSMC’s 40-nm supply problems are behind us, so we’re optimistic on that front.

Having said that, we can’t help but notice that AMD does offer a more attractive solution in terms of price, performance, and acoustics in the form of dual Radeon HD 6970 cards. You must really covet slot space—or have designs for a dual-6990, four-way CrossFireX rig—if you pick the 6990 over two 6970s. Not that there’s anything wrong with that.

We also can’t avoid noticing Nvidia still owns the title of the fastest dual-GPU solution on the market, in the form of two GeForce GTX 580s in SLI. And we have some clear indications that Nvidia may be cooking up an answer to the Radeon HD 6990 based on that same technology. The challenge Nvidia faces if it wants to dethrone the 6990 is, of course, power draw and the related cooling required. Given that two GTX 570s are slower than a single 6990 and draw more power, the GeForce team certainly has its work cut out for it. Besting the 6990 will have to involve some deep magic, or at least solid progress on multiple fronts.

Or, you know, a triple-slot cooler.

Comments closed
    • anotherengineer
    • 9 years ago

    I think if some company releases this card with 6 display ports and 2 dvdi-d and 1 hdmi ports that would be kinda neat.

    • anotherengineer
    • 9 years ago

    Wow over 120 posts and not one is mine so I guess I will post something.

    I will be excited when/if AMD/ATI comes out with a 6850 that doesn’t require external power!!!

    Now that would be cool!!

    edit – grammer

      • FuturePastNow
      • 9 years ago

      I don’t know who voted you down, but I’d love to see a passively-cooled 6850, and lowering the power consumption is a necessary step for that. But I don’t think they can lose the six-pin connector, as it’s a 127W TDP card now. For the sake of compatibility with PCIe 1.0, they’d have to get that down to 75W to make sure it could run from any x16 connector.

      If AMD were willing to rule out PCIe 1.0 compatibility entirely, they could delete the external power connector now, as a 2.0 x16 slot can provide 150W.

        • anotherengineer
        • 9 years ago

        So it is 150W from pcie 2.0, I wonder if it does run w/o the power connector plugged in on a 2.0 pcie mobo or will you get the “plug in your stuff” warning/error??

        Passively cooled would be nice also, I think the next best thing is the MSI cyclone, apparently it’s fairly quiet.

        [url<]http://www.techpowerup.com/reviews/MSI/HD_6850_Cyclone_Power_Edition/27.html[/url<] If only my wife wasn't off on sick leave 😐

          • FuturePastNow
          • 9 years ago

          It’ll complain about being unplugged, and probably throttle down to 2D clocks (if it can).

          I’ve heard- and this may not be correct- that cards will try to draw as much power as they can (within the spec) from the external power connector before loading the power pins in the slot.

    • CaptTomato
    • 9 years ago

    [url<]http://www.jbhifionline.com.au/plasma-lcd-tvs/plasma-tv/panasonic-viera-th-p58s20a-58-full-hd-plasma-tv/632420[/url<] $1690 for 58in plasma [url<]http://configure.ap.dell.com/dellstore/config.aspx?oc=t14u3011wau&c=au&l=en&s=dhs&cs=audhs1[/url<] $1900 for 30in LCD.

      • Airmantharp
      • 9 years ago

      [url=http://www.provantage.com/hewlett-packard-hp-vm617a8-aba~7HEWA178.htm<]$1168.69 for the HP ZR30w.[/url<] I can do this too, but you're missing two very important points- 1. The plasma screen, or any HDTV, is at 1080p- which is 2MP, where the 30" LCD (and only 30" LCD monitors) is 4MP. That's a factor of two brotha. 2. It's a TV. With plasmas, you get text that isn't sharp, and with LCDs, you get input lag, so pick your poison. There's a reason for the season here. As I've said in other comments, I'd love to see 30" 4MP TNs on the market, but they simply do not exist, which means that in order to get the 4MP resolution you have to pay the screen size tax, the resolution tax, and the IPS tax too.

        • CaptTomato
        • 9 years ago

        I’m talking about value for money…..30’s are overpriced devices.

          • Airmantharp
          • 9 years ago

          And yet missing the capabilities that put 30″ LCD monitors in their own category!

          They cannot be compared to an HDTV, sorry. You could power an HDTV with a much more pedestrian card, but things like a 30″ LCD or multi-monitor setup will put cards like the HD6990 to use.

            • CaptTomato
            • 9 years ago

            I don’t believe a little 30in is better than say a 42in 1080p LCD, but when I speak of big plasma’s, I’m simply pointing out how poor the 30in is in a value for money stakes.

            • Ryu Connor
            • 9 years ago

            We’re seriously comparing a 16:9 1080p screen with huge pixels to a 16:10 1600p screen with tiny pixels and failing to understand the cost difference?

            Comparing plasma which is cost prohibitive to scale into smaller sizes to LCD is also a bit of an apples to oranges comparison.

            • Farting Bob
            • 9 years ago

            30″ monitors are the highest end consumer screens, and nobody has yet made a 30″ with a TN screen, look at the highest end of any tech, its never worth the money unless you are using it for busines.

            If you want value for money, i have a 27″ TN screen that cost me £300 over a year ago, they are even cheaper now (£220-280). Quality isnt amazing, but it is plenty good for gamers and anybody using it in a non-pro use. Costs 1/3 of the price of the cheapest 30″, and its still god damn huge. Shame its got beefy bezels though or id seriously consider multi-monitor setup with them.

            • Airmantharp
            • 9 years ago

            The issue here isn’t size, unfortunately, it’s resolution. I probably have the same 27″ screen, and it’s sitting next to my 30″, right now. Thing is, it’s half the resolution of the 30″!

            I would expect a 30″ TN screen with a 2560×1600 resolution to cost somewhere in the neighborhood of US$500-600, compared to $1100-1500 for the IPS versions.

            • UberGerbil
            • 9 years ago

            The thing is, TN’s disadvantages get worse the larger the display gets. When you’re sitting with your eyes centered on a 24″ display, the top and bottom rows of pixels are only a bit off-axis. Move up to a 30″ display, though, and the angle gets a lot sharper… and the color-shifting that TN is prone to becomes much more pronounced. Now, clearly there’s a range of TN displays and some are much better in this regard than others, and presumably a hypothetical 30″ TN would use the better ones… but presumably that would cost a bit more, too. And it still might not be all that good.

            • Airmantharp
            • 9 years ago

            You’re absolutely right-

            I just feel that if say, Hanns-G put out a 2560×1600 30″ TN display in the US$500-$600 range, that it’d be pretty popular with a lot of different groups, like their 27.5″ 1920×1200 has been.

            We can all agree that TNs have gotten much better over time. When viewing angles aren’t as important and colors are close enough, I think low cost, size, and resolution would combine to make a pretty compelling package.

            • CaptTomato
            • 9 years ago

            The DELL27 is reasonable value for money, but the 30in isn’t and isn’t dropping in price relative to other LCD’s.

            • Airmantharp
            • 9 years ago

            Relative to what other LCD’s?

            I think that the 27″ 2560×1440 is a reasonable compromize- smaller screen, fewer pixels, same panel quality. Thing is, it only costs $100 less than I paid for my 30″ screen; and I’d rather have the HP (if they made a 27″ display with the same panel as the Dell like they do with their 30″) which has lower input lag.

            • CaptTomato
            • 9 years ago

            “””I think that the 27″ 2560×1440 is a reasonable compromize””

            Me too, and in dollar terms, these panels in OZ can be as low as $700.
            If I was going to buy a 30, I’d get the DELL, but only when pricing makes sense.

            • NeelyCam
            • 9 years ago

            I want that resolution in 24″. Why? Because that’s all I can fit on my desk.

          • UberGerbil
          • 9 years ago

          Tell me how much I have to pay to get that 58″ plasma to output a clear 2560×1600 display? Because without that number, there’s no way to compare the two.

          Alternatively, allow me to point out it’s quite possible to get 1080p monitors — with the same resolution as that plasma — for just a couple of hundred dollars. They may not be as large, but I’m not going to be sitting 24 inches away from that plasma either. Compared to a $200 monitor with the same resolution, that plasma looks “overpriced” and “poor value for money” to me.

          Or maybe, just maybe, they’re different tools for different jobs and comparing them in this kind of simplistic way is silly?

            • CaptTomato
            • 9 years ago

            There’s this big wank going on in the world regarding resolution, but when it comes to image quality, SIZE makes a huge difference……but FYI, I’m talking about value for money, in that I could buy a large plasma and have more fun watching it rather than gaming on a 30in LCD….which is small anyway.

            • Ryu Connor
            • 9 years ago

            Of course you make the typical mistake of assuming that what you consider value for your money is the absolute truth for all of humanity, life, and the world.

            To you, “when it comes to image quality, SIZE makes a huge difference”

            As many of us have detailed there are real technical reasons that would argue otherwise. If you prefer size over the numerous advantages that 30″ monitor provides versus a TV, that’s fine. Just let go of the idea that your value of the money is anything but a Colbert Truthiness.

            [url<]http://en.wikipedia.org/wiki/Truthiness[/url<]

          • LawrenceofArabia
          • 9 years ago

          I see that you don’t fully understand the qualities associated with panel type, pixel pitch, and color accuracy. That plasma has a pixel size of 0.6688 millimeters and a density of 37.98 ppi. Your standard 30′ 4MP display has a pixel size of a mere .2524 mms, and a density of 100.63. That’s over two and a half times the pixel density.

          Go run a 2MP 24′ display at 1280×720 and see how crappy it looks. The pixel density will still be higher than that 58′ plasma. Why on earth would you would want that plasma instead of a 30′ display is beyond me, and that’s without even taking into consideration other image quality benefits.

            • CaptTomato
            • 9 years ago

            Why does bluray look better on my big HDTV’s than on my small LCD and all the small LCD’s I see instore?
            I’m not necessarily talking about gaming on a 58 btw, I’m just saying, that I can’t bare to buy a small 30in LCD when i could get a 58in plasma, a panasonic no less.

            • Airmantharp
            • 9 years ago

            …because your small LCD and all of the small LCDs you see in store are lower-end TNs designed for desktop work, not movies.

            Your HDTV has at least a baseline calibration and some post-processing circuitry designed to make movies look good (though not necessarily true to developer intent).

            Any questions?

            • CaptTomato
            • 9 years ago

            I have a 25.5in 8bit and a 1080p 37in LCD, and TN panels aren’t in 32/37/40in screens.

            • GrimDanfango
            • 9 years ago

            In addition to other comments:

            Also, it looks better, because you’re viewing a 1080p source on a 1080p screen. Of course Bluray will look no better on a professional quality 2560×1600 panel – it’s simply displaying lower quality content than it was built for.

            You’ve also made the mistake of assuming the only reason professional 30″ monitors exist is to play bluray and games. Yes, one of these screens is quite obviously of no value to *you* over any consumer-grade plasma television, because you have no intention of using it for its intended purpose.

            Have you ever tried editing photos on a 58″ plasma TV? I guarantee you’ll have a more pleasant experience, not to mention a considerably more accurate one, on a 30″ monitor.

            Games can scale to the higher resolution, but even that won’t look a world better than on a 1080p screen currently, simply because games really don’t have the fidelity to make much use of the extra pixels.
            As games make more and more use of tessellation, super-high-res textures, and high quality shaders, they’ll begin to come into their own on a 30″.

            There’s the other thing of course – you tend to sit at a desk to view a monitor, you tend to sit on a sofa to view a TV, so depending on your situation, it’s highly likely that the 30″ would be filling a larger proportion of your field of view anyway.

            As I’ve mentioned, use the right tool for the job – if you want a TV, buy a TV… a professional graphics monitor makes a pretty poor TV.

            • CaptTomato
            • 9 years ago

            This is the 6990 thread, it should be obvious I’m referring to games.
            I want a 30in, I just don’t want to be ripped off because DELL are playing the niche product BS forever.
            On my 37in, GTR evolution looks better at 1440×900 than it does at 1920×1200 on my 26in 8 bit panel.

            • GrimDanfango
            • 9 years ago

            So stick with TVs! Yes, this is the 6990 thread, so why exactly have you started ranting on about the poor value of a professional grade graphics monitor for playing games on? You seem to have mistakenly decided that its sole use and target market is gamers and movie-watchers.

            Just because you’ve got no use for a product, doesn’t mean it has no use. You’re a consumer, there are consumer televisions that do all that you require, and do it well. You’re not a graphics professional, so don’t bother with a pro graphics monitor.

            But then don’t rant on for pages about how pointless that product is. Just because something is a niche product, doesn’t mean it’s not good value, it just means it’s not of use to the majority of people, including you. You would only be ripping yourself off if you bought it, because you’d be wasting a large sum of money on a product you won’t use for its intended purpose.

            Like I say, if a TV is sufficient for your needs, buy a goddam TV!

            • CaptTomato
            • 9 years ago

            Seems it’s okay for you to express your devotion to dell, but not okay for me to question it’s value.
            Gee, I wonder what might happen if more people questioned it’s pricing….u never know, they might get reasonable, but not with trusty hero worshipers like u onboard.

            • moose17145
            • 9 years ago

            I would hardly say he was worshiping dell. From what I was reading he was merely pointing out the advantages of a 30″ monitor over a (by comparison) low res plasma, and pointing out how its pointless to compare the two, and how you cannot simply say they have a poor value for what you get because you can get a bigger TV for less, when they clearly have different target markets in mind.

            Sitting on a couch many feet away from the TV… I am sure the TV does look better… but sit as close to that TV as you would that 30″ monitor and I can promise that image will look like total crap. Also that 30″ monitor can be made so that your blu-rays look gorgeous on them. On a professional grade display can be calibrated to make the image look just about any way you want. I have seen it done and typically these tools come with the monitor when you purchase it. But this also requires more effort than simply plugging it in and hooking up the BD player and hitting play like you do on your TV, which typically comes precalibrated for movies, TV, and yes, games (XBox, PS3, etc). But saying that the image looks worse on a 30″ professional grade monitor is more or less saying you don’t know how to properly use such a display since it likely was not calibrated for said style of use (since they more often than not do require you to calibrate them for what you are using them for), or you have simply become so used to seeing what things look like on a TV that seeing them in their true form for whatever reason looks less appealing to you.

            Also you can NEVER compare how images look on monitors that are setup in best buy and other big box stores. I used to work for best buy in their computer dept and helped setup those displays. The way we set them up was hooking them all into one big splitter that had a dozen ports on it (which right there pretty much ruined the image quality), and we also never took the time to calibrate those displays and set them up properly. We just plugged em in and turned em on. In addition, consider this. We hooked up the tiny 15″ displays all the way up to the big 30″ bad boys into the same splitter. That means that the 30″ monitor was being fed the same low res image that was the 15″ displays max res. So most of the monitors on display were not running in their native resolution.. which again… makes the image look less appealing when the monitors are forced to scale up a lower res signal than what is their native resolution. When I bought my 24″ monitor from best buy (bought it there cause it was the same price in store as online at other places, and they let me test it for dead pixels before i took it home… and i could also get it the same day cause at the time i was feeling greedy an wanted it right now after over a month of researching displays), the image they had being displayed on it looked like crap. I took it home, set it up properly, and the display looked great (despite being a TN panel). It was seriously almost like it wasn’t the same monitor as what they had setup on their shelf when comparing image quality.

            With TV’s on the other hand they can get away more easily by hooking them all up into the same splitter without as much loss of image quality. Most TVs anymore will accept a higher resolution input than what they can actually display, but will scale down the resolution to what the TV CAN display (something that computer monitors are typically not very good at if they can even do it at all). AND also unlike computer monitors where each step up in size typically denotes the ability to also display a higher peak res… TVs all have the same res… whether you are looking at a 35″ or a 60″… they both display a 1080p resolution. As a result they can send a 1080p image to all the TVs hooked up to that splitter.. and almost every TV hooked up to it will be receiving a signal with a res that is the same as its native res. The smaller 720p TVs that are receiving that 1080p input will typically scale down that image to work on their smaller res displays. But being able to send a signal out that matches the native resolutions on basically every display hooked up to its source makes a huge difference in how nice the image looks.

            But in all over simplifying the argument to “teh big TV is a better value cause it costs less and is bigger” is just flat wrong. Again.. right tool for the right job. You wouldn’t say that an impact wrench is overpriced when all you need is a ratchet. No kidding… they are meant for two totally different things. Just cause they CAN often be used for the same thing does not mean they should be. Also… I do not understand how you think people simple bitching about the prices of 30″ monitor will make them cheaper. Sure they might be able to make a cheap 30″ panel with a 2560×1600 res that looks like crap… but quite frankly most people looking at displays that size are either going to just buy a TV… or are professionals who can make use of the extremely high resolution and precision that an actual computer monitor will provide them. Its the same argument as to why a Xeon costs more than its equally fast i7 counterpart. Or why professional grade videocards can costs thousands of dollars more than their 500 dollar gaming counterpart despite the fact they might fail at playing games at acceptable frame rates. You are paying for more than just the hardware. You are paying for quality assurance, a better stock warranty, along with many other things.

            • CaptTomato
            • 9 years ago

            How can a tiny 30in display cost more than a monster plasma with inbuilt HD tuner and speakers?

            • GrimDanfango
            • 9 years ago

            How can you seriously believe “bigger means better”?

            Here’s how. I work in visual effects – guess how many “good value” plasma TVs we have in our office? Zero. We’ve got hundreds of $2000+ 24″ LCD screen however. Why do you suppose a company in the visual effects industry would waste half a million on “overpriced” small screens when they could spend half that and get bigger screens?

            I’ll tell you why, because the movies you go and watch at the cinema would look a hell of a lot worse if we created them on cheap plasma screens, because you simply can’t trust the colours you see. It takes a screen that has been built to very accurate standards, uses higher bit-depth gamma processing, and provides extensive manual calibration controls.

            That’s how a 24″ display can cost more than a monster plasma. Makes the 30″ look pretty good value I’d say.

            • CaptTomato
            • 9 years ago

            From a gamer/home consumer perspective, the dell30 is a blatant rip off.

            • GrimDanfango
            • 9 years ago

            Haven’t you been listening? It’s not for gamers/home consumers. It’s a professional piece of equipment.
            Would you buy a $1000000 articulated robot arm? No, because you don’t need to fabricate precise parts on a production line. Does that make a $1000000 robot arm a rip-off?

            If you don’t have a use for a professional piece of equipment, don’t buy it!

            • CaptTomato
            • 9 years ago

            anything with a niche tax is a rip off champ…..you can’t be that simple minded can u?

            • GrimDanfango
            • 9 years ago

            You can’t be so simple minded that you think all products are produced in batches of 100,000s with wide tolerances can you?
            It’s the most simple rule of manufacturing, the more you make, and the looser your tolerances for quality control, the cheaper you can knock ’em out.
            I’ve seen monitors for medical applications cost tens of thousands – very small production runs and they have to bin the majority of samples because of defects/dead pixels/etc, because the application calls for them to be demonstrably near-perfect.
            To you, that would be a rip-off… to the person who avoids a false diagnosis on a life-threatening condition, it’s worth every penny. Well guess what? They’re not expecting you to buy one!

            If you’re so determined to prove to me that Dell are ripping everyone off, why not go and research the production costs, the overheads and margins, and demonstrate to me that Dell are selling these at a significantly higher margin than other cheap screens. I’ve given you several detailed examples of why they’re *not* a rip-off… hows about you actually argue with some facts if you’re intent on arguing? Spouting endless price-fixing conspiracy drivel is not likely to convince me.

            • CaptTomato
            • 9 years ago

            How is it they can sell the 30 on special{a rare event} for $1200 vs it’s typical price of $1900?
            They’re obviously making money on the 1200 sale, so there’s a $700 niche tax being paid by you all, LOL.
            As for the cost because of pixels, why isn’t the 27in ultra expensive?….it’s often on sale for $700-$750, yet it has a tight assed pixel pitch as well.

            So you often have the case where the 27 is $700 vs $1900 for the 30.

            • Airmantharp
            • 9 years ago

            How is it that you’re still surprised that the relative prices of luxury items are different in your country than the rest of the world?

            Give it a rest, we know why you’re whining already, we can be fairly sure that you have intelligence or maturity issues (either you can’t see or refuse to accept the obvious truth), and we don’t want to hear anymore of it.

    • michael_d
    • 9 years ago

    Your test system shows that you used a 750watt PSU and 6970 Crossfire reached 608watt under load. I wonder if you ran into any problems during testing?

      • Damage
      • 9 years ago

      Nope. Actually took that same PSU over 750W before.

      [url<]https://techreport.com/articles.x/15293/10[/url<] Never had a problem with it.

      • MrJP
      • 9 years ago

      That PSU is 85% efficient at higher loads, so even the 580 SLI at 682W measured at the wall is “only” 580W supplied to the system. Plenty to spare!

      • michael_d
      • 9 years ago

      The official minimum PSU requirement is 750w. You have to have more head room to run it normally.

        • Airmantharp
        • 9 years ago

        Minimum requirement of what? SLi? Crossfire? The PSU itself?

        He stated above that the system under load reached only 608 watts in this case, and has run well above that in the past. What’s your beef with it?

    • gerbilspy
    • 9 years ago

    nice article, thanks

    • clone
    • 9 years ago

    I’m more impressed than I am disappointed.

    the idle and load power draw are amazing considering what the card is and the performance it offers.

    the price is high as usual and I’m a little disappointed in the peak perf in some apps but that may be driver and could change in the months ahead.

    in the end would I buy one of these cards….. no primarily because it seems like a daunting task to cool them outside of going with water and while I use water cooling I was hoping to get away from it so while I’m impressed with the card no I won’t buy it.

    good job ATI on the power consumption though, very impressive but they need to work on the cooler and if you want to hold the crown will need some more perf……. again that depends if Nvidia can maintain it’s 580 performance with it’s dual gpu while also keeping heat and power consumption tolerable which would also be no mean feat.

    • Krogoth
    • 9 years ago

    Another “meh” extreme-edition epenis product.

    It is kinda funny that all of the current generation high-end, dual-GPU solutions can effortlessly handle 4Megapixel gaming with AA/AF thrown in. This was only a pipe-dream several years ago.

    PC gamers are so darn spoiled by modern hardware. I remember back when 640×480, 16 bit colors at 60FPS was considered to be high-end……….

      • Airmantharp
      • 9 years ago

      Not sure why you’re being rated down for this, as I agree wholly. To contribute, I remember when 17″ screens running at 1024×768 were the bomb, except you had to have SLi (the first one that stood for Scan Line interleave!) in order to run Quake 3 on with twin Voodoo 2 cards.

      Now I have one of those four megapixel screens with a GTX570, and I’m looking at future titles thinking ‘I need TWO of these, and a 4.5GHz Sandy Bridge setup to feed them…’

        • Krogoth
        • 9 years ago

        I suspect that my original post is getting downvoted, because I am not wetting my pants over 6990.

        It is just a repeat of HD 5890 which is going to be eclipsed by Nvidia’s own answer; 2x GF110 on a stick = (GTX 595?). I based this on the fact that 590 SLI still outclasses 6990.

          • swaaye
          • 9 years ago

          Or it’s being voted down because you’re being negative, as usual. 😉 There are quite a few critical posts about the card already.

            • Krogoth
            • 9 years ago

            Negative?

            Nay, I am being realistic. 😉

            I guess that I am getting older or something. The days of getting geekgasms over pieces of silicion are over for me.

            • swaaye
            • 9 years ago

            Especially when they cost $700, there aren’t any games worth playing on them, they consume about 350W to make semi-pretty pixels, and they sound like an airliner?

      • Kaleid
      • 9 years ago

      I agree. Bring on 28nm single GPUs. If money wasn’t a problem then microstuttering plus all the noise certainly would be.

        • Krogoth
        • 9 years ago

        IMHO, driver profiles are the biggest issue.

    • PRIME1
    • 9 years ago

    I would like to see AMD man up and give one of these away on TR.

      • sweatshopking
      • 9 years ago

      lol
      right after nvidia gives away one of their dual gpu cards. you’re just mad because nvidia STILL (for the 3rd generation) doesn’t have the fastest graphics card on the market.

    • glynor
    • 9 years ago

    Great article, Scott. Well done.

    And the improvements to the site made it [i<]much[/i<] easier to read last night on my phone. Thanks.

    • indeego
    • 9 years ago

    I’ll keep my Office at 40 dB please, kthx.

    • thermistor
    • 9 years ago

    Can’t believe a $160 6850 doesn’t even make the freaking charts now…and I just bought it around the holidays.

    I played all the way thru BF:BC2 on 3×1 Eyefinity, but I could not play at maximum resolution 1680×1050 3×1. Gonna need another card in xfire to get maximum resolutions, god forbid if I move up to 1080P displays…

    The pricier stuff looks quite well suited for multi-display gaming at higher resolutions. If I was committed to Eyefinity at crazy resolutions, this card looks like a relatively good value, all things considered, maybe except for the noise.

      • tejas84
      • 9 years ago

      It’s not really a speed demon is it? The 6850 is pretty slow if you are being honest.

        • thermistor
        • 9 years ago

        At 1/4 the price of the 6990 I’ll take the bang/buck trophy home if ya don’t mind…

      • MrJP
      • 9 years ago

      Add another 6850, and you’ll be beating the 580 and only about 30% down on the 6950CF (see the original Cayman review) which is roughly equivalent to a 6990, so it really wasn’t a bad choice at all. You might still struggle with lots of AA at 5MP with only a 1GB framebuffer however.

      • Damage
      • 9 years ago

      We’d love to include the 6850 and other cards in a review like this one. Two of them could be a legitimately decent solution, even at the settings we used, for most games. However, you have to understand that we only received the driver for this card at mid-day this past Thursday.

      Yes, that’s this past Thursday, mid-day.

      We couldn’t start testing until we received that driver, since we wanted to test all of the Radeons with the same one.

      Furthermore, AMD didn’t have F1 2010 CrossFireX scaling working properly until they issued a new CAP (their third attempt) on Saturday evening. We had to re-test all CrossFireX cards after that.

      To test 11 different configs like we did in 7 different games, plus to do power/noise/temperature testing, turn in all into graphs, take pictures, do layout, and write a coherent article about it in the span from Thursday-Monday night was a non-trivial undertaking.

      We are committed to quality reviews with extensive comparisons, so we have repeatedly lobbied both AMD and Nvidia for more time to conduct graphics card reviews. In spite of that, they both appear to have settled on a standard time window of which this review was very typical:

      -Product with completed driver gets into reviewer’s hands on Thursday.
      -Product launches Monday evening.

      -Total working days in window: 3 x 8 hrs.
      -Real days worked by reviewer: 5 x 14 hrs.

      Understand that our advance notice before we begin an undertaking like this one, where we drop everything and work non-stop on a review, is typically about 48 hours. (Living from deadline to deadline like this isn’t easy after, say, 11 years or so of doing it.)

      Never is a review like this one as detailed, thorough, or even fun as we would like, and we have lobbied hard for changes in the way things work. However, we are one voice with odd ideas about quality and detail in a vast stream of media sources.

      So yeah, we didn’t slight the 6850 intentionally, exactly. FYI.

        • derFunkenstein
        • 9 years ago

        Not to mention you’re talking about an unplayable mess at that point, given the settings, resolution, AA, etc. that you’re testing at just to establish some sort of differentiation between the high-end configs. I think it’s probably not worth testing sub-$250 single cards or sub-$400 dual-card configs in this scenario.

        • thermistor
        • 9 years ago

        Damage: I read practically all your reviews, and understand the level of effort that goes into a multi-card review in terms of trials per application multiplied by the number of applications. I apologize if you took my sullenness at having bought a fairly stout card that is nowhere near the top of the list now as negative criticism of TR’s great work.

          • Damage
          • 9 years ago

          Oh, no problem. I was just frustrated at being unable to include it and the related reasons why, which certainly aren’t your fault!

      • Krogoth
      • 9 years ago

      The chart was supposed to be represent the values at the high-end with 4Megapixel gaming + AA/AF.

      Current mid-range cards would have struggle under the same conditions.

    • michael_d
    • 9 years ago

    I just checked power consumption, noise and temperature levels. 6970 CrossFire is absolutely atrocious! I thought that 5870 CrossFire was bad enough. Even though they release new GPUs on a smaller process node but power consumption, noise and temperature keep edging upwards.

    I think they should just give up on dual-GPU configuration solutions and produce single enthusiast-based GPUs but then again it will be more difficult to sell them. It is a lot easier to manufacture a moderate GPU, combine the two of them and present as high end product.

      • Airmantharp
      • 9 years ago

      You’re actually talking about the difference between AMD and Nvidia’s strategies here- AMD makes smaller GPUs and then uses two of them for their high end product, Nvidia makes larger GPUs and uses a fully enabled one for their high end product.

      I guess the only problem with this is that AMDs smaller GPUs (which are also denser, if you look at transisters/mm^2) are competing very well with Nvidias larger GPUs. Hence, a 6970 is either as fast or faster than a GTX570 with a smaller die, fewer transisters and lower power consumption.

      Nvidia also seems to have been focusing on GPU compute in their products. The same GPUs that go into our Geforces also go into Quadros and Teslas, which have more functions enabled. It’s a questionable strategy, making one larger product for all markets instead of multiple products to target each market individually, but it seems to be working for them right now.

    • ap70
    • 9 years ago

    I have a single 560Ti running at 64fps on BBC2 why buy a second one to go at 74fps?
    The same with this card…the improvements are not whorth the money.
    You can see the difference in 2fps or 20fps when you are up to 60fps.
    If you are an enthusiast you must know how to stop.
    I learned after buying the i7980x expecting something faster than a i5….what never happened.

      • Airmantharp
      • 9 years ago

      Actually, when looking at these charts while running a GTX570 myself, I was impressed with the performance of a single GTX560 Ti, but I was also dissapointed in the performance two of them in SLi. The minimum frame rates, which you’ll actually feel, are much lower than the average frame rates on this card than pretty much anything else. I’m glad I went with the GTX570 now :).

    • swaaye
    • 9 years ago

    Check out the noise level on that baby. It must be unreal. Too much heat in too little volume.

    BTW, when I put an aftermarket cooler on my 6950, I noticed the phase change compound. I’m pretty sure I remember AMD saying way back that they prefer phase changing thermal compounds for some reason.

    • tviceman
    • 9 years ago

    For all six people that are in the market for this card, enjoy!

    • zimpdagreene
    • 9 years ago

    Well as said before there is really limited use for this. Until game developers really start programing for the PC, then you won’t need this or a crossfire config really. No game out right now pushes the edge really. Crysis 2 (Multi-player) graphic’s are ok but not blow out like most people expected. When they start pushing the real skin look, open physics and destructible everything and large 10 20 mile maps. Then yea that’s when you need this and maybe 2. So as far as now what do we have?
    Nothing just the wait game!

    • Freon
    • 9 years ago

    Too bad they didn’t call it the AUSUM Selective User Adjustment Control Extention.

    Pretty nice for a single card. I wasn’t expecting the extra habenero.

    Plenty happy now with a 6950 with the 6970 BIOS hack, though. Until I blow a $1000+ on a 30″ display I think I’ll be fine.

      • Airmantharp
      • 9 years ago

      This is my problem :).

    • BoBzeBuilder
    • 9 years ago

    None of those games really stress the 6990 do they? Maybe Metro 2033

    • derFunkenstein
    • 9 years ago

    It’s a neat product, I guess. I dunno…these things only exist “in theory” as far as I’m concerned. I don’t know anyone who’s going to drop $700 on one of these, and I don’t know anybody with a monitor worthy of one in the first place. Sure is fast, but at 1920×1080, so is my card. :p

    • bcronce
    • 9 years ago

    And in ~8 months, a $150 card will out perform it and consume a heck of a lot less power.

      • flip-mode
      • 9 years ago

      Well, the Radeon 5970 launched 16 months ago and your prediction is still nowhere even remotely close to coming true. There’s still no single gpu that matches it’s performance at all, much less at a $150 price point.

      • GrimDanfango
      • 9 years ago

      weeeell… maybe more like ~3-4 years. It’s been around 5 years since the 8800GTX – not even a dual-chip card, and only now does a $150 card deliver comparable/slightly faster performance.

        • shank15217
        • 9 years ago

        What? Please get off the 8800GTX boat, its been beat like 2 generations ago and most cards at $150 could beat it two generations back.

          • swaaye
          • 9 years ago

          I think you had to do the $200 level two generations ago to beat it. 4850 and 9800GTX.

          • GrimDanfango
          • 9 years ago

          Jesus, why does everything have to be taken as fanboy ravings? I’ve never even owned an 8800GTX, and I’ve been a fairly staunch Radeon supporter for the last 5+ years!

          It was just a simple, familiar example of a top-end supercard holding its own for a hell of a lot longer than 8 months. The 4850 came in around about two years later, was roughly similar performance to the 8800GTX, and came out around the $200 mark… which was an exceptionally competitive and efficient card from ATI at the time. So my point stands, even two years on, with highly efficient and well priced new cards, a $150 card wouldn’t have kept up with a top end card from two generations previous. The following generation, sure… so maybe not 5 years as I said, but 3 years is still a lot more than 8 months.

        • bcronce
        • 9 years ago

        GPUs this fall are supposed to be 3 times faster. If you assume the 6990 is only about 1.8 times faster than the 6950 and the 6950 is about 50% faster than a card about half the price aka $150, then math works out based on a ton of assumptions.. hehe… that in ~8 months, a $150 card will be about the same or faster than a 6990.

          • sweatshopking
          • 9 years ago

          I have no idea where you get your math from. there is NO way the 7k series will be 3 times faster. or the 6xx series. you’re on glue.

    • GrimDanfango
    • 9 years ago

    Three tits? AUSUM!

    Edit: Nuts, only people in the UK will get that. Darn staggered release dates.

    • sweatshopking
    • 9 years ago

    well gentlemen, I’m here so relax.

    First off, WTF is with that card size? OMG 12 inches, where can i fit that? (that’s what she said!) no but seriously. This card, at 700$ is an extremely expensive solution, for a problem that nobody really has. Sure, it’s fast. But a 4890 will play every game on the planet at max settings, with the exception of metro 2033. Until some new none console games come out, there is no reason to purchase this card. I’m all for speed, and @brightsideofnews.com they claim this card has more computational power than ever pc in the world in 1998. if true, that’s incredible. However, I can run everything on high @ 1080, and like my wife says about the “business”, this card is not worth the effort required. For most people. some guys on 3 30 inch monitors, giver. you’ve already got more money than you need.

      • Airmantharp
      • 9 years ago

      I don’t doubt your personal experience, but really? The HD4890 is two generations old, and there are [b<][i<]many games[/b<][/i<] that cannot be played on it at max resolution and max settings on a common 1080p sized monitor, let alone a single 30" panel or multi-monitor setup. I can say that I have a hard time trying to justify using this card over a traditional and equally priced Crossfire/SLi setup, and I see where you're going there- this card uses less power and takes up one less slot, but with the stock cooler it's far too loud- though it otherwise appears to work very well. I would venture that any experienced builder putting one of these to use would be putting it in a case designed to handle the length and breathing needs of the card, with a motherboard that has its slots arranged to facilitate airflow. I can see someone using one of these with a water-cooling setup, and I can see someone using two in either a high airflow case that's sound dampened or with an appropriate water-cooling setup as wel. This card is not for your average user with some spare cash.

        • sweatshopking
        • 9 years ago

        Which game can’t be played on a 4890? Just cause 2, BC2, crysis, all run very well @ 1080p. battlefield 3 will likely require an upgrade, and the dx11 might be worth it, but otherwise, I can pretty much run anything.

          • Airmantharp
          • 9 years ago

          Again, this is [b<][i<]your[/i<][/b<] experience. You might want to check out reviews of newer cards that include an HD4890 as well :).

            • sweatshopking
            • 9 years ago

            I have. I’m not aware of any games that won’t run well on that chip, besides metro. If you’re doing eyefinity, then sure, it’s not capable. but for 2MP, a 4890 is plenty fast.

            Not trying to be a smart ass, just not sure what your point is. Do you have a game that won’t run on it? If so, let me know! 🙂

            • Airmantharp
            • 9 years ago

            No issue ‘running on it,’ but running well, yeah, that’s a challenge.

            • GrimDanfango
            • 9 years ago

            Crysis Warhead runs just about pleasantly at max settings on my 6970… I had a 4890 before that, and I guarantee it wouldn’t have managed terribly well. GTA4 could run on around-about medium settings (although admittedly, I *still* can’t run it maxed out, and I don’t think anyone ever will, I think their engine is largely crippled on PC)

            Agreed, a lot of games ran perfectly well on near-max settings at reasonable resolutions, but there were a fair few exceptions, and now I’ve switched to a 2560×1440 monitor, it really would have been scraping by. The 6970 is fair overkill in a lot of cases, but enough future-proofing to keep me going for another couple years.

            I tried Crossfire once, and was so reviled by the horrendous microstutter effect that I never touched it again. So I honestly can’t see the appeal of these multi-GPU cards, not until they find a way to properly synchronise the frame output. (onboard, natively-supported Lucid Hydra would help)

            • sweatshopking
            • 9 years ago

            You’re correct in that warhead doesn’t do amazing. I average about 30fps on high, with no AA, but that’s plenty fast for most people. You’re correct in that a 1440p monitor needs more. i wouldn’t want a 1890 at that res. it would suck.

            You’re also correct that GTA4 Is a POS. I have it, and it just runs like crap. it’s worse than crysis ever was. people want to know why no red dead on pc, it’s because the engine barely runs. It’s just not optimized. the graphics aren’t amazing, and a quad core has more than enough cpu power, but still, it runs like crap. it’s bad programming there.

            I agree that crossfire has it’s issues. I’ve run crossfire and sli, on numerous setups each, and IMO the 6k series do the best job in a multi gpu environment.

      • HisDivineShadow
      • 9 years ago

      You say there is no reason to buy the card. I’m not even impressed by the card’s cost, heat production (or strategy of removing said heat), or sound profile, but I can think of several great reasons for it to exist.

      1) The single slot PCIe gamer that needs, craves, must have Crossfire/SLI, but lacks the slots to do it. Usually for 2560×1600 with AA and the trimmings. Boom. Here it is. 4890 can’t do that.

      2) Eyefinity. It’s not my thing, it’s probably not your thing, but it’s definitely better than nVidia’s version that requires SLI to use it. Here you get a device that fits on a single PCIe slot for those gamers running with only one video PCIe slot (mini, ITX, micro) that need it, plus the POWER to push a multidisplay resolution with some room to spare.

      3) Those who want quad-CF to push super-high resolution (2560×1600) Eyefinity.

      4) For people in Alaska who are cold and want their space heaters to do more than just heat the room up. Now they have a combo device! They can buy this and get awesome graphics for their computer, plus get a free space heater AND white noise generator with every purchase.*

      *One heater and noise generator per purchase. One per individual per family per pet per PC. Some exclusions apply. Offer available for a limited time.

      • novv
      • 9 years ago

      4890 will not play every game on the planet with max settings because that implies anti-aliasing. If 30fps are enough for you in a FPS that’s your problem. I really like min 60fps. I do have now a HD4870OC 1GB and I can say there is a huge difference between this and a hd5850 but hd6990 is a completely different story. Please before you post something like that try to play TestDrive Unlimited 2 @1920×1200 with 4x AA max settings or Mafia 2, Crysis Warhead, Batman:AA the same settings.And when I say play I don’t mean just benchmarking (play the game for a couple of hours and see if it’s really smooth).

        • sweatshopking
        • 9 years ago

        I already said no AA. I don’t consider it important at 2MP. at lower res, sure. at higher, I don’t see a difference. As for the 1200, i did say 1080p. as for the “play the game” I don’t benchmark. I play games. I don’t have any issues with any games. My 4890 is factory OC’d, but it’s minor. You like 60fps, super. I don’t have a problem with 30-60. sure a 6990 is faster, duh, but I’m not convinced I need to upgrade at this point.

    • TaBoVilla
    • 9 years ago

    great review!

    how long has it been now since we last saw dual gpu’s from nvidia’s side? what can they possibly cook in one pcb that could top this?

      • Airmantharp
      • 9 years ago

      The last one was the GTX295, which was based on the 55nm GT200b, and configured similarly to a GTX275.

      The answer would either be a pair of the GF110 GPUs used in the GTX570/580, or a pair of the GF114 GPUs used in the GTX560Ti, and would really depend on whether Nvidia has been planning on following AMDs decision to disregard the PCIe specification for maximum power draw, as putting a pair of GF110 GPUs on a single card will most likely exceed 400 watts.

    • Bensam123
    • 9 years ago

    Yay, no annoying ditzy highschool girl!

    • Incubus
    • 9 years ago

    cheaper than 2x GTX580/GTX570 in SLI for pretty much the same performance and you get more room in your case.
    On a side note to Scott or if you prefer Mr.Wasson,SC2 doesn’t support AA,so applying it in CCC will only hurt the performance without any substantial or noticeable quality gains in the game.

      • Airmantharp
      • 9 years ago

      You should probably check your knowledge on SC2 and AA; I trust that these guys know more than you may think.

      And while you get the same performance, you will also get significantly worse acoustics- this is a big deal to myself and many others who would be moving from a quiet single GPU solution to this card.

      The opinion I’ve formed on this card after reading more reviews is that it is simply far too loud. It’s also very long, which means that the idea of using it as a solution for situations where only one PCIe x16 slot is available doesn’t hold up to logic very well- if you’re using this card, you’ve planned your entire system [b<][i<]around[/i<][/b<] it. It's going to need lots of airflow to it and and open slot or two next to it, there's no two ways about that, so shoving it into a cheap case with an mATX board simply isn't going to fly.

        • Incubus
        • 9 years ago

        I use Headphones and a quality ones too,cuz i can.
        About the length,well i think most mid-high tower cases will do fine.unless you’re using a $20 case then you’re right.

          • Airmantharp
          • 9 years ago

          Headphones work, but one of my considerations is that I also use speakers; I don’t want to hear a dustbuster under the desk, so unless this card could be fed enough air to keep it quiet and any other unpleasant noise isolated away, I’m really not interested in this over a dual-card solution.

            • Incubus
            • 9 years ago

            only crazy people would go with dual cards config over a powerful single one.
            But if you’re really into spending your kids’ college money into GPUs,then wait till GTX590 comes out so the prices would drop more,or even till next gen GPUs dropout by Xmas.

        • CaptTomato
        • 9 years ago

        A HAF or any case with HDD enclosures facing the side of the case would fit this.

      • Firestarter
      • 9 years ago

      Do you really think they’d say they tested it with AA and not notice that it wasn’t working? You must think TR’s editors are stupid or blind as bats.

    • DrDillyBar
    • 9 years ago

    Without an HD6870 I’m lost.
    Seems to fit into the “more thrifty than me” bucket though.

      • JustAnEngineer
      • 9 years ago

      Agreed. It might have been nice to have a couple of mid-range cards tossed in there for reference.

    • paulWTAMU
    • 9 years ago

    60 decibels? That’s insanely loud. I usually don’t care but if the noise comparision chart I looked up is right that’s about the noise you’d get standing 100 feet off a busy freeway.

    And of course 700 USD for a damn GPU 😡 I love the numbers it puts out but damn.

    oh noses -2! How shall I live!

      • Airmantharp
      • 9 years ago

      After having lived with insanely loud CPU coolers and GPU solutions, as well as loud PSUs throughout my youth, I have to say that when I build a computer, each component is chosen particularly for noise characteristics along with performance characteristics. I’ll pay more for silence, or settle for less perfomance; I have a [url=https://techreport.com/forums/viewtopic.php?f=33&t=75683<]wish list of sorts[/url<] that I created as a kind of guide for myself, where I explain my choices for each part and provide links from established review houses to support them.

        • paulWTAMU
        • 9 years ago

        As soon as my budget permits that’s what I’ll do. It becomes more important if I start using my PC as a home theater too, which I want to do at some point. Right now I can’t get truly silent performance at the performance want at the budget I have so I don’t (although I try to avoid really loud stuff). Soon as I can though…

    • MaxTheLimit
    • 9 years ago

    Why can’t one of THESE be given out in the screenshot contest?

      • UberGerbil
      • 9 years ago

      Because no screenshot on earth could win it.

        • bhtooefr
        • 9 years ago

        I bet I could make one… and it would be on a display that needs a card like this to push enough pixels to it.

        (Actually, depending on adapters, and how Eyefinity works, it may be possible to drive SIX of those displays off of this card. Each display needs three DVI links – one dual, one single – for full 48 Hz operation. DisplayPort 1.2 offers the equivalent of four DVI links on a single port. So, the card has 18 links.)

    • Airmantharp
    • 9 years ago

    As an [url=http://www.newegg.com/Product/Product.aspx?Item=N82E16814130595&cm_re=gtx570-_-14-130-595-_-Product<]Evga GTX570 SuperClocked[/url<] owner looking forward to upgrading to SLi, this review was very enlightening to say the least. For reference, I'm using an HP ZR30w 30" panel at 2560x1600, and my card is running at its stock speeds of 797MHz Core/1594MHz Shader/3900MHz Memory versus a stock GTX570 which runs at 732MHz Core/1464MHz Shader/3800MHz Memory. First, I have to commend AMD for putting this card together, and especially for having the guts to exceed the PCIe spec by 75 watts out of the box, and by 150 watts (+50%!) in the special AUSUM mode. This bodes well for enthusiests, and sets a good precedent of pushing GPU performace beyond the shortsighted PCIe spec into an area where power supply and enclosure limits become the outer bounds of performance. Next, I began to realize by the summary page that I might have wanted to wait for AMD's HD6900 single GPU series to come out before purchasing my GTX570. While I don't regret that purchase at all, I'm seeing that going with an HD6950 2GB would have saved me money while providing equivalent performance, greater flexibility with the extra GRAM, and a quieter and lower power solution that hopefully would also run cooler. Or, for the same price, I could have gone with the HD6970, had the extra RAM, and gotten extra performance for my money, while only adding a little bit of noise and no more power usage under load. Oh the possibilities :). On the bright side of things, I have to say that my particular GTX570 is absurdly quiet until pushed heavily for a good length of time, and is a SuperClocked version that is clocked over 10% faster than a stock GTX570, which would put it right at the same performance level as the HD6970 if linear scaling is even loosly applicable. Further, on the note of extra memory- I have done some monitoring of memory usage on my card during games, and I haven't seen it hit 900MB of my cards 1280MB while at 2560x1600 on my 30" panel. I'm mostly not using AA due to the panels resolution, but every other setting that I can raise I have, and I'll be keeping a close eye on this in the future!

    • mentaldrano
    • 9 years ago

    That must have been more fun to write than the usual review, and as always fun to read! Not that I’ll be dropping that much cash on a video card, but it is good to see where the high end is now, because that’s where we’ll all be in a few years.

    • bdwilcox
    • 9 years ago

    Yeah, but will it run Crysi…nevermind.

      • dragosmp
      • 9 years ago

      anything runs Crysis 2 😛

        • Meadows
        • 9 years ago

        I guess that’s why Scott’s 580 GTX ran it at a pedestrian 35 fps using maximum quality. Oh, wait.
        (Granted, that level of quality is still somewhat higher than he used to [i<]actually murder[/i<] 8800 Ultras with, back in the day.)

    • Palek
    • 9 years ago

    This card can go hang with the i7-990 in the back of the bar and smoke expensive Cuban cigars. Both products are representations of silly excess. A fun read, nevertheless.

    The 6970 looks like the product to buy in the high-end. That is, unless you have a multi-monitor setup or a 2560×1600 screen.

    EDIT: typo on the first page:
    [quote<]MD calls it the "Antilles Unlocking Switch for Uber Mode" or—yes, this is [u<]happenening[/u<]—AUSUM.[/quote<]

      • Airmantharp
      • 9 years ago

      Well, as a new user of a 2560×1600 30″ screen, I have to say that this review has really interested me. The idea that GPU solutions are available that can run the newest games with the highest settings available (except one in Metro 2033, apparently) is very, very cool.

      I personally own an Evga GTX570 SC, which is the same price and performance as an HD6970, and found the GTX570 SLi benchmarks to be enlightening for sure, but I’ll make my own comment for that.

      • dragosmp
      • 9 years ago

      Agreed, it’s like a review of the latest Porsche 911 GT2 with 500+ HP – a fun ride, but you’ll probably get the more reasonable Carrera instead.

      Good morning read

        • cynan
        • 9 years ago

        Why settle for the GT2 when you can get a Turbo S for only like a few couple thousand more? Sheesh!

      • mesyn191
      • 9 years ago

      Not a typo, its the AUSUM field effect taking hold.

      • tejas84
      • 9 years ago

      @Palek

      Excess rocks. Charlie Sheen is proof of that! I have a 2560×1600 monitor and I need a 6990 to play at some kind of decent settings.

      Jealous much??

        • paulWTAMU
        • 9 years ago

        I’m very jealous, yes. Doesn’t change the fact it’s a very niche product and that most people aren’t well served by buying it (unless you’re running a 2560×1600 monitor).

        • Palek
        • 9 years ago

        Not really. I prefer to spend my money wisely! 🙂

        2560×1600 is great if you do engineering, publishing or design work, but I personally do not really care about resolution for gaming purposes above a certain level. 1680×1050 and above are fine for me (emphasis on me).

          • Airmantharp
          • 9 years ago

          I’ve been drooling over 30″ 2560×1600 screens [i<][b<]for gaming[/b<][/i<] for years, and finally got ahold of one this year. One of the big differences I see is in things like view distance, which matters considerably for online gaming. When things get far away, they get small, but I can see them in more detail than anyone with a lower resolution screen :). Otherwise, the addition of an IPS panel is just awesome for everything. I'm really enjoying the color fidelity and viewing angles, and intend to get into digital photography, processing, and manipulation in the near future as well.

            • Palek
            • 9 years ago

            I have an IPS panel myself, and that was most certainly a worthwhile investment for all kinds of applications. I would consider a 2560×1600 panel to provide diminishing returns for the extra cost – both the panel and the beefier video card required to drive it.

            I think a large majority of PC gamers pick their computer parts from the best price/performance bracket and only a very small minority stays on the bleeding edge. The return is just not worth the investment. I personally consider it unwise to chase after the latest and greatest, but I mean no disrespect to anyone who has the money to spend on top-end parts.

            [EDIT]This is all in the context of gaming PCs. Professional hardware is a different matter.[/EDIT]

            • CaptTomato
            • 9 years ago

            Problem with the 30’s are cost disparity, ie, they’re more expensive than a Pansonic 58in plasma in Australia, but the 27in model is reasonably priced.
            Having a good monitor is one of the best things you can do for your PC as it enhances everything, and IMO, way too many people have beefy systems driving ridiculous 22-24in TN panels FFS….

            • Airmantharp
            • 9 years ago

            The cost disparity is deserved though- if you consider that a 24″ IPS panel will cost around $500, paying $1000 for a 30″ IPS panel that has almost twice as many pixels doesn’t really seem out of order.

            But like I said (below?), if a 30″ TN panel with a resolution of 2560×1600 existed, I’d probably have gone that way. I was most interested in resolution, and paying significantly more for pretty colors and better viewing angles just isn’t worth it, though I do like them!

            • CaptTomato
            • 9 years ago

            How can a 58in Panasonic NEO Plasma be cheaper than a 30in LCD?

            • Airmantharp
            • 9 years ago

            I agree- there is definitely a diminishing cost, and I would absolutely not recommend this screen for someone looking just to game. To be honest, if such a resolution were available with a decent TN panel (decent for TN, JAE), I would have probably went that direction, but such things don’t exist.

            This is also my first IPS (or non-TN!) panel, which is also something I’ve been looking forward to. Previously, I’ve used a Hanns-G 28″ 1920×1200 screen, which is doing web browsing and messenger/system monitoring with it’s giant pixels now, a Samsung 20″ TN at 1600×1200, and numerous 19″ Trinitron types from various manufacturers before that.

          • CaptTomato
          • 9 years ago

          I feel sorry for you……get a DELL27 and you’ll never go back.

            • Palek
            • 9 years ago

            I have a 23″ 1920×1080 Mitsubishi IPS. I have absolutely no desire to get anything bigger for my main PC.

            If I really want to play on a big screen I just move over to the living room and fire up the HTPC connected to our 1080p LCD TV.

            • Airmantharp
            • 9 years ago

            To give you an idea-

            my 28″ TN from Hanns-G has giant, bright, sharp pixels that are very easy to read- I use it next to my 30″ IPS from HP as a browsing/monitoring screen. It’s also very very fast and has no input lag, which means that it excells in games where an HDTV would not; expect up to 100ms of input lag from many HDTVs.

            The 30″ screen also offers things that HDTVs do not, in particular, a unique 2560×1600 resolution. More pixels means more stuff or more detail at once, and I’d venture that the IPS panel it uses is of higher quality (calibrated and un-calibrated) than the panels used in HDTVs.

            • Palek
            • 9 years ago

            I really do not need convincing, but thanks for the effort you put into your replies.

            I’m perfectly happy with my setup.

    • MadManOriginal
    • 9 years ago

    AUSUM review.

      • Palek
      • 9 years ago

      CUDA been worse, I guess.

        • bdwilcox
        • 9 years ago

        Your puns make me want to Stream.

          • Meadows
          • 9 years ago

          Yours make me want to cry. That’s not a pun.

            • sweatshopking
            • 9 years ago

            I think he’s referring to ATI stream. I thought it was good 🙂

            • Meadows
            • 9 years ago

            I know that. But I thought it wasn’t.

            • sweatshopking
            • 9 years ago

            I know you did. I just wanted to tease you. I love teasing hot babes.

            • bdwilcox
            • 9 years ago

            Meadows, you just have no Vision. My reply was the Fusion of a pun and a snarky remark.

        • Kraft75
        • 9 years ago

        not too hard to NVISION…

          • ssidbroadcast
          • 9 years ago

          Looks like your pun didn’t come under Meadow’s Crossfire.

            • Damage
            • 9 years ago

            Ooh, very SLI.

            • Meadows
            • 9 years ago

            That one was unexpected.

            • flip-mode
            • 9 years ago

            I love when TR staff join in the geeky-dorky!

            • Kraft75
            • 9 years ago

            Thanks to solid and DirectXecution, my comments GLIDE right through…. XD

      • beanman101283
      • 9 years ago

      [url<]http://img684.imageshack.us/i/cardfull.jpg/[/url<]

      • swaaye
      • 9 years ago

      Lemme round that up to terdy for ya.

    • DancinJack
    • 9 years ago

    [quote<]There is a price for being AUSUM, though, and apparently it's about 50 watts.[/quote<] So many good things about this sentence.

      • DancinJack
      • 9 years ago

      +11? I’m glad you guys think it’s as amusing as I do.

    • cynan
    • 9 years ago

    Great review.

    If this thing really does cost anywhere near $700, then it looks as though a pair of HD 6950s is the way to go value wise as you can get a pair of 2GB models for under $550 and a pair of 1GB models for around $450. However I supposed that the 2GB of memory would be worthwhile for use in crossfire (even though, with single cards, there does not seem to be much performance difference between the 1 and 2 GB models.

    Anyone really going to pay close to $700 for this thing?

      • LawrenceofArabia
      • 9 years ago

      It makes a decent amount of sense if your watercooling, since it’ll still be cheaper than buying two separate blocks for two 6970s, not that your being super value conscious with a $2500 watercooled rig. More so I think its simply to service the segment of the consumer base that just goes and buys the most expensive card on the market every 2 years and doesn’t care so much about the particulars. Also having “the most powerful single card solution in the world’ makes good marketing material, its not like they’ll be making huge quantities of these anyways.

        • Airmantharp
        • 9 years ago

        You’re right; this is a halo product, and will be purchased largely by people who don’t really understand it, and by people crazy enough to pry it to peices in order to add it (and possibly another!) to their watercooling loop!

        It really is like Chevy’s Corvette ZR1, Mercedes AMGs, Chrysler’s SRTs, Caddy’s V’s, BMW’s Ms, and so on, for car references.

          • dpaus
          • 9 years ago

          I disagree – I think you’re overlooking the corporate market for these cards. From a single slot, I can drive five HP ZR30w 30″ LCD displays. That’s of huge use in financial trading centres, air traffic control centres, 9-1-1 and security centres, etc., etc.

            • paulWTAMU
            • 9 years ago

            …no. At least no in 911 or security centers. You get cheap cards with multiple outs.

      • Airmantharp
      • 9 years ago

      I will make my own comment on this, but as a GTX570 owner looking at SLi, this whole review has been [b<][i<]very[/i<][/b<] interesting!

      • DrDillyBar
      • 9 years ago

      no.

      • tejas84
      • 9 years ago

      I’m willing to pay £450 which is about $726 but that is as far as I go. Otherwise it just becomes silly.

    • michael_d
    • 9 years ago

    Nothing surprising as I expected it to be slightly slower than CrossFire setup. Hopefully the upcoming 7XXX series single GPU-based card will beat 6990

      • DancinJack
      • 9 years ago

      I can’t imagine that happening. On the overall performance/dollar graph the fast single GPU Radeon is at roughly 40 fps. The 6990 is about 65. The 5970 is almost 10fps “faster” than the 6970 too. I guess it could be close, but I doubt you could get 6990 performance out of a single GPU in the next generation due.

        • Airmantharp
        • 9 years ago

        Won’t happen without a die shrink, but that [b<][i<]is rumored.[/i<][/b<] One of the reasons we're seeing such a small increase in performance is the lack of a die shrink; although, we can imagine the first generation 40nm parts (GTX400's, some HD4000's and all HD5000+) as being 'shrunk' when compared to the second generation 40nm parts (GTX500's, HD6000's) due to the large increase in efficiency and yields, which has resulted in parts that are more complex or have more modular sections enabled, while consuming less power at load than the previous generation. Still, the next full shrink at TSMC, which is 28nm, should provide a bigger jump than we've seen in the last year or so.

          • DancinJack
          • 9 years ago

          I do agree that a die shrink would help, as it’s necessary to include new features in the GPU, but I don’t think just a die shrink is all that needs to happen. I assume this is what you meant too. A die shrink+new features at 28nm and we could see something like that.

          As an example, something like G92>G92b. Only 65>55nm but there wasn’t much performance gained there.

    • michael_d
    • 9 years ago

    double post, please deleted

Pin It on Pinterest

Share This