Nvidia’s GeForce GTX 590 graphics card

March has certainly been a month of extremes around here. We kicked it off with a look at the Core i7-990X, a world-beating six-core CPU, and then moved on to the absolutely epic Radeon HD 6990. After that, we investigated a pair of breathtakingly fast SSDs. Now, we’re back on the graphics beat with a premium offering from Nvidia, the GeForce GTX 590. Like the Radeon HD 6990, the GeForce GTX 590 is a dual-GPU video card planted firmly at the top of the lineup.

This is Nvidia’s first attempt at a dually product in quite some time, at least in the frenetically paced graphics market. The last one, the GeForce GTX 295, debuted over two years ago. As we noted in our 6990 review, cramming two high-end GPUs onto a dual-slot expansion card isn’t easy; power and thermal limitations often define these products, more so than most. That’s probably one reason we didn’t see a dual-GPU entrant in the GeForce GTX 400 series. The first-gen chips based on the Fermi GPU architecture were famously late and thermally constrained, making them iffy candidates for the SLI-on-a-stick treatment.

The GF110 GPU in today’s high-end GeForce cards is still a rather enormous chip, but it’s a little easier to tame—and is a formidable rival to the Cayman GPU in the Radeon HD 6900 series. Naturally, then, Nvidia has cooked up an answer to the Radeon HD 6990, one that reveals a decidedly different approach to the extreme dually graphics card.

Sizing up Gemini

Code-named “Gemini” during its development, the GTX 590 has a pair of GF110 chips onboard, and those GPUs haven’t had any of their onboard hardware disabled. Unit counts therefore mirror those for a pair of GeForce GTX 580 cards in SLI. Yet in order to keep the GTX 590 within a manageable power limit, Nvidia has dialed back the clock speeds to levels well below the GeForce GTX 570’s. The GTX 590’s core clock is just 607MHz, and the GDDR5 memory ticks along at 854MHz—or about 3.4 GT/s. So, although these are fully-enabled GF110 GPUs, the GTX 590’s projected rates for key graphics capabilities look very much like a pair of GeForce GTX 570s, not two full-on GTX 580s.

Here’s a quick look at the numbers.

Peak pixel

fill rate

(Gpixels/s)

Peak bilinear

integer texel

filtering rate

(Gtexels/s)

Peak bilinear

FP16 texel

filtering rate

(Gtexels/s)

Peak shader

arithmetic

(GFLOPS)

Peak

rasterization

rate

(Mtris/s)

Peak

memory

bandwidth

(GB/s)

GeForce GTX 560
Ti
26.3 52.6 52.6 1263 1644 128
GeForce GTX 570 29.3 43.9 43.9 1405 2928 152
GeForce GTX 580 37.1 49.4 49.4 1581 3088 192
GeForce GTX 590 58.3 77.7 77.7 2488 4856 328
Radeon HD 6850 24.8 37.2 18.6 1488 775 128
Radeon HD 6870 28.8 50.4 25.2 2016 900 134
Radeon HD 6950 25.6 70.4 35.2 2253 1600 160
Radeon HD 6970 28.2 84.5 42.2 2703 1760 176
Radeon HD 5970 46.4 116.0 58.0 4640 1450 256
Radeon HD 6990 53.1 159.4 79.7 5100 3320 320
Radeon HD 6990
AUSUM
56.3 169.0 84.5 5407 3520 320

We’re assuming perfect scaling from one GPU to two in the figures above, which isn’t always how things work out in practice. However, these are simply theoretical peaks, and even the most efficient GPUs don’t always maintain these rates in real applications.

On paper, at least, the GTX 590 just beats out the Radeon HD 6990 in ROP throughput and memory bandwidth, two keys to fast operation at high resolutions with edge antialiasing, but it’s slightly slower in other areas. We wouldn’t sound any alarms about the GTX 590’s vastly slower theoretical shader arithmetic rates. Nvidia’s shader architecture tends to be more efficient, delivering performance comparable to AMD’s in many cases, if not superior. Meanwhile, the GTX 590 absolutely crushes the Radeon HD 6990 in peak triangle rasterization rate, which is but one indication of the GF110’s quite real end-to-end superiority in geometry processing and DirectX 11 tessellation throughput. The question there is whether or not Nvidia’s geometry processing advantage will matter in real games, and it’s a vexing one.

All in all, the GTX 590 looks to be endowed with outrageously high specifications. Yet those specs look very much like those of the primary competition, the Radeon HD 6990. This is gonna be a close one, folks.

The card

Dude. Glow.

Like its competition, the GTX 590 presents dual 8-pin aux power inputs to the user, threatening to require a PSU upgrade. The card’s max power rating, or TDP, is 365W, just 10W below the peak power deliverable through the combination of a motherboard’s PCIe x16 slot and a couple of those 8-pin auxiliary inputs. Not coincidentally, that’s also 10W below the Radeon HD 6990’s TDP.

The GTX 590’s expansion slot covers are pierced by three dual-link DVI ports, a mini-DisplayPort connector, and as much thermal venting as the card’s designers could muster. That mini-DP output supports DisplayPort 1.1a, so it’s less capable in several ways than the DisplayPort 1.2 outputs on newer Radeons. Then again, those Radeons can drive only one dual-link DVI display natively; connecting more will require expensive adapters.

When it comes to truly extreme display configurations, Nvidia and AMD have taken different paths. The GTX 590’s dual-link outputs will allow it to power a trio of four-megapixel monitors at once—or three smaller (~2 MP) monitors at 120Hz for wrap-around stereoscopic gaming via Nvidia’s 3D Vision scheme. That DisplayPort output enables the 590 to drive four displays simultaneously, but only for productivity; multi-monitor Surround Gaming is limited to a maximum of three displays. Meanwhile, AMD isn’t nearly as far down the path of cultivating support for stereoscopic 3D, but its Eyefinity multi-monitor gaming scheme will happily support six displays at once. The 6990 can do it, too, thanks to five onboard outputs and the possibility of connecting more monitors via a DisplayPort 1.2 hub. True to this mission, the 6990 also comes with more video memory than the GTX 590—2GB per GPU and 4GB total, versus 1.5GB per and 3GB total on the 590. It’s up to you to choose why you get a headache: from wearing flickery glasses, or from trying to track objects across display bezel boundaries.

GeForce GTX 580 (top) versus GTX 590 (middle) and Radeon HD 6990 (bottom)

If you’re looking for an indication of the differences in philosophy between Nvidia and AMD for cards of this ilk, look no further than the picture above. The GTX 590 is shown sandwiched between Nvidia’s best single-GPU card, the GeForce GTX 580, and the massive Radeon HD 6990. The GTX 580 is a very healthy 10.5″, the 590 is a considerable 11″, and the 6990 is just a smidgen shy of a full 12″. Although the GTX 590’s space requirements are definitely above the average, the 6990 will be problematic in all but the deepest PC enclosures. AMD has aimed for peak extremeness. Nvidia has tailored its solution to be a bit more of a good citizen in this way, among others.

Source: Nvidia.

Source: Nvidia.

Another way the GTX 590 aspires to be easier to get along with? Acoustics. Superficially, this card doesn’t look too terribly different from its rival, with a centrally located fan flanked by dual heatsinks whose copper bases house vapor chambers. However, Nvidia says the GTX 590 isn’t much louder than the GeForce GTX 580—and is thus substantially quieter than the howls-like-a-banshee 6990. We’ll put that claim to the test, of course.

Otherwise, the 590 has all of the sophisticated bits you might expect from a dual-GPU solution of this sort, including a 10-phase power supply with digital VRMs and Nvidia’s familiar NF200 PCI Express switch chip, which routes 16 PCIe 2.0 lanes to each GPU and another 16 lanes to the PCIe x16 slot.

And, yes, there is an SLI connector onboard, raising the prospect of quad SLI configurations based on dual GTX 590s. The card will do it, but Nvidia wants users to be careful about the selection of components to wrap around such a config. It recommends a motherboard with an additional expansion slot space between the PCIe x16 slots, so there’s adequate room between the cards for the interior one’s fan to take in air. The firm is certifying motherboards that meet its qualifications for quad SLI, along with cases and PSUs. Right now, cases are the biggest bugaboo. Only three are certified—Thermaltake’s Element V, SilverStone’s Raven RV02, and CoolerMaster’s HAF X—although more are purportedly coming soon. You could probably build a very nice quad SLI setup with some other popular full-tower cases and the right sort of cooling. Our sense is that Nvidia is emphasizing certification simply because it wants to ensure a good user experience and adequate cooling.

As you might expect, the GTX 590 will be priced at $699.99, exactly opposite the 6990. Cards should be available at online retailers starting today. Interestingly, you’ll only find GTX 590 cards from Asus and EVGA available for sale in North America. In other parts of the world, the 590 will be exclusive to other Nvidia partners. My understanding is the cards have been divvied up in this manner because they’re relatively low-volume products. It may have been deemed impractical to have six or more brands offering them simultaneously in one market. How low volume? When we asked, the firm told us it would be shipping “thousands of cards” worldwide. That’s noteworthy because it’s not tens of thousands—just thousands. That said, Nvidia expects a “steady supply available in the market.” Perhaps the $700 price tag will ensure demand doesn’t exceed supply over time.

One more thing

As you may know, the Radeon HD 6990 comes with an alternative firmware, accessible via a small DIP switch, that enables a configuration dubbed “uber mode” by AMD. The switch that turns on “uber mode” is the “Antilles Unlocking Switch for Uber Mode,” or AUSUM, for short. Because this config exceeds the PCIe power spec and isn’t guaranteed to work properly in all systems, it’s essentially overclocking, though it’s tacitly approved by the GPU maker.

We tested the 6990 with the AUSUM switch enabled, and that raised an issue of fairness. Nvidia hasn’t given the GTX 590 any comparable mechanism, but the card can be overclocked in software. We figured, by all rights, we should test an overclocked configuration for the GTX 590, as well. One has to be careful here, though, because the GF110 chips will definitely reach much higher clock speeds when given enough voltage—we reached 772MHz at 1025 mV, similar to the GTX 580—but you’ll also definitely bump up against the GTX 590’s power limiting circuitry if you push too hard. The result, as we learned, is that performance drops with the supposedly overclocked config.

We eventually decided on a more mildly overclocked config in which the GPU core was raised to 690MHz, the GPU core voltage was increased from 938 mV to 963 mV, and the memory clock was tweaked up to 900MHz (or 3.6 GT/s). This setup was easily achieved with MSI’s Afterburner software, proved quite stable, and, as you’ll see in the following pages, performed consistently better than stock. The only thing left to do then was give these settings a name, since they lacked one. Folks, say hello to Wasson’s Intrepid Clock Konfig, Extreme Dually—or WICKED. We’ve put WICKED and AUSUM head to head to see which is better.

Our testing methods

As ever, we did our best to deliver clean benchmark numbers. Tests were run at least three times, and we’ve reported the median result.

Our test systems were configured like so:

Processor Core
i7-980X
Motherboard Gigabyte EX58-UD5
North bridge X58 IOH
South bridge ICH10R
Memory size 12GB (6 DIMMs)
Memory type Corsair Dominator CMD12GX3M6A1600C8

DDR3 SDRAM at 1600MHz

Memory timings 8-8-8-24 2T
Chipset drivers INF update 9.1.1.1025

Rapid Storage Technology 9.6.0.1014

Audio Integrated ICH10R/ALC889A

with Realtek R2.58 drivers

Graphics
Dual Radeon HD
6870 1GB

with Catalyst 11.4 preview drivers

Radeon HD
5970 2GB

with Catalyst 11.4 preview drivers

Dual Radeon HD
6950 2GB

with Catalyst 11.4 preview drivers

Radeon HD 6970
2GB

with Catalyst 11.4 preview drivers

Dual Radeon HD
6970 2GB

with Catalyst 11.4 preview drivers

Radeon HD 6990
4GB

with Catalyst 11.4 preview drivers

MSI GeForce
GTX 560 Ti Twin Frozr II 1GB +

Asus GeForce GTX
560 Ti DirectCU II TOP 1GB

with ForceWare 267.26 beta drivers

Zotac
GeForce GTX 570 1280MB

with ForceWare 267.24 beta drivers

Zotac
GeForce GTX 570 1280MB +

GeForce GTX 570 1280MB

with ForceWare 267.24 beta drivers

Zotac
GeForce GTX 580 1536MB

with ForceWare 267.24 beta drivers

Zotac
GeForce GTX 580 1536MB +

Asus GeForce GTX 580 1536MB

with ForceWare 267.24 beta drivers


GeForce GTX 590 3GB

with ForceWare 267.71 beta drivers

Hard drive WD RE3 WD1002FBYS 1TB SATA
Power supply PC Power & Cooling Silencer 750 Watt
OS Windows 7 Ultimate x64 Edition

Service Pack 1

Thanks to Intel, Corsair, Western Digital, Gigabyte, and PC Power & Cooling for helping to outfit our test rigs with some of the finest hardware available. AMD, Nvidia, and the makers of the various products supplied the graphics cards for testing, as well.

Unless otherwise specified, image quality settings for the graphics cards were left at the control panel defaults. Vertical refresh sync (vsync) was disabled for all tests.

We used the following test applications:

Some further notes on our methods:

  • Many of our performance tests are scripted and repeatable, but for some of the games, including Battlefield: Bad Company 2 and Bulletstorm, we used the Fraps utility to record frame rates while playing a 60- or 90-second sequence from the game. Although capturing frame rates while playing isn’t precisely repeatable, we tried to make each run as similar as possible to all of the others. We raised our sample size, testing each Fraps sequence five times per video card, in order to counteract any variability. We’ve included second-by-second frame rate results from Fraps for those games, and in that case, you’re seeing the results from a single, representative pass through the test sequence.
  • We measured total system power consumption at the wall socket using a Yokogawa WT210 digital power meter. The monitor was plugged into a separate outlet, so its power draw was not part of our measurement. The cards were plugged into a motherboard on an open test bench.

    The idle measurements were taken at the Windows desktop with the Aero theme enabled. The cards were tested under load running Battlefield: Bad Company 2 at a 2560×1600 resolution with 4X AA and 16X anisotropic filtering. We test power with BC2 because we think it’s a solidly representative peak gaming workload.

  • We measured noise levels on our test system, sitting on an open test bench, using an Extech 407738 digital sound level meter. The meter was mounted on a tripod approximately 10″ from the test system at a height even with the top of the video card.

    You can think of these noise level measurements much like our system power consumption tests, because the entire systems’ noise levels were measured. Of course, noise levels will vary greatly in the real world along with the acoustic properties of the PC enclosure used, whether the enclosure provides adequate cooling to avoid a card’s highest fan speeds, placement of the enclosure in the room, and a whole range of other variables. These results should give a reasonably good picture of comparative fan noise, though.

  • We used GPU-Z to log GPU temperatures during our load testing.

The tests and methods we employ are generally publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.

Bulletstorm

This game is stressful enough on a GPU to make it a decent candidate for testing cards of this type. We turned up all of the game’s image quality settings to their peaks and enabled 8X antialiasing, and then we tested in 90-second gameplay chunks.

The Radeons turn in a relatively strong showing here, with the 6990 essentially matching Nvidia’s fastest dual-GPU solution, a couple of GTX 580s in SLI. At its stock clocks, the GTX 590 performs almost exactly like our GTX 570 SLI setup.

F1 2010
F1 2010 steps in and replaces CodeMasters’ previous effort, DiRT 2, as our racing game of choice. F1 2010 uses DirectX 11 to enhance image quality in a few, select ways. A higher quality FP16 render target improves the game’s high-dynamic-range lighting in DX11. A DX11 pixel shader is used to produce soft shadow edges, and a DX11 Compute Shader is used for higher-quality Gaussian blurs in HDR bloom, lens flares, and the like.

We used this game’s built-in benchmarking facility to script tests at multiple resolutions, always using the “Ultra” quality preset and 8X multisampled antialiasing.

Here’s another very strong showing for the red team. Even the Radeon HD 6950 CrossFireX setup outperforms two GTX 580s in SLI. Hang tight—this surely won’t last.

Civilization V
Civ V has a bunch of interesting built-in tests. Up first is a compute shader benchmark built into Civilization V. This test measures the GPU’s ability to decompress textures used for the graphically detailed leader characters depicted in the game. The decompression routine is based on a DirectX 11 compute shader. The benchmark reports individual results for a long list of leaders; we’ve averaged those scores to give you the results you see below.

Remember how I said we shouldn’t sound any alarms about the much higher theoretical shader throughput of the Radeons? Here’s an example of why that’s so. Even though the GTX 590 has relatively low clock speeds, and although the performance of multi-GPU setups doesn’t scale well in this test, the 590 comes out ahead of the fastest dual-Radeon config.

In addition to the compute shader test, Civ V has several other built-in benchmarks, including two we think are useful for testing video cards. One of them concentrates on the world leaders presented in the game, which is interesting because the game’s developers have spent quite a bit of effort on generating very high quality images in those scenes, complete with some rather convincing material shaders to accent the hair, clothes, and skin of the characters. This benchmark isn’t necessarily representative of Civ V‘s core gameplay, but it does measure performance in one of the most graphically striking parts of the game. As with the earlier compute shader test, we chose to average the results from the individual leaders.

You can worry a bit here if you’d like, though. This pixel-shader-intensive benchmark runs notably faster on the Radeons.

Another benchmark in Civ V focuses, rightly, on the most taxing part of the core gameplay, when you’re deep into a map and have hundreds of units and structures populating the space. This is when an underpowered GPU can slow down and cause the game to run poorly. This test outputs a generic score that can be a little hard to interpret, so we’ve converted the results into frames per second to make them more readable.

The GTX 590 proves to be faster than the 6990 in this test, although both cards offer more-than-adequate frame rates at these settings.

StarCraft II

Up next is a little game you may have heard of called StarCraft II. We tested SC2 by playing back 33 minutes of a recent two-player match using the game’s replay feature while capturing frame rates with Fraps. Thanks to the relatively long time window involved, we decided not to repeat this test multiple times. The frame rate averages in our bar graphs come from the entire span of time. In order to keep them readable, we’ve focused our frame-by-frame graphs on a shorter window, later in the game.

We tested at the settings shown above, with the notable exception that we also enabled 4X antialiasing via these cards’ respective driver control panels. SC2 doesn’t support AA natively, but we think this class of card can produce playable frame rates with AA enabled—and the game looks better that way.

The GeForces win the day in StarCraft II, and the GTX 590 performs particularly well, with our WICKED config nearly matching dual GTX 580s.

Battlefield: Bad Company 2
BC2 uses DirectX 11, but according to this interview, DX11 is mainly used to speed up soft shadow filtering. The DirectX 10 rendering path produces the same images.

We turned up nearly all of the image quality settings in the game. Our test sessions took place in the first 60 seconds of the “Heart of Darkness” level.

This one is more or less a dead heat, although WICKED outduels AUSUM in more pronounced fashion.

Metro 2033

We decided to test Metro 2033 at multiple image quality levels rather than multiple resolutions, because there’s quite a bit of opportunity to burden these graphics cards simply using this game’s more complex shader effects. We used three different quality presets built into the game’s benchmark utility, with the performance-destroying advanced depth-of-field shader disabled and tessellation enabled in each case.

The GeForce cards have an edge at the lower image quality presets, where frame rates are into the hundreds. Once we turn up all of the shader effects and detail settings, though, the standings even out somewhat. The result: the 6990 noses past the GTX 590, and AUSUM overcomes WICKED.

Aliens vs. Predator
AvP uses several DirectX 11 features to improve image quality and performance, including tessellation, advanced shadow sampling, and DX11-enhanced multisampled anti-aliasing. Naturally, we were pleased when the game’s developers put together an easily scriptable benchmark tool. This benchmark cycles through a range of scenes in the game, including one spot where a horde of tessellated aliens comes crawling down the floor, ceiling, and walls of a corridor.

For these tests, we turned up all of the image quality options to the max, along with 4X antialiasing and 16X anisotropic filtering.

The 6990 takes our final game test, while the GTX 590 falls in place right behind dual GTX 570s in SLI.

Power consumption

Although the GTX 590’s TDP rating is 10W lower than the 6990’s, the new GeForce draws substantially more power here than the Radeon. This isn’t an absolute max power situation—we’re running a game, not a synthetic stress test or the like—so the results aren’t necessarily surprising. The 6990 does look to be more power-efficient than the GTX 590, though, both at idle and when running Bad Company 2. In fact, the AUSUM 6990’s power use is comparable to the stock GTX 590’s. The WICKED config demonstrates why, perhaps, Nvidia hasn’t pushed any harder on GPU frequencies. While the clock headroom is there, the power headroom is not.

Noise levels and GPU temperatures

Here’s the eye-popping result of the day. Although the GTX 590 draws more power than the Radeon HD 6990 under load, it still registers as roughly 10 decibels quieter than the Radeon on our sound level meter. Subjectively, the difference is huge. The 6990 fills the room with a rough hissing sound, while the GTX 590 isn’t much louder than an average high-end video card. Even when it’s overclocked, the WICKED 590 is quieter than the stock 6990 by a fair amount.

One contributor to the difference is revealed in the GPU temperature results. The 6990’s fan control profile is relatively aggressive about keeping GPU temperatures low, perhaps out of necessity. The GTX 590 lands in the middle of this pack, at least until it goes WICKED.

Conclusions

Most folks seem to enjoy these value scatter plots, so let me drop one on you.

We’ve taken the results from the highest resolution or most intensive setting of each game tested, averaged them, and combined them with the lowest prevailing price at Newegg for each of these configurations. Doing so gives us a nice distribution of price-performance mixes, with the best tending toward the upper left and the worst toward the bottom right.

Keep in mind that we’ve only tested seven games, and that these standings could change quickly if we altered the mix. Still, based on our tests, the Radeon HD 6990 has an appreciable performance lead over the GTX 590 at the same price. Yes, that lead largely evaporates with our WICKED overclocked config, but going WICKED involves some pretty extreme power draw, even compared to AMD’s (admittedly more conservative) AUSUM option.

The GTX 590 is still breathtakingly fast—much quicker than a single GeForce GTX 580 and nearly as quick as a pair of GeForce GTX 570s or Radeon HD 6950s—but its true distinction, in our view, is its wondrously soft-spoken cooling solution. The GTX 590’s cooler is vastly quieter than the boisterous blower on the Radeon HD 6990. Combine that acoustic reality with the GTX 590’s shorter 11″ board length and understated appearance, and a sense of its personality begins to take shape. This card is more buttoned-down than the 6990. There’s no AUSUM switch, no bright red accents showing through the case window, and no obvious aural proclamation that Lots of Work is Being Done Here.

Frankly, I like that personality. If I were spending $700 on a dual-GPU graphics card for my ideal PC, I’d probably choose the GTX 590, even if it did mean sacrificing the absolute best performance.

But choosing the GTX 590 does mean making that sacrifice, and I’m not sure how that plays in the world of uber-extreme PC hardware components, where speed and specs have always been paramount. Prospective buyers of these rather exclusive video cards have an intriguing choice to make. In my view, the image quality and feature sets between the two GPU brands are roughly equal right now. The prices are the same, and the Radeon HD 6990 has more of nearly everything: frames per second, onboard memory, video outputs, as well as noise and board length. The 6990 has the most. Could it be that it’s still not the best?

Comments closed
    • kamikaziechameleon
    • 9 years ago

    This plot points out the 6970 CF config is the best price performance per dollar at that price range.

      • Airmantharp
      • 9 years ago

      Unless you look at the HD6950 2GB CF setup, which produces the same FPS/$ as 6870 CF with higher performance, and better FPS/$ than the HD6970 CF solution. When you consider that most HD6950s are shader unlockable and overclockable, there’s very little reason to spend more!

    • Crayon Shin Chan
    • 9 years ago

    I’ll wait for Kepler and then laugh at all you pansies getting excited over this card.

    • shark195
    • 9 years ago

    GTX 590 VRS AMD RADEON 6990 BATTLE FIELD

    POWER CONSUMPTION NVIDIA WINS(0)
    ACOUSTICS ( NOISE LEVELS) NVIDIA WINS(+1)
    EFFICIENCY (PERFORMANCE PER WATTS) AMD WINS(+1)
    PERFORMANCE (PERFORMANCE PER VALUE($$$) AMD WINS(+1)
    GPU TEMPERATURES NVIDIA WINS(0)
    DESIGN NVIDIA WINS(+1)
    OTHER FEACTURES (EYEFINITY, 3D VISION SURROUND,

    So looking at the comparisons above it is obvious, AMD wins but
    NVIDIA still has the FASTEST SINGLE GPU SOLUTION AND STILL THE FASTEST MULTIPLE GPU (2GPUs) SOLUTION IN TERMS OF SCALABLE LINK INTERFACE (SLI)

    • Kevlar
    • 9 years ago

    [url<]http://www.youtube.com/watch?v=sRo-1VFMcbc&feature=youtube_gdata_player[/url<] Sign me up!

      • Airmantharp
      • 9 years ago

      It’s a good point (I uprated you for the link), and I understand that you’re posting it in humor, but I’m reminded to take such pieces of information with a grain of salt.

      It’s surely nice to know that someone else has found out that these cards have a real breaking point though, and there’s never going to be a shortage of people willing to spend this kind of money just to find out the hard way!

      I don’t think that anyone who runs one of these cards at stock or is reasonable about their overclocking will have issues like the guys in your link.

    • Airmantharp
    • 9 years ago

    I’m a little late in adding this, but I wanted to say in my own comment,

    THANK YOU DAMAGE & CO. Your efforts are wanted and appreciated.

    Now that we have all of the current generation parts out, it’s nice to be able to look at the whole top end field to get an idea of how everything we’ve been looking forward to for the last two years or so has panned out.

    Things like AMD pushing Crossfire scaling to new heights, and passing Nvidia’s performance benchmark in the process, are exciting and were completely unexpected.

    I also appreciate the use of 2560×1600 as the main resolution of the review and comparison. I own a GTX570 SC, and after a quick run through of the admittedly buggy Dragon Age II while watching my GPU memory usage on GPU-Z, I was starting to get worried about my card running out of memory in newer games!

    I am and have been considering SLi, and I’m thinking that if I’m close to running out of memory now with just one card, running upcoming games, in particular Battlefield 3, with two cards might be worse.

    I’ll say that your review here (and the HD6990 review that preceded it) actually haven’t really helped answer my fears in that the GTX570 SLi showing has been spot on for it’s horsepower in single and dual configuration so far.

    Still, I’m highly considering trading or selling my EVGA GTX570 SC and replacing with an HD6950 2GB so that when I do get around to upgrading my board (and CPU and memory) to support two GPUs, I can grab a second and not be worried about scaling in newer games or running out of video memory.

    Thanks again! Keep up the excellent work!

    • grantmeaname
    • 9 years ago

    Why does the 6970 have a 1fps minimum in SC2? I’m not seeing it in the line graph.

    Also: holy balls, 365W TDP. My GPU uses 13W.

    Edit: For comparison to the i7-980X/590 at 541 stock W/616 overclocked W, the Core 2 X6800 with three 8800 Ultras drew 601 W three years ago.

      • Airmantharp
      • 9 years ago

      I’m actually not sure what you’re worried about here…

      For SC2, or any game that reaches a 1FPS minimum, it shouldn’t be much of a surprise or cause for alarm. Many game benchmarks span periods where the game is loading data and these points should show very low framerates, sometimes dropping to zero. This shouldn’t be a concern though- I’m not worried how fast the GPU runs the game during the load screen, and you shouldn’t be either!

        • grantmeaname
        • 9 years ago

        I never said I was worried about anything. I’m suggesting it’s a numerical artifact and shouldn’t be included in the bar graph, since it prevents us from seeing the real minimum framerate.

    • CaptTomato
    • 9 years ago

    You want the truth, you can’t handle the truth.
    These big assed GPU’s suck, they suck on price, suck on juice, suck on noise, suck on looks.
    Who’d buy either of these things?…..in Australia, the slower and uglier 590 costs more than the faster and hotter 6990, so I guess we’re being slugged a “cooling tax”

    I’m giving this whole generation of dodgy CPU’s and big assed nuclear powerplant GPU’s the ass…..I’ll wait till this time next yr for some fair dinkum gear.
    I can buy a 50in plasma for the cost of the 590…how the hell does that make sense.

    • Krogoth
    • 9 years ago

    590 is a minor disappointment. I was half-expecting to leapfrog the 6990. 590 only proves to faster in games that favor Fermi architecture over Cypress/Cayman architecture. Otherwise, it is just as fast as 6990 or a tidbit slower.

    You can’t go wrong with either choice. These behemoths can effortlessly handle 4Megapxiel with AA/AF thrown in. $699 may seem steep at glance, but decent 2560×1600 monitors aren’t cheap either. The 2megapixel crowd are better served with the more affordable; GTX 460, GTX 550, HD 6850, GTX 560, HD 6870, HD 6950 and GTX 570.

      • Mikael33
      • 9 years ago

      nVidia could have been faster had they not compromised the performance of it to fit thermal and acoustic needs, whereas Amd decided to build the fastest thing they possibly could, with somewhat ok thermals and noise. The cooler on the 6990 leaves much to be desired though, clearly nVidia has them beat in that dept for both single and dual gpu cards.

        • clone
        • 9 years ago

        Mekael33 590 GTX’s have real and significant heat issues with that quiet running cooler you mention and are failing when overclocked in spectacular fashion on launch day….. from this you assume their is plenty of headroom left in the cards for overclocking?

        p.s. Nvidia is in the process of locking down the cards overclocking potential (on launch day no less) not because they want to impose quiet running cards on users but because overclocking kills the GTX 590’s.

      • Meadows
      • 9 years ago

      [quote<]590 only proves to faster[/quote<] Is it that time again? When people invent verbs? [quote<]These behemoths can effortlessly handle [b<]4Megapxiel[/b<] (note the spelling - Ed.) with AA/AF thrown in.[/quote<] I don't get it, Krogoth. Every single videocard review lately has received a copypasta "these [insert here] can *effortlessly* handle X megapixel gaming" line. Do you, per chance, have a script or macro that comments for you? Has the industry really gotten that predictable? Yawn, we need your help more than ever.

    • Voldenuit
    • 9 years ago

    If you pull the GTX 590 to match the price/performance curves of the 6990 and 6970, it should be sitting at the $600 mark.

    So, nv, you need to cut the price of this beast by $100.

    • flip-mode
    • 9 years ago

    Does anyone else here miss the days when games needed fast cards?

      • sweatshopking
      • 9 years ago

      no. i’m poor. those days made me sad πŸ™

        • flip-mode
        • 9 years ago

        I’m poor too, but GPU hardware is more exciting when there are games that can give it a challenge. There’s a paucity of games out there that can tax a 6850 / 460

    • Buzzard44
    • 9 years ago

    Hotter, slower, more power draw vs. Louder, bigger.

    Bring on the noise, please!

    • kamikaziechameleon
    • 9 years ago

    I just can’t believe that the two dual GPU solutions are 700 dollars. I bought my 9800 GX2 for 250 bucks 6 months after release. This is just insane.

      • tejas84
      • 9 years ago

      Nvidia’s business practices and pricing have finally put me off them big time.

        • sweatshopking
        • 9 years ago

        finally? I like you tejas, but i don’t ever remember you being unbiased in the GPU department. That being said, in this case, nvidia dropped the price a TON. 2 580’s is 1000, this is 700.

          • Meadows
          • 9 years ago

          Two 580’s are also faster, albeit require more space and power.

          • Farting Bob
          • 9 years ago

          Going by performance this is 2 570’s on a stick. Marginally faster, slightly more expensive.
          Either way, spending $700 on GPU(s) is stupid for all but the richest gamers with multi-monitor setups that insist on only ever playing on the highest settings at 100fps.
          What is important for the other 99.5% of potential buyers is the sub $250 market.

    • tejas84
    • 9 years ago

    Everyone will be shocked I am saying this but here goes…

    What a POS garbage card the GTX 590 is. Can you believe that this shit is selling for MORE money than the AMD 6990 which is faster?

    Can anyone explain that to me? Who the hell buys a dually for how quiet it is??? The 6990 wins in the tests that count. TDP and Performance.

    Nvidia and their price gouging for an inferior product can go suck a lemon.

      • ClickClick5
      • 9 years ago

      AND YET! People will still flock to the card. It is hotter, slower and draws more power. So the conversation goes:

      “I’m waiting to see who is faster. Nvidia or AMD. Then I’ll buy.” <<<(Nvidia in mind the whole time)
      “Wow, look at that, AMD won here. I bought the Nvidia anyway.” <<<(Fanboyisim)

      • flip-mode
      • 9 years ago

      [quote<]Who the hell buys a dually for how quiet it is???[/quote<] Good point. Initially I favored the 590's mix of characteristics, but you make a point that's hard to argue with. As for pricing - these cards may as well be $1500 as far as I care. Price / performance doesn't really matter much here, but I guess flat out performance does. If you're buying one of these cards then you probably have a pair of $150 headphones to go with them and you don't give a shot about noise.

    • StuG
    • 9 years ago

    So the HD6990 wins. Interesting conclusion after all the GTX590 hype.

    • ssidbroadcast
    • 9 years ago

    Hey, that self-illuminating logo on the side of the card is pretty neato.

      • StuG
      • 9 years ago

      That’s where the high power draw comes from!

        • ssidbroadcast
        • 9 years ago

        I wonder if other card vendors will follow suit, followed by an escalation effect. Maybe we’ll start seeing “news-ticker” style banners across the side of the card now?

          • MrJP
          • 9 years ago

          Corsair were already there with RAM modules many years ago. Just wish I could remember the name or find a link.

            • JustAnEngineer
            • 9 years ago

            Crucial Ballistix Tracer, IIRC.

            P.S.: Here’s a link:
            [url<]http://www.crucial.com/store/listmodule/DDR3/~Ballistix%20Tracer%20240-pin%20DIMM%20%28with%20LEDs%29~/list.html[/url<]

            • MrJP
            • 9 years ago

            Here’s a link to the Corsair XMS Xpert I was thinking of, from all the way back in 2005. They also had simpler LEDs showing memory activity back in 2003.

            [url<]http://www.trustedreviews.com/cpu-memory/review/2005/04/13/Corsair-XMS-Xpert-CMXP512-3200XL-memory/p1[/url<] Looks like they're at it again: [url<]http://www.thinq.co.uk/2010/6/1/corsair-unveils-new-led-memory-monitor/[/url<] Does anyone still have case windows?

            • sweatshopking
            • 9 years ago

            my case has no sides!! I found it, and the panels were missing. it’s one of the old off white ones. it does the job, but it’s not very pretty. so to answer your question, YES! I would be able to see these!!

      • JustAnEngineer
      • 9 years ago

      My Creative X-Fi Titanium has one of those, too. Since I outgrew cases with windows years ago, it seems like a silly waste to me.

      • Bensam123
      • 9 years ago

      Creative has a cool little glowing logo (or used to) on the sides of their cards. I honestly think it’s a sweet little detail for showing off your hardware, I have no idea why more companies haven’t jumped on that yet. It’s just a clear little logo with a light behind it. What is brand visibility?

    • Ryu Connor
    • 9 years ago

    Ended up getting the EVGA 590.

    Amusingly despite the smaller size of the 590 it still is going to result in a case swap. My current SFF case holds a 10.5″ GTX 295. The max video card size of this particular SFF case is 10.6″. Doh!

    So got a new SFF case from Thermaltake that can handle up to a 13″ graphics card. I’d been pondering replacing this current case, the purchase just gave me the needed execuse.

    On the whole – as compared to my current 295 – I’ll get more performance, essentially identical heat output, better idle power consumption, and better noise characteristics at idle and load. What it cost me is more power consumed at load.

    Acceptable trade offs for me. Thanks for the review, Scott!

      • sweatshopking
      • 9 years ago

      IT ALSO COST YOU 700$ + YOU CRAZY MAN. DON’T FORGET ABOUT THAT!!

        • Ryu Connor
        • 9 years ago

        How much do you think my GTX 295 or even my 4870×2 cost me? πŸ™‚

          • sweatshopking
          • 9 years ago

          LOL IRL

          • PainIs4ThaWeak1
          • 9 years ago

          No more than $550 new.

            • Ryu Connor
            • 9 years ago

            As a bit of flight of fancy for how prices have changed over the past two years, I went and dug up my old invoices on NewEgg.

            According to NewEgg invoice I paid $560 for the 295. My 4870×2 meanwhile rang up closer to $430.

            Looking to the generations of single slot SLI that has come since I found that the only still in production 5970 is $900. The MSRP for the now dead 5970 was $600, but it never sold for that price.

            The 6990 are all selling above $700, trending closer to $730. The 570 is trending at $350 each and the 590 matches up against two 570 in SLI very well.

            If you were looking for indications that the price of video card products are going up it’s definitely here. Except both ATI and NVIDIA are setting new highs with their prices.

            The MSRP on the 5970 was $170 more than the 4870×2 and the 6990 is running roughly $300 more than the classic 4870×2 at current market prices. The 590 is $170 (EVGA) or $140 (Asus) more than the 295 at current prices.

            Moot point really. According to TR’s latest poll only about 5% of us who were at least active enough to click a button on a poll own 2560×1600/2560×1440 monitors. I’d gather that same % point is likely the number of us who own SLI/Crossfire hardware to push those monitors.

            The reactions in this thread aren’t that shocking. The 6990 review comments were equally filled with people who found the price incredulous and even managed to spawn an offshoot topic about the price of 30″ monitors.

            • BuddhistFish
            • 9 years ago

            As Ryu said, some of us are using monitors that run at resolutions that require this kind of horsepower. I’m running dual GTX460’s now, and I’ll be plunking down the cash for this 590 in a month or two. Nice as they are, my 460’s can’t push most modern games at, what I would consider, decent visual settings at the native res (2560×1440) of my Dell U2711. Is this a card for the masses, no. Is it a niche card, yes, hence the $700 price tag.

            • phez
            • 9 years ago

            You should think about getting out of your niche.

            • Ryu Connor
            • 9 years ago

            Statements like that make me wonder what happened to the hardware enthusiast community. Seems like peoples sense of value and brand loyalties has left “cool hardware” swinging from a gallow.

            I’m excited to get some shiny new hardware. Seems I shouldn’t expect any camaraderie with regard to it though!

            • sweatshopking
            • 9 years ago

            I think it’s cool stuff, bro, just that it’s WAY out of my budget.

            • oldDummy
            • 9 years ago

            +1 Fellow hardware nut

            • clone
            • 9 years ago

            I remember when I used to replace CPU’s and Video card every 3 months, motherboard and ram every 6 to get more speed….. so much coin wasted, stopped in 2005.

            have since moved on to other interests and paid off 4 cars (2 were cheap used)

            you spent $700 on a video card and while that’s great I can’t help but think that $700 could get me a full stainless 3″ turbo back exhaust, a high flow cat and used 680cc injectors for a Talon TSI.

            just laughing at how much my focus has changed.

            • d0g_p00p
            • 9 years ago

            I am with you man. Whenever I read the comments after a review here I really wonder if I am on a computer “enthusiast” site. I comment about this all the time. A piece of hardware gets reviewed and the comments are filled with cheap asses and people complaining. If a piece of hardware is out of the price range or does not fit the bill of comeone it’s either too expensive or worthless. The video card and power supply reviews are the worse offenders for these whiners. What is the common response?

            People with 3 years old systems saying how they can play everything max everything in a game so this new video card is useless. people have more money than brains, moaning about power usage. Look if you cannot stomach spending a extra $20 a year to run a power supply rated over 300w or a current video card then maybe this is not the site for you.

            Maybe the TR crew should refocus on what they review to please the cheapskates on this site. No power supply review over 400w, no video card costing more than $130, no CPU over $75, etc. It just kills me on what people (the same ones over and over again) bitch about. Myself, yes I will never buy this card for a number of reasons but I like reading about it and seeing the benchmarks. Same with every piece of gear on here that does not fit my needs or my budget. However I am not complaining about the cost, power consumption, or how no one needs a video card faster than a GTX 260.

            I mean why even bother to read the article much less comment if all you are going to do is complain about how expensive it is, etc.. I could keep ranting but I’ll end this and i have said enough.

            • Airmantharp
            • 9 years ago

            Well said, I appreciate your comment.

            When I read an article like this one, especially one done this well (thanks Damage!), I put it into the perspective of someone that might put this hardware to real use.

            It’s no different (as I commented in the HD6990 review) than reading articles about supercars! We know that most (if not all, statistically) of us will never use the technology on display here, but that doesn’t mean we don’t also enjoy reading about it and considering it’s real uses.

            Like Ryu Conner said, with respect to the GXT590 and HD6990, since they’re both fast and have enough outputs, it comes down to load noise. Power isn’t an issue either, since we know we can get 1000 watt and higher PSU’s that are also near silent at the loads a system using one of these cards would need.

            Since the GTX590 is much quieter in both measured and subjective tests I can see why he chose it, and I support his decision fully.

            Further, the price of these cards is quite pertinent, but please, show me how they’re out of line enough to complain about! Compared to a pair of single cards to do the same work, I feel that they’re absolutely in line with what they should be. Really only the HD6990 stands out of the crowd of dual-GPU solutions in that it is both considerably louder, and capable of measurably higher performance when pushed.

            The real differentiation I think will be this- people who want maximum performance regardless of cost will tend to grab the HD6990 (or two) and bolt an effective waterblock to it, adding it to an in place (or planned) water-cooling loop. They’re paying more both for the card, and the system, due to the water-cooling necessary to use the card in silence. That said, I’d say that they are definitely getting what they’re paying for, and I support their solution, especially if their goal includes a sizable Eyefinity setup.

            People who want the compromise, that being awesome performance and reasonable acoustics, given that power draw and heat output aren’t an issue at this level, will go for the GTX590 like Ryu above. For the performance he’s getting in a single card, meaning that he can use a smaller case and mainboard while still having room for things like internal soundcards and network cards, or TV tuners, or whatever, he made an excellent choice.

            So, quit bitching. If this product doesn’t pertain to you, then this comment section doesn’t pertain to you!

            • CaptTomato
            • 9 years ago

            The thing is though, all of us can afford to buy the x2, but due to inflated price{feel ripped off} and temp issues, we generally don’t.

            • Airmantharp
            • 9 years ago

            You should think about getting another job so that you can afford the toys to join the club!

            Heck, I’m looking around the house right now to find things worth selling so I can make a dent in my wishlist :).

            • Bauxite
            • 9 years ago

            My 5870 from 2009 πŸ™‚ drives a U3011 just fine, so I’ll disagree that single monitor truly demands above a $300 class card now. (6950, with possible “free” 6970 upgrade)

            Multimon, yes.

            Also sli/crossfire are still significantly more buggy and behind the driver fix curve, that has yet to go away.

      • oldDummy
      • 9 years ago

      I have a GTX580 which cost …whatever.
      In addition have also donated $50 to Japan.
      Should that have been reversed?
      Maybe, but that’s up to me.

      Stop the hating.

        • Airmantharp
        • 9 years ago

        I know right?

        I have a GTX570, and yet that pales in comparison to what I spent on the 30″ HP it’s pushing to. Besides, I don’t really believe in giving money for disasters (or to politicians, or to the homeless on the street, or whatever), instead preferring to physically help where I can when I can. If I was aware of a way to go help the people in Japan right now while still being able to keep my job, my home and my car waiting for me when I returned and also not go into debt to get there, work, and come back, I’d jump on it!

        Instead, I endeavor to find ways I can help my own community. But if I want to spend some of my money on my own entertainment, I’m going to, and I’m not going to feel bad about it (well, my checkbook might…)

      • swaaye
      • 9 years ago

      Watch out. You aren’t allowed to go to excess with your computer hardware around here. πŸ˜‰

      • Krogoth
      • 9 years ago

      Why are you getting dual-GPU cards for a SFF chassis?

      The said cards are meant to operate in a full tower chassis. I am surprised that your previous 295 hasn’t run into issues. 590 will most certainly have thermal issues at load. The card barely cools itself in a full tower chassis at load and stock speed. It is one of caveats of SFF chassis besides the lack of expansion slots.

      You have not much choice but move into a larger chassis. If you want to avoid nasty issues.

        • Airmantharp
        • 9 years ago

        I’m not sure what the full reasoning behind your comment is, but I believe that it’s flawed in that you’re looking at things a little backward.

        The card will run well and without issue if it is provided enough power and airflow, both of which are possible in many SFFs. You have to be careful of course, and I agree with you there, but by no means must this card be put in a full tower!

        The idea of using one of these cards in an SFF is actually quite appealing- you can use an ITX case (with appropriate power) and board to put the most functionality that has been possible up to this point into such a small system!

        Why can’t you appreciate that goal? I know it’s something that I’ve wanted to do, but every time I start looking at SFFs I realize that I don’t ever, and don’t ever want to, tote a desktop system around.

          • Krogoth
          • 9 years ago

          You grossly underestimate the thermal output of the 590 and heat dissipation of a SFF chassis.

          Nvidia’s engineers never set SFF chassis as the baseline for 590. It is probable they set the typical full tower ATX chassis as the baseline. Given the size requirements of the card and overclockers are burning up their 590s inside full towers chassis. It isn’t too difficult to imagine what would happen to a stock unit in a more confined space.

            • Meadows
            • 9 years ago

            Useless comment. Let the man decide for himself.

            • Ryu Connor
            • 9 years ago

            FWIW:

            [url=http://www.thermaltakeusa.com/Product.aspx?S=1200&ID=1428#Tab1<]This[/url<] is the old case holding the [url=http://www.newegg.com/Product/Product.aspx?Item=N82E16814130453<]295[/url<]. [url=http://www.thermaltakeusa.com/Product.aspx?S=1122&ID=1989#Tab1<]This[/url<] is the new case holding the [url=http://www.newegg.com/Product/Product.aspx?Item=N82E16814130630<]590[/url<]. The difference in temperature between the 590 and 295 is [url=http://www.anandtech.com/show/4239/nvidias-geforce-gtx-590-duking-it-out-for-the-single-card-king/16<]not going[/url<] to be an issue. They are practically identical from my point of view. The 295 has been on an x58 mATX with an i7-940 the last two years. It works peachy. No need to Monday morning quarterback me on this. I have the situation well in hand.

            • Farting Bob
            • 9 years ago

            The volume of the case is 100% completely unrelated to how power hungry the system you put in it is. Its about airflow. If you have enough air going in and enough hot air going out you can run a 1000w system in a shoebox all day, its just that you will need very high airflow fans and it will sound like a jet engine.

            • oldDummy
            • 9 years ago

            Hey what a concept, reality.
            And with a directed airflow it doesn’t have to be THAT loud.

    • kilkennycat
    • 9 years ago

    Really looking forward to a dual GTX560Ti using the GTX590 cooling system.

    Probably default-clock at 850MHz with decent overclock potential. I would expect an initial price < $500, dropping to ~ $400. Likely to be the most popular dual-GPU card ever.

    eVGA and “friends” are you reading? Do not skimp on the cooling system to save a very few buxx when you finally release the dual-GTX560Ti.

    • oldDummy
    • 9 years ago

    Anyone who thinks I’m a Nvidia fanboy should know up front that I was a 3Dfx fanboy…bigtime.
    THAT being said;
    Nvidia has wormed it;’s way into my boxes for quite a few years now. One word describing this company? Relentless….though others come to mind also. πŸ˜‰

    • ap70
    • 9 years ago

    This reviews always put me down.
    I bought a GTX560 TI two months ago and it runs at 56Fps on BBC2 at 70*C.
    In this review they don’t even show the 560 ti as a single card but only as SLI getting 60-65Fps.
    It makes me to feel bad about my two months old sweet video card.
    NVIDIA should sit and develop an extreme good card per year instead poping up 5Fps extra every month.
    Give us something reeeeaaaaallly amaizing instead putting good cards as 560Ti in the back of the bus.
    Can we really see a difference between a 560TI @ 58Fps and a 590 @ 70Fps ?

      • Mithrandir8
      • 9 years ago

      They were running those cards at 2560×1600 resolution with 4X AA. That’s quite a noticeable difference over the settings you’re probably playing at.

    • xeridea
    • 9 years ago

    What I see is confirmation of suspected lies and twisted marketing from Nvidia. Looking at the numbers the TDP is obviously much higher than Nvidia claims, and higher than the 6990, though the stated TDP is 10w less.. On their website they say that the GTX 590 is the fastest card, though third party benchmarks say otherwise, on average.

    Its a wonder how it is cooler having less venting and more power draw, perhaps they have had more time put into cooling since their cards tend to run hot. Which brings up another point, I wonder what the 6990 noise level would be like if they didn’t care about near boiling temperatures like Nvidia.

    • sheytan
    • 9 years ago

    6990 is faster and more efficient than 590.
    Still the author “choose” 590…Strange when many other sites has 6990 as the “winner”
    The author Scott Wasson is obviously a Nvidia fanboy.

      • Myrmecophagavir
      • 9 years ago

      Maybe he cares about the noise level? I know I do. It’s really standing in the way of a 6990 purchase for me, even though I’d prefer the higher performance and lower power consumption. I wonder if there are any 6990 models in development with quieter coolers.

      • Damage
      • 9 years ago

      Like I said, I’d personally pick the 590 for my own use because it’s much quieter (the difference is quite substantial when you use the things side by side in person) and because it’s shorter (and wouldn’t constrain my case choices so much). That’s my preference, and those are my reasons for it. I didn’t say the GTX 590 was definitively superior to the 6990, though. I left that up to the reader, so you are free to follow your own tastes. If you think having such nuance in a review makes me an Nvidia fanboy, well, I can’t really help you.

        • gbcrush
        • 9 years ago

        +1 in my book. Some folks just don’t understand the difference between Imply and Infer.

        Of course, I too, like to think I understood Damage perfectly. Assumptions abound! πŸ™‚

        • can-a-tuna
        • 9 years ago

        But in idle mode it’s 6990 that is a bit quieter. At least I use my PC 90% time in non-gaming related tasks. I appreciate good idle noises much more. If you game at full blast stereos on, does it really matter if the fan is a bit loud? Besides, I never go for reference models because their higher fan noise. I’m sure one can adjust the fan speed and also non-reference Radeons should be available soon.

          • Damage
          • 9 years ago

          The difference between the two cards’ sound levels at idle is small numerically, relatively speaking. Perceptually, to me, the difference is pretty much zero. That’s just not the case under load, at all, by any stretch. That’s why I’ve emphasized noise under load in my analysis–because the differences are much more consequential there, practically speaking.

          Look, you’re welcome to prefer the 6990 over the GTX 590 for its higher performance, lower measured power draw, built-in overclocking, better display output array, more memory, or flashier looks. That’s your call.

          But trying to argue the acoustics are a wash between the two is fool’s errand.

        • clone
        • 9 years ago

        I accept / concede your point Damage but in the end disagree because while the noise of the AMD is bad it can be fixed quite easily while none of the GTX 590’s shortcomings can.

      • can-a-tuna
      • 9 years ago

      He is similar to Tom’s “Tom”. I’ve been bitchin’ about that but in the end, it doesn’t help so why even bother. They have chosen their path. AMD must be substantially better to break nvidia’s force field. Being just a little better is not enough.

    • adam1378
    • 9 years ago

    SLI on a stick, I like that. What would call Crossfire?

      • KeillRandor
      • 9 years ago

      Lets see – (Breaks out the dictionary… Looks at words beginning cr… (To get the best match?))

      Choices, choices…

      Crossfire in a cradle…
      Crossfire in a crate…
      Crossfire on a crayon… (Hey, it’s a (coloured, pointy) stick, right?)
      Crossfire on a creeper…
      Crossfire in a crib…
      Crossfire in (on?) a crocodile…
      Crossfire in a crypt…

      Whadd’ya think?

        • Meadows
        • 9 years ago

        You want a word that suggests simplicity, not alliteration.

          • KeillRandor
          • 9 years ago

          I was aiming for a little humour, lol :p

      • Damage
      • 9 years ago

      CrossFire on the cob, of course

      • willyolio
      • 9 years ago

      You forgot card?

    • Myrmecophagavir
    • 9 years ago

    “Interestingly, you’ll only find GTX 590 cards from Asus and EVGA available for sale in North America. In other parts of the world, the 590 will be exclusive to other Nvidia partners.”

    I read this as though the Asus and EVGA models would only be available in the US. But the Asus one is at all the UK e-tail sites today, and their advertising for it was showing at the places I visited this week too.

    Did you perhaps mean that in the US you’ll only find those two brands available, but there’s no such restriction elsewhere?

      • Damage
      • 9 years ago

      No. There are only certain partners selling the card in each region, as I understand it. The globe is carved up into enough regional markets that a board vendor may have a number of them assigned to it, but there won’t be a full slate of brands offering cards in any one place.

        • Myrmecophagavir
        • 9 years ago

        OK. Well, the Asus model is definitely available in the UK anyway πŸ™‚

    • clone
    • 9 years ago

    the real tragedy of the GTX 590 isn’t the heat or noise but it’s power draw being so high, apparently the overclocked cards have already started failing and it’s only been 24 hours, that the card comes with less RAM than the HD 6990 by default, that the cards performance is only comparable are some of the growing list of reasons why this card is a disappointment… not a failure but definitely weaksauce.

    Nvidia would have been better sticking to the 580 claiming they don’t need 2 GPU’s on one card to compete, by releasing the 590 they highlight Fermi’s shortcomings.

    go AMD and install a more efficient cooler or wait for an OEM to offer something better for minimal cost, I can find no reason to choose the GTX 590 over the HD 6990, the quieter running of the GTX 590 is vapor at this price point and badly offset by the lack of ram, the excessive power consumption, the bad pr being generated across the web due to premature card failures when overclocked.

    • MrJP
    • 9 years ago

    The 590 certainly looks like a better all-round package than the 6990, but with two caveats:

    1. How close to the noise/thermal performance of the 590 could the 6990 get with a less aggressive fan profile?
    2. If your motherboard can accomodate it, 6950 Crossfire is a much better choice than either, especially if the 6950s can be flashed to 6970 specification.

    • HisDivineShadow
    • 9 years ago

    I think the 6990 is for the overclocker who will replace the cooling solution. I think AMD might as well have sold that card without a cooler for a discount or with a watercooler-ready heatsink. The noise generated by it is simply intolerable to my ears.

    I think the 590 is for the overclocker who will use the included solution. nVidia couldn’t make it faster, so they just made it better as sold for my purposes. Not that *I*’m the consumer they’re targeting with EITHER card. I imagine the user of a card like this to be a shoebox computer with only one PCIe and who has a small case with an intake near the top and two exhausts at the front and back. The smaller size and lower noise level both support such a situation better than the 6990 in its default config.

    If I had $700-750 for this, though, and I had a regular sized desktop, I’d be skipping both the 6990 and the 590. I’d just be knocking on the 580 SLI for domination or 6970 CF for the cash to performance ratio. Plus, they’re easier to cool.

    The worst part of it is that the 6900-series IS the better bet for the long term. Its performance is scaling the best in CF, its got the 2GB of video memory that makes it better for the future games that will need ever more memory, and its pricing is rather good. Just with the 6990 specifically, that noise level is unacceptable in our post-FX blower world. We’ve evolved beyond the Black Label Delta’s of yesteryear. Or at least I have.

    • MrJP
    • 9 years ago

    As usual a good review, but why the change in policy on overclocking?

    In previous articles we’ve seen overclocked cards (usually Nvidia) tested against stock-speed cards (usually AMD) on the basis that the the overclocked cards are freely available in a box with a warranty at that speed. I’m perfectly happy with that reasoning, and the follow-on logic that you don’t then artificially overclock the stock-speed cards all the way through the benchmarks to compensate. I really don’t understand why other people have complained about this previously.

    Now you’re comparing a card that has a built-in overclocking feature (it even has its own switch!) that is supported under warranty by at least one of the card vendors (XFX have stated that AUSUM mode is covered under warranty) with one that doesn’t, and then artificially overclocking the card with no in-built overclocking feature to compensate.

    I laughed at the WICKED designation, but I still think that this is not entirely apples-vs-apples testing and not in line with your previous policy. Why was this not just in a separate page at the end with overclocking results in a few selected benchmarks as has been done previously? Given the differences seen in the power usage plots, why not overclock the 6990 to the same power level to see how that performs? (apart from not being able to stand the noise, of course…)

    Sorry if this comes over as complaining (I still think it’s a good review), but I’m just curious about what motivated the change in approach.

      • Airmantharp
      • 9 years ago

      I think it’s just that they’ve responded to AMD’s change in posture by adding a switch to their card, in order to keep the review fair. Cards overclocked at retail are not the same, as they are their own retail product at that point. Scott is just trying to make sure that the GTX590 is given some consideration for overclocking headroom in view of AMD’s extra feature.

      • Damage
      • 9 years ago

      I think you are mixing up real overclocking, where the user acts to raise the clock speeds and runs the product at a higher-than-stock, non-guaranteed configuration, and “overclocking,” where the marketing department at a video card company labels a fully validated, stock-clock config that’s higher than the GPU vendor’s baseline as “overclocked.”

      I see a bright and shining dividing line between the two, even with AMD’s AUSUM switch present.

      AMD’s uber mode requires user intervention to engage, is not guaranteed to work, and clearly violates the PCIe power spec with its higher PowerTune limits. Is it, for all intents, real overclocking, not just the marketing variety. Using the AUSUM switch may not invalidate your video card warranty, but software-based overclocking may not, either, depending on the vendor.

      Clearly, some folks have trouble seeing the difference between true overclocking and marketing-driven “overclocking.” And yes, the lines do seem to get blurry when a GPU maker offers a wink and a nod about it. Still, our definition of these things, and our practice, has been consistent. We’ll try to keep explaining that from time to time.

        • novv
        • 9 years ago

        I totally agree with MrJP about overclocking GTX590. You cannot guarantee that this OC, posted as WICKED in the review, is totally stable for every card sold by nVidia and in every system configuration. You say in the review that is not fair to compare HD6990 in AUSUM mode with the GTX590 not OC. It’s like saying it’s not fair that AMD made the card with all that stuff and that means is not fair that nVidia didn’t put dual bios on their card and an enthusiast configuration mode, is not fair that the drivers shipped with the cards are not function correctly with OCP/OVP and the price is not 699$ (anywhere I looked). In the end those that choose GTX590 and OC it should be very careful (http://www.youtube.com/watch?v=sRo-1VFMcbc&feature=player_embedded)!!!

        • MrJP
        • 9 years ago

        No, I think I have a pretty clear idea of the difference between manufacturer-endorsed, warranteed “overclocking”, and user-activated non-warranteed overclocking. The bright and shining line as you put it. The mix-up is that I thought that the 6990 AUSUM was on the manufacturer-endorsed side of the line, and the 590 WICKED was not. Clearly your view is different, and you might well be right since AUSUM (I can’t believe I keep having to type that) is certainly in a bit of a grey area. In fairness, now I look back, I see that you made this view clear in the 6990 review. (Incidentally – aren’t both the 6990 and 590 outside the PCie spec anyway even before any overclocking?).

        But even then, you have moved into presenting what you class as user-overclocked results in the main meat of the review, you just did it one review earlier than I’d thought. As long as they’re always clearly presented as such then perhaps that’s fine, but it does represent a shift away from the position that you’d previously defended strongly (and convincingly) of only putting results for genuinely-available products in the main meat of the review, and keeping overclocking in a separate section.

        Maybe this is just a one-off (two-off?) given the unusual nature of these particular cards, but I do think this might ignite a lot of the fanboy-vs-fanboy arguments and groundless accusations of bias that the previous position sought to avoid. Fundamentally it’s difficult to have a completely fair comparison when you’re determining the clock speed of one of the products yourself (the YMMV problem). Maybe you just missed the more entertaining discussions in the comments?

          • Damage
          • 9 years ago

          Obviously, the inclusion of the (truly) overclocked AUSUM config in the 6990 review was prompted by AMD’s quasi-endorsement of this configuration, which went as far as including switchable firmware. Once we’d tested it, doing so with the GTX 590 only seemed fair, however clumsy our attempt may have been.

          The 6990 review was not the first time we did such a thing, though. There wasn’t a switch, but AMD recommended a similar overclocked mode for the Radeon HD 5970 at its launch, and we tested that, too.

          You’re right that we’re inviting fanboy rage by trying to be fair to whatever side they’re not on. Hazard of the job, but also nothing new.

            • sweatshopking
            • 9 years ago

            I must say, Scott, whilst I normally don’t get involved in reviewing your reviews, this does seem to be a different approach then you’ve had in the past. the 6990, and the 590 are 2 totally different approaches. this is without a doubt a break from your ‘460 FTW’ policy. I don’t care, as I’d like to see how it overclocks, but it does come across as a change in policy.

            Did nvidia recommend an overclock on this product, as ati did with the 5970?

            • MrJP
            • 9 years ago

            Fair enough. It’s more work for you and more information for us, so it would seem churlish to debate any further. I think you might secretly enjoy the fanboy rage anyway.

            Just one final suggestion: perhaps the 6990 “A” and 590 “W” (I’m not going the whole hog anymore) could be coloured pale red and pale green with grey text on the price/performance scatter plot since those performance levels aren’t guaranteed at those prices. Otherwise you’ll need to add points for 6950s flashed to 6970 speeds, etc.

    • slaimus
    • 9 years ago

    One thing the 590 has on the 6990 is a much smaller PCI-E bridge chip, which makes the length more manageable. Otherwise, the 6990 looks like a better engineered product with digital PWM and switchable BIOS.

    • shank15217
    • 9 years ago

    LOL this is just getting ridiculous, now we just need that lucid hydra chip and put the ausum and wicked in one box…

      • anotherengineer
      • 9 years ago

      I believe it would be called.

      WICKED AUSUM

      πŸ˜‰

        • Meadows
        • 9 years ago

        I believe your legal name is Captain Obvious.

          • sweatshopking
          • 9 years ago

          And my name is SEX MACHINE!

    • Spotpuff
    • 9 years ago

    Is anyone else confused that AMD offers HDMI and DP (or 6x DP on their higher end cards) while Nvidia is offering… triple DVI + DP?

      • Airmantharp
      • 9 years ago

      AMD one upped them on architecture development- when AMD was doing this in secret, Nvidia was building the GF100 that we’re using today. We’ll need more than a refresh before Nvidia covers this gap.

    • gbcrush
    • 9 years ago

    WICKET!

    ..er WICKED!

    ROR!

    Thanks for the review, Damage. Answered a couple of curiosities I had about a pair of cards I’ll never buy. But I came for all the TR goodness, not to spend my money. πŸ™‚

    • shark195
    • 9 years ago

    I think both cards are faster in their own niches, for example in acoustics 590 wins, but in power which is the main issue to tackle as days go by, your bill will certainly go high, but you don’t pay anything for the noise, so in that case AMD STILL has the fastest single graphic card on the planet.
    AMD is still the winner, whichever you look at it though

      • Veerappan
      • 9 years ago

      My sanity has a price, and therefore high noise levels aren’t acceptable.

        • Airmantharp
        • 9 years ago

        I’m with you brother. Nvidia has made a point out of making quieter cards for the power drawn at every level for the past few generations, and that’s something I admire.

          • sweatshopking
          • 9 years ago

          but at the cost of heat? are there any nvidia partners that still offer double lifetime warranties? XFX is radeon only. Evga is lifetime/10 year. I don’t want a furnace. my bfg 8800gt ran at 90 DEGREES CELSIUS. it was insane!

            • Airmantharp
            • 9 years ago

            You do realize that heat = power draw; what the card runs at means very, very little. Oh no, 90C! 100C! But my card runs at 70C, it must be better!

            Or not.

            These cards are specified to run at specific temperature ranges, and their coolers are designed to keep them there. As long as they operate in range, they’re good; to whine about how ‘hot’ a card runs is moronic.

            My point is that Nvidia cards have had both higher TDPs and lower noise profiles across the board. Heck, even the GTX480, as loud as it was, would have been much louder if AMD had built the cooler for it (meaning, the cooler had the same TDP -> noise ratio that AMD cards do, with it’s much higher TDP).

            I know that Nvidia cards use more power for the performance they give, and this had been debated for at least a year, since the final fate of GF100 was known. They made the bet that they could make one massive chip that serviced both the top enthusiest markets and the GPGPU markets their Tesla products are destined for, and a broken 40nm process bit them in the tail; even the capped stock performance of this GTX590 is a result of that.

            • sweatshopking
            • 9 years ago

            I’m aware of that, I just don’t like having a heater. I’d rather that my chip was cooler. I’ve had a few fail, and I figure heat might have had something to do with it.

            • Airmantharp
            • 9 years ago

            If it failed due to heat, either it was defective (or poorly built), or you failed to keep the dust out of it and it couldn’t save itself if it wanted to.

            Being too hot as measured at the core has nothing to do with it, and nothing to do with how hot the card really is. It can only really be compared to cards of the same type, not different cards from the same vendor and certainly not cards from different vendors.

            • sweatshopking
            • 9 years ago

            I understand that, however there is such a thing as simply being too hot. the card was cleaned, but the fan was insufficient. BFG ended up doing a bios fix to turn fan speeds up, but it was too late.

      • tejas84
      • 9 years ago

      AMD is the winner by a long chalk but all these review sites somehow say that Nvidia won.

      Reality check people!

    • 5150
    • 9 years ago

    We need some new articles on stuff that isn’t practical, like a RAID card roundup showing their performance with different HDD and SDD’s. That oughta keep you busy.

      • Veerappan
      • 9 years ago

      I would actually like to see a round-up of low-end hardware RAID cards that use SATA drives.

      I picked up an Areca 1200 last year for my desktop (dual 640GB WD in RAID 1). I was sick of the chipset RAID corrupting partitions when switching between my Linux and Windows installs. The Areca card has drivers for both OSes and has been trouble-free since installation.

    • Arclight
    • 9 years ago

    [quote<]The 6990 has the most. Could it be that it's still not the best?[/quote<] Stock noise and PCB size is kinda pointless at this price point. Even people who buy lesser cards like GTX580 still install water blocks, why should they go about it differently with this curent sucking ice melting vampires? I would see alot more reasons to install custom coolers for this cards, than for cheaper single GPU cards. If i had the money i'd go for performance. The GTX 590's 52 db during load ain't exactly silent either mind you.

    • MaxTheLimit
    • 9 years ago

    The load power consumption surprised me. I thought nvidia would have worked a bit more power consumption magic.
    I wouldn’t buy either card. Wouldn’t buy the 6990 for the noise, and wouldn’t buy the 590 for the power draw.
    Then again, I wouldn’t spend 700 bucks or more on a video card.

    • phez
    • 9 years ago

    560 SLI results are pretty good.

      • Airmantharp
      • 9 years ago

      They’re actually horrific. The 560 SLi has shown again in a review that it’s minimum framerates dip far too low, and out of proportion with other cards in its range. I love the card’s performance vs. price ratio in single configuration, and can’t recommend anything else, but for SLi it’s GTX570 minimum for me.

    • SARS
    • 9 years ago

    Interesting card… Really like the cooling design and db values.

    Unfortunately for NVIDIA, ATI continues to maintain the performance crown. For that matter, ATI has held the performance crown since the release of the 5870 in September 2009.

    Could we really be looking at 2+ years of ATI performance leadership before 28nm products ship?

    After the 2900XT fiasco, I didn’t think I’d ever see that… Strange.

      • sweatshopking
      • 9 years ago

      wonder what the 7k series will bring…

      • LawrenceofArabia
      • 9 years ago

      I think its likely less important for Nvidia to try and grab the performance crown with a crazy card than it is to put out a more acceptable consumer solution. Nvidia still maintains a following of SLI users (I imagine due to better scaling than AMD, until very recently) who spend big bucks every year or two on the latest SLI config. But they key there is that these users aren’t usually the crazy watercoolers with 5ghz overclocks on their cpus, which the 6990 seems to be targeted at. Nvidia’s clientele using high end SLI are more typical consumers, with more typical concerns about noise and compatibility. It simply made more sense to put out a more sane card.

        • novv
        • 9 years ago

        I don’t think “typical consumers” buy 700$+ graphic cards!!!

        • sweatshopking
        • 9 years ago

        i’ve had both SLI, and Crossfire. I haven’t tried it past the 8k/4k series. I’m not sure that at those points, it would be worth it. MGPU setups are still less than 2%

          • anotherengineer
          • 9 years ago

          I thought you were poor and didn’t really have a video card until you bro gave you that single 4850?!?!

          Did you xfire that 4850 with your old rage pro?!?!

            • sweatshopking
            • 9 years ago

            no, i had an 8800gt sli rig, that i had to sell, when i got laid off working for nabors, before I moved to NS. Then, once i got my new rig, I had 4850, and my bro had a 4850×2, and a 4850.

            I had the rage pro between those systems. My new system cost me absolutely nothign, excepting the screen, where I got a 22 inch 1080p monitor for 100$ with free shipping from memoryexpress.com

    • anotherengineer
    • 9 years ago

    Beastly.

    But still happy with my HD 4850.

      • xeridea
      • 9 years ago

      My 4850 serves me well also, along with my dual core. Waiting to see what Bulldozer brings though.

      • glacius555
      • 9 years ago

      Sold mine two weeks ago, for app. 60 bucks.

      • Mikael33
      • 9 years ago

      Same here, looking to buy a 6870 or 6850 to replace it in a bit, as I finally upgraded my S939 box to a 3.7ghz (oced) dualcore(was hoping it would unlock) phenom II based system so my 4850 is holding me way back, sticks out like a sore thumb in my system since I have quite decent hardware other than then 4850.

Pin It on Pinterest

Share This