Zotac’s GeForce GTX 580 AMP²! graphics card

We are a ruthlessly pragmatic lot here at TR, with our value scatter plots and an ever-present emphasis on the value equation in any piece of gear we review. Every once in a while, though, we’ll run across a product that strikes a different, resonant chord in the PC enthusiast’s quantized heart. Rationally, we may know it’s not necessarily the practical consumer’s value leader, but it stirs a part of us that runs deeper and must sometimes be restrained by those pesky filters.

The subject of our attention today, Zotac’s GeForce GTX 580 AMP²!, is such a creature—at least for me. I’m not just talking about the fact that I might have to be restrained if I ever come into proximity with the dude who chose this thing’s name, either. No, despite the name, the product itself reminds me of why I got into this crazy hobby of screwing together my own computers.

The concept is simple enough. The AMP²! is a GeForce GTX 580 card that’s been customized for additional awesomeness in several obvious ways.

First and foremost, there’s that enormous triple-slot cooler. We liked it when it rode atop Zotac’s amped-up version of the GeForce GTX 480, except that it was a GeForce GTX 480. As a product, the GeForce GTX 580 is cooler in multiple senses of the term.

This GPU cooler, wider than the federal budget deficit and sporting more pipes than a Detroit crack house, is apparently made by Zalman, Zotac’s end-of-the-alphabet compatriots. It looks to be very similar to VF3000F, except for the lack of minty-green gaudiness. You can purchase the VF3000F separately for 75 bucks, but Zotac has had the good sense to make a custom, jet-black version of it standard equipment on the AMP²! This thing’s array of five heat pipes, twin fans, and beacoup aluminum fins will occupy two adjacent slots in your PC, for a total of three eaten up, making the possibility of SLI configurations contingent on your motherboard’s spacing of PCIe x16 plugs. Fortunately, the thing doesn’t weigh much, so the extra leverage shouldn’t translate into excessive torque on an expansion slot.

The conspicuous question about this cooler, of course, is its effectiveness. I don’t want to give away too much, but you can read subheads, right?

The second custom touch on the AMP²! is a common one: higher clock speeds. The plain-vanilla baseline core and memory clocks for the GeForce GTX 580 are 772MHz and 1000MHz, respectively. (That makes for 4000 MT/s memory, since it’s GDDR5). By contrast, the GPU on this marvel of modern technology runs at 815MHz, and the memory ticks along at 1026MHz—increases in GPU and RAM speeds of 5.5% and 2.6%, respectively.

I know, I know, you’re on the verge of losing bladder control, so stark are the changes.

Zotac’s third modification is more notable, because it involves a true change to the hardware. Camped out beneath that massive cooler, hiding under some low-profile black heatsinks, are GDDR5 memory chips with a total capacity of 3GB, double what you’d find aboard most GTX 580s. We’ve seen video cards with more RAM, monsters like the Radeon HD 6990 or the GeForce GTX 590, but those products split their DRAM chips between two GPUs, giving them a functional memory capacity that’s half the advertised total. The AMP²!, however, dedicates all 3GB to a single, massive GPU, giving that one chip more room to operate than anything else on the market, as far as we know.

Does that, you know, help? Tough to say. We’ve not run into a case where a standard GeForce GTX 580 appeared to be running low on memory, even when tested in SLI (which has some extra memory overhead) on a four-megapixel display. Going beyond 1.5GB may simply be excessive. However, game requirements are rising over time, and two GTX 580s in SLI can drive multiple displays, potentially even a triple array of four-megapixel panels. Bumping up against memory limitations in such a configuration seems quite possible, though we’ve not tested it ourselves for lack of, say, GlobalFoundries-sized funding.

The trio of enhancements Zotac has brought to the GTX 580 makes the AMP²! substantially more appealing than your run-of-the-mill card. To further sweeten the pot, Zotac has included a coupon for a free digital download of Assassin’s Creed: Brotherhood, which is about as good as it gets for a bundled game. Not only does it have solid ratings on MetaCritic, but it’s also easily and frequently abbreviated as Assbro. So you can tell your friends you’re “Just hanging out, playing some Assbro.” Assbro currently sells for around 50 bucks, but I’d pay more just to be able to say that.

Zotac has ticked the other requisite boxes, too, including a “limited lifetime” warranty (which unfortunately requires registration within 30 days or turns into a pumpkin) and the obligatory DVD stuffed with potentially useful trialware. The total package is slated to sell for about $560 at online retailers, and Zotac tells us it should be in stock at Newegg any day now.

But how does it perform, I hear you asking? This is hardware review, so we need multiple pages of bar graphs to demonstrate the 2.2-to-5% increase over the stock GTX 580, right? Relax, we have you covered, right after we tell you a brief, cautionary tale. Our first sample of the GTX 580 AMP²! had a really bad habit: it crashed, a lot. We gathered up all of the info we could about the issue and sent it off to Zotac, expecting to hear back that we had a single, defective unit. What we found out, instead, is that the entire first wave of AMP²! early press samples had a power delivery problem. Zotac quickly shipped us a replacement unit with new PWM programming, and the new card hasn’t crashed on us once. Zotac tells us only press samples were affected, but if you somehow got your hands on one of these cards early and your system is doing the BSOD dance more often than usual, I’d recommend placing a support call ASAP.

We’d expect any card you can buy from today forward to be as solid as Bob Seger, just like our second review unit.

Our testing methods

As ever, we did our best to deliver clean benchmark numbers. Tests were run at least three times, and we’ve reported the median result.

Our test systems were configured like so:

Processor Core
i7-980X
Motherboard Gigabyte EX58-UD5
North bridge X58 IOH
South bridge ICH10R
Memory size 12GB (6 DIMMs)
Memory type Corsair Dominator CMD12GX3M6A1600C8

DDR3 SDRAM at 1600MHz

Memory timings 8-8-8-24 2T
Chipset drivers INF update 9.1.1.1025

Rapid Storage Technology 9.6.0.1014

Audio Integrated ICH10R/ALC889A

with Realtek R2.58 drivers

Graphics
Dual Radeon HD
6870 1GB

with Catalyst 11.4 preview drivers

Radeon HD
5970 2GB

with Catalyst 11.4 preview drivers

Dual Radeon HD
6950 2GB

with Catalyst 11.4 preview drivers

Radeon HD 6970
2GB

with Catalyst 11.4 preview drivers

Dual Radeon HD
6970 2GB

with Catalyst 11.4 preview drivers

Radeon HD 6990
4GB

with Catalyst 11.4 preview drivers

MSI GeForce
GTX 560 Ti Twin Frozr II 1GB +

Asus GeForce GTX
560 Ti DirectCU II TOP 1GB

with ForceWare 267.26 beta drivers

Zotac
GeForce GTX 570 1280MB

with ForceWare 267.24 beta drivers

Zotac
GeForce GTX 570 1280MB +

GeForce GTX 570 1280MB

with ForceWare 267.24 beta drivers

Zotac
GeForce GTX 580 1536MB

with ForceWare 267.24 beta drivers

Zotac
GeForce GTX 580 1536MB +

Asus GeForce GTX 580 1536MB

with ForceWare 267.24 beta drivers

Zotac
GeForce GTX 580 AMP²! 3072MB

with ForceWare 270.51 beta drivers


GeForce GTX 590 3GB

with ForceWare 267.71 beta drivers

Hard drive WD RE3 WD1002FBYS 1TB SATA
Power supply PC Power & Cooling Silencer 750 Watt
OS Windows 7 Ultimate x64 Edition

Service Pack 1

Thanks to Intel, Corsair, Western Digital, Gigabyte, and PC Power & Cooling for helping to outfit our test rigs with some of the finest hardware available. AMD, Nvidia, and the makers of the various products supplied the graphics cards for testing, as well.

Unless otherwise specified, image quality settings for the graphics cards were left at the control panel defaults. Vertical refresh sync (vsync) was disabled for all tests.

We used the following test applications:

Some further notes on our methods:

  • Many of our performance tests are scripted and repeatable, but for some of the games, including Battlefield: Bad Company 2 and Bulletstorm, we used the Fraps utility to record frame rates while playing a 60- or 90-second sequence from the game. Although capturing frame rates while playing isn’t precisely repeatable, we tried to make each run as similar as possible to all of the others. We raised our sample size, testing each Fraps sequence five times per video card, in order to counteract any variability. We’ve included second-by-second frame rate results from Fraps for those games, and in that case, you’re seeing the results from a single, representative pass through the test sequence.
  • We measured total system power consumption at the wall socket using a Yokogawa WT210 digital power meter. The monitor was plugged into a separate outlet, so its power draw was not part of our measurement. The cards were plugged into a motherboard on an open test bench.

    The idle measurements were taken at the Windows desktop with the Aero theme enabled. The cards were tested under load running Battlefield: Bad Company 2 at a 2560×1600 resolution with 4X AA and 16X anisotropic filtering. We test power with BC2 because we think it’s a solidly representative peak gaming workload.

  • We measured noise levels on our test system, sitting on an open test bench, using an Extech 407738 digital sound level meter. The meter was mounted on a tripod approximately 10″ from the test system at a height even with the top of the video card.

    You can think of these noise level measurements much like our system power consumption tests, because the entire systems’ noise levels were measured. Of course, noise levels will vary greatly in the real world along with the acoustic properties of the PC enclosure used, whether the enclosure provides adequate cooling to avoid a card’s highest fan speeds, placement of the enclosure in the room, and a whole range of other variables. These results should give a reasonably good picture of comparative fan noise, though.

  • We used GPU-Z to log GPU temperatures during our load testing.

The tests and methods we employ are generally publicly available and reproducible. If you have questions about our methods, hit our forums to talk with us about them.

Bulletstorm

This game is stressful enough on a GPU to still make a decent candidate for testing cards of this type. We turned up all of the game’s image quality settings to their peaks and enabled 8X antialiasing, and then we tested in 90-second gameplay chunks.

Notice that, although the GTX 580 AMP²! is an impressively expensive video card, we’ve put it up against some even pricier competition. That’s because we’re borrowing the results from our GeForce GTX 590 review, and those include lots of SLI and CrossFire multi-GPU setups. In this company, the AMP²! is about middle of the pack, price-wise.

Sadly, though, it’s not even the fastest single-GPU solution we tested in Bulletstorm. The Radeon HD 6970 chats up Trishka even better than the GTX 580, and the AMP²! averages a fraction of a frame per second less than the stock 580 here, likely due to my own shaky hand guiding the proceedings.

F1 2010
F1 2010 steps in and replaces CodeMasters’ previous effort, DiRT 2, as our racing game of choice. F1 2010 uses DirectX 11 to enhance image quality in a few, select ways. A higher quality FP16 render target improves the game’s high-dynamic-range lighting in DX11. A DX11 pixel shader is used to produce soft shadow edges, and a DX11 Compute Shader is used for higher-quality Gaussian blurs in HDR bloom, lens flares, and the like.

We used this game’s built-in benchmarking facility to script tests at multiple resolutions, always using the “Ultra” quality preset and 8X multisampled antialiasing.

You wonder about running out of memory? Have a look at the dual-1GB configurations in that final chart above, the Radeon HD 6870 CrossFireX, the Radeon HD 5970, and the GeForce GTX 560 Ti SLI. Their minimum frame rates at 2560×1600 are markedly lower than everything else. The AMP²!, meanwhile, just nips the 6970 to take the lead among single-GPU configs.

Civilization V
Civ V has a bunch of interesting built-in tests. Up first is a compute shader benchmark built into Civilization V. This test measures the GPU’s ability to decompress textures used for the graphically detailed leader characters depicted in the game. The decompression routine is based on a DirectX 11 compute shader. The benchmark reports individual results for a long list of leaders; we’ve averaged those scores to give you the results you see below.

Here’s an example where multi-GPU solutions don’t scale particularly well. As a result, the AMP²! comes out near the top of the heap, with its Fermi architecture showing off its GPU-computing mojo and unified 768KB L2 cache.

In addition to the compute shader test, Civ V has several other built-in benchmarks, including two we think are useful for testing video cards. One of them concentrates on the world leaders presented in the game, which is interesting because the game’s developers have spent quite a bit of effort on generating very high quality images in those scenes, complete with some rather convincing material shaders to accent the hair, clothes, and skin of the characters. This benchmark isn’t necessarily representative of Civ V‘s core gameplay, but it does measure performance in one of the most graphically striking parts of the game. As with the earlier compute shader test, we chose to average the results from the individual leaders.

The sheer FLOPS throughput of AMD’s Cayman chip gives the 6970 the advantage here, although AMD’s power-limiting scheme has clamped the Radeon HD 6900-series cards’ performance somewhat. The 5970 has no such limitation.

Another benchmark in Civ V focuses, rightly, on the most taxing part of the core gameplay, when you’re deep into a map and have hundreds of units and structures populating the space. This is when an underpowered GPU can slow down and cause the game to run poorly. This test outputs a generic score that can be a little hard to interpret, so we’ve converted the results into frames per second to make them more readable.

Our theory is that this actual, in-game test involves lots and lots of units with very fine geometry, which may be why the GeForces perform especially well. Nvidia’s Fermi architecture packs gobs of geometry processing horsepower. Whatever the reason, though, the AMP²! easily outruns the 6970.

Battlefield: Bad Company 2
BC2 uses DirectX 11, but according to this interview, DX11 is mainly used to speed up soft shadow filtering. The DirectX 10 rendering path produces the same images.

We turned up nearly all of the image quality settings in the game. Our test sessions took place in the first 60 seconds of the “Heart of Darkness” level.

As these results from different games pile up, the the GTX 580 is slowly establishing itself as the fastest single-GPU solution available. The AMP²! is consistently a sliver ahead of the stock GTX 580, too.

With that said, cheaper multi-GPU options like dual GTX 560 Ti cards or dual Radeon HD 6870s clearly outperform the AMP²!

Metro 2033

We decided to test Metro 2033 at multiple image quality levels rather than multiple resolutions, because there’s quite a bit of opportunity to burden these graphics cards simply using this game’s more complex shader effects. We used three different quality presets built into the game’s benchmark utility, with the performance-destroying advanced depth-of-field shader disabled and tessellation enabled in each case.

Something something 6970, something something 560 Ti SLI. Yeah, same story, folks.

Aliens vs. Predator
AvP uses several DirectX 11 features to improve image quality and performance, including tessellation, advanced shadow sampling, and DX11-enhanced multisampled anti-aliasing. Naturally, we were pleased when the game’s developers put together an easily scriptable benchmark tool. This benchmark cycles through a range of scenes in the game, including one spot where a horde of tessellated aliens comes crawling down the floor, ceiling, and walls of a corridor.

For these tests, we turned up all of the image quality options to the max, along with 4X antialiasing and 16X anisotropic filtering.

These results provide a little more reinforcement for our gathering consensus.

Power consumption

Although the GTX 580 is based on a really frickin’ big GPU, its power consumption is generally pretty reasonable, especially at idle. The AMP²! doesn’t add much power draw versus the stock 580, either, despite its higher clock speeds. One clear weakness is apparent: with a Radeon HD 6970 installed, our test system requires over 60W less than it does with the AMP²!. (Do I put a period there? Hrm.)

Noise levels and GPU temperatures

Yeah, we saved ’em for the end, but those last two bar charts are pretty much the clincher for the GTX 580 AMP²! This card is the quietest solution we tested while running a game, measurably below the noise levels produced by the stock GTX 580 and the Radeon HD 6970. At the same time, Zotac’s custom cooler keeps the AMP²!’s GPU running nearly 20° C below the rest of the pack. I’m smitten, enamored with this outcome, but more so with the subjective experience it helps describe. The AMP²! lets you play games effortlessly, fluidly, at just a whisper.

Conclusions

You really shouldn’t buy Zotac’s GTX 580 AMP²! Yes, the Radeon HD 6970 is a little bit slower and a little bit noisier, but it’s not that far behind on either front—and the 6970 costs more than $200 less. Plus, the 6970 draws less power and sends its hot air exclusively out of the expansion slots, reducing its likely contribution to overall system noise. If you can tolerate more noise and power draw, pairing a couple of GeForce GTX 560 Ti or Radeon HD 6870 cards will yield even higher performance than the AMP²! at well under half a grand. Both of those options are better values, all things considered. Your rational mind should understand this.

But there’s also the undeniable fact that the AMP²! is an incredibly sweet piece of gear, a near-silent killer whose virtues are easy for the discerning enthusiast to appreciate. You may, perhaps, even be looking for a reason to justify one to yourself. Let me help you with that. At $560, the AMP²! costs only 60 bucks more than the going rate for a vanilla GeForce GTX 580. For the extra cash, you get amazing value: a better-looking and pre-installed version of an outstanding $75 cooler, twice the GDDR5 memory, higher clock frequencies, and a $50 copy of Assbro. If you can spare the slot space and are set on getting a GTX 580, the AMP²! is the one to buy.

I know. I had you at Assbro.

There’s also the fact that the GTX 580 has substantially higher geometry processing and tesselation throughput than the 6970, which may matter more for future games than it does for present ones. Combine that with the headroom provided by its 3GB of memory, and the AMP²! may be one of the most thoroughly future-proof video cards we’ve seen in ages. So it’ll still be perfectly serviceable when you decide to upgrade needlessly in two years.

Comments closed
    • liambussell
    • 8 years ago

    I run two of these cards in SLI. I don’t do much benchmarking, but I can tell you that in this test, I think your temps are even a little high. Maybe its my case (Coolermaster Haf-X) But they stay very cool even under high load. I was going to watercool, and I likely still will, but it is no longer the urgent issue I thought it would be. These coolers are that good (for air!)

    I am really happy with them. THe only complaint I have (and its not really a complaint) is that one of the fans on the card made a noise when I first installed it. When I looked, one blade of the cooler fan was clipping the housing. I loosened the screw one turn, and it was fine. However, if I move my machine, it may start up again, required a tap with my finger.

    I got mine in April and I haven’t had a BSOD since installing windows, so maybe they weren’t kidding about it only being the press ones (I hope so!)

    Also, my computer is very nearly silent, only a little noise from the case fans. If I sit my laptop next to my desktop, I’m horrified how loud the laptop is compared to the desktop.

    Excellent card, in SLI the temps are a real benefit. 9.5 out of 10

    • Stargazer
    • 8 years ago

    [quote<]Plus, the 6970 draws less power and sends its hot air exclusively out of the expansion slots, reducing its likely contribution to overall system noise.[/quote<] This would seem to be an excellent opportunity to see how a good non-exhausting fan solution compares to an exhausting fan solution. It is obvious that exhausting hot air is good, but how much better is it? The Amp gets very good results in your noise/temperature tests, but it (and any other non-exhausting gfx cards) gets an unfair advantage in your open-bench tests, since the exhaust is rendered pointless. It would thus be *very* interesting to see how it compares to other cooling solutions when used inside a case. If it still produces better noise/GPU temperature performance, then this is a benefit that can be weighed against the downside of higher in-case temperature. However, if it loses its noise/GPU temperature advantage under such circumstances, then it's essentially a piece of crap solution that only looks good under non-realistic review conditions, and should be avoided. So, I think it would be great if you could perform a limited sub-set of your tests inside a case (obviously it wouldn't be exactly representative of *all* cases, but at least it'd give some idea of how things change). What I'd like to see is a comparison of the 580 Amp with a standard 580 reference card (including more cards would be interesting, but the time requirements would also increase), looking at noise and GPU/CPU/(MB?) temperatures for idle and load. Such a comparison should give a pretty good idea of how good it is to have an exhausting fan solution, and would be *very* good to know when evaluating a card with a non-exhausting fan solution.

    • Bensam123
    • 8 years ago

    “There’s also the fact that the GTX 580 has substantially higher geometry processing and tesselation throughput than the 6970, which may matter more for future games than it does for present ones.”

    Sadly I think we’ll see two or three completely new GPU generations before tessellation becomes common in video games. Judging that two of the three will have released a new console by then.

    • PRIME1
    • 8 years ago

    Where do I click to register to win one of these?

    • no51
    • 8 years ago

    I think this card proves once and for all, that girth is more important than length.

    • flip-mode
    • 8 years ago

    This looks like “the” GTX 580 to get.

    Am I the only one that has begun to miss the days of mid-high end cards with single slot coolers? I probably am. Remember the 7900 GT? The X800 XL? The single slot 7800 GTX? The 8800 GT? The 9700 Pro? The HD 4850?

    We can’t even get single slot on a mid range card these days, like the GTX 450 or the HD 6850. I bet the HD 6850 could make due with a single slot cooler.

      • DancinJack
      • 8 years ago

      It’s not just you. Our appetite for cool and quiet components is rather insatiable today. Not to mention the 9800pro had ~75W TDP. A little less than the high end of today.

        • Firestarter
        • 8 years ago

        In these days of 300 watt TDPs, I’m surprised that there aren’t more 3 slot coolers that exhaust out the back.

      • SomeOtherGeek
      • 8 years ago

      Nope, you are not alone… Miss the 8800GT days too.

      • paulWTAMU
      • 8 years ago

      Definitely not alone! i’m there with you at least. These monster cards can be a PITA to fit in your case.

      • Krogoth
      • 8 years ago

      Physics is a harsh mistress.

      The amount of power that modern GPUs, GDDR chips and the power circuitry draw at load is insane. That’s the price of wanting more and more performance at such a rapid pace.

      Hell, I remember when GPUs got away with a tiny, lousy heatsink. You will only find such a setup on a bargain basement, bottom of line card.

        • PRIME1
        • 8 years ago

        I remember when CPUs had only heatsinks, TVs were black and white, mammoths roamed the earth, etc.

          • dpaus
          • 8 years ago

          …and you were young and open-minded.

          (had to!)

      • Lazier_Said
      • 8 years ago

      I remember the 9700 Pro. It had a tiny heatsink with a fan the size of an oreo that had to spin 5K to keep things cool. It whined obnoxiously, of course. Nobody had ever heard of fan profiles in 2002 so the whine never stopped. Making it tolerable cost me the second slot anyway – and voided my warranty in the process.

      I don’t miss that at all.

      • MadManOriginal
      • 8 years ago

      No, sorry, I don’t terribly miss the smallish whiney fans (not loud but irritating even when the card is idling). I had a few flavors of 8800GT, 4850s and a 9600GT and the stock coolers were never that great in one way or another. Mostly it was the irritating quality of the noise of their fans. Given the level of integration on motherboards nowadays I don’t care about using up two slots for a video card either. Heck, I’ve been on mATX for the last 3 years and don’t feel dual slot cards are any kind of problem. Slot layout *is* key though.

      • Bensam123
      • 8 years ago

      Sort of. I did till I realized how much cooler it made my system blowing hot air out the back of it rather then regurgitating it inside my case. That’s why third party coolers don’t make much sense anymore in terms of GPUs unless it’s liquid cooling.

      • BobbinThreadbare
      • 8 years ago

      Until the minute I realize that even low end cards these days are way more powerful than those high end cards.

      Sure they could make single slot high end cards, but they would just be handicapping themselves. So instead they make monsters and make slightly slower single slot cards for the [s<]grandpas[/s<] gamers such as yourself.

        • flip-mode
        • 8 years ago

        But the [s<]5770[/s<] 6770 and the 6850? The 450/550? Surely those could get along just fine with a single slot solution. Still it is very true and certainly has value that some of the dual slot coolers push the air out of the case. But those dual slot coolers that do not push the air out of the case are pretty unfortunate things, even though they often do a better job of cooling the GPU itself.

          • sweatshopking
          • 8 years ago

          there are some passively cooled single slot 5770’s.

      • Coran Fixx
      • 8 years ago

      I thought my GTX 260 was a fat pig when I put it in, now it looks like the Ally McBeal of video cards

    • indeego
    • 8 years ago

    Don’t call me an ass, dude.

      • Firestarter
      • 8 years ago

      He’s not your bro, mang.

        • ssidbroadcast
        • 8 years ago

        He’s not your mang, guy.

          • Firestarter
          • 8 years ago

          I’m not your guy, friend.

            • dpaus
            • 8 years ago

            I’m not your friend, buddy.

            • I.S.T.
            • 8 years ago

            I’m not your buddy, guy.

    • MrJP
    • 8 years ago

    This would look very bad on the usual price/performance scatter plot! But you’re right that it’s hard to put a value on a really good cooler. It’s a shame you don’t seem to get supplied with more non-standard cards from a range of manufacturers, particularly on the AMD side. It would be nice to see how a 6970 Direct CU II compares with this, for example.

    I’m glad you decided to cut the 590 WICKED from the old benchmarks, though.

    • darryl
    • 8 years ago

    As usual, the decision about which team to go with (Green Team or Red Team) hinges on what game(s) you’ll play. Is there some way to identify games or GPUs by the architecture they favor? That seems so simple; almost too simple. So it probably ain’t.
    darryl

    • CaptTomato
    • 8 years ago

    How much is the ASUS 6970 with their big assed coolers, and why no aftermarket spec-ed AMD’s from MSI etc?

      • Lans
      • 8 years ago

      My thoughts too given those have been out for many months now. Even if they did not warrant a review of their own, why aren’t they included for comparison of power and noise at least?

        • Damage
        • 8 years ago

        This is a review of single product, not a round-up of cards with aftermarket coolers. We can’t always dedicate resources to doing every possible thing we might like, so we have to make choices, set a project scope, and stick with it. In this case, we had room for a quick review of a single card, so that is what we did. We understand everyone always wishes to see more, but that isn’t always feasible.

          • sweatshopking
          • 8 years ago

          damnit scott. review everything ever.

            • paulWTAMU
            • 8 years ago

            That begs for a “your mom” comment doesn’t it?

            I wish they could review every CPU/GPU motherboard combo ever. They don’t have anything like a budget, and I’m sure they’ve really got infinite time so that isn’t a concern either 😉

          • Firestarter
          • 8 years ago

          I think you guys made a good choice here.

        • CaptTomato
        • 8 years ago

        exactly, as they represent better value.
        The AMD’s also have 2vram, so it makes the 580 seem expensive.
        The ASUS bigassed edition 6970 2gig is $430 vs $650 for a Gainward 580 3gig in Australia.

    • vvas
    • 8 years ago

    Great review, but can we agree to drop exclamation marks from product names for the purposes of writing prose? It constantly sounded like you were super-excited about the card in question (though I don’t know, perhaps you were).

    • r00t61
    • 8 years ago

    When will we see a quadruple-slot cooler? At the rate these thermal and power numbers are creeping up, I shudder…

      • Coran Fixx
      • 8 years ago

      Yeah it will have a 2nd non electrical tab to plug into your sli slot to keep the heat sink from shredding the motherboard.

    • spigzone
    • 8 years ago

    … or buy a couple of low noise, low temp dual fan 6950’s for the same price and get 50% better performance.

    • Meadows
    • 8 years ago

    The first full page of this review is the best thing I’ve read in a long while.

    • I.S.T.
    • 8 years ago

    I’m rather surprised the additional RAM does not help at 8XMSAA and 2560×1600… Maybe it’d help with 3D Surround/Eyefinity resolutions?

      • mczak
      • 8 years ago

      Possible, but it might be too slow alone for that. In this triple-monitor review (http://www.techspot.com/review/390-triple-monitor-gaming-performance/page3.html) you can see an obvious limitation due to framebuffer size (most notably on that page), but this was at highest resolution (7680×1600…) and GTX 590 – that is a GTX 580 might not have enough muscle at these settings anyway, not to mention you couldn’t drive 3 displays with one card to achieve that resolution in the first place. Of course with SLI it would be a different story.

        • internetsandman
        • 8 years ago

        This is what I wonder, if two of these in SLI, with the combination of the increased clock speeds in comparison to the 590, the increased VRAM, and of course the massive increase in potential overclocking, if that would make a significant difference at such resolutions

        • I.S.T.
        • 8 years ago

        Thanks for the link. Really does demonstrate the VRAM limitation at highest settings.

    • Jigar
    • 8 years ago

    This review is dying to have a OC benchmarks, i am sure this cooler will help GTX 580 reach new level of performance.

      • SomeOtherGeek
      • 8 years ago

      My thoughts exactly. I was wondering why they didn’t bump the speeds up when had such low levels of temps. Maybe it was drawing too much power?

      Anyway, great review. Lol’ed a lot!

        • mczak
        • 8 years ago

        Well, the low temps are probably the reason it doesn’t draw more power than the stock GTX580 despite being clocked a bit higher. And in contrast to the GTX590, GTX580 handle OC reasonably well.
        The GTX 580 though run quite close to the 300W TDP limit of its connectors, and while it’s no problem to exceed that, possibly Zotac didn’t want to break that spec at stock. I’d agree though this card is really meant for overclocking, and it’s slightly disappointing to see only such a mild overclock at stock.

      • flip-mode
      • 8 years ago

      Does Nvidia even let you OC the 580? Didn’t they limit the overclocability to the point that its, er, pointless?

        • Meadows
        • 8 years ago

        Wasn’t that the dual-GPU GTX 590?

          • flip-mode
          • 8 years ago

          Oooh, yeah, I think it was.

            • sweatshopking
            • 8 years ago

            It was.

    • MadManOriginal
    • 8 years ago

    [quote<]Assbro currently sells for around 50 bucks, but I'd pay more just to be able to say that.[/quote<] Please, no need to give publishers a reason to make $60 a standard release price for PC games.

    • ssidbroadcast
    • 8 years ago

    [quote<]This GPU cooler, wider than the federal budget deficit and sporting more pipes than a Detroit crack house, is[/quote<] lol! Never change, Techreport!

      • aces170
      • 8 years ago

      Yep, love the refreshing change in the write-ups 🙂

Pin It on Pinterest

Share This