Single page Print

GeForce GTX 980 cards from Gigabyte and Zotac reviewed


The Gigabyte G1 Gaming takes on the Zotac AMP! Omega
— 4:36 PM on October 10, 2014

The GeForce GTX 980 is the new king of the hill among single-GPU graphics cards, and with nifty features like DSR, it looks like an awfully tempting potential purchase.

If you're feeling this particular temptation, there's probably one question on your mind: which one should I get?

The first GTX 980 cards to hit the market were based on Nvidia's reference design, with that familiar aluminum cooling shroud and blower. Demand for these cards is high, and supplies are tight. Now, however, a number of custom-designed GTX 980 cards are becoming available. Not only are they potentially more abundant, but they also promise various upgrades over the reference cards. Are they worthy of your attention? We've spent some time with a couple of slick offerings from Gigabyte and Zotac in order to find out.

Zotac's GeForce GTX 980 AMP! Omega


Pictured above is the GeForce GTX 980 AMP! Omega from the folks at Zotac. This hulking creation looks like some sort of heavy mechanized military unit. Here's how it compares to the GTX 980 reference card:

GPU
base
clock
(MHz)
GPU
boost
clock
(MHz)
GDDR5
transfer
rate
Aux
power
ports
Length Intro
price
Reference GeForce GTX 980 1126 1216 7 GT/s 2 x 6-pin 10.5" $549
Zotac GTX 980 AMP! Omega 1203 1304 7 GT/s 2 x 8-pin 10.75" $579

The Omega is bigger and beefier than the vanilla GTX 980 reference design in almost every way. Its GPU clocks are higher, it takes in more juice via dual eight-pin aux power inputs, and its price is pumped up by 30 bucks, too. About the only thing that's the same is its 4GB of GDDR5 memory, which is clocked at 7 GT/s, just like stock.


The most notable way that the Omega differs from the reference card, though, has gotta be its massive cooler. Zotac has a happy tradition of choosing exotic coolers for its aftermarket board designs, and this one fits the mold—or breaks it, I suppose, if the mold is conventionally sized. This thing will occupy three slots in the host PC and is 10.75" long. Beyond that, it sticks up past the top of the PCIe slot cover by about 1.25", enough that it could present clearance issues in older or smaller cases.

The oversized cooling shroud covers a pair of densely populated banks of heatsink fins fed by quad heatpipes. The twin cooling fans are positioned directly above those banks. That's an awful lot of metal and gas to situate atop a GPU with a 165W power envelope (although I doubt the Omega really honors that limitation).

Despite the obvious excess, the Omega retains something of a stately look, at least around front. There aren't any illuminated logos or other such bling. The only LEDs present are basic power indicators on the back of the card.



Also around back is one of the Omega's most intriguing features: a USB port labeled "OC+". Zotac includes a cable to plug into this port and into an internal nine-pin USB header on the host PC's motherboard. Via this connection, the OC+ feature monitors some key variables, including the 12V line from the PCIe slot, the 12V line from the PCIe power connectors, GPU current draw, and memory voltage. Beyond monitoring, OC+ also allows control over the card's memory voltage.

Although Nvidia already has built-in hooks for monitoring and tweaking various aspects of the GPU's operation, OC+ makes an end-run around all of it. This monitoring capability is external to the GPU and relies on a separate chip and shunt resistors. Based on the device IDs shown in Device Manager, Zotac has apparently incorporated a Texas Instruments MSP430 USB microcontroller onto the board to drive OC+.


Eager to try out the OC+ monitoring capability, I connected the USB cable to my motherboard's header, installed Zotac's FireStorm tweaking utility from the included DVD, and was confronted with the interface you see above.

At this point, my feeble brain became confused. Pressing the "advance" button in the interface brought up the series of sliders you see above, but all of those options are available with pretty much any Maxwell or Kepler tweaking utility. The only monitoring I could find consisted of those two small graphs on the top left showing the GPU core and memory clocks. Most of the other buttons like "setting" and "info" proved fruitless. The "Quick Boost" icon was self-explanatory—likely a modest pre-baked overclocking profile—and I figured "Gamer" was probably a slightly more aggressive version of the same. OC+ was nowhere to be found.

Worse, neither Zotac's website nor the included documentation offered any explanation of what OC+ actually does (beyond the words "OC Plus real-time performance intelligence . . . takes your graphics experience to the next level") or how to access it. Hrm.

After consulting with Zotac's friendly PR types, I was encouraged to press the "Gamer" button. Lo and behold, clicking "Gamer" brought up a new window called "S.S.P Chip Setting." There's no mention of OC+ anywhere, but the right info is present.


Once you find the right spot, OC+ does indeed tell you things you can't know via Nvidia's usual GPU monitoring hooks.

Oddly enough, though, the Omega doesn't expose much control over those variables. The GPU Vcore setting appears to allow the user to raise the card's peak GPU voltage by 0.02V, to 1.21V, but it's fussy. The FireStorm app doesn't always keep up with the GPU's dynamic behavior under load, so you're not always adjusting the present voltage properly. Causing a system crash with this slider is way too easy.

The memory voltage slider has two settings, "no change" and a +20 mV offset. That's it.

My understanding is that you may have to pony up for Zotac's GTX 980 AMP! Extreme edition, priced at $609, in order to get working voltage control.

The OC+ limitations chafe a bit, but the worst of it came when I tried to tweak the Omega using the regular controls in the "settings" window, like one would with any recent GeForce card. You can adjust the sliders to your heart's content, but near as I can tell, none of them do anything at all. The Omega's GPU clocks and memory speeds simply don't change when you press "Apply."

For the purposes of this review, I was able to overclock the Omega somewhat using a much older version of FireStorm that I grabbed from Zotac's website. (The new version hasn't yet been posted online.) This older utility has a simpler and frankly more logical interface, and it works reasonably well. That said, nothing I did in software allowed me to raise the Omega's GPU voltage. That variable is evidently locked on this card—a curious choice by Zotac since even the reference design cards aren't voltage-locked.