Gigabyte’s GeForce GTX 1080 Xtreme Gaming graphics card reviewed

As we discovered in our review, Nvidia’s Founders Edition GeForce GTX 1080 graphics card sets new heights for graphics performance and smoothness. That card is relatively noisy, however, and it doesn’t shut off its fan at idle for silent operation. Those characteristics left us wondering how non-reference designs would handle the GP104 GPU. Gigabyte has at least three takes on the GTX 1080 in its catalog already, and today, we’re looking at the company’s biggest, baddest air-cooled card so far: the GeForce GTX 1080 Xtreme Gaming.

When I first saw this thing at Computex, its two-slot-and-change cooler made me raise an eyebrow. It’s one of the biggest graphics coolers I’ve laid eyes on. That huge shroud isn’t just for show, though. Gigabyte crams a dense fin array in there with six heat pipes winding through it, and it uses three larger-than-average 100-mm fans to move air over all that metal. As someone whose journey as a PC hardware reviewer started in case testing, I’m a fan of this decision. Larger fans can spin slower and make less noise than their smaller counterparts, all while moving similar amounts of air.

Big fans like those Gigabyte employs here could end up making for a cooler that’s lengthier than average, but the company cleverly staggers the height of the middle fan so that the two outer fans can overlap it. This “stack fan” design means the GTX 1080 Xtreme Gaming is actually a few millimeters shorter than the GTX 980 Ti G1 Gaming card, and Gigabyte claims it also makes for better airflow over the heatsink. Like any good custom-cooled graphics card these days, the Gigabyte card stops its fans at idle for total silence.

The considerable real estate on the side of this card’s cooler lets Gigabyte put an enormous, RGB LED-illuminated “Xtreme Gaming” logo there. If you have a windowed case, nobody will be left with any doubt about your preferred brand of graphics card—or how “Xtreme” you are. Surely this is good for a couple extra FPS. In all seriousness, I do enjoy being able to coordinate the LED colors of my graphics card with the rest of my system.

Four light pipes on the X-brace that holds the cooler’s second fan in place light up in RGB LED glory, too, although these will probably be difficult to see in most cases. Folks with a Thermaltake Core wall-mounted enclosure will be pleased, though.

Gigabyte reinforces this giant card with a sturdy aluminum backplate emblazoned with the Xtreme Gaming shield logo you see on the side of the card. The orange stripes and white logo are painted on, though, so builders concerned about color coordination will need to be OK with those shades in their systems.

Field-stripping the Xtreme Gaming card reveals a glorious 12+2 power phase design, fed by a pair of eight-pin PCIe power connectors. Each power plug has an LED above it that’ll light up in solid white if a PCIe power connector isn’t plugged in. They’ll also blink if the card detects a problem with the quality of the power source it’s hooked up to. In regular operation, these LEDs stay off.

To carry heat away from the GP104 GPU, Gigabyte uses a large slab of copper that makes full contact with the graphics chip and its surrounding GDDR5X memory. This plate doesn’t extend to the power-delivery circuitry, but a group of thermal pads ensure those components are still transferring heat to the aluminum fins of the heatsink. Instead of the straight, uniformly tall fin design of many graphics card heatsinks, the Xtreme Gaming cooler uses a zig-zaggy fin shape that’s claimed to increase the surface area and potential heat-transfer ability of the heatsink.

  GPU base

core clock

(MHz)

GPU boost

clock

(MHz)

Memory

clock

(MHz)

Memory

size

(MB)

GTX 1080 Founders Edition 1607 1733 2500 8192
Gigabyte GTX 1080 Xtreme Gaming (gaming mode) 1759 1898 2553
Gigabyte GTX 1080 Xtreme Gaming (OC mode) 1784 1936 2600

Gigabyte uses its custom board and cooler to push the GTX 1080 Xtreme Gaming card’s base and boost clocks about 9% over the Founders Edition card in its “Gaming Mode” clock profile. The card also has a more aggressive “OC Mode” that adds about another percent of clock speed on top of the baked-in Gaming Mode boost. Gigabyte doesn’t ship cards with a “review BIOS,” as we’ve seen with some MSI and Asus cards, so the performance of our sample should be representative of what one would get with a retail card.

The Xtreme Gaming card Gigabyte sent us is the “Premium Pack” edition, meaning it comes with a few extra goodies. Along with its three DisplayPorts and a single HDMI out in its rear cluster, the Xtreme Gaming has a pair of HDMI outs on its front edge that connect to an included 5.25″ breakout box. When that box is plugged in, two of the card’s rear DisplayPorts are disabled.

This box puts two extra USB 3.0 ports and two HDMI outputs on the front of the PC, making it easy to connect VR headsets without fiddling about in the dark hindquarters of a system. If you have an Oculus Rift, these extra ports are all that’s needed to plug and play using the headset. HTC Vive owners will still have to rely on that system’s separate breakout box, though, since the Vive has to draw power from a wall outlet to function. Still, this add-on is genuinely useful—it conserves precious VR headset cable length. That’s especially important with the room-scale Vive. 

The other Premium Pack extras are more niche in their usefulness. Gigabyte includes an Xtreme Gaming-branded HB SLI bridge, a case badge, and a rear bracket for the card’s extra HDMI ports, should builders want to route them there for some reason. HB SLI bridges aren’t cheap, so that extra and the VR breakout box are genuinely nice to have.

The GTX 1080 Xtreme Gaming and its Premium Pack add-ins sell for $699.99 on Newegg right now, the same price as a GTX 1080 Founders Edition card—presuming you can find either of those cards in stock, of course. At least on paper, the Xtreme Gaming card’s beefed-up PCB and considerably larger cooler make it seem more appealing than the Founders Edition card. Let’s fire up some games and see what it can do.

 

Our testing methods

As always, we did our best to deliver clean benchmarking results. Our test system was configured as follows:

Processor Intel Core i7-6700K
Motherboard ASRock Z170 Extreme7+
Chipset Intel Z170
Memory size 16GB (2 DIMMs)
Memory type G.Skill Trident Z DDR4-3000
Memory timings 16-18-18-38
Chipset drivers Intel Management Engine 11.0.0.1155

Intel Rapid Storage Technology V 14.5.0.1081

Audio Integrated Z170/Realtek ALC1150

Realtek 6.0.1.7525 drivers

Hard drive OCZ Vector 180 480GB
Power supply Seasonic Platinum SS660-XP2
OS Windows 10 Pro

 

  Driver revision GPU base

core clock

(MHz)

GPU boost

clock

(MHz)

Memory

clock

(MHz)

Memory

size

(MB)

Gigabyte GTX 980 Ti G1 Gaming GeForce 368.81 1152 1241 1753 8192
Gigabyte GTX 1080 Xtreme Gaming GeForce 368.81 1759 1898 2553 8192

Like many graphics cards on the market today, the Xtreme Gaming GTX 1080 comes with multiple clock profiles that one can enable in its companion software. Gigabyte ships the card in “Gaming Mode,” which lets it run at 1759 MHz base and 1898 MHz boost speeds. Since most people will be using this card without its companion software, we left the card in Gaming Mode for our testing.

You’ll note that we’re relying on a Z170 motherboard and a Core i7-6700K CPU to power the GTX 1080 in this review, rather than the X99 and Core i7-5960X combo we use in our main graphics-testing rig. That’s because that monster rig is across the country from me. The Core i7-6700K is more than a match for the GTX 1080, though, going by our review of that chip. If anything, it might be a better gaming CPU than the Core i7-5960X in some cases. Either way, we shouldn’t be bottlenecking the GTX 1080 with our testing system.

Because of the scarcity of GTX 1080s right now, we don’t have another custom card in the lab to compare the GTX 1080 Xtreme Gaming against yet. We do have an ample supply of hot-clocked GeForce GTX 980 Tis to choose from, though, so we can at least give you an idea of the generational advance a hot-rodded GTX 1080 provides over similarly hopped-up GTX 980 Tis. With time, we should be able to get more custom GTX 1080s in the lab and provide a broader picture of the approaches Nvidia’s board partners are taking to hot-rod the card. For now, though, we’re relying on Gigabyte’s GeForce GTX 980 Ti G1 Gaming. This card is among the fastest 980 Tis around, so it should be a worthy opponent for the GTX 1080 Xtreme Gaming.

 

Out-of-the-box performance

If you’re looking for an in-depth take on how the GTX 1080 performs in our advanced “Inside the Second” frame-time benchmarks, you should go read our GTX 1080 Founders Edition review. We won’t be repeating that fine-grained testing here. Instead, we’ll be relying on the scripted benchmarks from Middle-Earth: Shadows of Mordor and Rise of the Tomb Raider to gauge average frame rates at a variety of resolutions. These simple FPS-based tests give us a good idea of just how much more performance the GTX 1080 Xtreme Gaming is delivering versus the GTX 980 Ti G1 Gaming card.

Shadows of Mordor

To put Shadows of Mordor through its paces, we used the game’s Ultra preset. Click the buttons below the FPS graphs to see how the GTX 1080 performs versus the GTX 980 Ti G1 Gaming.


In this older (yet still demanding) title, the GTX 1080 Xtreme Gaming delivers about 23% more frames per second than the GTX 980 Ti at 1920×1080. That figure should please folks with 144-Hz, 1080p gaming monitors. Click up the resolution to 2560×1440, and the Xtreme Gaming card still pushes over 100 FPS—great for high-refresh 27″ displays like Asus’ PG279Q and Acer’s Predator XB271HU. Even at 4K, this card pushes over 60 FPS at ultra quality settings. No matter what type of gameplay you enjoy in this title, the GTX 1080 Xtreme Gaming seems ready for it.

Rise of the Tomb Raider
Rise of the Tomb Raider is a recent, demanding title from our current graphics testing suite. We ran the game on our test suite of cards with the following settings:


Much like it did with Shadows of Mordor, the GTX 1080 performs about 23% better than the GTX 980 Ti G1 Gaming at both of the resolutions we tested. For an idea of the value proposition the Xtreme Gaming card provides, the GTX 980 Ti G1 Gaming card is $539.99 on Newegg right now, about 30% less expensive than the GTX 1080 Xtreme Gaming in its Premium Pack guise. That’s not a 1:1 increase in performance for every extra dollar spent, but it’s pretty close.

Power consumption

We already have a decent idea of the effects a GTX 1080 has on system power consumption from our Founders Edition review, but I sadly don’t have that card at hand for direct comparison. We can compare the card to the last-gen GTX 980 Ti, however, and that’s just what we’re going to do here.


At idle, our test system draws 51 watts with the GTX 1080 Xtreme Gaming installed. That’s pretty close to the 52W we measured with the GTX 1080 FE in our main X99 test rig, so hot-clocking the card doesn’t seem to increase one’s power bill with the graphics card at rest.

Under load, our system draws 280W, compared to the 265W or so we measured with the GTX 1080 FE. We think that’s a fair tradeoff to make given the Xtreme Gaming card’s higher clock speeds. Let’s see whether that extra power translates to more heat or noise production.

Noise and temperatures

To get a real-world idea of this card’s cooling performance and noise levels, I moved the GTX 1080 Xtreme Gaming off our test bench into my personal PC, which is housed in Cooler Master’s excellent MasterCase Maker 5 enclosure.


Since both the GTX 1080 Xtreme Gaming and the GTX 980 Ti G1 Gaming cards turn off their fans at idle, we’re really recording the noise floor of the host system rather than any performance characteristic of the graphics cards in question. My personal PC produces about 32 dBA at idle. Crank up Heaven for 10 minutes, though, and the GTX 1080 separates itself nicely from Gigabyte’s G1 Gaming cooler. The Xtreme Gaming card only increases system noise to 38 dBA under load, while the G1 Gaming card has to spin its fans faster to cool the bigger, hotter GM200 GPU. That behavior leads to a 42-dBA system noise level.

From a subjective standpoint, neither card is loud, exactly. That said, the G1 Gaming cooler has a distinct whine to its fan noise that’ll be noticeable to other people in a room, while the 100-mm fans on the Xtreme Gaming card only produce the slightest whisper. I’ve found myself double-checking whether this GTX 1080’s fans are actually spinning several times during the course of my tests. In fact, the 140-mm fans on my Corsair H105 CPU cooler are more noticeable. The Xtreme Gaming card does produce a noticeable coil whine on my test bench, but that high-pitched screech became next to inaudible in a case.

Overall, Gigabyte’s Xtreme Gaming cooler is one of the best I’ve ever laid ears on for a graphics card. Let’s see whether the company traded thermal performance for quiet running by running Unigine Heaven on each card for 10 minutes inside the same case.

Even though its default fan profile is tuned for quiet running, the GTX 1080 Xtreme Gaming’s load temperatures don’t suffer for it. At stock clocks, the GPU’s temperature didn’t exceed 66° C. While it’s not really a useful direct comparison, the G1 Gaming card reached 73° C under the same load while producing more noise. Just goes to show that process shrinks are a wonderful thing.

Now that we’ve seen how the GTX 1080 Xtreme Gaming performs out of the box, let’s dive into Gigabyte’s included software utility to see how the card handles manual overclocking—and the effects of that tweaking on performance.

 

The Xtreme Gaming Engine

Gigabyte’s included software utility for monitoring and tuning the GTX 1080 Xtreme Gaming is called the Xtreme Gaming Engine. While the app does have a rather silly skin on it to match the Xtreme Gaming card’s wild looks, it at least doesn’t sacrifice usability to the flashy facade.

Upon launching the app, users are greeted with a picture of the card being monitored. To nobody’s surprise, that’s a GTX 1080 Xtreme Gaming in the case of this review. Gigabyte’s interface displays multiple frames to account for multi-GPU systems, although the six open spaces here seem a little optimistic given Nvidia’s recent move to support dual-GPU SLI exclusively.

After the user chooses a card, the XGE displays all of its vitals at a glance. It also provides big orange sliders for modifying core and memory clock offsets, GPU voltage, power limits, and temperature limits. Users can create a seemingly arbitrary number of tuning profiles using the drop-down in the upper left.

The somewhat-confusingly-named Advanced OC section actually controls this GTX 1080’s baked-in clock profiles, too. We get the default “Gaming Mode,” a slightly more aggressive “OC Mode,” and a slightly down-clocked “Eco Mode.”

If the pre-baked profiles aren’t to one’s taste, the app also lets users apply a linear offset to the card’s voltage-and-frequency scaling curve or use the GPU Boost 3.0 features in Pascal to tweak clock speeds at a variety of voltage levels. We won’t be doing that kind of tweaking today, but more ambitious overclockers can sleep soundly knowing that the option is there.

The “Fan” section of the app provides control over three baked-in fan profiles, as well as options for setting fixed fan speeds or a custom fan curve.

Finally, the “LED” section lets owners change the lighting color of the Xtreme Gaming logo on the side of the card and the four light pipes above the middle fan. These LEDs behave as one zone, so picking a color is an all-or-nothing affair. Along with the usual solid color mode, Gigabyte includes “breathing,” “flashing,” and “dual flashing” styles, plus a “variable brightness” mode that can (in theory) change the brightness of the card’s LEDs in response to inputs like CPU or GPU temperatures.

 

Overclocking

Thanks to Nvidia’s GPU Boost technology, overclocking a GTX 1080 isn’t quite as simple as moving some sliders and calling it done. As we saw with our out-of-the-box performance testing, the boost clock on Nvidia cards isn’t an absolute maximum—it represents a typical speed under load, determined by factors like temperature limits, power limits, and voltage. The GPU can clock up past that boost number under the right conditions.

The last time we tested custom-cooled graphics cards, we relied on a FurMark variant in MSI’s Kombustor app called FurryPQTorus. FurMark is a great stability tester, but the extreme thermal load it produces isn’t representative of actual games. To investigate how the GTX 1080 Xtreme Gaming performs, I’m relying on Unigine’s tried-and-true Heaven benchmark. Heaven is still an exceptionally demanding application, but the load it produces is closer to what one might see from an actual game than the one from FurMark.

As a point of reference for the overclock to come, here are the clock speeds and voltage numbers the GTX 1080 Xtreme Gaming exhibited in “Gaming Mode” and “OC Mode” after running Heaven for 10 minutes:

  GPU

base

clock

(MHz)

GPU

boost

clock

(MHz)

Memory

speed

(MT/s)

Heaven

GPU

voltage

Heaven

GPU

clock

(MHz)

Gigabyte GTX 980 Ti G1 Gaming 1152 1241 7010 1.193 1342
Gigabyte GTX 1080 Xtreme Gaming 1759 1898 10206 1.050 1987
GTX 1080 Xtreme Gaming (OC Mode) 1784 1936 10400 1.050 2012

As you can see from the “Heaven GPU clock” results above, the GTX 1080’s specified speeds have practically no link to the real-world clock speeds it can deliver. Whatever parameters GPU Boost 3.0 is considering, it’s apparently comfortable pushing the Xtreme Gaming card far above its rated clocks. I also kicked the card into its “OC Mode” to see how much Gigabyte’s hottest profile would eke out of the card, and I observed a 25-MHz clock speed boost. Not much, but not nothing.

To overclock the Xtreme Gaming, I maxed out the voltage, power, and thermal limit sliders in the Xtreme Gaming Engine software before edging up the clock speed offset. This isn’t a particularly refined approach, but it didn’t seem to harm the card, either, so I rolled with it. I continued pushing clocks upward until I began to see artifacting or crashes in Heaven, and I double-checked my work by running Doom for a while. In my experience, Doom exhibits serious artifacting while running on a graphics card pushed past its limits, so it’s a good real-world reference for the success or failure of an overclock.

  GPU

clock

offset

(MHz)

Base/

boost

with

offset

(MHz)

Memory

speed

(MT/s)

Heaven

GPU

voltage

Heaven

GPU

clock

(MHz)

Heaven

GPU

temp. (°C)

Gigabyte GTX 1080

Xtreme Gaming (OC)

+50 1759/1948 10868 1.094 2063 68

The table above records the maximum clock speed offset I was able to use while keeping the card stable. The “boost+offset” number describes the boost speed the Xtreme Gaming Engine software displays after clock tweaking, not an actual number teased out through performance testing. For that, we need to look at the voltage and “delivered” clocks, for lack of a better word, that the card exhibited after 10 minutes in Heaven with this overclock. Under an actual load, the GPU settled into an observed speed of 2063 MHz. At that speed, GPU-Z indicated that the card was voltage-limited.

Considering that firing up OC mode in the Xtreme Gaming engine resulted in a 2012-MHz stable speed, our particular piece of GP104 silicon only had about 2.5% more clockspeed left in it and about 4.5% more memory speed on top of the factory boost that Gigabyte dialed in. It’s worth remembering that every chip and card is unique, however. Some of these GTX 1080s will fare better than our test unit, and others may fare worse. Even taking the silicon lottery into account, Gigabyte’s easy-to-use tuning software and beefy custom PCB seem to offer no obstacles in taking a particular piece of GP104 silicon to its limits, and that’s what we’re really interested in when we overclock a graphics card.



To see just what those clock speed boosts mean for performance, I re-ran our Shadows of Mordor and Rise of the Tomb Raider benches. Depending on the resolution used, overclocking the card resulted in anywhere from about 2-4% better performance in RoTR and about 3.5-6% better performance in Shadows of Mordor compared to its results in Gaming Mode. Whether that’s worth the trouble of manual overclocking is in the eye of the beholder, but if you’re looking for every last bit of performance, it might be worthwhile to move some sliders around.

I also tested our system’s power draw while the GTX 1080 was overclocked and found that it pulled about 30W more from the wall with the sliders maxed, or 310W in total. We’d expect nothing less from an overclocked card, but its effect on system power draw is still less than that of a custom GTX 980 Ti running at its default clock profile.

 

Conclusions

Gigabyte’s GTX 1080 Xtreme Gaming is the first custom card of its ilk that we’ve tested. That’s a tough mantle to carry, but Gigabyte’s effort impressed us in almost every regard. The Premium Pack version of the card that we received comes with useful add-ins that make the card more friendly for PCs that will power VR headsets, as well as an HB SLI bridge that could be as much as $40 on its own. Those are compelling add-ins for a card that costs the same as the GTX 1080 Founders Edition.

Nothing in this world is perfect, and the Xtreme Gaming card exhibits some coil whine under load. That said, the coil whine should be mostly dampened by any good case, like the Cooler Master MasterCase Maker 5 I used for noise testing. The card’s two-and-a-half-slot cooler could make it a tight fit in smaller cases, and running a pair of these babies in SLI could be a cooling and clearance challenge for the brave few who will try it. A motherboard with widely-spaced primary PCIe slots (plus an appropriate HB SLI bridge) seems like a must for that purpose.

By almost every other measure, however, this card is sublime. Despite the hot-clocked GP104 GPU underneath its hefty cooler, the Xtreme Gaming remains remarkably quiet under load. It’s probably the quietest air-cooled graphics card I’ve ever used. The Xtreme Gaming cooler also keeps the GP104 GPU quite chilly—in my test system, the card didn’t exceed 66° C under load despite its serene noise character.

This GTX 1080 unsurprisingly offers a considerable boost in performance over Gigabyte’s GTX 980 Ti G1 Gaming, and its observed boost clock speeds are quite a bit higher than Nvidia’s reference specs for the GTX 1080, too. During our overclocking attempts, the Xtreme Gaming’s power-delivery subsystem and cooler didn’t pose any obstacles in taking the GP104 silicon on board to its outer limits. We may not have hit the jackpot in the silicon lottery with this particular GP104 chip, but the Xtreme Gaming card and its accompanying software seem to offer all the knobs one might want for probing the limits of Pascal.

Gigabyte GeForce GTX 1080 Xtreme Gaming

July 2016

The styling of Gigabyte’s top-end cooler will probably be the most divisive thing about this product. Honestly, if I were spending $700 on a graphics card, I would want it to make a statement like this one does. The Xtreme Gaming’s extensive RGB LED lighting, large logos, and bold color scheme will all speak loudly and clearly to your card of choice through a case window, and it’ll do so whether it’s installed in a traditional case or in showier designs like Thermaltake’s Core P3 or Core P5.

It’s early in the life cycle of Pascal, but the GTX 1080 Xtreme Gaming’s superb performance, excellent noise character, high-value add-ins, and aggressive style make it an easy Editor’s Choice. If you’re going to drop $700 on a GTX 1080, this card should be high on your list

Comments closed
    • samhain1969
    • 3 years ago

    Why not compare this GPU to the 980 Ti Xtreme Gaming OC Edition (Windforce)??? The G1 is a good card but the OC Edition, was/is the better of the two in all ways.

      • Jeff Kampman
      • 3 years ago

      We never received a review sample of that card to work with, so I had to make do with what was on hand. I’m sure the GTX 980 Ti Xtreme Gaming is also excellent.

    • Sabresiberian
    • 3 years ago

    Very nice card, and it comes with some extras other cards don’t, but I fail to see the value in it. Why spend more than what something like the EVGA ACX 3.0 costs – $620? They all overclock to the same level.

    • jihadjoe
    • 3 years ago

    Dat yummy yummy 2GHz tho ლ(´ڡ`ლ)

    • Coran Fixx
    • 3 years ago

    Theoretical MSRP brought to you by OOS Hairclub for Men

    • derFunkenstein
    • 3 years ago

    The elimination of 5.25″ drive bays in so many cases actually hurts this card’s value proposition. If you’re into VR and want to be able to just plug/unplug a headset easily, and if your case lacks a 5.25″ bay (like the Define S or a bunch of other recent releases) you’re SOL.

      • Voldenuit
      • 3 years ago

      The smart thing to do would have been to make the breakout panel 3.5″ and included a 3.5 to 5.25 adapter.

        • derFunkenstein
        • 3 years ago

        It seems anymore that cases either have front panels or don’t. That might help in a few cases but not those panel-less WC-oriented cases.

      • juzz86
      • 3 years ago

      There’s a good opportunity for standardisation here, I reckon:

      Make the front I/O ports standard. Make it the size of a 3.5 inch bay, and let the case manufacturer put the hole wherever they like.

      The standard plugin (shipped with cases) is eg. 2x USB 3, 2x USB 2.0 and 3.5mm headset/mic jacks, Power and Reset.

      People like Gigabyte can then come along and offer alternatives. In this bundle for example, the Front I/O might be Type C, 2x USB 3, 2x HDMI and FP audio.

      The one shipped with the top-end X99 boards might be all USB 3/Type C/Thunderbolt.

      The one shipped with a Quadro or NVS might be four DPs.

      Hey, if you want to put a Fan Controller or a Floppy drive there, go ahead!

      The number of people using eSATA and Firewire on their front I/O continues to decline, however these controllers shouldn’t have to go unused because the case you like doesn’t have the ports – you can just hop on eBay and find a cheap FPI/O plate that hosts them. ALternatively if your case ships with Firewire/eSATA and you’re not using them, pick up a cheap 8x USB 2 plate and go nuts.

      I reckon something like this could be a go-er with the decline of the traditional external bay.

      • Prestige Worldwide
      • 3 years ago

      Funny, the other day I had to take a drill to my case to remove the rivets holding my 5.25″ bays in my Fractal Arc Midi v2 case. Trying to accomodate a 360mm radiator on top and use DVD drive so rarely that I’ll just buy an external one.

      But man those rivets were annoying, I really wish that they had just used screws.

    • JDZZL
    • 3 years ago

    For everyone that was complaining about TR not producing reviews within 20 seconds of every cards release, this is why you come to this site. Not for rushed reviews, but for quality. Temps, sound, build quality, etc. all those things that actually matter in conjunction with not just average frame rate but 99th percentile frame creation.

    If you’re an early adopter, we all know you’re going to buy this first available card and then show the universe your awesomeness.

    While we here at TR do spend a significant amount on our tech gear, I’d like to think we maximize our dollar thanks to TR…

    Excellent review Jeff, keep up the good work!!

    • JosiahBradley
    • 3 years ago

    Is nVidia still voltage locking their GPUs? Looks like Pascal has an insane amount of headroom here and a giant cooler like this should provide beefier overclocks. Still impressive for the price though.

      • jihadjoe
      • 3 years ago

      IIRC the policy hasn’t changed since Kepler: Nvidia will warranty the cards so long as they stay under the specified voltage. Partners are free to unlock cards if they want to, but if stuff fails they’ll have to handle repairs/replacements themselves.

    • Jigar
    • 3 years ago

    Double post, sorry.

    • Jigar
    • 3 years ago

    Thanks for the review guys, i actually liked the card. Never considered Gigabyte before but i will be looking into it now.

      • Prestige Worldwide
      • 3 years ago

      My friend’s Gigabyte G1 Gaming GTX 970 overcloced to 1600 MHz with ease, whrereas my Asus GTX 970 STRIX topped out at 1500mhz. They make quality cards and great overclockers.

    • I.S.T.
    • 3 years ago

    Question: How come there isn’t a table or something comparing the clockspeeds on this card to the stock speeds of a GTX 1080?

      • Pancake
      • 3 years ago

      Yeah, this review is a bit meaningless. The stated aim is to test against performance of Founders Edition as in the original review. I was looking forward to noise/thermal/OC differences. What’s the point of these 980Ti benchmarks?

        • Jeff Kampman
        • 3 years ago

        You know we do have a GTX 1080 FE review, right? The numbers aren’t perfectly portable but you can compare a few things about this card’s noise and performance to it. [url<]https://techreport.com/review/30281/nvidia-geforce-gtx-1080-graphics-card-reviewed[/url<]

      • Jeff Kampman
      • 3 years ago

      Good point, I added one to the first page. Not sure how I forgot that.

        • I.S.T.
        • 3 years ago

        Yay, I helped!

    • ThatStupidCat
    • 3 years ago

    This is a beautiful card. Thanks Gigabyte for sending one to TR.

    • ultima_trev
    • 3 years ago

    With these aftermarket 1080s boosting to well over 2000 MHz I cannot see the new, reference only Titan Xp selling too well. Unless someone NEEDS the extra 4GB of VRAM the value proposition is non-existent.

      • jihadjoe
      • 3 years ago

      Someone who gets the new Titan for gaming is probably on the hardcore side of PCMR and has the will and the wherewithal to put a waterblock on it.

        • Chrispy_
        • 3 years ago

        That’s an awfully high-dollar warranty to void though, even for the hardcore.

      • Flapdrol
      • 3 years ago

      Value proposition has always been terrible with titan cards.

      • Prestige Worldwide
      • 3 years ago

      Well, it does have an extra 1000+ cuda cores, so there’s that. Even if it doesn’t clock as high, it will be way ahead of the 1080.

    • Farting Bob
    • 3 years ago

    Considering the cost of all the extra bits, and the crazy overengineered heatsink, the fact this is the same price as the founders edition is pretty impressive. Of course there are cheaper custom cards, but for those who want the very best card right now, its a good offering.

      • realneil
      • 3 years ago

      I agree with this. I see it as low cost when you look at the features and the HD SLI bridge included.
      I have the 980Ti version of this card here and it’s performance is stellar.

      I want to buy this one right now, but I’m trying to wait for the 1080Ti variant to land.

      Good review too.

    • spiritwalker2222
    • 3 years ago

    I would have preferred to see the card compared to the founders edition that you already tested. People will debate between the FE and this variant of the 1080. But who will be thinking of either buying the 980 vs 1080?

      • Jeff Kampman
      • 3 years ago

      Well, here’s my short take: it’ll be much quieter, perform similarly (probably a bit faster without overclocking in the picture, though the silicon lottery makes it hard to say) and have more thermal headroom than the FE card. Logistics prevented a more detailed comparison, and I apologize for that.

      • DPete27
      • 3 years ago

      I was going to say that also. Of course, readers can always pull up the [url=https://techreport.com/review/30281/nvidia-geforce-gtx-1080-graphics-card-reviewed<]GTX1080 FE review[/url<] for a side-by-side comparison anyway, but I'm feeling rather lazy. There's also the complication on the power and noise measurements from both cards since they were tested on different systems, locations, sound meters, sound floors, etc etc.

    • chuckula
    • 3 years ago

    [quote<]You'll note that we're relying on a Z170 motherboard and a Core i7-6700K CPU to power the GTX 1080 in this review, rather than the X99 and Core i7-5960X combo we use in our main graphics-testing rig. That's because that monster rig is across the country from me. The Core i7-6700K is more than a match for the GTX 1080, though, going by our review of that chip. If anything, it might be a better gaming CPU than the Core i7-5960X in some cases. Either way, we shouldn't be bottlenecking the GTX 1080 with our testing system.[/quote<] Ah, the nightmare that is logistics. However, if you could get some frametime numbers from this card [even downclocked to stock levels] in the 6700K for comparison to the FE version of the GTX 1080 in the X99 rig that could be an interestly article too.

      • Voldenuit
      • 3 years ago

      [quote<]However, if you could get some frametime numbers from this card [even downclocked to stock levels] in the 6700K for comparison to the FE version of the GTX 1080 in the X99 rig that could be an interestly article too.[/quote<] The FCAT testing hardware purportedly costs several thousand dollars. I'm guessing it's also across the country with the main graphics test rig. Otherwise, great job Jeff, and the extra resolution benchmarks are much appreciated!

        • Jeff Kampman
        • 3 years ago

        We don’t rely on FCAT to glean our frame time info, so there’s no geographic obstacle to collecting the data. However, gathering frame-time info is still extremely labor-intensive and we tend to reserve it for our first reviews of various GPUs.

        Once we get partner cards, we like to spend that time on getting average frame-rate data for extra resolutions where it’s possible, as we have here. This way, we’re covering all our bases.

          • xeridea
          • 3 years ago

          Are you making all the graphs manually? It seems it shouldn’t be time consuming with a script, just give it frametime data and it would spit out all the graphs. If I wasn’t so busy I would consider writing one just for fun. I can see it being tedious to do manually, perhaps someone can help?

          For general use, someone made a script a while ago that takes FRAPS frame data and outputs a nice graph. I think some other sites may use something similar because it has the 0.1% frame time. It doesn’t seem to have the one showing distribution of frames though, which is really useful.

          [url<]http://www.overclock.net/t/1337569/fraps-frametime-similar-to-techreports-methods-viewer-application-released[/url<]

            • Ninjitsu
            • 3 years ago

            There’s also a small free program called FRAFS for interpreting FRAPS frame times, maybe it’s the same you’ve linked to.

    • derFunkenstein
    • 3 years ago

    You might get some complaints, but I like this simplified approach for a review of a variant of a card you’ve already tested in excruciating detail. The important things here are what you covered – overclocking, noise, and power consumption.

    (Yes, I’m a kiss-ass, but seriously, it’s true. Look it up.)

      • TwoEars
      • 3 years ago

      Yupp – If it isn’t a brand new card with new architecture I usually skip straight the overclocking page and look at 2560×1440. Then I go back to frame times. Then maybe I look at temp and noise. Perfectly find to keep it short and sweet if you’ve already tested similar models.

      On another topic: These cards are getting seriously cpu bound in many games these days, Fallout 4 is one of the worst. Rise of the Tomb Raider is one of the few with really good scaling so far, kudos to whoever made the engine and optimization for that.

      • Mr Bill
      • 3 years ago

      Agreed, the review hit the points that mattered. I’m impressed by how much other vendors rework some aspects of these cards.

    • pot
    • 3 years ago

    This is a beautiful card.

      • Chrispy_
      • 3 years ago

      I think it’s hideous and would look so much better if all the garish plastic were removed.

      Stuff I find aesthetically offensive seems to sell like hotcakes though, so I’m probably in the minority here.

Pin It on Pinterest

Share This