Asus’ ROG Strix GeForce GTX 1080 graphics card reviewed

If there’s one constant in life as a PC builder, it’s that Asus knows how to build a high-quality video card. The Strix GTX 980 Ti OC Edition took home a coveted TR Editor’s Choice Award in our GTX 980 Ti roundup, so I was excited to see what Asus had in store for its next generation of graphics cards when the ROG Strix GTX 1080 flew into the TR labs.

For the arrival of the GTX 1080, Asus opened a clean CAD file and rethought both the function and style of its high-end graphics card for a post-Pascal world. That clean-sheet thinking has resulted in a few changes for Asus’ new breed of aven predators. The Strix brand is now part of the swanky Republic of Gamers family. The most visible result of that change is a move from a brash red-and-black design language to a subtle gray-and-black palette. The Strix cooler’s shroud is now accented by angular outcroppings that spread across the underside of the card’s cooler like some kind of bionic wings.

Those neutral colors serve as a perfect backdrop for the RGB LEDs embedded within the Strix’s cooler shroud. Six fissures in the shell of the cooler let those LEDs shine through. Those full-spectrum LEDs also illuminate ROG logos on the side of the shroud and on the card’s backplate. While a lot of thought clearly went into these LEDs, it’s a shame they’ll be hidden in 99% of regular PC builds. Show-offs will want to use a case like Thermaltake’s Core P3 or Core P5 to use the Strix’s LEDs to maximum effect.

These lights don’t improve the Strix card’s performance, but I’m still pleased that I can coordinate the card’s lighting with the other RGB LEDs in my system and on my peripherals. Even in a typical ATX case where the LEDs around the card’s fans won’t be visible, the ROG Strix card still offers a sharp-looking touch of color on its backplate with a laser-cut ROG logo.

The most unique feature of the ROG Strix GTX 1080 hides under the front edge of the card’s PCB. Asus includes a pair of four-pin fan headers that allow system fans connected to the Strix to respond to the graphics card’s temperature fluctuations. Hallelujah.

You see, I’m a huge stickler about firmware fan control, and being able to tie system fan speeds to a variety of component temperatures is one of my favorite things about the better motherboards out there. Even Asus’ motherboard firmware doesn’t allow builders to set the graphics card temperature as a control variable for system fan speeds yet, though. For systems where the graphics card is likely to be the biggest heat generator (like small-form-factor builds), I’ve had to rely on third-party fan controllers like NZXT’s Grid+ v2 for that kind of control. With the ROG Strix GTX 1080, that all changes.  

The fan headers on the Strix card take some DNA from Asus’ best-in-class fan-control mojo. They can control both four-pin (PWM) and three-pin (DC) fans, and the headers automatically sense the type of fan that’s connected for plug-and-play simplicity. The fans connected this way are controlled by the same fan curve that governs the graphics card’s fans, however, so they can’t be configured independently or individually. They’ll also shut off at idle, which might be an issue in powerful systems with only a couple of fans. Be careful.

Though Asus doesn’t include a breakout box for VR hardware like some Gigabyte and EVGA cards do, it still offers a degree of VR-friendliness in its port cluster. Asus trades one of the GTX 1080’s three standard DisplayPorts for another HDMI 2.0 out. This two-and-two split between HDMI ports and DisplayPort outs means that Rift and Vive owners can plug in their headset alongside another HDMI monitor or TV without an adapter. For folks who want to show off their VR exploits in real time, that extra HDMI out might be handy.

To keep the GP104 GPU chilly, Asus uses a new version of the DirectCU cooler we’ve seen on many of its cards in the past. This cooler uses five copper heatpipes—only three of which touch the GPU itself—to carry heat away from the chip. From this angle, we can also see the six- and eight-pin PCIe power connectors that feed the card’s eight-plus-two power-phase setup.

  GPU base

core clock

(MHz)

GPU boost

clock

(MHz)

Memory

clock

(MHz)

Memory

size

(MB)

GTX 1080 Founders Edition 1607 1733 2500 8192
Asus ROG Strix GTX 1080 1670 1809 2500
Asus ROG Strix GTX 1080 OC Edition 1759 1898 2500

The ROG Strix GTX 1080 comes in two flavors: the “ROG Strix-GTX1080-A8G-Gaming” and the “ROG Strix-GTX1080-O8G-Gaming.” The main difference between the cards is one of clock speeds. The A8G card is clocked at 1670MHz base while the O8G card pushes further with a 1759MHz base clock and an 1898MHz boost speed. Both cards are clocked significantly faster than the 1607MHz base and 1733MHz boost speeds of Nvidia’s GTX 1080 Founders Edition.

Confusingly, Newegg sells the A8G card under its full Asus model name, while the O8G is sold as an “OC Edition.” Buyers should be careful to make sure they’re getting the Strix card they want. The A8G card sells for $709.99, and the OC Edition sells for $719.99. That’s not much more to pay for a significant increase in factory clock speeds. Both cards are priced far in excess of Nvidia’s seemingly wishful $599.99 suggested price for custom GTX 1080s, though. Let’s see what those dollars buy us when the bits hit the DirectX queues.

 

Our testing methods

As always, we did our best to deliver clean benchmarking results. Our test system was configured as follows:

Processor Intel Core i7-6700K
Motherboard ASRock Z170 Extreme7+
Chipset Intel Z170
Memory size 16GB (2 DIMMs)
Memory type G.Skill Trident Z DDR4-3000
Memory timings 16-18-18-38
Chipset drivers Intel Management Engine 11.0.0.1155

Intel Rapid Storage Technology V 14.5.0.1081

Audio Integrated Z170/Realtek ALC1150

Realtek 6.0.1.7525 drivers

Hard drive OCZ Vector 180 480GB
Power supply Seasonic Platinum SS660-XP2
OS Windows 10 Pro

 

  Driver revision GPU base

core clock

(MHz)

GPU boost

clock

(MHz)

Memory

clock

(MHz)

Memory

size

(MB)

GeForce GTX 1080 Founders Edition GeForce 372.54 1607 1733 2500 8192
Gigabyte GTX 1080 Xtreme Gaming 1759 1898 2553 8192
Asus ROG Strix GeForce GTX 1080 OC Edition 1759 1898 2503 8192

Like many graphics cards on the market today, the ROG Strix GeForce GTX 1080 comes with multiple clock profiles that one can enable in its GPU Tweak II companion software. Asus ships the card in “Gaming Mode,” which lets it run at 1759 MHz base and 1898 MHz boost speeds. Since most people will be using this card without its companion software, we left the card in Gaming Mode for our testing.

We’re pitting the ROG Strix GTX 1080 against Nvidia’s GeForce GTX 1080 Founders Edition and the Gigabyte GeForce GTX 1080 Xtreme Gaming card that we recently tested. All of the cards were run on an open bench for performance testing using the same Core i7-6700K-powered test system that’s served as the underpinnings of our recent graphics card reviews. We conducted noise and thermal testing inside a Cooler Master MasterCase Maker 5 ATX mid-tower to provide a sense of real-world performance on those measures. For each benchmark we run, we perform three test runs and take the median of the results.

Our testing methods are generally publicly available and reproducible. If you have any questions about our methods or results, be sure to leave a comment on this article or join us in the TR forums.

 

Out-of-the-box performance

If you’re looking for an in-depth take on how the GTX 1080 performs in our advanced “Inside the Second” frame-time benchmarks, you should go read our GTX 1080 Founders Edition review. We won’t be repeating that fine-grained testing here. Instead, we’ll be relying on the scripted benchmarks from Middle-Earth: Shadows of Mordor and Rise of the Tomb Raider to gauge average frame rates at a variety of resolutions. These simple FPS-based tests should be good enough to separate the various GTX 1080s we have on hand.

Middle-earth: Shadow of Mordor

To put Middle-earth: Shadow of Mordor through its paces, we used the game’s internal benchmark and the Ultra quality preset. Click the buttons below the FPS graphs to see how the three GTX 1080 cards perform at three common resolutions.


Interesting. Despite their similar core clock speeds, the Gigabyte GTX 1080 Xtreme Gaming ekes out a tiny but consistent lead over the Strix. Perhaps the Xtreme Gaming’s higher-clocked memory is giving it the edge over the Asus card here. Both custom GTX 1080s open a wide performance advantage over the GTX 1080 Founders Edition card.

Rise of the Tomb Raider
Rise of the Tomb Raider is a recent, demanding title from our current graphics testing sute. We ran this game using the following settings: 


Man. Far be it from us to call a winner here. Both custom cards are within about a frame per second of one another at all of the resolutions we tested. Both the Asus and the Gigabyte cards offer a nice performance boost over the Founders Edition card, to be sure, but whatever characteristic gave the Gigabyte card the edge in Shadow of Mordor isn’t being exercised here.

Noise levels

Our noise level measurements don’t make this race any less close. Going by pure dBA numbers alone, the Strix card is ever-so-slightly quieter than the already excellent cooler on the GTX 1080 Xtreme Gaming. To be perfectly honest, I doubt most people’s ears are sensitive enough to hear the difference in absolute noise levels between these cards while they’re operating. The Strix and the Xtreme Gaming cards both offer improvements over the GTX 1080 Founders Edition in this regard, as well.

dBA numbers never tell the whole story about noise, though. Despite its low dBA figure, the Strix’s trio of fans produces a higher-pitched whoosh than the GTX 1080 Xtreme Gaming’s 100-mm spinners do. Unfortunately, this sound isn’t entirely smooth. It has a slight buffeting quality or roughness about it that puts a bit of scruffiness on an otherwise broad-spectrum noise character.

At their worst, the Gigabyte fans produce the slightest hint of a baritone note, but they usually don’t sound like much of anything in operation. That means the Xtreme Gaming card’s sometimes-prominent coil whine is often the only sound that’ll clue you in to the fact that it’s operating at speed. On the other hand, the Strix card produces barely any coil whine, so which card is “better” on a subjective basis will depend on whether you’re bothered more by the sound of air moving or by the occasional sound of electronics switching at high speed.

Now that I’ve heard it with my own ears, I think we may have been a bit harsh on the GTX 1080 Founders Edition cooler in our original review of that card. The fan on the Founders Edition is one of the best-sounding blower coolers I’ve ever heard, but it still has a certain whiny, hissy quality about it that doesn’t fade into the background as easily as the sounds of the Asus or Gigabyte cards do. Once again, the card isn’t exactly loud, but it’ll always make itself known in operation.

Power consumption

At idle, our test system is impressively frugal on power with all of the GTX 1080s we have on hand. The overclocked custom cards unsurprisingly consume slightly more power than the GTX 1080 Founders Edition, but that’s to be expected.

Under load, the GTX 1080 FE consumes significantly less power than both the Asus and Gigabyte cards. The factory clock speed boost on both of the custom cards seems to come at a significant cost for power consumption. The Strix holds about a 10W advantage over the Xtreme Gaming card while running Unigine Heaven, but the two are so closely matched in every other regard that we’d call this test a wash.

GPU temperatures and observed clock speeds

In our experience with Pascal chips, the GPU Boost 3.0 algorithm tends to boost clocks to a peak speed before settling into an equilibrium once the card has had a chance to heat up. To take that behavior into account, we ran the Unigine Heaven benchmark for 10 minutes before observing clock speeds and taking temperature measurements.

Unsurprisingly, both custom coolers handily outperform the GTX 1080 Founders Edition’s reference heatsink. Asus’ custom card delivers a 14° C drop over the reference design, while the Gigabyte card’s massive heatsink lets it run another 3° C cooler yet. If you’re concerned about thermal headroom for overclocking, the performance potential of either of these coolers should offer plenty of wiggle room for a chip that might be thermally limited.

We also observed the sustained boost clocks that each card reached under our Unigine Heaven load. After 10 minutes, the Strix settled into a boost speed of 1972MHz, while the Xtreme Gaming card seemed content to run at 1987MHz. Both of those speeds are far in excess of the 1898-MHz boost speed that Asus and Gigabyte list in the specs for these cards, so that figure is at best a very loose guideline of what to expect.

The Founders Edition card, on the other hand, has to modulate its clocks in apparent service to its thermal limit. Recall that Nvidia’s boost speeds are a range, not a ceiling. Despite Nvidia’s specified 1733-MHz boost speed for the GTX 1080, the card doesn’t just boost up to that speed and stop. Instead, we observed that the Founders Edition’s clocks tended to oscillate around that speed in order to keep temperatures in check.

After being loaded with Heaven for a while and reaching its 82° C maximum, the Founders Edition card occasionally boosted as high as 1772MHz, but it also dipped as low as 1696MHz. The handy GPU-Z utility can expose why a card is hitting a performance ceiling, and unsurprisingly, it showed that the Founders Edition card was hitting a thermal limit instead of a power cap.

In contrast, neither custom card needed to vary its boost speed while under load. After building up some heat, both the Strix and the Xtreme Gaming cards maintained their boost speeds without a hitch. While we don’t think most people will notice, that performance consistency could translate into smoother gameplay over time, and it’s another reason to consider custom-cooled GTX 1080s instead of the Founders Edition card.

 

The GPU Tweak II software

As any respectable graphics card maker does these days, Asus provides its own first-party management utility for the Strix GTX 1080: GPU Tweak II. Let’s take a brief tour of the software and see what kind of control it offers over the Strix.

When it’s first launched, GPU Tweak II presents users with the “Simple Mode” interface, along with a handy “GPU Tweak II Monitor” app that graphs most every performance parameter one might care about when turning the knobs and dials of a graphics card. I imagine most people will use the Simple Mode interface to switch between the Strix’s Gaming Mode and OC Mode clock profiles and to monitor the card’s temperatures.

A “Gaming Booster” button in this view promises to speed up one’s system by turning off visually intensive Windows features like Aero, disabling Windows processes and services, and a “System Memory Defragmentation” routine. Gaming Booster doesn’t say which Windows processes and services it’s messing with before it runs, so I would never actually run this utility. Like many other “system optimizers” out there, this one seems like it has too much potential to harm and too little potential to help. If you really care about rogue processes running on your system, the Startup tab in Windows 10’s Task Manager offers a much finer-grained look at those services and their impacts on performance.

The “Professional Mode” UI that’s invoked using the button in the lower right corner of the GPU Tweak UI offers many more options for manual tuning, including core clock speeds, core voltage, memory clocks, fan speeds, the GPU power and temperature targets, and a frame rate cap. These controls all work as you’d expect if you’ve used MSI’s Afterburner or a similar overclocking utility.

GPU Tweak also includes a subset of the extremely handy GPU-Z utility baked right into the app. If you want to know all of the basic specs of your graphics card, the Info tab will tell you all of that information and more.

The Tools tab serves as a launcher for Asus’ own Aura RGB LED control utility and for Xsplit’s Gamecaster utility, if you have it installed. Each shortcut will prompt you to install the associated app if you haven’t already. If you already have Xsplit or Aura installed, clicking on the shortcut will launch the app as you’d expect.

The Aura utility

Unlike Gigabyte’s Xtreme Gaming utility, which condenses both tweaking and lighting controls into one app, the Aura app that controls RGB LEDs on Asus hardware is an independent utility that needs to be downloaded and installed separately.

Once it’s running, the Aura app offers most of the knobs one might need to control RGB LEDs to the fullest, including a temperature-sensitive mode that varies the colors of the Strix’s RGB LEDs according to its load temperature and a “music” mode that puts on a psychedelic light show in time with the tunes of your choice. The color wheel that the Aura UI presents for picking a given color isn’t particularly well-aligned with what shows up on the card, however, so some trial-and-error might be needed to get the shade you want to actually appear on the Strix.

Some other colors—like pure white—result in the expected white on the ROG logo, but a violent purple elsewhere on the card. Asus says this is a limitation of the RGB LEDs themselves, and other reviewers got similar results when they played with the Aura software. I also think Asus could make life easier for the truly RGB LED-obsessed by offering hexadecimal or distinct RGB value entry fields for primary values from zero to 255, as some other RGB LED control utilities do, for ultra-fine tuning of the colors that show up on the card.

Overall, Asus’ software package for the Strix is about as good as any I’ve used for graphics-card tuning, and users will find most everything they need to extract all the performance and bling they might want from their Strix cards.

 

Overclocking and the silicon lottery

In a normal graphics card review, this is where I’d share the results of our efforts to wring out every last Hertz of clock speed from our Strix. Sadly, that won’t be the case today. We lost the silicon lottery with the particular GP104 GPU that’s soldered to this GTX 1080.

As we’ve already seen, Nvidia’s GPU Boost technology can make the marked clock speeds on recent GeForces an understatement of their actual capabilities, and that’s especially true of Pascal-powered cards. To establish the actual clock-speed baseline for these cards, we’ve taken to running Unigine Heaven for 10 minutes to let cards heat up before making any records of observed clock speeds.

Since the Strix card comes with multiple clock speed profiles baked into its firmware, we figured we’d start our testing by observing the clock speeds the card reached in its default Gaming Mode and the slightly-boosted OC Mode before moving to manual tweaking. Though Gaming Mode was perfectly stable in our tests, switching over to the baked-in OC Mode immediately led to hangs and artifacting, suggesting our particular GP104 chip didn’t even have the headroom to handle that modest overclock.

This instability was surprising to see. I’ve never used a graphics card that couldn’t maintain stability with its factory clock profiles, so I asked Asus about this behavior. The company told me that while its Gaming Mode profiles are guaranteed to be stable, OC Mode stability may vary from card to card. Asus suggests that if OC Mode doesn’t work, owners should just flip back to Gaming Mode.

To be honest, that response surprised me—no such disclaimer is in evidence on any Asus product or retail page that I can see. What’s worse, Newegg (and other retailers) present that OC Mode clock speed as the one the card will hit, not as a provisional bonus. Furthermore, Asus has sent out cards to reviewers in the past that were locked into OC Mode by default. If OC Mode isn’t guaranteed for stability on Asus cards, that decision would mean the cards in reviewers’ hands might not reflect the performance buyers will get off the shelf—even if it is just a tiny difference. We’d urge Asus to clarify its position on just what OC Mode means for its graphics cards so that buyers don’t end up disappointed if their cards can’t hit those clocks.

After our initial discovery, I tried to nurse this particular piece of GP104 silicon to higher clocks with power, temperature, and voltage limit increases, but our chip just wasn’t having it. Even single-digit clock speed increases over the Gaming Mode defaults introduced instability, and that behavior persisted even with artificially high fan speeds that ruined the Strix’s noise character and drove temperatures far below the typical load levels we’ve seen on GTX 1080s so far. Even a careful application of premium thermal paste didn’t help. At that point, we wrapped up our overclocking efforts.

To be clear, we don’t think these results are a black mark on the Strix GTX 1080’s design or quality. Chip-to-chip variance is a fact of life with all semiconductor products, but this is the first time we’ve run into such a dud. It’s possible that with different luck, one might see better overclocking results with a Strix, and both TechPowerUp and HardOCP had better luck with their Strixes. We just can’t test that headroom with our particular sample, so caveat emptor.

 

Conclusions

Given our past experience with Asus’ top-end graphics cards, we had high expectations for the ROG Strix GTX 1080 when it landed in our labs. For the most part, the card lives up to the high standards Asus has set with its past ROG products, but we do have a couple nits to pick.

For one, the ROG logo on the side of our GTX 1080 actually fell off from the heat dissipated by the card in its passive mode. Asus assures us that this ungluing is an anomaly of an early-production sample. The card’s RGB LEDs also don’t produce a pure white—set that color in the Aura software, and you’ll get a pale purple instead. Asus tells us that an imperfect white is inherent to RGB LEDs. The Gigabyte Xtreme Gaming card we recently reviewed can’t do a clean, true white with its RGB LEDs, either, so we’ll let this one slide.

We’re less forgiving of the “overclocking” experience we had with the Strix. We know that manual overclocking is always a luck-of-the-draw exercise, but baked-in clock speed profiles have always seemed to offer an implicit guarantee of extra performance to us, regardless of manufacturer. That sense is furthered by the fact that both Asus and online retailers make no disclaimers about whether users will be able to safely activate this (or any other card’s) OC Mode. We were a little surprised to learn that Asus doesn’t actually guarantee the stability of the Strix’s OC Mode, and it suggests that users should simply use Gaming Mode if the more aggressive clock speed profile demonstrates instability.

If OC Mode isn’t actually guaranteed to be stable, we think Asus could avoid some headaches by coming out and saying as much. A handful of negative Newegg reviews and Asus forum posts suggest we’re not the only folks who have had trouble with the OC Mode on Strix cards, and even a small subset of owners disappointed with the performance of their $720 video cards seems like a problem a company would want to avoid in the highly competitive market for GTX 1080s.

Even if one draws a short straw in the silicon lottery, the Strix’s factory clock speed boost in Gaming Mode is still more than enough to let the Strix soar over the GTX 1080 Founders’ Edition card in our performance tests. The triple-fan Strix cooler let this custom card consistently hold clock speeds above and beyond Asus’ specified boost speed for the vast majority of our testing, and he low noise levels and polite noise character of the Strix cooler make this card an easy upgrade pick over the Founders Edition GTX 1080, as well.

On balance, the ROG Strix is a solid take on the GTX 1080 with some innovative features builders won’t find anywhere else, especially its GPU-controlled system fan headers. Those fan connectors are great to have on a graphics card, and Asus deserves high praise for the convenience they offer.

The problem for the Strix is that Gigabyte’s superb GTX 1080 Xtreme Gaming card is just as fast, it runs cooler than Asus’ effort, and its heatsink sounds better at speed. The Gigabyte card also includes some nice accessories in the box, and it actually sells for less money than the Strix when it’s in stock. Even better, Gigabyte tells us that it tests the stability of its OC Mode clock profiles at the factory, and it’ll accept an RMA if one of its cards demonstrates instability in any of its clock speed profiles. That’s the kind of support we’d expect from a $700 video card, and we think that unless you’re willing to gamble with the silicon lottery and a Strix card, that added assurance is worth the wait if the Xtreme Gaming card is out of stock.

Comments closed
    • richardjhonson
    • 3 years ago
    • richardjhonson
    • 3 years ago
    • bfar
    • 3 years ago

    As always, many thanks for the review and conclusion.

    I look at the performance increase, and while it isn’t earth shattering, I’d take it. Then I look at the noise and temperature reductions and I’m liking those too. Then I see the premium the likes of Asus are asking for and I just lol, because while the improvements are welcome, the stock experience just isn’t that far removed. The product is good, but the price isn’t. Add to that the stock issues with these cards in Europe, and we’re really onto a dud. Someone should call these guys out.

    • Krogoth
    • 3 years ago

    Protip: Overclocking has always been “Your results may vary”. You sometimes get golden eggs or flat-out duds that barely work at “stock” speed and voltages.

    • Dr_Gigolo
    • 3 years ago

    Reading this review I am just all the more happier that I sprung for a eVga ACX 3.0 1080 card for around 630USD at launch day here in Europe.

    Lucky for me the ACX 3.0 wasn’t available after about 3 weeks of waiting, so I got a SC version intead. It runs overclocked at about 2Ghz with a low fan speed (30-40%) and a GPU temp of around 80 degrees C. I can’t really hear my gaming PC running 1440p with my Asus PQ297 playing Witcher 3 and whatnot.

    All these high end cards are just pure crap and marketing. Maybe the hybrid-type cards fare a little better, but as long as you are a victim of the silicon lottery, there is no guarantee that will get a good OC out of your 800dollar GTX 1080 compared to a ~600 dollar one.

    • torquer
    • 3 years ago

    I snatched one of these up during one of the 30 second windows of availability right after launch. Overall pretty happy with it and loved the fan connectors on the front. Brilliant.

    I’m not a huge fan of blinkenlights but it was cool to be able to sync with the lights on my X99A II. In the end though I sold it and picked up an eVGA 1080 hybrid. My temps on the Asus card were around 74*C under load and the max sustained boost clock I saw was about 1958MHz. With the eVGA Hybrid I’m getting load temps maxing at 50*C and sustained boost clocks of 2114. This is all in The Division, my current game of choice, running at 3440×1440.

    • Skid
    • 3 years ago

    ASUS is off the list. For the money they’re charging, I expected *way* better.

    • Gastec
    • 3 years ago

    Thumbs up for the nice review, short and to the point. Point being, like Dexter said: “I won’t pay, I won’t pay ya, no way” What do they think, that I’m crazy or something? They can shove it where the sun doesn’t shine.

    • End User
    • 3 years ago

    I still can’t fathom your 1080 FE results.

    My OC’ed 1080 FE goes no lower than 1949 MHz. Memory is at 11 Gbps. Temp holds steady at 72 ° C. I’m using a custom fan curve. Fan noise is not an issue as I’m using a Corsair case with sound deadening material.

    It may be down to the silicon lottery but I’ve read reviews of the stock RX 480 and the settings for that card have to be massaged before its true performance emerges.

      • Jeff Kampman
      • 3 years ago

      [quote<]My OC'ed 1080 FE[/quote<] Yeah, we're not comparing apples to apples anymore.

        • floodo1
        • 3 years ago

        He meant your over locking results. My FE also runs over 2ghz under load which is relatively common, so he’s surprised yours oc’s so poorly

          • Jeff Kampman
          • 3 years ago

          I…didn’t OC it. Maybe we’re referring to the GPU Boost numbers?

            • End User
            • 3 years ago

            Is that not a factory OC? The ASUS cards have much higher boast clocks than a stock 1080.

            • K-L-Waster
            • 3 years ago

            There’s a huge difference between enabling a factory OC and developing your own OC profile. Most users do not attempt to create their own.

            Completely different type of review.

            • End User
            • 3 years ago

            A completely different type of review done by completely different review sites and channels. One that has proven to be very useful to many video card enthusiasts.

            • derFunkenstein
            • 3 years ago

            Clearly you just need to feel good about your impulse control and the fact you way overpaid for something.

            • End User
            • 3 years ago

            The 1080 FE was a mere pittance of overpricedness. I’m about to drop $1,700 on an iPhone 7. The kicker is that I plan on ditching the 7 for the 8 next year. Let the burning of money continue!

        • End User
        • 3 years ago

        Weird response. Can you clarify? The cards you tested are OC’ed and the 1080 FE results you posted are wildly different that what I see in real world usage. You even attempted to further OC the cards in this review yet you did not in the 1080 FE review.

        From your original 1080 FE review:

        “While the GP104 chip itself might have plenty of overclocking potential on tap, we’d be wary of trying to push it too far with the Founders Edition cooler given our stock-clocked results. Plenty of Nvidia’s board partners are now selling GeForce GTX 1080s, and the custom coolers on those boards might unlock the thermal headroom one would want to really push the clocks skyward. For now, we’re reserving judgment on the GTX 1080’s overclocking prowess.”

        I OC’ed my 1080 FE to a stable 1949 MHz with temps 10 °C cooler than your stock 1080 FE (granted you may have conducted testing in a volcano). Obviously my 1080 FE will be louder than either of the ASUS cards but it is not even remotely an issue.

        From a consumers point of view your 1080 FE review is nowhere near what I see in the real world. It is fortunate I bought my 1080 FE before I read your review otherwise I would have missed out on a good thing.

        I’m guessing that you will continue to use your weird 1080 FE results in future articles and that bums me out. It’s kinda like that “never meet your heroes” thing.

        Here is an idea. Send me your 1080 FE and let me test it out. I promise to send it back. 🙂 Or, better yet, play around with the settings of that 1080 FE and see what happens.

          • Jeff Kampman
          • 3 years ago

          We don’t seem to be on the same planet here.

          We made no attempts to overclock any card in this review above and beyond what GPU Boost 3.0 automatically extracts from each product.

          Our Founders Edition card was operating within stock parameters, and so were the Gigabyte and Asus cards, even if those cards do have factory clock speed boosts. We made no modifications to power limits, temperature limits, or any other parameter.

          Other sites have noted that the Founders Edition card has an 82° C thermal ceiling under load in its stock configuration, as well, so the fact that our testing shows that’s the case isn’t unusual.

          You freely admit that you’ve overclocked your card and modified its fan curve. While I’m not denying the card can deliver higher performance when overclocked, it means it’s no longer a valid point of comparison for the out-of-the-box experience any of these cards can deliver. Sorry.

            • End User
            • 3 years ago

            Out of the box settings have been shown to be less than optimal with both the 1080 FE and the stock RX 480. Stock settings can lead to overheating and throttling which is exactly what occurred in the TR review of the 1080 FE. If solutions are potentially there why not investigate and share? Is that not the point of a review?

            • Jeff Kampman
            • 3 years ago

            Less-than-optimal by whose definition? Nvidia’s engineers tuned the Founders Edition card to deliver exactly the performance that it does. Far be it from me to say that their choices are “less-than-optimal” when that’s how most users will experience the product.

            We have always tested graphics cards twice in reviews like this: once in their out-of-the-box configuration and then a second time after manual tuning. Since the Asus card that’s the primary subject of this review didn’t have any overclocking headroom left, I didn’t carry out that testing on the other two cards when it wasn’t germane to the review any longer. The topic of this article isn’t “Overclocking the GeForce GTX 1080 Founders Edition.”

            • End User
            • 3 years ago

            Less-than-optimal as in Nvidia’s stock tune of the 1080 FE lends itself to throttling. AMD appears to have less-than-optimal stock tune for their RX 480 as well. A few tweaks to the profile and bam, no more throttling at stock clocks (as discovered in other reviews).

            I used my OC’ed 1080 FE for comparison as it was a real world example that I was familiar with. One that demonstrated vastly lower temps while supporting a substantially higher clock without resorting to a screaming fan.

      • DPete27
      • 3 years ago

      [quote<]RX480....settings for that card have to be massaged before its true performance emerges.[/quote<] [quote<]My OC'ed 1080 FE[/quote<] [quote<] I'm using a custom fan curve[/quote<] [quote<]case with sound deadening material[/quote<] You've said it all my friend. You've said it all.

        • End User
        • 3 years ago

        Real world usage my friend. Real world usage.

      • Krogoth
      • 3 years ago

      [sisko facepalm.jpeg]
      Not every piece of silicon can overclock to the moon even if you throw a ton of volts and exotic cooling at it. It has always been part of the overclocking game.

      [/sisko facepalm.jpeg]

        • End User
        • 3 years ago

        You are completely missing the point (as per usual). Kampman missed the point as well. It has nothing to do with overclocking. Custom settings can make a big difference in the performance of a stock clocked GPU.

          • Krogoth
          • 3 years ago

          Moving goal posts again?

          Dude, I know you want to think every piece of GP104 silicon can effortlessly overclock to the moon but in the real-world this may not be the case. It is called the overclocking lottery for a reason. I have dealt with duds before so have other countless overclockers. Just because you never personally ran into one doesn’t mean that they don’t exist.

          The performance differences at stock speeds are negligible at best. The “slightly higher numbers on Gigabyte unit” can be attributed to statistical anomalies.

          • Meadows
          • 3 years ago

          I don’t know what these “custom settings” are but apparently Jeff tried every OC-related setting under the sun and the GPU still wouldn’t budge.

    • I.S.T.
    • 3 years ago

    So, why was the card faster on one test, but slower on another?

      • Jeff Kampman
      • 3 years ago

      I explain as much in the piece. The Gigabyte card I have here runs its memory faster than the Strix card does out of the box, and I believe that difference matters in Shadow of Mordor.

        • I.S.T.
        • 3 years ago

        I must have missed this, sorry.

    • GreatGooglyMoogly
    • 3 years ago

    The main reason this card was not a finalist for me was the decision to go with the nonstandard 2 DP/2 HDMI setup. I use a triple monitor setup w/ G-SYNC and this is a no-go. Going from DP->HDMI than the other way around is a lot more compatible as well in my experience.

    Luckily, the EVGA 1080 FTW that I ended up getting had faster turbo clocks (1999.5 MHz) and was 100 bucks cheaper anyway.

    • HERETIC
    • 3 years ago

    Just been looking at some 480 and 1060 here in Aus.
    Strix have a $100 premium over Asus basic cards-about 25%.
    Is that cooler worth the extra??????????
    I thought most manufacturers binned their chips and the dearest (Strix)
    got the better chips…………..Seems not……….

      • tay
      • 3 years ago

      I don’t think chips are binned to the extent people think they are. I don’t think they’re binned at all to be honest.

        • Gastec
        • 3 years ago

        They used to “bin” the chips. Five,six years ago, when their brains were not overheated by greed. And when people didn’t use to buy every piece of…for any price.

    • cynan
    • 3 years ago

    The suspense is killing me! What was the OC profile clock speed that was tried that this card was unstable at?

      • ImSpartacus
      • 3 years ago

      Yeah, kinda weird how this article omitted certain things like that.

      • rahulahl
      • 3 years ago

      They kinda said it was unstable if they raised the clocks even in single digits from the gaming profile.

      • Gastec
      • 3 years ago

      I don’t think it matters, as that’s the limit of the chip in that Asus card. If we are to believe what Nvidia’s is said to have demonstrated during its Dreamhack (I’m rofling what can I say :~| keynote), the GP104 will reach 2.1 GHz – on air. That’s Antarctic air. That being the case, and of course Nvidia never lies about these things, we can only conclude that the GP104 chips in the Asus ROG Strix GeForce GTX 1080 come from the recycle bin. You get it? They’ve been binned.

      • SsP45
      • 3 years ago

      I have the same card. The profile is literally just called “OC”. On mine with that profile selected, I get boost speeds of around 1890Mhz. It varies by game, obviously, but that’s the normal speed. It won’t go above 1900Mhz.

        • Devils41
        • 3 years ago

        The Tweak software is horrible to overclock with, I tried to adjust my profile but ended up only being able to get just 2088MHZ on the core. I ended up using EVGA’s OC utility for mine and for some reason was getting more stable overclocks. I cant explain it but it worked. Have you bumped your target limit up to 120% and set the throttle on temps not power target?

          • SsP45
          • 3 years ago

          I had massive instability problems just by having the EVGA software installed.

          Right now I’m having problems with performance in any intensive games (getting drops to mid-30s FPS in Battlefield 1 and Project CARS). I don’t want to OC until I figure out those issues.

    • ddarko
    • 3 years ago

    Thanks for the review. I do wish the software section was more robust. As it stands, it’s more a description of the software instead of an evaluation of how well it does or doesn’t work. I imagine testing methodology makes evaluating the software that comes with hardware very challenging but I feel like this is an area that tech reviews are missing at the moment. A lot of my hardware has been and is from Asus – my current Z97 and former Z68 motherboards, current GTX980 and prior Radeon 7970 – and I have found with all of them that the software is the weak spot. They have invariably been very buggy and the bugs persist revision after revision. I’ve tried GPU Tweak II, GPU Tweak, AI Suite II and AI Suite and all of them have been a source of tremendous frustration. It’s an experience that others share in Asus’s own forums or threads devoted to their products but it never seems to get reflected in formal product reviews.

    • chuckula
    • 3 years ago

    The lack of OC was unfortunate but at least Asus sent you a card that ran in the same default mode as any retail card so no BIOS shenanigans. They also didn’t cherry pick the review sample (or they didn’t do a good job if they did).

      • trek205
      • 3 years ago

      As I just mentioned, it was the driver not the card itself that had issues with overclocking.

        • stefem
        • 3 years ago

        After your post Jeff Kampman tried with the new driver apparently ending up with the same results, so probably is just a normal non-awesome chip (it’s factory overclocked so I wouldn’t call it a bad chip)

    • Devils41
    • 3 years ago

    I was waiting on a EVGA FTW 1080 but ended up getting the STRIX on a whim when it was in stock. I paid $670 but this was right at its release, I know price gouging has gotten worse as time has gone on. I must have managed to win the silicon lottery with mine since it is stable at a 2114Mhz OC (Proof: [url<]http://www.3dmark.com/fs/9896326[/url<] ). I've had the card now almost 2 months and have no complaints.

      • ImSpartacus
      • 3 years ago

      Thanks for sharing. That’s a valuable data point.

      It’s going to be pretty cool to see the next gen g#104 part once the process matures and clocks rise. Pascal is clocking out of the park.

    • trek205
    • 3 years ago

    372.54 what’s the cause of your overclocking issues. Those drivers screwed up overclocking for many people and there’s even a Reddit Thread about it. I now have my normal overclocking back after upgrading to the latest NVIDIA drivers that just came out a couple days ago.

      • homerdog
      • 3 years ago

      Yeah I had no prior knowledge of issues with 372.54 but from reading his results it was clear to me this was a driver issue. Single digit increases cause horrible instability? That has literally never happened on any GPU ever. I’m genuinely surprised he didn’t realize this, or even mention the possibility that it could be a driver bug.

        • synthtel2
        • 3 years ago

        A friend’s 7770 has as far as we can tell absolutely no OC headroom. +5% was an immediate no-go, and at +3% Heaven/Valley would usually not last more than an hour. Stock does check out fine though.

      • Jeff Kampman
      • 3 years ago

      Thanks for bringing this to my attention, but the card is just as crashy with 372.70 as it was with 372.54. Bum chips do happen.

    • Meadows
    • 3 years ago

    I am amused by how hard you’re trying to sell the whole LED thing of late. This might be the third review where I read a copy-pasted thought along the lines of “but I like how I can coordinate the colour with my other LEDs”. Almost as if you were given a reviewer’s guide for selling this stuff.

    Not insinuating it of course, just suspicious.

    Correction: not all of those may have been reviews, I might have seen that line in one of the regular news postings too.

      • RAGEPRO
      • 3 years ago

      Or I mean. Maybe he’s being facetious?

        • Meadows
        • 3 years ago

        You might think that, but after you see it written with a straight face for the third time, you sort of lose faith in that explanation.

          • RAGEPRO
          • 3 years ago

          What are you expecting him to put, a [url=https://www.youtube.com/watch?v=O4maBrc_UII<]winky face?[/url<]

      • Waco
      • 3 years ago

      So you’re saying that if you cared about color matching, it wouldn’t be something you’d want mentioned in a review?

      I’m confused. If you don’t care, don’t care about it. If you do care, it was useful to you. The review wouldn’t be complete without mentioning that feature.

        • Meadows
        • 3 years ago

        No, no. It’s just this new thing I noticed very recently. So new that I suspected it might have been suggested.

          • DPete27
          • 3 years ago

          Or just the fact that every single flagship computer component lately has RGB LEDs. Graphics cards, motherboards, keyboards, mice, etc etc. As far as graphics cards are concerned, that’s pretty new feature (the Pascal/Polaris generation) so yeah, it’s worth mentioning because it’s a feature that manufacturers are clearly putting time and effort into.

      • ImSpartacus
      • 3 years ago

      Yeah, I think this kind of audience well almost always prefer a classy understated look. While I respect that reviews should have some subjective opinions in there, I don’t really care to hear about how I can color code all of gear in the same obnoxious lime green light. I’ve just kinda grown out of that sort of thing.

      • Ifalna
      • 3 years ago

      Like it or not, it’s a feature of the product and that should be mentioned.

      Also, what’s wrong with liking RGB lighting? I like it too, in certain places.

        • DPete27
        • 3 years ago

        I don’t search out RGB LED components, and I certainly wouldn’t pay more JUST for that ability, but I certainly appreciate color matching components. It makes a build look more, professional.

        I’d prefer a multi-positional switch on the card itself that switches between major colors (red,orange,yellow,green,blue,pink,purple) rather than having to do that in software.

        I’m not sure I’d bother running some worthless LED color software in the background so my GPU lights match the rest of my build inside a case with no windows. But if my case had a window (or for a RGB keyboard/mouse), then yeah, I’d definitely run with my desired LED color. I HATE blue LEDs with a vengeance. And since that’s the default LED color in the electronics industry, if I can change that color via software rather than having to physically swap LEDs myself, then I appreciate the convenience.

          • Ifalna
          • 3 years ago

          Well if the software needs to run continuously, that’s shoddy design. The card should remember the setting w/o the software.

          Reminds me of my Thermaltake power supply. It can glow green, blue and red. Problem is: after each shutdown you had to press a button at the PSupply in order to change color again.

          DafuQ?!

          Well the default blue LEDs are dead now after 5 years anyway, so um yah. That problem basically solved itself. 😀

            • Jeff Kampman
            • 3 years ago

            The color (and animation) modes are set-and-forget. You might need the Aura software to take advantage of the “music” and “temperature” modes, but the basic color settings stick without Aura installed.

          • rechicero
          • 3 years ago

          LEDs make anything look more professional? I’d say lack of LED does that much better. Well, maybe not in LED TVs or monitors, ok.

      • chuckula
      • 3 years ago

      Tilting at windmills Meadows?

      • Jeff Kampman
      • 3 years ago

      Yeah, my Asus paymasters sure did tell me to highlight the RGB LEDs on a card that otherwise received a rather negative review. Sure thing, Alex Jones. Go re-apply your tinfoil.

        • torquer
        • 3 years ago

        Hey isn’t it our job to make fun of Meadows? Back to your typewriter! :p

        • jihadjoe
        • 3 years ago

        Everyone knows RGB LEDs are the the most important feature on any PC component nowadays. More than makes up for the little detail of the card failing to hit OC mode clocks.

        TR needs to properly measure and benchmark RGB LED performance because this is how buyers of the future are going to pick their parts.

      • llisandro
      • 3 years ago

      [quote<] Not insinuating it of course, just suspicious. [/quote<] comments like this boggle my mind. If you're in the TR comments selection, you're well aware of: 1) the fact that TR has had trouble getting pre-launch access to cards. 2) the relationship between a tech review site like TR, and a manufacturer giving out samples of its product to help show how their tweaks, bells & whistles differentiate their card from the reference design everyone's working from. and 3) if you're on this site, you're also most likely an expert in your respective field, and if someone blithely accused you of impropriety at your job you'd be pretty chuffed. I just don't understand how someone who takes time out of their busy day to frequent the comments section here isn't rooting for TR to succeed. :/ [url=https://www.flickr.com/photos/mlleghoul/14335661842/<]relevant link[/url<]

        • Meadows
        • 3 years ago

        What does any of this have to do with LED lighting? I had a suspicion and I already received a proper answer from Jeff.

Pin It on Pinterest

Share This