EVGA GeForce GTX 1080 Ti SC2 Hybrid is one cool customer

AMD's Radeon R9 295 X2 was the first card to demonstrate that hybrid coolers atop graphics cards can offer the best cooling performance this side of a custom liquid-cooling loop. Along those lines, EVGA just announced the GeForce GTX 1080 Ti SC2 Gaming Hybrid. This card combines a 120-mm closed-loop liquid cooler with a small axial fan to produce what EVGA says is its coolest and quietest GeForce GTX 1080 Ti yet.

EVGA says the card will stay cool under pressure despite speccing it for a base GPU clock of 1556 MHz and a nominal boost clock of 1670 MHz—a substantial increase from the reference speeds of 1480 MHz base and 1582 MHz boost. The company claims the card will keep its GP102 GPU below 50° C under load, an impressive figure indeed. We'd expect it to boost quite a bit higher than 1670 MHz, too, given the thermal headroom apparently available.

EVGA designed the cooler for this card with a unique plate that provides contact between the liquid cooler's all-copper block and the card's 11 GB of GDDR5X memory. The GTX 1080 Ti SC2 Hybrid is also equipped with EVGA's iCX thermal monitoring suite, which uses nine thermal sensors spread across the board to make sure components are staying frosty. 

EVGA is asking $810 for the GTX 1080 Ti SC2 Gaming Hybrid. If this is the GeForce GTX1080 Ti you've been waiting for, you can sign up for availability notifications on the company's website. EVGA is also giving one away through a social media contest. This page has all the details.

Comments closed
    • torquer
    • 2 years ago

    I bought their hybrid kit to install on my FE card, works great.

    • emorgoch
    • 2 years ago

    I find it interesting that the clocks on this are equivalent to the SC2 model, but a couple notches behind the FTW3. Considering that it has a listed price $30 above the FTW3 ($780 vs. $810), I would have hoped that it would have had the same specs. Not that the listed clock speeds are worth a damn any more.

      • ImSpartacus
      • 2 years ago

      They did something similar with their equivalent 1080. They had multiple water cooled variants.

    • Kretschmer
    • 2 years ago

    This looks crazy awesome, but why the VGA port? How many 2017 1080Ti gamers are still reliant on a DVI display as part of their monitor mix? I’d rather see a hozjillion DP ports.

      • Voldenuit
      • 2 years ago

      Huh? I see 3 DisplayPorts, 1 HDMI and 1 DVI.

      That looks like a perfect port setup to me. I don’t even care if the DVI is a DVI-I or DVI-D.

        • invinciblegod
        • 2 years ago

        If you notice, the right side of the DVI port has a dash- with no pins above and below, so no vga support!

      • Anovoca
      • 2 years ago

      Speaking from experience, most people, regardless of video card, are at the mercy of what ports their displays will accept. DP might be superior, but if you spent all your money on your 1080Ti and got a cheap Korean 1440 IPS display to pair with it, you might only be able to connect with DVI.

        • Kretschmer
        • 2 years ago

        Understood, i just don’t expect someone to have top-tier GPU tech and 5-year-old monitor tech.

          • continuum
          • 2 years ago

          More like 18 years old!

          • Thresher
          • 2 years ago

          I use a KVM. There are no good ones that work with DVI or HDMI and the ones that are available are very expensive.

          I

            • psuedonymous
            • 2 years ago

            A lot of monitors have a ‘built in’ KVM. e.g. Dell U3415w: has two upstream USB ports that can be assigned to any of the two inputs, so when you switch inputs the devices attached to the monitor switch to that USB host.

        • JustAnEngineer
        • 2 years ago

        DisplayPort to DVI-D active adapters are a thing.

    • Anovoca
    • 2 years ago

    +3 to EVGA Design team in the last few years. Glad to see at least someone out there gets it.

      • Redocbew
      • 2 years ago

      Between their power supplies and their video cards, I’m afraid I’m becoming a bit of an EVGA fanboy.

Pin It on Pinterest

Share This