Asus fires a volley of GeForce RTX 2070 cards

Right now, most gerbils are probably talking about Nvidia's high-end GeForce RTX 2080 Ti and RTX 2080 graphics cards. Those pixel-pushers command pretty high prices, though, and we figure that more than a couple of you are waiting for the more sensibly priced RTX 2070. Asus knows that, and it's displaying its stable of GeForce RTX 2070 cards to the world.

Asus' RTX 2070s come in three main variations: ROG Strix, Dual, and Turbo. The ROG Strix cards have a three-fan, three-slot wrap-around cooler with a baseplate that Asus says is ten times flatter than "traditional heat spreaders." The company says the "wing-blade" fans atop the Strix cards use smaller fan hubs than previous generations' and are more resistant to dust ingress. The Strix cards also include onboard RGB LED lighting and two four-pin fan headers so that budding enthusiasts can have their system cooling easily respond to GPU temperatures.

The Dual GeForce RTX 2070s, much like their name implies, use a dual-fan cooler. The whole apparatus still takes up three case slots, and Asus remarks that its fin surface area is 50% larger compared to the corresponding previous-gen cooler. Like the ROG Strix offerings, the Dual cards also have a metal backplate. 

Those looking to add a substantial amount of RTX firepower to Mini-ITX and other compact builds also have something to look forward to. The Asus Turbo RTX 2070 uses a blower-style cooler with a generously sized 80-mm axial fan (up from 60 mm in previous-gen cards). The company says it's tweaked both the shroud and I/O plate designs so that both air intake and exhaust are improved.

Much like with the rest of the manufacturer GeForce RTX announcements, there's no word on clock speeds for any of the cards. The ROG Strix GeForce RTX 2070 comes in regular, Advanced, and OC variations. Likewise, the Dual RTX 2070s come in standard trim, Advanced, and OC versions. As for the Asus Turbo GeForce RTX 2070, there's but a single-but-effective take on it.

Asus says its GeForce RTX 2070s will be available in Q4 2018. There's no word on pricing yet.

Comments closed
    • Billstevens
    • 1 year ago

    Now if they would just cave and support freesync I would buy one……

    • Firestarter
    • 1 year ago

    How good are these “dual” coolers anyway? IMO the ROG Strix cards are often overpriced but these Dual cards seem pretty reasonable

    • jihadjoe
    • 1 year ago

    [url<]https://xkcd.com/606/[/url<]

    • Bensam123
    • 1 year ago

    Also looks like same sleeve fans from the 1XXX series. Expect them to last about 1 1/2 before the first fan failure, closer to 2 for a average user. Sooner if you have dust problems.

    Just going to edit in here these are almost the exact same fans the 9XX series used and I talked about back when they were releasing variants of those. It’s exactly the same.

    $500-$1200 cards with sleeve bearings.

      • K-L-Waster
      • 1 year ago

      Under what use case? Gaming a few hours a day, or mining 24/7?

      (Not trying to be snarky, just making sure we’re clear on the variables.)

        • Bensam123
        • 1 year ago

        It’s a sleeve case fan. Take how long that applies to you and then apply it to a GPU. However, these are better then Gigabytes fans. I’ve had sleeve case fans that last longer.

          • K-L-Waster
          • 1 year ago

          Yeah, uh, don’t recall ever having a case fan or GPU fan fail on me.

          But then, I don’t mine.

      • hansmuff
      • 1 year ago

      I don’t have dust problems but my Strix 1070 is going to be 2 years old soon. Hmm. Is this really a huge problem or just anecdotal evidence?

        • Bensam123
        • 1 year ago

        Considering I have access to a couple hundred data points and you have one (more if you consider each card has multiple fans)… Guess you actually have to test the cards over a long duty period in order to actually benchmark something.

      • Devils41
      • 1 year ago

      I’ve had my Asus Strix 1080 since it launched in 2016, so just over 2 years now and I’ve had no issue what so ever. Is this a known issue? not questioning you, just curious.

        • Bensam123
        • 1 year ago

        Look into the mining community, and yes, but not as often as Gigabyte. I still have some Strix cards without a single fan failure, I also have plenty where 1 fan has failed on it over a two year ownership. I’ve also had I believe 2 that have had three fans fail… However I’ve replaced all my Gigabyte fans at least once already, some cards are going on two times. I order replacement fans off of Alibaba rather then sending them in for RMA now. It’s so common there is a market for their fans. Asus fans are much more expensive by comparison if you’re purchasing them.

        None of my EVGA cards have had fan failures (I own a lot and they’re double ball bearing). Also only had two MSI cards with fan failures.

          • Freon
          • 1 year ago

          The fans on the Strix cards probably turn off for normal desktop duty. Same seems true of most Maxwell or newer non-blower fan models. That probably brings down the power-on duty cycle of the fan to a 20-25% level at most, not 100% like a mining rig.

            • Bensam123
            • 1 year ago

            Yes. And they don’t run at 100%, depending on the manufactuer they never run the fans at 100% (some do, most don’t). You end up at throttle temps before the fan hits 100%. Sometimes they miscalibrate their settings and the fan is essentially running at 100% when afterburner will show 88%, sometimes they artificially lower the max ceiling based on what they want you to run at to reduce the failure rate.

            But if you’re trying to make a case for me misusing fans by mining with them, that’s a tough sell, there are plenty of case fans that run at 100% their entire life and they’re fine, especially double ball bearing fans.

            If I wasn’t a miner I’m pretty sure you guys would have a completely different take on this. Sleeves have been known to be horrible for fans in general.

            Also haven’t ever seen a blower turn off, although I’m sure there are some that do.

      • Krogoth
      • 1 year ago

      Reference coolers have always been on the cheap side. It goes back as far as HSF started becoming a requirement for performance GPUs.

        • Bensam123
        • 1 year ago

        Not talking about reference design? I believe the Nvidia blower coolers have almost 100% reliability on their fans, however I don’t own any and that’s just what I heard (as no one talks about them failing).

    • DPete27
    • 1 year ago

    [quote<]The company sasy the "wing-blade" fans[/quote<]

    • DancinJack
    • 1 year ago

    I REALLY don’t get all the comments over the past few weeks about the 2X8X and 2X7X series about being “average gamer” or “budget gamer” cards. Where have y’all been living the past decade?

      • DoomGuy64
      • 1 year ago

      Average gamer on this site, and most of them don’t care about value or respect budget gamers in the slightest.

      • NovusBogus
      • 1 year ago

      Seems reasonable, but I’ll think about other possible terms on my vacation next week. Alas, I must confess that I’m merely a budget traveler and will be slumming it; I need to upgrade my Gulfstream V to something better. Maybe someday I’ll be able to join the real aviation enthusiast ranks with an A380 or 777 but until then I gotta make some sacrifices.

        • jihadjoe
        • 1 year ago

        A Beluga with the interior converted into a flying mansion sounds like an awesome idea to pitch to a sheikh or prince somewhere.

    • ptsant
    • 1 year ago

    Expect 1080 prices and 1080 performance, I would guess.

      • NarwhaleAu
      • 1 year ago

      No, it will be between the 1080 and 1080 Ti and cost about $550 to $600 I’m betting.

        • ptsant
        • 1 year ago

        There is less “price premium” to be head in this segment. I would guess $500-550 (the 1080 starts at $450) but this of course would depend on the actual performance delta and the perceived value of the ray tracing stuff/DLSS.

      • travbrad
      • 1 year ago

      I hope so. I think I might have a heart attack if price/performance in the graphics market was to actually improve.

      • jihadjoe
      • 1 year ago

      This could actually be the ‘reasonable’ consumer card among Nvidia’s RTX Lineup.

      The shader count is about 80% of a 2080. Considering a 1080ti has 90% the performance of a 2080, the 2070 should be 90+% of a 1080ti’s performance while costing $150-$200 less.

      Edit: [url=https://www.techpowerup.com/reviews/NVIDIA/GeForce_RTX_2080_Founders_Edition/33.html<]Techpowerup's Performance Summary for the 2080[/url<] makes a good starting point for this as it lists the relative performance of the 1080 and 1080ti relative to the 2080. All we have to do is peg the 2070 at 80% of a 2080 (that's the approximate shader count, and the memory bus isn't crippled) and we can see how the chips fall: 1080 = 69% * 2070 = 80% 1080ti = 92% 2080 = 100% If we take the 1080 as a base 100% then the 2070 should be around 115%.

      • Krogoth
      • 1 year ago

      2070 should be at least 15% faster than 1080 and moreso if you throw in HDR and content that properly harnesses async shading. Just look at current 2080 figures and multiple it by ~0.80% and that’s where 2070 will end-up residing.

      It’ll probably have better energy efficiency too but it’ll be by a small margin.

        • ptsant
        • 1 year ago

        I was thinking more like 5-10%, without talking about DLSS and ray stuff, but unlike you I didn’t bother to do a linear approximation based on shader count.

          • Krogoth
          • 1 year ago

          The 2070 silicon is literately ~78% of a 2080 silicon and Nvidia didn’t crippled the memory bus on the 2070.

          On paper, assuming that clockspeeds are equal the 2070 should be about 78% of a 2080. Just seeing how the 2080 has a generous lead over the 1080. It isn’t too far-fetched to see it besting the 1080 by a smaller margin.

          Linear approximations are quite applicable when comparing the same architecture especially when they are superscalar in nature.

            • ptsant
            • 1 year ago

            Boost clock is 95% of the 2080 in the 2070 (1710 vs 1800). So, 0.78*0.95 = 0.74 of the performance. Taking shadow of the tomb raider as an example, the 55 fps of the 2080 would translate to 40.8 fps on the 2070, or 110.1% of the performance of the 1080, which scored 37 fps.

            In Project Cars, the 2070 would be slightly slower than the 1080, if I’m not mistaken.

            Further, using TRs own average FPS estimates from the FPS/$ graph the 2080 is at 59 vs 1080 at 41. 59*0.741=43.719, which gives 106.6% of the average perf.

            So, this comes quite close to my 5-10% estimate and I believe it also makes marketing sense to have a product that can be considered “slightly better” than the 1080 while not threatening the 2080.

            As I said before, I expect price to be +$50 or so compared to the 1080. $500 would be OK, $550 would be a bit expensive I think.

            • jihadjoe
            • 1 year ago

            Those specified boost clocks don’t really matter because GPU boost will almost certainly take both cards beyond those levels, often to the advantage of the card with fewer shaders because it has less heat to deal with.

            For example, the GTX1070 has a specified boost of 1683MHz vs the GTX1080’s 1733MHz, but TPU observed median boost clocks of [url=https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1070/28.html<]1822MHz[/url<] and [url=https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080/31.html<]1797MHz[/url<] respectively.

            • ptsant
            • 1 year ago

            I understand, but these are unknowns for the moment.

    • gecko575
    • 1 year ago

    “sensibly priced RTX 2070”

    As a budget gamer, when is the GT2030 going to be released again? 😛

      • Pville_Piper
      • 1 year ago

      I though the xx70 series was the budget gamer card…

      *cries in his beer*

      • Leader952
      • 1 year ago

      Probably never as the 20xx family will be RTX all the way.

      The RTX 1060’s will be next in line and I expect the line will end at RTX 2050. There will be a few other Ti’s thrown in.

      As integrated graphics become more powerful the low end of the discrete line disappears. This has already happened to the xx20’s and xx10’s. I expect that the xx30 and xx40 will are next in line.

        • DavidC1
        • 1 year ago

        RTX all the way?

        I disagree. Rumors already exist its GTX with 2060s and below. RT and Tensor cores take significant portion of the die. They’ll have to cut them on the low end.

        The performance on Ray Tracing will be so bad, it’ll be like iGPUs running Ultra settings on latest DX standards.

          • Leader952
          • 1 year ago

          You should realize that the lower models will be targeted at vastly reduced resolution so the hit will not be as you make it out to be.

            • DavidC1
            • 1 year ago

            The 2080 Ti can do 60 fps in 1080p resolutions with ray tracing enabled when its optimized according to the BFV developer.

            So how low do you want to go? 720p? If you want to cut the die size in half you have to cut everything in half. Except in lower cards, you have to cut slightly more because there are fixed function units that can’t be cut.

            • stefem
            • 1 year ago

            Screen space reflection are rendered at half or a quarter resolution in practically all games, Battlefield V included, and I didn’t hear a single one lamenting. Now people pretend to raytrace in real time at 4K….

            Developers had access to RTX hardware just two weeks before Gamescom, BF V DX12 was quite slower than in DX11 even without raytracing and even on AMD hardware.

      • derFunkenstein
      • 1 year ago

      in fairness he said MORE sensibly priced.

      • albundy
      • 1 year ago

      probably never. nvidia’s probable intentions are to destroy the gaming community with ultra high pricing.

        • K-L-Waster
        • 1 year ago

        “Hey, we’re making billions of dollars on gaming cards. Let’s kill that market so we can remove our primary source of revenue!”

        Something like that?

    • Krogoth
    • 1 year ago

    RIP Vega 64 and 1080

      • Longsdivision
      • 1 year ago

      RIP people’s wallet as well

      • Billstevens
      • 1 year ago

      If the 1080 draws less power I would still consider it over a 2070. The new cards are amazingly fast but their power draw and price are stupid.

        • Krogoth
        • 1 year ago

        Their power draw is only higher if they are using RTX mode and tensor cores.

        If they aren’t using them then they actually more energy efficient then Pascal even if they drink a few more watts at load.

        • Srsly_Bro
        • 1 year ago

        The idle power draw of the 2080 Ti with two monitors is worse than AMD’s RX series. Meanwhile my 1080 Ti uses 11W according to tpu 2080Ti review.

      • albundy
      • 1 year ago

      the 1080, yes. Vega 64 is a zombie. it has no idea where it is going.

    • chuckula
    • 1 year ago

    Hey Bruno, do a search-n-replace for 1070 with 2070 in a few spots of that article.

      • morphine
      • 1 year ago

      Derp, thanks. Been a rough week.

Pin It on Pinterest

Share This