In the lab: Gigabyte’s GeForce RTX 2070 Gaming OC 8G

Nvidia's GeForce RTX 2070 is here, and a wave of partner cards has broken with it. We sadly didn't get an RTX 2070 to review ahead of the card's launch, but Gigabyte has kindly shipped over one of its GeForce RTX 2070 Gaming OC 8G cards for us to play with. I've just pulled the Gaming OC 8G out of the box, and it looks like a fine take on Nvidia's most accessible Turing card so far.

Unlike its RTX 2080 sibling in the background, the RTX 2070 Gaming OC 8G fits well within a two-slot footprint. That doesn't mean Gigabyte has skimped on cooling, though. Several beefy copper heat pipes run through a full-length fin stack and make direct contact with the TU106 GPU underneath, and three counter-rotating fans provide plenty of airflow potential for moving heat away from all that metal. Gigabyte clocks this card at 1725 MHz in its "Gaming Mode" and 1740 MHz in its "OC Mode," although real-world clock speeds will likely be higher in use. For reference, Nvidia clocks its RTX 2070 Founders Edition at a 1710-MHz boost speed. An RGB LED-illuminated Gigabyte logo and a full-length backplate complete the package.

We're up to our eyeballs in hardware at the moment in the TR labs, but we'll be putting the RTX 2070 Gaming OC 8G to the test along with a wide range of other graphics cards as soon as we can. In the meantime, you can pick up one of these babies for $549.99 right now at Newegg if you're already convinced of the RTX 2070's performance chops.

Comments closed
    • Chrispy_
    • 1 year ago

    I really wish we had some real-world RTX applications to bench these with 🙁

    All we have so far is:

    1)
    An interview with DICE saying that they’ve had to significantly lower the amount and quality of raytracing in order to maintain performance (GPU model not disclosed)

    2)
    Remedy saying that their Northlight demo runs 60% slower using RTX effects on a 2080Ti!! What does that mean for the RTX 2070 with only 55% of the raytracing performance? I wasn’t even that impressed by the Northlight raytracing, it’s visibly noisy and you can see that Remedy are already using every trick in the book to keep the framerate acceptable (heavy filtering, smeary trails on lighting effects behind moving objects because they’re using tensor cores to do guesswork and interpolation instead of actually rendering occluded geometry)

    The fact that Nvidia is selling these cards on a completely unproven RTX track record is fishy.

      • Zeratul
      • 1 year ago

      Fishy how? There has to be a first for any new technology or pipeline. There was a “first” card to support pixel shaders too.

      Does that mean the kinks are worked out, it’ll catch on, or they’ll perform at the level they need to in this technology? No, not really, but no one programs a game for tech that doesn’t exist yet and it has to start somewhere

        • Chrispy_
        • 1 year ago

        Fishy in that Nvidia showcased the product deceptively.

        They (Nvidia) are refusing to talk about RTX feature performance, and all we’re hearing from developers is that they are having to dial back their RTX features to run acceptably on a single RTX 2080 Ti at the low resolution of 1080p, with Remedy going as far as to specify the exact performance. They’ve quoted explicit frame time costs of the RTX features and they’re high at 9.2ms per frame. That means that the game would need to be running at 133fps without RTX to meet 1080p60 with RTX effects turned on.

        It’s obvious that the “realtime” RTX demos Nvidia showcased for the launch event were actually running on two [i<]or more[/i<] RTX 2080Ti cards, because the people who [i<]made[/i<] those demos are themselves saying that a single RTX 2080Ti can't achieve that performance. The RTX 2070 isn't exactly cheap, but it has barely a quarter of the raytracing power of the dual RTX 2080Ti SLI setup used to sell the new cards.

        • Freon
        • 1 year ago

        Nvidia complains about Moore’s law, but then added a shit ton of transistors for RT which are likely not useful for consumers at all. Oddly standard raster still seemed to improve in efficiency (FPS/flop, FPS/GB/s), but not on performance/cost because you’re paying for all the dead weight transistors.

    • gerryg
    • 1 year ago

    Sweet! I’ll just open my wallet up and … er… hmm, I don’t seem to have that much $$$ on hand. Guess it’s time to sell furniture.

    • Captain Ned
    • 1 year ago

    Please remember that not all of us look solely to 4K/120+ FPS for definitions of “goodness”, n’est ce pas?

      • Jeff Kampman
      • 1 year ago

      I’ll be focusing on 2560×1440 results for this article, so rest easy.

        • derFunkenstein
        • 1 year ago

        Good choice, glad to hear it. 🙂

      • DPete27
      • 1 year ago

      Ay, but it seems the RTX2070 will do >70fps at 1440p on Ultra settings in most games. Since hardly anybody tests at anything besides Ultra, I’d rather have 1440p and 4k numbers at HIGH settings.
      I swear sometimes that review sites are pushed to test at Ultra so that readers continually think they need the latest gen and most expensive hardware to play Battlefield at 1080p.

        • Voldenuit
        • 1 year ago

        Yeah, benchmarks at multiple resolutions would be nice, although FCAT/OCAT testing is a lot more involved and intensive than running a game and grabbing the Fraps score.

          • derFunkenstein
          • 1 year ago

          Because of the time sink, I have a feeling reviewers are doing max settings because they’re trying to push the hardware. How much time is wasted if you run it on high and it’s >100 fps, now you have to do ultra too?

          That doesn’t change the fact that reviews are very frequently at odds with how people use the cards.

            • DPete27
            • 1 year ago

            It doesn’t take much time with a FRAPS overlay to see if a given setting is too low and producing needlessly high frame rates. I’d guess most reviewers are doing that already to dial in their desired test settings (if they’re not just defaulting to Ultra).

            • derFunkenstein
            • 1 year ago

            But do you do that on each card? Do you use the same settings? What if you get half-way through it and one card is obviously tanking? You’re the expert, so by all means, do tell. 😉

            • synthtel2
            • 1 year ago

            You test with the fastest card first to make sure you’re staying away from the CPU-bound region, and if it isn’t an issue there it won’t be with any of the others either. Worst case, this has to be done once for each vendor, and it doesn’t take long. If a card or two fall off the bottom of the graph, it isn’t a big deal; the config may not be playable, but unless one runs out of VRAM it’ll still be representative of relative performance.

            • DPete27
            • 1 year ago

            Well in this case, there’s [url=https://techreport.com/news/34189/chocolate-cupcake-day-shortbread<]at least a dozen week-old RTX2070 reviews to start from,[/url<] so (spoiler alert) we know it sits between the GTX1080 and GTX1080Ti. I'm suggesting these things (test at more typical video settings) because it's about the easiest thing that can be done at this point to add something meaningful to the data set. Benchmarking the same games at the same resolutions and video settings that everyone else published isn't worth anyone's time (unless you uncover something unique). I'd like to see TR produce more "unique findings" (because I know they're capable of it) and drill into them and make them stand out. [b<]Especially[/b<] when it's a "late" publication. Sure, you're not going to have "Inside the Second" home runs every day, but you've gotta have something unique to say, or else it's not worth saying. But yeah, what [i<]synthtel2[/i<] said. It's not that hard to ballpark the performance of a card the world has never seen just based on it's technical stats. Jeff has done it multiple times with good accuracy. Throw that into the respective performance category (which varies based on how many GPUs you include in your review(s) and yeah, you test settings on the lower-bound GPU and the upper-bound GPU to dial in your settings for a given game. The rest will fall into place. Also, having a card or two fall below the "playable minimum" (generally agreed as 40fps) in a review isn't uncommon.

            • Srsly_Bro
            • 1 year ago

            We’ll get to see benchmarks at 4k. Jeff hasn’t looked at a steam hardware survey to know that few game at that resolution, even on here.

            Jeff only cares about the 1-percenters. I wish he’d show some love to the 99%.

            • DPete27
            • 1 year ago

            Well….According to Steam, the 99% plays DOTA2, LoL, Warframe, WoT, CSGO, TF2, etc etc which aren’t exactly demanding games (by design)

            • Mr Bill
            • 1 year ago

            FRAPACHINO is what you get…
            [quote<]if a given setting is too low and producing needlessly high frame rates[/quote<]

        • synthtel2
        • 1 year ago

        Agreed, I’d rather see 4K high than 1440p ultra all else equal. The performance gap between different resolutions is pretty easy to guess, unlike the difference between settings. High is far closer to how I’ll be playing things, and turning up the resolution can still pull framerates well away from the CPU-bound region.

          • Srsly_Bro
          • 1 year ago

          Reviews that are useful to readers and informative instead of a resolution few have, are what should be provided.

          I have a 1080Ti FTW3. I have at 1440 with a 144hz GSync monitor. 4k resolution isn’t relevant to my needs and to the needs of most people. It’s also unlikely you game at 4k with a card of at least a 1080Ti.

          I don’t bother reading TR reviews anymore because they are several weeks late and have a resolution that doesn’t apply to me. I want to know approximately how well a card does for the resolution I have, not information on a resolution that matters to few.

          Reviews should be made with an audience in mind, not a small group of 4k users.

          Can the 4k panel not be reduced to 2560×1440? And i know TR generally spends the minimum amount of time on reviews, so maybe just use a resolution that matters to your audience the most.

          I swear nobody cares anymore here.

            • synthtel2
            • 1 year ago

            I too use a 1440p144 monitor and yet I would still find results at 4K high more useful and informative than results at 1440p ultra.

            Just take either the 4K framerate times 1.8 or the CPU-bound framerate, whichever is lower, and you’ve got a pretty good estimate of your 1440p high framerate. If you try to review high-end cards at 1440p high, in contrast, all you learn is that you’re CPU-bound and that’s not news.

          • Anovoca
          • 1 year ago

          This is exactly why they SHOULD test at ultra. There is no way for us to know the quality gap between ultra and high from one game to another. Resolution performance gaps however are pretty consistent from game to game. For quality test results you want as much consistency as possible, and testing at ultra sets the base line at “The best this game can offer” which is more tangible than the other settings.

            • synthtel2
            • 1 year ago

            Since there’s no way to know the settings gap, results from high will fail to be very useful to those who game on ultra and vice versa. Therefore, the settings that the most people will be using should be the target.

            High settings are just plain better for most gamers. “The best this game can offer” is convenient to always go for, but it on average demands a significantly beefier GPU for merely similar results (in both image quality and framerate), and I’d guess that most of the people using it are those who are already willing to spend on an overkill graphics card in exchange for not having to worry about any of it.

            High and ultra aren’t even the only settings. I use medium a lot, and that’s so far from ultra that I barely have reason to read most GPU reviews rather than just looking at a spec sheet. I’m sure very few here run medium as a habit, but we make GPU recommendations to others who might.

            I’m not sure exactly what you’re referring to with “consistency” here, but if one of these is inconsistent, it’s ultra. Medium-high or something like it usually has to work on consoles, which means most games are sharing a limit in that region. Ultra may be barely different from that at all, or it may be 3x as heavy like Unigine Superposition.

        • Krogoth
        • 1 year ago

        Ultra settings on most modern titles is really just “High” setting with optimization tricks disabled(rendering stuff that you can’t see anyway), texture compression disabled (That’s why video memory consumption goes up significantly) and LOD bias disabled. That’s why there’s almost no difference in image quality between high and ultra settings.

    • thedosbox
    • 1 year ago

    Someone photoshop bongo cat on that top picture please.

Pin It on Pinterest

Share This