Take a sneak peek at our GeForce GTX 1070 Ti results

As your inbox or Twitter feed will doubtless inform you, the review embargo for Nvidia's GeForce GTX 1070 Ti lifted this morning. If you're just finding out by reading this post, it's because I'm still getting our article where I want it, and I think you'll find the wait to be worth it.

In the meantime, here's the summation of our results for your perusal:

As our scatters suggest, the race between Nvidia and AMD in the high-end graphics card market has tightened considerably since we last checked in. I'll hopefully have a fuller explanation of just how we got to these conclusions soon. Thanks for your patience.

Comments closed
    • DPete27
    • 2 years ago

    Get ready for more comments from people complaining about:
    1) Please test the game I’ll be playing for the next month at the resolution I use cuz I don’t care how cards compare to each other, only what the EXACT framerate I can expect in ____ game if I buy a GPU.
    2) Please adjust GPU prices for current market pricing and keep them up to date in perpetuity.

      • Voldenuit
      • 2 years ago

      Re: point #1, one thing which concerns me is that very few hardware sites test multiplayer games, because it’s practically impossible to control for variables like player count, player behavior, server load, version control, internet connection, etc.

      However, if you look at the most played games on PC, the list would probably have PUBG, Overwatch, Destiny 2, Battlefield 1, GTA:O in the top 10 most played games, at the least.

      How do you test and report on something you can’t test? To paraphrase Cayde, “I don’t have time to explain what I don’t have time to understand.”

      • mudcore
      • 2 years ago

      1) I think people just wish sites like TR would review graphics cards using games real humans are actually playing. Obviously this is hard to do since real humans are not playing dated single player games but alas I fail to see how this invalidates the desires of readers. Heck I’d argue hardware review sites should do MUCH more subjective testing for these readers but eh… just another one of my “I think even the best hardware websites don’t serve most readers well” thoughts but eh that’s a much bigger discussion.

      2) Is this really that much to ask? Do none of the pricing tracking websites offer an API where a site could create dynamic graphs? I say it definitely can be done and it is done. That’s why lots of sites lose traffic to places like GPUBoss *barf* and this goes back to number 1 IMO.

        • Voldenuit
        • 2 years ago

        Perhaps I would have phrased it more diplomatically, but mudcore has a point.

        Taking a look at TR’s suite of games in their most recent published review (Vega), we see the staples of DOOM, Hitman, ROTTR, Witcher 3, DEMD and GTA:V that everybody tests.

        If we look at the number of current/historical peak players for those games on steamcharts, they work out to:

        DOOM (2,199 / 31,623)
        Hitman (741 / 8,138)
        ROTTR (2,973, 44,541)
        Witcher 3 (21,664 / 92,268)
        Deus Ex: Mankind Divided (1,281 / 52,051)
        GTA V (72k / 360k)

        Meanwhile, let’s take a look at some (currently) popular games on Steam:
        PUBG ( 2.31[b<]M[/b<] / 2.39[b<]M[/b<]) Warframe (77k / 121k) ARK (45.9k / 100.7k) I've left off things like MOBAs, CS:Go and TF2 from the comparison since they are not GPU-limited by any stretch. Outside of Witcher 3 and GTA:V (and I can practically guarantee 99% of current GTA: V players are on GTA:O), no one really plays the games that are being tested in the customary test suites that TR and other review sites use. So we have a giant hole in the data that may be of relevance to a large portion of the playerbase. But it also begs the question of how the #$$% do you benchmark an online game properly?

    • Canuckistani
    • 2 years ago

    It looks like pricing for Vega in Canada is still ridiculous.

    At both Amazon.ca and NCIX, Vega 56 cards are priced above the 1080 and sometimes even more than the 1080 ti. The nVidia cards are generally around where I expect them given the current exchange rate.

    If the 1070 ti is priced properly then it might fill a market hole here.

    • ptsant
    • 2 years ago

    Funny how everything lines up on a straight line in perf/$. This means that both companies are close in value per money but, most interestingly, it also means that you don’t pay much of a premium for the high end parts. Good stuff.

    • MrJP
    • 2 years ago

    Any news from anyone on Vega 56 with custom coolers? It’s looking more attractive on the price/performance curve, but the lack of quieter cooling options is a mark against it relative to the competition.

    • davidbowser
    • 2 years ago

    Competition is good for us all.

    • Pville_Piper
    • 2 years ago

    Tracks about where I thought it would…

    Considering that the it is the same price as the average GTX 1070 in real world pricing it is very worth buying… If I was buying.

    Since I’m going to upgrade around March I won’t be at this time. Who knows, maybe the mythical Volta will be out by then… Not getting my hopes up too much though. And maybe I’ll even be able to afford it!

    • Demetri
    • 2 years ago

    Vega 56 is looking pretty good in this chart. Finally available for MSRP too:

    [url<]https://www.newegg.com/Product/Product.aspx?Item=N82E16814202302[/url<]

      • PrincipalSkinner
      • 2 years ago

      At last.

      • mudcore
      • 2 years ago

      Curious how long this stays. Crypto mining on the Vega 56 took a pretty considerable profit leap in the last few days. The rise may not (probably won’t) last long but for now I’d think there will be a rush on them for a bit again.

    • kuraegomon
    • 2 years ago

    <Compares with Vega launch review graph>Oooooh. Well then. So, that suggests some impressive driver improvement work by AMD to raise the performance profile in tested games?

    (And, of course, manic buying by the crypto crowd to raise the price profile :-D)

    • chuckula
    • 2 years ago

    [quote<]As our scatters suggest, the race between Nvidia and AMD in the high-end graphics card market has tightened considerably since we last checked in. [/quote<] Actually, I note that since this scatter graph includes more realistic prices for Nvidia cards than what were present in the original review these scatter graphs put the price/performance ratio for Nvidia cards in a more positive light. Additionally, I checked Amazon and while it's possible to get an RX-56 at 400 and RX-64 at $500, those aren't average prices but temporary sale prices on one available card model. Here's the original scatter plot where the Vega 64 was given a substantially lower price than the GTX-1080: [url<]https://techreport.com/r.x/2017_08_13_AMD_s_Radeon_RX_Vega_64_and_RX_Vega_56_graphics_cards_reviewed/99thValue.png[/url<] [Edit: Using rough numerical estimates, here's the 99th Percentile FPS/$ numbers: Original Rx 64 review: Vega 64: 55 FPS / $500 = 0.11 FPS/$ 1080: 62 FSP / $555 = 0.11 FPS/$ New review: Vega 64: 67 FPS / $500 = 0.134 FPS/$ 1080: 68 FPS / $500 = 0.136 FPS/$

      • Jeff Kampman
      • 2 years ago

      I used retail pricing as much as I could in the RX Vega review. People complained (much like you are), so I’m using MSRP across the board in order to produce a common frame of reference and to remove the effects of real-world pricing fluctuations from the picture.

        • DragonDaddyBear
        • 2 years ago

        I know it’s more work, but could you do both? If nothing else, the actual selling price plot could serve as a historical reference.

          • Jeff Kampman
          • 2 years ago

          If somebody wants to write a fully dynamic graphing deal that pulls in all of this data in real time so that it’s never out of date, I’m all ears. Otherwise it’s building on shifting sands.

            • K-L-Waster
            • 2 years ago

            The Quantum Review Price Uncertainty Principle — no matter which price you quote, someone will claim it’s wrong and unfair.

            • CampinCarl
            • 2 years ago

            I think one of my coworkers wrote some python to pull data from the fantasy TSP stuff; I’ll see if I can get it from him and work on modifying it to pull via something like pcpartpicker? Not sure if they have an API.

            Additionally, you might be able to talk to PCPartPicker or someone else (Newegg?) and work out a sponsorship deal and maybe they could provide the code/data for you?

            Just some thoughts.

            @Bruno Not sure if the web page would support some Python?

          • DPete27
          • 2 years ago

          Ooh, could you make scatter plots auto-update each week from a price sniffer?

            • Voldenuit
            • 2 years ago

            Maybe use third party data? Doesn’t pcpartpicker track prices for many components? Even if they don’t expose their price database to third parties, it should be possible to create a script that visually analyszes their price graphs and find the x,y coordinates of colored regions. Now whether it is legally or ethically defensible to do so without their explicit say-so, is not my domain.

        • Kretschmer
        • 2 years ago

        Could you label and caveat the X Axis as MSRP, then?

      • kuraegomon
      • 2 years ago

      I think that most people have a good sense of what the real-world pricing situation is.

      The _really_ interesting part – that I’m going to call you out for skimming over – is how much the Vega cards’ position on the performance axis has shifted. I.e. the RX Vega 64 went from significantly lagging the GTX 1080 to sniffing right at its heels – and the Vega 56 went from par with the 1070 to opening a small-but-significant can of whoopass on it.

      On one _very_ summarizing scatter plot anyway. I definitely want to see the detailed results – mission accomplished, Mr. Kampman.

        • chuckula
        • 2 years ago

        The Vega64 has all of the hardware (and more) of the GTX-1080[b<]Ti[/b<] so call me impressed when AMD's "miracle" drivers get the Vega64 clearly outperforming its older competitor with similar silicon size and actually lower compute power. There's nothing miraculous about AMD getting a comparatively small boost in performance from finally getting the drivers in shape a few months after launch. And it really is a comparatively small performance boost in the final analysis, the performance boost in games that the i9 7900X got in TR's latest review compared to its at-launch performance was probably just as good, and not one single person claimed that Intel had performed some "miracle" with the 7900X*. * I'm dead serious and correct in that statement BTW. TR's 1950X review put the 99th percentile of the 7900X at about 75FPS and it was behind the 1950X: [url<]https://techreport.com/r.x/2017_08_17_AMD_s_Ryzen_Threadripper_1920X_and_Ryzen_Threadripper_1950X_CPUs_revi/99th-value.png[/url<] Then fast forward to the newer reviews and the 7900X is up over 90FPS: [url<]https://techreport.com/r.x/2017_09_27_Intel_s_Core_i9_7980XE_and_Core_i9_7960X_CPUs_reviewed/value-99thfps.png[/url<] So that's an improvement of > 15FPS at the 99th percentile on a CPU that doesn't even have drivers in the same sense as a GPU. What are we gushing over here? Original Vega 64 is about 55 FPS: [url<]https://techreport.com/r.x/2017_08_13_AMD_s_Radeon_RX_Vega_64_and_RX_Vega_56_graphics_cards_reviewed/99thValue.png[/url<] And now this article is putting it maybe 10 or 11 FPS higher.

          • Jeff Kampman
          • 2 years ago

          You’re comparing 2560×1440 and 1920×1080 numbers there.

            • chuckula
            • 2 years ago

            That’s not my underlying point and of course I’m not making some ridiculous statement that a CPU is somehow better than a GPU.

            My point is that [b<]nobody[/b<] was jumping up and down congratulating Intel for working so hard to improve the performance of the 7900X in games after it launched. [b<]Nobody[/b<]. Here we see some very expected and nothing particularly miraculous optimizations in drivers being done for Vega and it's treated like a huge deal. Once again, wake me up when the Vega 64 with arguably superior hardware to the GTX-1080Ti starts blowing the 1080Ti away in every dimension.

            • Jeff Kampman
            • 2 years ago

            You misunderstand. I benched the Threadrippers and friends at 2560×1440 and the Core i9s etc. at 1920×1080. There is no “improvement,” it’s down to resolution.

            • chuckula
            • 2 years ago

            OK, in that case I’ll concede the point about the 7900X. I was confused since the 1950X showed only a tiny variation in performance between the two graphs so I figured the tests were being run at the same resolution.

            • Mr Bill
            • 2 years ago

            Perhaps these Percentile vs value graphs should not mix resolutions and should be labled what resolution is represented. Or, the tested resolution could be represented by color.

            • Jeff Kampman
            • 2 years ago

            Or they should only be taken within the context of the review that they’re produced for?

            • Beahmont
            • 2 years ago

            But that prevents assessment over time and significantly reduces the value of the assessment of performance metrics by making the review only valid for a very short amount of time. Things like driver updates and program optimizations can easily change performance and value over time.

            • Waco
            • 2 years ago

            Which is why you generally can’t compare between two time-separated reviews anyway.

            • Voldenuit
            • 2 years ago

            How about three separate graphs, at 1080p, 1440p and 2160p?

            If I’m looking to power a 4K display, I don’t really care if, hypothetically, a 1060 might give me the highest fps/$, if it’s only doing, say, 20 fps (please be kind, I completely made up this number).

            Similarly, if I only need to game at 1080p, then it’s useful to know which cards are pointless overkill.

            • tsk
            • 2 years ago

            Intel is also good buddy *pats on the back*

          • kuraegomon
          • 2 years ago

          Re, those AMD driver improvements: the graph appears to indicate a 20% improvement [i<]across the board[/i<]. This isn't something to take for granted. Do they represent a full extraction of the Vega architecture's potential? Who knows? Obviously, all of its raw compute hasn't translated into anything resembling 1080Ti performance. I never claimed that it had. "Miracle" is your word choice, not mine. I.e. your usual bullcrap habit of misquoting people. I'll ask you not to try it on me, because you will get called out _every_ time. "Gushing" is definitely a mis-characterization of my tone. More Chuckula-playbook-page-1 stuff. Again, save it. The fact that I called you out on is this: the clear message of the graph was that TRs testing shows that the performance comparison between Vega and Pascal has changed significantly since Vega's launch. That's the central point of Jeff's post. For your first comment to make ZERO acknowledgement of that shift, even in your usual inimitable damn-with-faint-praise style, is disingenuous at best. Damnit man, for every time you impress me with a properly-thought-out and well-informed comment, you manage at least one (and sadly, probably more) comment that's clearly beneath you. You're too smart for the intellectual dishonesty and pointless combativeness that you so often display here. Why not do better?

            • Voldenuit
            • 2 years ago

            Do both graphs use the same suite of games? What resolution(s) were used to compile the meta-score?

            If this is a straight +20%, then, really, “wow”.

            • kuraegomon
            • 2 years ago

            I’d expect that Jeff used the same suite of games. If the comparison between the two graphs (Vega launch vs. sneak peek) wasn’t strictly apples-to-apples, I think he’d have mentioned something to that effect.

            • chuckula
            • 2 years ago

            [quote<]Re, those AMD driver improvements: the graph appears to indicate a 20% improvement across the board. [/quote<] Yeah, and they also represent about 10% improvement in a 16 month old Nvidia GPU architecture during the equivalent time period too. What's more "miraculous" a 20% improvement over disappointing initial results or a 10% improvement in an already refined product? [Trick question: the answer is NONE OF THE ABOVE] It just goes to show you that it's easy to show ooh-ah relative performance improvements when you can control both sides of the scale. Once again, I'll call it a "ooh ah miracle" when Vega 64 starts behaving like an older Nvidia product with weaker hardware and lower power consumption.... and of course by that I mean the GTX-1080[b<][i<]Ti[/b<][/i<]

            • Voldenuit
            • 2 years ago

            Dunno, they both sound pretty impressive to me, but then I’m easily impressed.

            Also, the improvement may not be purely from drivers. They could have come from game devs refining their game over time, etc.

      • maxxcool
      • 2 years ago

      I’d LOVE to see the scatter plot pull the top 5 vendors from newegg and amazon and average their prices to use in the scatter chart..

      no freaking way am i scraping the barrel for a ‘sapphire’ card or ‘zotac’ for a premium card to save money..

        • derFunkenstein
        • 2 years ago

        Sapphire is generally regarded as one of the more reliable AMD board partners. Besides, if you’re doing “top 5” vendors for AMD you basicaly have to include them:

        ASUS
        Gigabyte
        MSI
        Sapphire
        HIS
        XFX
        PowerColor

        Out of those 7 I’d dump HIS and PowerColor so what you’re left with is the first four and XFX.

          • maxxcool
          • 2 years ago

          oh godz .. power color … totally forgot they existed.

      • derFunkenstein
      • 2 years ago

      PowerColor and Sapphire on Newegg, and it doesn’t say anything about sale prices:
      [url<]https://www.newegg.com/Product/ProductList.aspx?Submit=ENE&IsNodeId=1&N=100007709%20600473877%20601302833[/url<] As I noted elsewhere I'd skip PowerColor. Also a shame that AMD's stock blower cooler...err...blows.

Pin It on Pinterest

Share This