Nvidia releases its first official benchmarks for the RTX 2080

Nvidia touted some impressive theoretical performance numbers for its GeForce RTX cards during CEO Jensen Huang's keynote this Monday, but the company didn't generally talk in the universal language of gamers, flawed as it may be: average FPS. We've already tried to estimate how the Turing-powered RTX 2080 and RTX 2080 Ti would perform using a number of peak performance measures for rasterization, but even those estimates are a bit abstract.

To help the average person understand how the RTX 2080 performs, Nvidia is releasing some average-FPS metrics and performance indices of its own for the games of today and tomorrow. First, here are some average-FPS numbers for the RTX 2080 at 4K with HDR enabled. Nvidia doesn't provide any of the other settings or testing conditions it used to arrive at these numbers, but here they are nonetheless.

  • Final Fantasy XV: 60 FPS average
  • Hitman: 73 FPS average
  • Call of Duty WWII: 93 FPS average
  • Mass Effect Andromeda: 67 FPS average
  • Star Wars Battlefront II: 65 FPS average
  • Resident Evil 7: 66 FPS average
  • F1 2017: 72 FPS average
  • Destiny 2: 66 FPS average
  • Battlefield 1: 84 FPS average
  • Far Cry 5: 71 FPS average

Whether the RTX 2080 will be able to provide a 99th-percentile frame time of 16.7 msβ€”more or less the equivalent of saying it offers a consistent 60 FPS in all of those titles at 4K with HDR onβ€”will remain to be evaluated by independent reviewers.

Nvidia also provides a measure of relative performance for the RTX 2080 versus the GTX 1080 in some current and future titles under two conditions: with its Deep Learning Super Sampling (DLSS) technology off, and with DLSS on. Even without DLSS enabled, the RTX 2080 is about half again as fast as the GTX 1080 in every title Nvidia tested at 4K and with HDR on.

Flip on DLSS in the titles that support it, though, and those games can apparently deliver more than two times the number of frames, on average, than the GTX 1080. Again, we don't know what settings Nvidia used to arrive at its baseline numbers for either card past the 3840×2160 resolution and the fact that HDR was on, but that level of performance is certainly impressive in titles where it is or will be available.

Whether this promise of greatly improved performance in today's games will be enough to get gamers to pony up $699 or more for the RTX 2080 remains to be seen, but it at least demonstrates that the RTX 2080 has the potential to deliver performance in the ballpark of the GeForce GTX 1080 Ti at 4K and with HDR enabled with today's games. That bodes well for the RTX 2080 Ti, which stands poised to deliver even more pixel-pushing power. We'll hopefully be able to subject the RTX 2080 and RTX 2080 Ti to our advanced metrics ahead of or around the cards' September 20 ship date.

Comments closed
    • DeadOfKnight
    • 1 year ago

    Obviously this 4k HDR is a biased comparison playing off the weakness of the 1080 more than the strength of the 2080, but overall I think it’s still evidence that we will see at least double digit improvements for existing games. I think to really get that ~50% improvement across the board though, you’d have to be going from a 1080 to a 2080 Ti for current, non-DLSS games.

      • VincentHanna
      • 1 year ago

      I mean biased is a loaded word.

      Any other comparison that didn’t take advantage of a full 1/3 of the 2080 because it’s a highly specialized engine designed to do light ray rendering is also going to be biased… which is the point that huang was trying to make when he said “you can’t compare these chips to pascal.”

      Is it a weakness that the 1080 doesn’t have however many RTX/Tensor units which specialize in doing 4k HDR? I guess.
      Is there any reason, other than sour apples, why it’s not a valid comparison?

        • DeadOfKnight
        • 1 year ago

        Memory bandwidth. The 1080 was never a great card for 4k, and HDR undermines the benefits of compression in Pascal.

          • VincentHanna
          • 1 year ago

          Same response as before.

          GDDR6, like tensor cores and RTX cores are a new addition to the card, and a proper test will show that. There is no test that will be fair to both of these cards, given those improvements.

          That does not make this an invalid test, however.

      • Freon
      • 1 year ago

      I can’t imagine the 20xx increasing framerate as much by simply running NO AA at all as it does with DLSS. The baseline two must be with some uber tier (4x SSAA?), to which they’ll push DLSS as equivalent.

      I don’t think there’s anything special otherwise. The two “baseline” numbers show about what we might expect for performance increase in normal GPU limited scenarios based on specs (i.e. by raw FLOPS give or take).

      To which, I have to say, gee, something like 2x MSAA + TXAA or 4x MSAA looks really good on a 10xx card with fairly low performance impact vs. no AA at all. No one runs costly AA anymore with all the good low cost options. If DLSS is nothing but a 4x SSAA killer I’m not sure anyone should care.

    • HisDivineOrder
    • 1 year ago

    I keep seeing the argument about 20 Series being Nvidia moving the segments up one level. So the Titan becomes the 2080 Ti, the 1080 Ti becomes the 2080 and the 1080 becomes the 2070…

    I don’t get why people don’t realize the problem with that. The 1080 was launched at high end pricing only to be replaced by the 1080 Ti at a slightly higher price. What’s going to come in now, take the 2080’s price point, and replace it with a cut down version of the Titan/2080 Ti Big Turing chip?

    Nothing. So it’s obvious this is a big change and not good for consumers. It’s a shame. You shouldn’t be paying $1k-1200 to get only a high end video card. AMD, stop screwing around and spin off ATI so they can resume competing not just on years when you aren’t distracted with CPU’s…

    Omg, I think I’m getting old.

      • DeadOfKnight
      • 1 year ago

      I don’t think you realize how much more expensive it is to make such a bigger chip. There’s a reason GP100 never became a card for gamers. There’s a reason they kept the market limited for Volta. These are big chips. The 2080 Ti is about the same class of GPU, but much cheaper. They aren’t raising the price of each class of GPU, they’re just now marketing a big chip to gamers.

      Edit: They did raise the price of each class of GPU with the 600 series, so if you want a point of comparison, look there first.

      • VincentHanna
      • 1 year ago

      So, small correction: the 1080 -> 2080. Everything down the line is comparable, except the 2080TI which, it is arguable compares more directly to Titan (which is not planned for this generation)… but it’s not a platform-wide shift.

      The reasons why you might compare the Titan and the 2080 TI are as follows:
      The Titan has a LARGE 700nm chip, and so does the 2080TI
      The Titan had a $1200 price tag, and so does the 2080TI
      There is no Titan planned for the 2080 series…
      Nvidia says so (this is a rumor, but if it were true, they’d probably know)

      So… that being said, I’m not really sure what you are complaining about. It seems to me, correct me if I’m wrong, that you are upset that you won’t be able to buy the 2080TI that would have come out in 8-12 months made up of cut-down/rebinned “big turing chips”

      And if that is what you are upset about, do you have a single shred of evidence to support that statement?

    • Mumrik
    • 1 year ago

    The 2080 launches at $799

    The 1080 launched at $549

    The 1080 Ti launched at $699

    Why are they comparing 2080 to 1080 again?

      • DoomGuy64
      • 1 year ago

      Because it’s a 104 series GPU with the x080 numbering. The price has nothing to do with it. Nvidia has been consistently raising prices on their cards ever since Kepler, and this is the end result. Overall, the consumers are the ones at fault for paying these increased prices, and you will find that even though some people are waking up, the majority of Nvidia fans still have no problem with the pricing.

      It’s a cult. Apple had this problem up until Steve Jobs passed, and Nvidia has taken their place. I have the opinion that if Jen-Hsun left the company, the cult would dissipate much like Apple. It’s not about the products, it’s about the cult, and fitting into a group.

        • HisDivineOrder
        • 1 year ago

        Nvidia has released superior products on a more consistent basis. Unfortunately, AMD is dropping the ball pretty hard and leaving us at the mercy of Nvidia.

        So we have the choice of:

        1) Buying AMD products in spite of their deficiencies
        2) Not buying Nvidia or AMD GPU’s.

        It makes little sense to buy an inferior product just because they’re the only guy not Nvidia. It makes more sense for us to demand AMD to stop being inconsistent and full of years where they leave us at Nvidia’s mercy.

          • Krogoth
          • 1 year ago

          Actually, Nvidia only provided superior stuff on a consistent basis recently with Maxwell and onward.

          Prior to that point, it has been in the same mixed bag as ATI/AMD RTG.

      • Voldenuit
      • 1 year ago

      [quote<]The 2080 launches at $799 The 1080 launched at $549[/quote<] Don't forget the 1080 FE launched at $699, a full $150 over the suggested price of non-reference versions, and was the only version readily available at launch. $549 cards weren't available until 3 months after launch, and 9 months after launch, nvidia dropped the official MSRP to $499. Then the crypto bubble hit. Similarly, the $799 launch price for the 2080 is for the FE. We may see cheaper 3rd party cards once supply clears up.

        • Krogoth
        • 1 year ago

        I wouldn’t count on it though. Nvidia shareholders got hooked on the margins that etailers got from the mining craze. They wants to continue that trend and get a bigger piece of the pie and make it the new pricing paradigm.

          • DoomGuy64
          • 1 year ago

          The real question is that if gamers will tolerate it, now that they’ve successfully lobbied against and boycotted garbage like Battlefront 2. Maybe growing a spine only applies to games, or maybe Nvidia won’t have the unchallenged support that they think they have. Time will tell.

    • freebird
    • 1 year ago

    How many others on here are concerned that this is more of a memory bandwidth comparison benchmark, instead of an actually GPU comparison test?

    I say that because game testing reviews and comparisons are rarely done at 4K with HDR on.

    there are a few:
    HDR Benchmarks perf impact on AMD (-2%) vs Nvidia (-10%) – 12 Games at 4K HDR

    [url<]https://www.guru3d.com/news-story/hdr-benchmarks-perf-impact-on-amd-(-2)-vs-nvidia-(-10)-12-games-at-4k-hdr.html[/url<] The GTX 2000 series is probably good news if you do play at 4K with HDR on, but performance difference is probably game specific & memory bandwidth issues at this resolution. Nvidia may also have fixed this issue with the GTX2000 series quote from link above: "The performance loss is likely due to the way NVIDIA converts RGB to reduced chroma YCbCr 4:2:2 in HDR" Edit: Didn't take long for the "serious" haters on here to jump in with the down votes... ;D

      • psuedonymous
      • 1 year ago

      No, people are downvoting you because your own source says that HDR should have no effect on performance. Engines are ‘HDR’ internal after all, the only ‘SDR’ bit is the final downsampling for output (which is why they can do all that fancy colour grading to produce a consistent brown-on-brown aesthetic).

        • freebird
        • 1 year ago

        but the RESULTS shows HDR DOES have an effect on GTX 1080 performance. Didn’t you look at the results??? If Nvidia “fixed” the GTX 2080 to so there isn’t a performance hit with HDR enabled, then the performance difference between the GTX 2080 vs GTX 1080 may be less, if you don’t game at 4K with HDR ON. THAT is MY POINT.

        Which all the downvoters seem to ignore.

        I’d love to game at 4K with HDR on, but also want 100+ refresh rates (and probably 30″+, but I drop that for now) and currently those are few & far between and more expensive than the GTX 2080.

        [url<]https://www.newegg.com/Product/Product.aspx?Item=N82E16824236885[/url<] and if you can afford those monitors then you won't mind paying for the GTX2080TI. So tell me, does Nvidia's benchmark tells us more about playing performance at 4K with HDR ON or what the performance increase most buyers will expect when buying a GTX2000 vs GTX1000 series card?

          • DoomGuy64
          • 1 year ago

          It’s with HDR on. Best case scenario with maxed out settings that only exist in the latest games that support those features. That’s why Nvidia is not being upfront on the settings used, because the performance difference without those settings will be much smaller, and this card will only appeal to a niche market that wants those features. Like Raytracing, etc.

          • psuedonymous
          • 1 year ago

          Once again, your own link states that this slowdown only occurs when 4:2:2 subsampling is active. This will only be active when HBR3 bandwidth is exceeded, which means outputting at UHD, AND with 10+bpc, AND when above 90Hz.

            • freebird
            • 1 year ago

            ONCE AGAIN, are you saying I’m wrong and the GTX 1080 doesn’t have a reduced frame rate in a number of games when running @ 4K with HDR ON versus HDR OFF???

            That was the gist of the article I linked to…

            and my point being the that GTX 2080 has more memory bandwidth (448GB/s) and likely doesn’t exhibit the same slow down as the GTX 1080 (320GB/s).
            Which points out that the performance difference between the GTX 2080 vs 1080 in the SELECTIVE test at 4K with HDR ON was meant to show the GTX 2080 performance in the best light (not ray traced πŸ˜‰ ) possible.

            So yes, thank you for helping make my point.

            • psuedonymous
            • 1 year ago

            Yes, you are wrong. As the article states, the performance impact is only in one specific situation (i.e. not all HDR output will be affected), and is due to the output format only. It had no effect on the memory bandwidth required to render, as regardless of whether you output as HDR or SDR the internal precision used for lighting calculations will remain the same.

            • freebird
            • 1 year ago

            Once again you are arguing a point I’m not making…make all the strawmen arguments you want, but I never said all HDR output would be affected or even all games would exhibit a decrease because you can see in this an other articles some games are affected more than others and some not at all.

            The point is that the GTX 1080 suffers a performance penalty with HDR ON at 4K (to differing degrees across games) and I doubt the GTX 2080 exhibits the same…. you can keep blabbing about the article all you want, but those are the facts. I think it is due to the bandwidth difference between the GTX1080 (320GB/s) and GTX2080 (448GB/s).

            Are you arguing that it doesn’t?

            Are all the sites that tested the GTX 1080 at 4K with and without HDR are WRONG then?… ok.

            Thanks for that update…

            I have more faith in this analysis than yours or Nvidia’s PR Team.

            [url<]https://www.hardocp.com/news/2018/08/24/adoredtv_dissects_marketing_to_make_nvidia_turing_rtx_performance_predictions/[/url<]

            • psuedonymous
            • 1 year ago

            [quote<]The point is that the GTX 1080 suffers a performance penalty with HDR ON at 4K [/quote<] And once again, you are compeltely skipping over the crucial part: [i<]when using 4:2:2 chroma subsampling[/i<]. There is no reason to use 4:2:2 at all unless you are bandwidth limited, and the only situation where you are bandwidth limited at UHD over HBR3 (DP 1.3 or 1.4) is when you are operating with more than 8 bits per channel at over 90Hz at the same time. You could be outputting SDR at 1080p, and if you forced 4:2:2 chroma subsampling you'd see the same bug. And quite frankly, anyone waving around "X is memory bandwidth limited" without an Nsight or GPUTrace log pointing to "here's where the GPU is spinning waiting on vRAM" is not going to garner much belief. As we've seen with every card using HBM, gaming workloads are not memory bandwidth limited even at extreme resolutions.

            • Krogoth
            • 1 year ago

            It probably has more due to Pascal’s compression struggling with 10-bit color content which ends-up eating up more precious bandwidth. Volta/Turing address this problem.

            • psuedonymous
            • 1 year ago

            Yet again: whether display output is 8bit, 10bit or more (or less), the engine will still be operating internally at the exact same bit depth.

            • Krogoth
            • 1 year ago

            The problem is that 10-bit content is harder for Pascal to compress then 8-bit content and as a result there’s a hit on memory bandwidth when you run at HDR mode. The bandwidth hit is much more noticeable at higher resolutions and AA on top.

            This isn’t a new revelation either. It has been known for a while.

            Nvidia is just trying to show the 2080 in the best possible light with current content.

            • psuedonymous
            • 1 year ago

            Again, you’re mixing up output bit-depth with texture bit-depth. Regardless of what bit-depth the final output is, the textures will be stored and operated on at the same bit-depth regardless. 8-bit output, 10-bit output, the GPU gives not the tiniest shit and will use the exact same texture bit-depth and buffer bit-depth.
            It’s not until the final composited buffer is made available that sampling down to the output bit-depth occurs. And compared to the bandwidth used during rendering, the bandwidth needed for a single additional buffer per frame is absolutely hilariously miniscule, even at 8K or above.

            • Krogoth
            • 1 year ago

            They are both the same thing. HDR mode = 10-bit texture and color output. There’s fair amount of circumstantial evidence that Maxwell/Pascal’s memory compression struggles it with, but it is addressed in Volta/Turing.

          • VincentHanna
          • 1 year ago

          [quote<]then the performance difference between the GTX 2080 vs GTX 1080 may be less, if you don't game at 4K with HDR ON. THAT is MY POINT.[/quote<] Yeah. If you test at 1080p 60hz, medium settings there's likely to be ZERO difference between them. Woo hoo. Might as well save some cash and buy a used gtx 780.

      • VincentHanna
      • 1 year ago

      Even if your premise was solid, eg that the 1080 series had bandwidth problems which aren’t present in the 2080 series, and they get zero benefits from their new cores and/or arch…

      So what? 50% is still 50% imo.

      That being said, I doubt your premise.

    • deruberhanyok
    • 1 year ago

    Okay. Even assuming these numbers are accurate for 4k performance (which is a pretty big leap of faith for a vendor-supplued benchmark result), the sheer number of people I’ve seen on forums and even the comments here saying “this is worth the upgrade from a 1080ti” is mind boggling.

    How is it that six months ago everyone was complaining that video cards are too expensive because of mining, but now people are seriously willing to spend $800+ on something that might be 50% faster than their current GPU in certain, very specific, limited use-case scenarios?

    I mean, we haven’t seen any real test results, NVIDIA was all over the place spouting “RTX ops” and glossing over the part where the 2080ti is only pushing about 20% more tflops than a 1080ti when you discount the tensor and RT cores.

    It’s a step up, sure, but how is it suddenly worth it to spend $800, $900, even $1200 on a GPU when those prices were causing flailing of limb and gnashing of teeth just a few months ago?

      • PixelArmy
      • 1 year ago

      You answered your own question. “It’s a step up” while 6 months ago it was inflated prices for the same old thing.

        • deruberhanyok
        • 1 year ago

        I guess. But… until we see actual benchmark results, right now the numbers are all showing it to be a “20% ish” step up. Is that really worth $1200 to people?

          • Waco
          • 1 year ago

          Not too many.

          • sonneillon
          • 1 year ago

          It would have been stupid of me to buy a 1080 or 1080ti a few months ago due to the artificial markup considering it’s a 2 year old piece of hardware and there had been hints that Nvidia was going to release a new generation video card

          I have a GTX 970 paired with a i7-2600K cpu. I’m probable going to be replacing my entire rig in early 2019, not because of money concerns but because to replace now with a 1080 or a current gen Intel just makes no sense. The next gen Intel should have the hardware fixes for the multiple recent security vulnerabilities (and in theory the hardware fixes will have less of a performance impact then the software fixes) and why get a 1080 when in a few months I should be able to get a 2080 or 2080ti?

            • deruberhanyok
            • 1 year ago

            I was ranting more about people saying it was a worthwhile upgrade from a 1080/1080ti – see original post.

            If you’re buying fresh, or upgrading from a much older system, sure, get whatever is best that you can afford.

            My confusion comes from people with existing 1080 / 1080ti cards, who are so hungry for something new they seem to be willing to pay out the nose for something with no reliable performance data available.

            I mean, who preorders a video card?

      • _ppi
      • 1 year ago

      I was considering buying 1080Ti a year ago, but GPU prices made it impossible to buy it. Now, if 2080 is about on par with 1080Ti and just marginally more expensive, why should I not consider it?

        • deruberhanyok
        • 1 year ago

        I can’t fault your logic, but that’s a pretty big “if” to make based on a single nvidia-released benchmark.

    • Srsly_Bro
    • 1 year ago

    I also found that Dell released an update to my s2716dg (g sync) with a s2719dg (freesync). I wonder why Gsync wasn’t included again.

    • wierdo
    • 1 year ago

    Saw this today and thought I’d share:
    [url<]https://www.youtube.com/watch?v=CT2o_FpNM4g[/url<] Looks like it's going to depend allot on dev optimizations, otherwise something closer to 20 percent performance gain vs last gen - 2080 vs 1080 (non-Ti) - in "current" titles more or less.

    • Laykun
    • 1 year ago

    I’m now even more intrigued to see what the TR review is going to look like. If this card can double my 1080Ti in performance with same or imperceptibly different image quality then it hits my metric for “upgrade”.

      • DancinJack
      • 1 year ago

      It will not double your 1080 Ti. No need to wait.

      • K-L-Waster
      • 1 year ago

      Looks like it’ll beat the 1080 TI, but doubling it is highly unlikely.

    • RdVi
    • 1 year ago

    Am I wrong in assuming that possibly these results are with regular SSAA turned on? If the DLSS results are faster, then wouldn’t that have to be the case?

      • RdVi
      • 1 year ago

      OK nevermind. I think the DLSS results might actually be rendering at a lower res and then upsampling to 4k, which it seems like what DLSS actually is. This explains the performance improvement.

      • Action.de.Parsnip
      • 1 year ago

      Im pretty sure the running demos seen were with ai AA vs supersampling. We won’t get any actual data for quite a while

    • DPete27
    • 1 year ago

    2080 is about 50% faster than the 1080
    1080Ti is about 33% faster than the 1080

      • chuckula
      • 1 year ago

      Ngreedia intentionally gimped the 1080Ti before it even launched to make Turing look better: CONFIRMED!

        • DeadOfKnight
        • 1 year ago

        Apparently you are the only one who can make sarcastic comments like this.

          • K-L-Waster
          • 1 year ago

          I think his license runs out in 2022.

      • jihadjoe
      • 1 year ago

      2080 is 13% faster than 1080Ti?

        • DPete27
        • 1 year ago

        So it would seem.

        • Mr Bill
        • 1 year ago

        Copy that calculation.

    • Srsly_Bro
    • 1 year ago

    the RTX 2080 is about half again as fast as the GTX 1080 in every title Nvidia tested at 4K and with HDR on.

    Half again as fast?

    Can someone explain to me what that means?

      • DancinJack
      • 1 year ago

      + ~50 percent over the 1080. So if a 1080 gets, say, 100 FPS in a game, a 2080 might get somewhere around 150 FPS in the same game/settings.

        • Srsly_Bro
        • 1 year ago

        I understood it through looking at the graphs, but what an interesting way to say it. I’ve never seen anything worded that way. Thanks, bro.

          • Waco
          • 1 year ago

          Pretty common parlance IMO.

            • WhatMeWorry
            • 1 year ago

            Maybe an Americanism?

            • Pville_Piper
            • 1 year ago

            yup…

            • Shobai
            • 1 year ago

            Aussie here, and it’s quite a common phrase IME.

            • f0d
            • 1 year ago

            aussie here too and i have never heard it said like that before

            • Srsly_Bro
            • 1 year ago

            No one in my peer group, generally under the age of 30 has ever said that in college or on the job. Perhaps it’s generation centric which would explain my absence of familiarity.

            • Shobai
            • 1 year ago

            And you’ve never interacted with anyone outside your peer group, at a guess? Orphan, raised by orphans, etc?

            Perhaps you should sit down to read this next bit, but it sounds like you’re sheltered – you need to get out more. Like, srsly, bro.

            • Waco
            • 1 year ago

            I’m 33. :shrug:

            Anyway – now you know, now you can use it. πŸ™‚

            • sweatshopking
            • 1 year ago

            Perhaps you just need some education, son. Hit me up on skype laterz

      • Shobai
      • 1 year ago

      Break it down, James Brown.

      You’ll know “as fast as” , and “half again” is an additional half.

        • Srsly_Bro
        • 1 year ago

        I don’t hang out with people over the age of 40 or even “once again as much”, so I can honestly, and srsly say, I’ve never heard or read that idiom.

        These idioms lead to confusion and the general public and dorks on here have a hard enough time with more common idioms and the logic dictating their meaning.

        It’s more difficult trying to imagine what the person was trying to say than just using a number. How can you half again if you haven’t halved once before? It’s really stupid.

        Edit:
        I think the James Brown reference answered my question.

          • f0d
          • 1 year ago

          im over 40 and i have never heard it before either

          • Shobai
          • 1 year ago

          [url=https://m.youtube.com/watch?v=95Y5kRQBKo0<]Yes, I'm almost certain it did.[/url<]

          • Waco
          • 1 year ago

          It makes perfect sense if you spend a few seconds thinking about it. Half again as much. Half, again (on top of) the other number. 150%. This is even more clear if you [i<]just look at the graph[/i<]. Boom, confusion turned into learning experience. I get that you're trying to make everyone sound old (or something) but you're only reinforcing common stereotypes of millennials (and technically I am one, so take that how you will). Doubling down on "how stupid" it is just makes you look petty and willfully ignorant.

            • Srsly_Bro
            • 1 year ago

            I got out from the graph. I said as much several times. Thanks for all the replies, bro. This was fun.

      • chuckula
      • 1 year ago

      Let’s learn how to read and do math!

      First word: Half. That means 50%.

      Next word: Again. Ok, take the previous word and use it again, that means: Half Half.

      Now mathematically, what happens when you take 50% and multiply it by 50%? You get 25% of course!

      So let’s translate this sentence:

      [quote<]the RTX 2080 is about half again as fast as the GTX 1080 in every title Nvidia tested at 4K and with HDR on.[/quote<] --> [quote<]the RTX 2080 is about 25% as fast as the GTX 1080 in every title Nvidia tested at 4K and with HDR on.[/quote<] READING IS FUNDAMENTAL!

        • GTVic
        • 1 year ago

        “again as” is usually interpreted as adding to the original so 100% + 50% = 150%

          • chuckula
          • 1 year ago

          [url=https://www.youtube.com/watch?v=_HvGven4qJ0<]You need a math lesson![/url<]

            • GTVic
            • 1 year ago

            Interpretation and math are separate, the interpretation is addition and I added so the math is correct. Since the language is documented then my interpretation is also correct. Proof is here [url<]https://idioms.thefreedictionary.com/half+again+as[/url<]

        • Waco
        • 1 year ago

        Your intentional misreading of this makes me laugh…but stop. πŸ˜›

        • Srsly_Bro
        • 1 year ago

        I ignored the logic and tried to understand what he meant through the graphs. I would like if people refrained from using idioms in general. Many seem to be made by the Common Man.

        If he meant 50% or half again, 25%, what’s wrong with just using numbers?

        Had I used logic I’d be asking about the mysterious unmentioned half. Half again? What about half prior to again? Which old person thought that saying up? Let’s figure that out.

        Who decided “again” refers to the compared value and half again is 50%? Or is it 25%? I’d wager the person who made it up wasn’t into math.

        This is really stupid.

          • K-L-Waster
          • 1 year ago

          Claiming any phrase you yourself are not familiar with is stupid… is itself… ahh, I’ll let you fill in the blank.

          • Waco
          • 1 year ago

          I’d wager you’re taking a joke far too seriously to try and prove a point. πŸ™‚

          • 1sh
          • 1 year ago

          I agree that is a very awkward use of the English language. I dont think it is common at all…

            • GTVic
            • 1 year ago

            [url<]https://idioms.thefreedictionary.com/half+again+as[/url<]

            • Voldenuit
            • 1 year ago

            If only I could give you one and half again upvotes.

            • mat9v
            • 1 year ago

            Well, I larned this phrase while studying for English FC certificate so it must not be so obscure.

          • cygnus1
          • 1 year ago

          [quote<] what's wrong with just using numbers? [/quote<] Because numbers aren't words and Jeff is an intelligent tech journalist/writer, so he writes words... Words and phrases made of words can represent numbers and entire math operations in much simpler ways than writing out the numbers. You're just being willfully ignorant and I'm surprised you haven't broken out the phrase 'fake news' in this thread.

          • willyolioleo
          • 1 year ago

          “This is really stupid.”

          “This” being yourself, right?

          Your basic literacy level is your problem, not the author’s. Blaming someone else for your own lack of understanding a completely normal English idiom is the height of stupid.

      • NTMBK
      • 1 year ago

      I don’t understand the confusion, it’s a perfectly cromulent saying.

      • DoomGuy64
      • 1 year ago

      From what I’ve heard, this gain is due to Nvidia finally supporting a mode similar to asynchronous compute, and deep learning AA methods, and the game engine needs to use it for the full performance boost.

      Considering that Nvidia has previously strong armed devs into not supporting async, the titles that see maximum gains should be fairly limited upon launch.

      This means early 2080 reviews will likely not show 50% gains, and people who take those reviews at face value will not realize the card has further potential. People will only see the difference once games start taking advantage of the new capabilities. That said, considering how Nvidia has such a “good” relationship with the tech press, it is very possible that Nvidia pressurizes reviewers to pick benchmark titles that support these features on launch, so it may not actually review bomb anyway. Either way, the gains are real, but games need to support it.

    • NoOne ButMe
    • 1 year ago

    These numbers look totally believable.

    Just uh. Delivering 1080ti performance at 1080ti pricing in a new generation is well.

    Not so impressive.

      • DancinJack
      • 1 year ago

      Not that they’re super useful yet, but there are some cool features included with Turing. RT, Tensor cores, DLSS, NVLink, GDDR6. You’re just paying a premium for features that won’t make a huge difference yet.

      edit: wrong acronym

    • Waco
    • 1 year ago

    I don’t think anyone makes a truck big enough to hold all the salt I need to look at that slide.

      • Krogoth
      • 1 year ago

      Nevermind, the salt from the fanboys trolling each other over this chart.

      • jensend
      • 1 year ago

      I dunno, I think [url=https://www.youtube.com/watch?v=638zMKTb4WI<]496 tons[/url<] is probably enough salt for me

        • chuckula
        • 1 year ago

        He who controls the Salt controls the universe!

          • freebird
          • 1 year ago

          I think you meant Lice, not to be confused with Spice. ;D

            • Shobai
            • 1 year ago

            [quote<]He who controls the Salt controls the Lice[/quote<] I usually use it for leeches or slugs, does it even work on lice?

            • freebird
            • 1 year ago

            No, but Old Spice does…

            [url<]https://www.youtube.com/watch?v=H3J16WV06Vo[/url<]

            • Voldenuit
            • 1 year ago

            Haven’t you heard of Salt and Pepper hair?

    • dragontamer5788
    • 1 year ago

    Hmm, that makes sense for 4k / HDR. Which would be severely bandwidth constrained. With GDDR6, the 2080 should have a leg up on the previous generation.

    So from a gaming perspective, this card is just so overkill for me that I won’t ever consider it. My current monitor is still 2560×1080 (ultra wide). The RTX features are actually far more interesting, since they’d give me higher-fidelity on my (effectively) 1080p monitor.

    Serious esport gamers who game at 144 Hz will have to wait and see. The 1080p or 1440p performance numbers matter more.

    The 2080 seems comparable to the 1080 Ti. But the 1080 Ti is going through way cheaper prices right now. Honestly, the 1080 Ti firesales look like a way better deal.

    Poor Vega.

      • Krogoth
      • 1 year ago

      Vega is only viable in the high-end gaming market because of Freesync. If Nvidia was willing to enable VESA VRR spec on its desktop/workstation GPUs. They could literately hammer down the last nail in AMD RTG’s coffin in the discrete GPU market.

        • techguy
        • 1 year ago

        Nah. The same die-hards that insist on buying Radeons today to match up with their Freesync monitors would do the same tomorrow. There will always be brand loyalists on both sides.

          • dragontamer5788
          • 1 year ago

          Hey man, if the price was right, I’d buy Vega.

          Vega 56 is looking pretty close to reasonable. If it dips below $400, I’d consider it a good deal. (Assuming the 1070 and 1070 Ti remain ~$400+ or so). Vega 56 is very good at Blender Cycles, and the RTX cards are just straight up out of my budget (so no raytracing for me this generation… not until the prices drop a bit more).

          There are no bad cards. There are just bad prices.

            • techguy
            • 1 year ago

            Fair enough. Personally, I wouldn’t buy a Vega at any price because they offer minimal upgrade from my 1070 (which was a downgrade from my Titan X (Pascal)), even though I have a Freesync display. Value is always relative.

            • dragontamer5788
            • 1 year ago

            I’m running older hardware and an older screen (a 290x at the moment, Freesync monitor). My primary purpose would seriously be running Blender Cycles, maybe toying with ROCm or CUDA. (100% custom code. Tensorflow doesn’t help me, and I’m not doing matrix multiplications).

            Blender Cycles may be updated to use those raytracing cores though. At which point, a 2070 would be the most obvious purchase, but a bit more expensive than I’m hoping for. Otherwise, I only do 3d rendering for fun / hobby purposes. Nothing professional. So Arnold / Corona / V-Ray are all too expensive for me to seriously consider.

            SIMD from GPGPU compute is super-interesting to me from a programming perspective. I want to run more experiments on SIMD devices. AVX is good on CPUs of course, but SIMD-languages like ROCm / HCC, CUDA, and OpenCL are more interesting to me.

            ————

            Its kind of a weird niche. My main decision tree is Vega 56 (Best $$/performance for Blender Cycles right now), or the RTX 2070 (if Cycles updates to use those raytracing features). Depending on Blender’s development cycle and how prices end up going.

            • Liron
            • 1 year ago

            Take a look at Octane. Their upcoming version 4 will have AI lights and denoising that might benefit from the tensor cores, and it will have a free version for hobbyists.

            • dragontamer5788
            • 1 year ago

            Hmm, I got a Threadripper 1950x though. Which performs (in Cycles anyway) roughly as well as a 1080 Ti.

            For animations, I can simply load two Blender instances simultaneously (one CPU instance will use 32-CPU threads, and the 2nd instance will be a higher-priority but use GPUs instead and can be locked to one NUMA node). Or that’s the plan anyway, since Blender doesn’t do hybrid CPU + GPU rendering.

            But in the case of animations, I need so much rendering time that it makes sense to split it up between instances of Blender.

            So a GPU-only renderer like Octane (or Redshift) is a no-go, unless its somehow faster than Threadripper + whatever GPU I end up buying.

            • Liron
            • 1 year ago

            Wow! It never even occurred to me to have one instance of Blender rendering on the CPU and one on the GPU, even though it seems so brilliant in hindsight.
            Mind. Blown!

            (although 2.8 [and 2.79, if you activate it] does have hybrid CPU+GPU rendering and it’s pretty great)

            • dragontamer5788
            • 1 year ago

            To have multiple instances running, you need to do the following:

            1. Create still images, such as .png files
            2. “Overwrite = false”. This means that Blender will NOT overwrite frame 1, or frame 2, etc. etc.
            3. “Placeholder = true”. This means that Blender will create a file named 1.png as soon as possible.

            The combination of #2 and #3 allows you to load “balance” between multiple blender instances. I’ve run 4 at a time for kicks, it works pretty well for animations.

            [quote<] (although 2.8 [and 2.79, if you activate it] does have hybrid CPU+GPU rendering and it's pretty great)[/quote<] It seems best to run multiple instances of Blender, because a chunk of the animation stuff can be single-threaded bound. Setting up the BVH or other data-structures. Furthermore, at the end of any render, there's always a few pixels that take a bit longer than other pixels. So you want to "overutilize" your CPU slightly to get the fastest rendering speed. In the case of Threadripper (16-cores / 32-threads), I've gotten good results (slightly faster in animation speed) with 4x instances of Blender, each with 16-threads assigned, and tied to a particular NUMA node. Its like single-digits worth of speedup, but it helps.

            • Chrispy_
            • 1 year ago

            My issue with Vega is power consumption. You are basically trading efficiency for freesync support, compared to a 1070Ti, for example.

            I do undervolt my Vega as far as possible, and it’s competitive with a bone-stock 1070Ti, but the stupid thing is that you can [i<]seriously[/i<] undervolt those too, so it's not even a fair contest comparing an undervolted Vega to Nvidia. You get freesync and fan noise, or quiet, efficient screen-tearing and animation stutter. The third option has priced itself out of the mainstream market and is a pitifully-limited selection of G-Sync monitors :\

            • dragontamer5788
            • 1 year ago

            For the task I’m concerned in, the competitor to the Vega 56 is not the 1070 Ti… but the GTX 1080 instead. In Blender Cycles tests, [url=http://download.blender.org/institute/benchmark/latest_snapshot.html<]the Vega 56 is comparable to the 1080[/url<]. (1080 wins in BMW... but the Vega 56 wins in classroom, Fishy Cat, Koro, and Barcelona. Sometimes by a [b<]lot[/b<]). I've got a few games I'd like to play, but honestly, the R9 290x I have is more than sufficient for my 2560 x 1080p Freesync monitor. There's this one spot in Sonic Generations which might benefit but... I've been playing mostly low-fidelity games (Chess / Puyo Puyo Tetris / Factorio) recently. The PC games I'm interested in the future are all relatively low fidelity too: Company of Heroes, Trails in the Sky, Civilization 6. (Yeah, I'm falling behind on my gaming schedule... the backlog is real). Honestly, I wouldn't be "high-power" gaming very much on my setup. The most realistic task I'd be doing is again, Blender Cycles. ------------ The wildcard is whether or not the Blender community will support development of RT Core code. I'd be willing to spring for the GTX 2070 (quite a bit more expensive) if it means that Blender will have hardware-assisted raytracing cores at its disposal. The technology is possible, but the Blender community seems relatively mum about supporting this newest feature.

            • HisDivineOrder
            • 1 year ago

            Except the FX 5800.

          • DoomGuy64
          • 1 year ago

          That’s not how Freesync works. It’s NOT locked in to AMD, being a VESA standard. If Nvidia supported the standard, there really would be no reason to buy Vega.

          Also, as long as Nvidia doesn’t physically support the newer standard, the ability to “toggle a switch” doesn’t exist.

          Gsync is literally vsync off with a special DRM signal telling the monitor to use the Ram buffer on the FPGA module to manage adaptive sync. It’s not doing any adaptive work on the card, and that’s how Nvidia was able to shoehorn Gsync into after the fact existing products (kepler) that weren’t released with the capability, and never did support true adaptive. If anything, you could possibly hack a Gsync monitor to work with AMD and even old Win9x video cards once the DRM is cracked.

            • K-L-Waster
            • 1 year ago

            [quote<]If Nvidia supported the standard, there really would be no reason to buy Vega.[/quote<] Don't worry, the AMD Defense Force would find / invent one.

            • DoomGuy64
            • 1 year ago

            Sure dude, whatever you say.

            Probably the only legitimate reason would be that Nvidia is now selling their 104 mid range parts, (previously cards like Kepler 680), for $999, while AMD is selling Vega for half the price. That’s not an argument you can ever droids hand wave over. You can’t justify that with any amount of obscene spittle projecting Nvidia fanboyism.

            • jihadjoe
            • 1 year ago

            With 4352 shaders and 352-bit bus I’m pretty sure the 2080Ti is NOT a 104 part.

            • Krogoth
            • 1 year ago

            Actually G-Sync 1.0 is adaptive sync but it uses proprietary hardware to achieve it. It was developed before VESA’s VRR spec was finalized a.k.a Displayport 1.2.

            That’s why Kepler-era silicon cannot operate anything beyond Displayport 1.1 mode. For example they simply cannot handle 2560×1440@144hz. (I had experienced this first-hand with my previous aging 660Ti and MG278Q combo)

            Nvidia does use VESA’s VRR spec but only on their mobile GPUs. They can just enable it on their post-Kepler desktop GPUs with a simple flag. The only crowd that would be SOL are Kepler-era crowd that are operating first-generation G-Sync monitors (before Displayport 1.2 spec was finalized).

            • DoomGuy64
            • 1 year ago

            Nothing here to disagree with. Only that you aren’t being descriptive enough.

            1. The adaptive hardware is completely implemented in the monitor itself via the FPGA. Nvidia was originally cooperating with VESA, but pulled out to implement a walled garden approach. The finalized VESA spec required hardware implementation on the video card itself, like AMD, but Nvidia bypassed that limitation by implementing VRR on the monitor.
            2. That’s just a side effect of outdated technology.
            3. Mobile only. None of the desktop cards support it. If they did, someone might have modded the driver to work with Freesync. The only way to hack Nvidia hardware onto freesync would be a mxm mobile to desktop converter. However, nvidia fanboys are such sycophants that nobody has ever attempted it. The last attempt of hacking Nvidia’s drivers was the Physx mod, and that pretty much died off as the developers couldn’t keep up with Nvidia’s updates, and the community didn’t support it long term. It would be a colossal undertaking to hack Nvidia’s drivers to use freesync, while it would be much easier to just hack their monitors.

            • freebird
            • 1 year ago

            Some people have a hard time dealing with facts on this site.

            • DoomGuy64
            • 1 year ago

            Like you. That’s why you haven’t brought up a single fact in your statement. If you actually did your research, you’d find that you can’t find anything to back up your religious beliefs.

            Oh, and Nvidia’s mobile chips using freesync isn’t an argument because the desktop cards still require the Gsync FPGA in the monitor. The FPGA is doing all the work, and it wouldn’t be in the monitor if it wasn’t necessary for the desktop cards.

            It’s quite possible for Nvidia to support real VRR on the desktop with new videocards and monitors that use DRM instead of FPGA, not that I endorse them keeping it locked, but that will never happen when Nvidia is making so much money this way. They’re currently selling two devices instead of one like AMD, and that won’t change as long as consumers let it go on. Regardless, as long as that FPGA chip exists, Nvidia is NOT doing VRR on their video card, and it is not possible to use Freesync on videocards missing the hardware support for it.

            • Krogoth
            • 1 year ago

            Wrong, any Nvidia GPU that supports 1.2 Displayport or newer can technically use VESA’s VRR spec. The problem is that it is locked behind drivers. The FPGA module is only for G-Sync monitors. The funny part is that it does create some interesting handshaking issues over Displayport with some non G-Sync mointors.

            You can hook-up a “Freesync” monitor to a laptop equipped with a modern Nvidia GPU and enabled “G-Sync for mobile GPUs”. The monitor will think that is operating Freesync mode.

            FYI, Freesync 1.0 = AMD’s implementation of VESA’s VRR spec.

            VESA’s VRR works entirely on the Displayport 1.2 interface. It doesn’t need any fancy chip or hardware to do it. That’s the beauty of it. G-Sync 1.0 is an artifact that is only around because Nvidia wants a “walled garden” ecosystem for some bizarre reason. They could easily make far more revenue (by stealing mindshare of “Freesync” mointor users) if they just adopt VESA’s VRR spec on their desktop/workstation GPUs (Freesync is only reason AMD RTG is even viable in the discrete market).

            The only losing party would be early adopters of G-Sync 1.0 (Pre-Displayport 1.2 era monitors).

            • DancinJack
            • 1 year ago

            I’m being super picky, but it’s 1.2a, not just 1.2. Agree on all other counts. πŸ™‚

            • DoomGuy64
            • 1 year ago

            Exactly. Which means that Nvidia still doesn’t support Freesync. Anyone looking through the fine details like this, will know that Nvidia has used every trick in the book to not support it.

            1.2a is optional, and Nvidia has OPTED OUT. Therefore the version argument can’t be used until Nvidia supports a version that makes VRR mandatory. No 1.2 Nvidia cards will ever support freesync. This is also why AMD’s GCN 1.0 can’t do freesync. It doesn’t support 1.2a, and Nvidia would also have to release new hardware just like AMD to support VESA VRR.

            edit: Just had an epiphany. All it would take to enable freesync on Nvidia cards is a “Freesync Module”. What do I mean by that? Something like mCable that supports freesync by buffering all unsynchronized frames and converting them into a freesync compatible display format. The product would sit completely outside of drivers, as all the processing would be done in the display cable itself via a small video processing chip. Anyone with the capabilities to produce this device is free to use my idea, or pay me royalties if they want.

            • freebird
            • 1 year ago

            Actually, I was supporting you, but now I might change teams…. ;P

            Oops, now I’m talking with myself instead of DoomGuy (Vega)[super<]64[/super<]Edition.

      • freebird
      • 1 year ago

      Yeah, I’d much rather see performance comparison on 4K without HDR. Since monitors with TRUE HDR are very expensive and doubtful to go much over 60Hz at 4K at the moment (unless you buy the $1000+ monitors).

      I’d love an 5K+ UltraWide screen (21:9) monitor with 120-144hz, TRUE HDR-10 and G-sync or Freesync2, but those are as hard to find as a Unicorn right now; unless you can afford one of the BFG monitors that are supposed to appear by the end of this year.

    • Krogoth
    • 1 year ago

    It looks like predictions on the real-world performance are pretty close based on the released hardware specs of the silicon.

      • techguy
      • 1 year ago

      I wouldn’t agree with that. RTX 2080 runs at similar clockspeeds as 1080 Ti yet has fewer CUDA cores. By all accounts it should be slower, yet NV shows it running at 30-60% faster before factoring in DLSS, which can boost it to over twice as fast.

      There’s more going on with Turing performance than just raw FLOP numbers.

        • Jeff Kampman
        • 1 year ago

        Nvidia is comparing the RTX 2080 to the GTX 1080, though.

        • Krogoth
        • 1 year ago

        Actually, 2080 boosts higher than the 1080Ti. That “30-60% factor” is comparing to the regular 1080 not 1080Ti.

        Turing isn’t going to have drastic changes in its rasterization pipeline over its predecessors. It is pretty much a graphical orientated Volta.

        That’s why Nvidia’s marketing is really focusing heavily on the “Ray-Tracing” acceleration and trying to distance the attention away from Turing’s performance on current content.

        Just like the Geforce 3 back in the day. Geforce 3 was’t really that much faster than the Geforce 2 family for content at the time. The Geforce 3’s main draw was its pixel/vertex shading.

          • techguy
          • 1 year ago

          Right, and the 1080 Ti is on average 30-35% faster than a 1080 so you do the math…

          2080: 30-60% faster than 1080 means it will be up to 30% faster than 1080 Ti πŸ˜‰

        • cynan
        • 1 year ago

        The graph compares the 2080 with a 1080, not a 1080 Ti. A 1080 Ti is 30-40% faster than a 1080, and depending how NVDA are dressing up their sneak peek benchmark (by omission of specific test settings, etc), this could account for most of the difference.

          • techguy
          • 1 year ago

          Also there’s this from Guru3d:

          “We’ve seen some game demos of the same game running on a 1080 Ti and the other on the 2080 Ti, the performance was often doubled with close to the same image quality.”

          [url<]https://www.guru3d.com/news-story/nvidia-releases-performance-metrics-and-some-info-on-dlss.html[/url<] We done with this FUD now? Turing is faster than Pascal, sometimes by wide margins.

            • derFunkenstein
            • 1 year ago

            [quote<]close to the same image quality[/quote<] wait, what? was it the same or not?

            • Eggrenade
            • 1 year ago

            They’re using the tensor cores to fill in some of the pixels to get 4k resolution. This purportedly (and believably, based on my limited understanding of deep learning) does wonders for jaggies, but loses some detail that you probably can’t see at 4k anyway.

            I want to see detailed side-by-side pictures of this trickery before I buy, and I’m sure Jeff will oblige.

            • cynan
            • 1 year ago

            I don’t think anyone is arguing that Turing isn’t at least moderately faster than pascal on a core per core basis. The question is by what margin and which, if any, new rendering short cuts are being used to accomplish it, and what the impact is, if any, on image quality.

            • techguy
            • 1 year ago

            I’m as eager as anyone else to learn the answers to those questions. I also don’t find it to be likely that the new, vastly more expensive parts are going to be slower than their predecessors though, so I preclude that possibility from the list of possible outcomes.

            • Waco
            • 1 year ago

            “We ran 8XFSAA on one card, and a lightweight DLSS on the other…and it was faster! WOO!”

            I’d rather see a comparison of TLAA or TXAA to DLSS, since they’re closer in terms of image quality and performance impact.

            • techguy
            • 1 year ago

            They did run TAA vs DLSS in some demos:

            From Anandtech:

            “Otherwise, NVIDIA presented a non-interactive Epic Infiltrator 4K demo that was later displayed on the floor, comparing Temporal Anti Aliasing (TAA) to DLSS, where the latter provided on-average near-identical-or-better image quality but at a lower performance cost. In this case, directly improving framerates.”

            [url<]https://www.anandtech.com/show/13266/nvidia-teases-geforce-rtx-2080-performance-numbers-announces-ansel-rt[/url<]

            • Waco
            • 1 year ago

            Thanks for the clarification – I only had time to skim. πŸ™‚

    • Redocbew
    • 1 year ago

    This is becoming more and more like the analysis of movie trailers before anyone has seen the movie.

    Someone needs to create an Honest Trailers version of pre-release benchmarks.

    • DancinJack
    • 1 year ago

    Pssssssst…

    wait for reviews

    edit: stupid BBCode

      • K-L-Waster
      • 1 year ago

      You… you mean the vendor supplied numbers… might not be trustworthy?

      MY WORLD IS SHATTERING!!!11!1!

    • mad_one
    • 1 year ago

    Looks like DLSS is using checkerboard rendering or something similiar? Would explain why the DLSS samples look quite blurry. Probably a fair tradeoff at 4K, not so much at lower resolutions. If it is checkerboard, the perf gain is a little disappointing though.

      • techguy
      • 1 year ago

      What?

      No. DLSS doesn’t use traditional rendering techniques. It makes use of the tensor cores, which operate independently of the CUDA cores. Hence why Turing cards are faster with DLSS than any traditional AA methods as they allow all those traditional units to be freed up for other purposes.

        • mad_one
        • 1 year ago

        I haven’t heard anything else either, but DLSS pictures are suspiciously blurry and now Nvidia advertises a ~30% perf gain from using DLSS. TAA uses nowhere near that much perf AFAIK, so something’s up here.

          • techguy
          • 1 year ago

          Upon what do you base the observation that DLSS looks blurry?

            • mad_one
            • 1 year ago

            In Nvidia’s gamescom presentation they showed DLSS and TAA side by side on a scene with a skyscraper in the background. They magnified a gun in the foreground, where DLSS did indeed look better, but the whole background and especially the fine structures on the skyscraper were far sharper in the TAA picture.

            It does make a lot of sense in my head, but Nvidia hasn’t released a lot of concrete information that I can find on DLSS. They did show a lot of research about turning grainy, half finished raytracing pictures into good looking results and combining checkerboard rendering + a temporal component with AI seems like a good application of this for rasterisation.

            But honestly, the perf numbers, the blurry pictures and Nvidia saying it uses deep learning are all I have to go on and I’m curious what they are really doing.

            • cynan
            • 1 year ago

            Well, if it’s like other deep learning, then it might very well be akin to checkerboard rendering, only instead of the pixels to be rendered being determined a priori, they are selected to some degree by a predictive AI algorithm (perhaps based on previous screens, etc).

            • caconym
            • 1 year ago

            It’s taking a lower-res image and reconstructing a higher one using an AI algorithm that’s been trained using sets of both. It also might be pulling in things from the frame buffer that a user doesn’t see, like a normal pass or a depth pass, in order to figure out where edges are and how best to reconstruct them. Denoising the raytraced passes will work similarly.

            • dragontamer5788
            • 1 year ago

            [quote<]AI algorithm that's been trained using sets of both[/quote<] Funny story about that. [url=https://dmitryulyanov.github.io/deep_image_prior<]Convolutional Neural Networks don't actually need any training to improve resolution of images.[/url<] Current research suggests that the structure of a neural network rebuilds images on its own, without any "learning" involved.

            • Redocbew
            • 1 year ago

            That’s badass. I love it when there’s people on a forum smarter than me.

            • caconym
            • 1 year ago

            That’s really interesting! Thanks for the link.

    • tsk
    • 1 year ago

    I pre-ordered a 2080, pleased with 1080ti levels of performance.

      • Chaserx
      • 1 year ago

      $839.00? For an ’80 series? Pass.

        • Voldenuit
        • 1 year ago

        The TU104 in the RTX 2080 has a die that is significantly larger than the 314 mm^2 in the 1080’s GP104, so you might even consider it a bargain.

        The TU102 (RTX 2080 Ti) has a 754 mm^2 die, which is just shy of the 815 mm^2 of Volta ($9k card).

          • Srsly_Bro
          • 1 year ago

          Lol nice post. I can’t get over all the dorks who are chasing die area without performance increase or process recognized.

        • Srsly_Bro
        • 1 year ago

        I paid that much for my 1080Ti ftw 3. The 2080 is about the same performance with newer technologies added.

        Edited for 2080

          • DancinJack
          • 1 year ago

          Think you meant 2080 in the second sentence here, Bro.

            • Srsly_Bro
            • 1 year ago

            I made a correction in a news post on it and I do the same. It’s like January of every year.

            Thanks, bro.

      • Freon
      • 1 year ago

      There have been some great sales on the 1080 Ti lately…

        • freebird
        • 1 year ago

        Yeah, Amazon has the Zotac 1080Ti AMP Edition for $526.50 Srsly_Bro…

    • techguy
    • 1 year ago

    If these numbers hold up in independent reviews, suddenly Geforce RTX looks a lot more attractive. I still want more VRAM though.

      • leor
      • 1 year ago

      How many times have numbers like this held up from either AMD or nVidia?

        • techguy
        • 1 year ago

        Depends on what color your glasses are.

          • leor
          • 1 year ago

          Is best bang for the buck a color?

            • Spunjji
            • 1 year ago

            Sure it is, but it’s rarely green!

            weeyyyyyyyyyy~

        • K-L-Waster
        • 1 year ago

        We’ll have to see — but I certainly wouldn’t recommend buying anyone’s GPU without seeing independent test numbers first.

        • psuedonymous
        • 1 year ago

        For Pascal, [url=https://images.anandtech.com/doci/10326/PascalCraft.png<]Nvidia claimed a 1.7x improvement in non-VR for the 980 to the 1080[/url<]. HardOCP's generation testing found [url=https://www.hardocp.com/article/2018/07/25/nvidia_gpu_generational_performance_part_1/17<]the 1080 consistently in the 60%-70% improvement range over the 980 from launch[/url<].

          • techguy
          • 1 year ago

          You mean a publicly traded company isn’t outright fabricating a public representation of their products? Gee, I wonder why that is…

          Some people around here think the recently released performance figures (GTX 2080 vs. 1080) are not to be trusted “because marketing”. I’m all for skepticism but this is not an attempt at obfuscation on Nvidia’s part. They’re not “hiding” the performance of RTX parts because those figures are lackluster, they’re doing this to give partners a chance to move as many of those old Pascal cards as they can before the new parts hit. This isn’t rocket science, and it’s not conspiracy theory.

            • Redocbew
            • 1 year ago

            There are better ways of informing business partners of plans for the future than to broadcast ambiguous results to the entire world, and then let everyone figure out on their own what it all means.

            • techguy
            • 1 year ago

            According to who? You? Those results aren’t ambiguous to me. Turing is going to do just fine, and anyone that’s lucky enough to own one in the early days will be smiling from ear to ear. I’ll still wait for official results but everything I’ve seen so far looks good from a performance standpoint.

            • Redocbew
            • 1 year ago

            [quote<]Those results aren't ambiguous to me.[/quote<] Yes, that much is clear. It should be ambiguous even if it really isn't, and if it isn't that just makes you an even better example of confirmation bias, so I guess that's something. I wish I could find a way to make this point without being snarky, but snark comes easily, and if I did have some flash of brilliance I'm not sure I'd want to use it on this.

            • techguy
            • 1 year ago

            You can characterize my take however you like. Care to wager how close NV’s numbers match those of the average of independent reviews?

            • Redocbew
            • 1 year ago

            Seriously, bro?

            • techguy
            • 1 year ago

            I’m willing to put my money where my mouth is. Are you? I made numerous offers leading up to the launch of Vega but no one ever took me up. I wonder why. Couldn’t have anything to do with AMD’s track record of over-promise/under-deliver, could it? PM me if you actually believe the things you say and we’ll work out the details.

            • Redocbew
            • 1 year ago

            Well that’s quite amusing I must say.

            I believe we may have to consider fractional Krogoth’s for a case like this. I’m not sure if I should be impressed, or not. Mostly not.

      • NoOne ButMe
      • 1 year ago

      So buy* a 1080ti and get the same performance and more VRAMβ€”and at a lower price.

        • techguy
        • 1 year ago

        More VRAM, sure. More performance? Not a chance. Sure, there might be a corner case you can concoct to make it so, but on average 2080 will be faster than 1080 Ti.

          • Krogoth
          • 1 year ago

          I really don’t think so, especially when you factor in overclocking. The 2080 is most likely near its overclocking ceiling at its max boost speed while 1080Ti has plenty of headroom if you are brave enough.

          This is the whole 980Ti versus 1080 debate all over again.

            • Liron
            • 1 year ago

            Jensen said that the 2080 overclocks like crazy.

            And if you can’t trust someone in a leather jacket, who can you trust?

            • Krogoth
            • 1 year ago

            It will not scale up as much though simply because it has less ROPs, shading units at its disposal. I suspect that GDDR6 is actually overkill for 2080 outside of general compute/”ray-tracing”.

            Nvidia GPU have been very efficient with memory bandwidth ever since Maxwell which is why GDDR5+ did almost nothing for GP104 and GP106.

    • chuckula
    • 1 year ago

    Begun, the vendor-fabricated powerpoint wars have.

      • Neutronbeam
      • 1 year ago

      I believe in these numbers…I also believe in the Easter Bunny, Santa Claus, Bigfoot, a balanced U.S. national budget, fusion as a practical energy source, and Half-Life 3 coming out in my lifetime.

        • chuckula
        • 1 year ago

        At least Nvidia invented one feature (DLSS) that they are claiming actually helps performance. We’ll see if that’s actually true in practice, but it’s better than ray tracing where we know it will hurt performance.

          • Demetri
          • 1 year ago

          Would be nice to know what type of AA on the 1080 they’re comparing it to.

            • jihadjoe
            • 1 year ago

            I assume the 1080 would be running MSAA since they did say ‘traditional’ anti-aliasing techniques.

        • K-L-Waster
        • 1 year ago

        You had me until that ridiculous prediction of HL3….

      • Krogoth
      • 1 year ago

      Fanboys are enriching the online tech threads and community with mountains of salt.

        • Redocbew
        • 1 year ago

        Does that mean nothing will grow there once they’re done?

          • Krogoth
          • 1 year ago

          You need to get Farm Simulator 2019 RTX Edition for that. πŸ˜‰

Pin It on Pinterest

Share This