Nvidia teases potential GeForce RTX 2080 launch at Gamescom

Hot on the heels of CEO Jensen Huang's Quadro RTX announcement last night, Nvidia released a teaser video for its upcoming festivities at the Gamescom convention in Cologne, Germany. That video contains a number of winks and nods to the fact that next-generation GeForce cards are coming, as spotted by PCWorld's Brad Chacos on Twitter.

Along with gauzy shots of what looks like a finned, open-style graphics cooler and a stylized backplate that looks like it could have come off any recent Founders Edition card, the video depicts gamers building and gearing up with Nvidia hardware.

The first tip-offs that new hardware might be revealed at Gamescom come from shots of our GeForce gamers chatting. One of the chatters' handles is “RoyTeX,” whose capitalization almost certainly alludes to the RTX branding introduced last night on the Quadro RTX series of graphics cards.

RoyTeX also chats with fellow gamer “Not_11.” 

Another group of chatters includes “Mac-20” and “Eight Tee.” If Nvidia was laying it on any thicker here, it could star in a Gamers Nexus thermal-paste-application-testing article.

Even the soundtrack to this teaser is suggestive: it's a version of Sam and Dave's “Hold On, I'm Comin'”, which would seem as clear a sign as anything that something is indeed going to be revealed or launched at the show.

The final tip-off comes at the end of the video, where the numbers 2-0-8-0 scroll up in order to form the date of Nvidia's announcement: August 20. We'll be watching for more details as they arise.

Comments closed
    • RtFusion
    • 1 year ago

    Videocardz just dropped a couple of bombs ahead of Gamescom:

    [url<]https://videocardz.com/77369/nvidia-geforce-rtx-2080-ti-features-4352-cuda-cores[/url<] [url<]https://videocardz.com/newz/exclusive-msi-geforce-rtx-2080-ti-gaming-x-trio-pictured[/url<] [url<]https://videocardz.com/newz/nvidia-tu104-gpu-and-geforce-rtx-2080-pcb-exposed[/url<] [url<]https://videocardz.com/newz/palit-geforce-rtx-2080-ti-and-rtx-2080-gamingpro-series-unveiled[/url<] [url<]https://videocardz.com/newz/msi-geforce-rtx-2080-ti-and-rtx-2080-duke-pictured[/url<] Surprised that the TI version was found out so quickly. Aren't they usually delayed 6 or more months after the GTX *80 comes out? Anyway, this is exciting stuff. EDIT: And judging by those photos, I think nVidia is also bringing nVlink to the masses. Look how large those top connectors. I've only see that shape of connectors on the Quadro GV100, Titan V, and the most recently announced Quadro RTX series. Would be very interesting if they also bring the memory pooling feature over. If they do bring it over, wouldn't be surprised if they bring the bandwidth down a few notches from 100GB/s for the Quadro RTX series.

    • ronch
    • 1 year ago

    If I’m gonna upgrade from my 5+ year old HD7770 (which is just fine forb King’s Quest 3, thank you) I’d want a card that’s as efficient or more efficient than the competition and isn’t marked up 2x or 3x the MSRP. And maybe 3x faster too so I’ll get smoother gameplay in Space Quest 5. It’s more demanding than KQ3, you know.

    • Wilko
    • 1 year ago

    Aside from an ATI Rage 128, my video card upgrade path has been MX 440, FX 5200, 8800 GTS, GTS 250, an open box 580 GTX, and finally a 1080 ti. Are they going to go back to three digits when they hit the 9000 series again?

    • Kretschmer
    • 1 year ago

    My 1080Ti welcomes the competition! It had a good run…

    • ronch
    • 1 year ago

    In reference to Mortal Kombat, this thing should be called GTX 2080 FATALITY Edition for obvious reasons. (Not Fatal1ty. Oh gee let that rest already.)

    Anyway, from 1080 to 2080? Is this a 10-generation jump?

      • auxy
      • 1 year ago

      First number increments by one as usual. (‘ω’)

        • ronch
        • 1 year ago

        980 -> 1080 -> 1180.

        Right?

          • auxy
          • 1 year ago

          No? (´Д⊂ヽ
          980 -> 1080 -> 2080. “1180” is incrementing the second number.

            • ronch
            • 1 year ago

            The last two numbers have always indicated the performance of that part within a certain generation. The first digit back when it was still 3 digits has always indicated the generation since 4xx (IIRC). And since 10 comes after 9, another digit was added to have 10xx. Obviously, 11 comes after 10, not 20.

            • Glix
            • 1 year ago

            Unless it resets on a 1 to 9 range. So 9 -> 1 (0) -> 2 (0).

            😀

            • ronch
            • 1 year ago

            Wat

            • Klimax
            • 1 year ago

            They are incrementing first digit, not number. (Partially following Intel)

      • jihadjoe
      • 1 year ago

      Yeah this makes me irrationally angry.

      The major version number in GTX 1080 was 10, not 1. Going by old DOS-like notation the cards would have been 10.60, 10.80 and so on. The next logical step up should be 11.

    • leor
    • 1 year ago

    Why 2080 and not 1180?

      • tipoo
      • 1 year ago

      Naming scales better I guess?

      1480 vs 5080 etc

      • Leader952
      • 1 year ago

      Because the changes are major not minor.

      • Krogoth
      • 1 year ago

      It is because the number eleven is bad at marketing and branding.

      • smilingcrow
      • 1 year ago

      Because Jensen Huang doesn’t go to 11 even though he wears a black leather jacket.
      He’s trying too hard to be all rock ‘n’ roll but he’s really The Fonz of GPUs.
      When he shows a ray tracing tech demo of himself jumping a shark on a surfboard that’s the sign to sell your Nvidia stock.

        • auxy
        • 1 year ago

        If Huang shows a graphics demo of himself jumping a shark on a surfboard at next year’s SIGGRAPH, complete with leather jacket, I will put every single dime of my savings into Nvidia stock. (;´・ω・)

      • LocalCitizen
      • 1 year ago

      better performance

      [url<]https://twitter.com/VideoCardz/status/1027903827649933312[/url<] :p

      • jihadjoe
      • 1 year ago

      Radeon 580
      [======]

      Rx Vega 64
      [=]

      Nvidia GTX 1080
      [===========]

      Nvidia RTX 2080
      [=====================]

        • K-L-Waster
        • 1 year ago

        So why not call it the RTX 2080808080808080?

          • Klimax
          • 1 year ago

          Intel would have a word with Nvidia.

          😉

            • K-L-Waster
            • 1 year ago

            Hmm, so maybe the 208020802080208020802080208020802080208020802080 then?

    • USAFTW
    • 1 year ago

    They didn’t go “Poor Volta”? I am disappoint.

    • Waco
    • 1 year ago

    The little marketing stunt they’re pulling on Facebook isn’t much more subtle. They uploaded “all green” logos, banners, etc over the last few days that have hidden text.

    I modded one in photoshop just to see if it was anything interesting but since it’s a marketing stunt I just gave up after the first one. I figure someone with some free time will do it anyway.

      • chuckula
      • 1 year ago

      You didn’t use a CUDA accelerated edge detection algorithm to get the text did you?

        • Waco
        • 1 year ago

        Photoshop. 😛 “Solid” colors always make me think of steganography.

    • tipoo
    • 1 year ago

    The ray tracing hardware took up a pretty big amount of die on the Quadro. Curious if the ratio stays the same on the Geforce, and if it shrinks, if that will soon end up like the first gen of tessellation units, where it wasn’t quite enough to do all devs wanted and the generation after it is where things really get going.

    [url<]https://techreport.com/r.x/2018_08_13_Nvidia_announces_raytracing_focused_Quadro_RTX_GPU_family/turing1.png[/url<]

      • DancinJack
      • 1 year ago

      It’ll take at least another generation for devs to move code over anyway. I won’t be expecting a lot of RT in games for a while.

    • Krogoth
    • 1 year ago

    *Stupid double post* 😛

    • Krogoth
    • 1 year ago

    Geforce 3 launch redux

    • DancinJack
    • 1 year ago

    I think the video is hilarious. My friends and I are older now, so we don’t say “legooooooo” but a lot of those convos are similar to ours.

    :You gettin on?
    :yeah, shower/eat/putting the baby down, and i’ll be on. gimme 20

    :gonna play OW – get in discord

    • jihadjoe
    • 1 year ago

    This is obviously filled with RadeonTechXnology!

    • tsk
    • 1 year ago

    Inb4 RIP AyyMD

      • techguy
      • 1 year ago

      In all seriousness, the competitive situation in desktop graphics was already quite bad, it just got worse. I don’t know what that’s going to do to prices but I doubt it’s good.

        • NoOne ButMe
        • 1 year ago

        It *might* not be quiet so bad If the ratio of CUDA cores to Tensor cores and Raytracing stuff is the same.

        As most stuff still will not use that, or will use it to enhanced the game, and otherwise, well.

        The only die we know so far is 60% larger die with 23% more FLOPs.

        Of course even a small gain for Nvidia would completely overwhelm the strained Vega cards when it comes to competing with the 1070/1080.

        And anything using raytracing will leap ahead like mad, of course.

        But I have a feeling we will see all the tensor cores, and most-if-not-all the Raytracing stuff dumped. Which means it’s a lot better for Nvidia/worse for AMD.

          • Beahmont
          • 1 year ago

          That’s sounds like a good bet at first glance, but Nvidia is able to make it’s Ray Tracing work only because of the tensor cores running a custom algorithm, as opposed to using an AI to help with the Ray Tracing.

          I personally think it’s highly likely that while the number of Tensor Cores will be reduced, that there will still be some on the cards if they actually have Ray Tracing fixed hardware.

          • techguy
          • 1 year ago

          I don’t think Nvidia is going to rebrand the high-end Geforce lineup as “RTX” instead of GTX and dump all the ray-tracing acceleration units. Makes no sense to do that. Cut them down compared to their Quadro RTX brethren? Maybe. We’ll see, hopefully before too much longer because I have been waiting for what feels like forever.

        • Spunjji
        • 1 year ago

        Yeah am not happy. Given what Nvidia has done with pricing the past couple of generations and the new super-l337 naming scheme, I can imagine they’ll price these as “optional upgrades” rather than successors.

    • JosiahBradley
    • 1 year ago

    Why so cringy nVidia?

      • psuedonymous
      • 1 year ago

      “Poor Volta”

      • RtFusion
      • 1 year ago

      The leather jacket demands it.

        • DoomGuy64
        • 1 year ago

        It’s the new turtleneck.

      • auxy
      • 1 year ago

      What is cringey about it? It seems pretty realistic to me.

        • JosiahBradley
        • 1 year ago

        I guess I’m finally getting old if that’s how people communicate these days. T_T

          • auxy
          • 1 year ago

          [b<]YEET![/b<] (/・ω・)/

            • JosiahBradley
            • 1 year ago

            I had to google that…

            • Kretschmer
            • 1 year ago

            See? There are cringier things, like forced ASCII emojis.

          • K-L-Waster
          • 1 year ago

          Have you told anyone to get off your lawn today?

Pin It on Pinterest

Share This