Nvidia unveils raytracing-focused Quadro RTX GPU family

At Nvidia's SIGGRAPH event this evening, CEO Jensen Huang introduced “the world's first ray-tracing GPU,” the Quadro RTX. This family of graphics processors offers up to “10 gigarays per second,” 16 TFLOPS of single-precision performance, and “500 trillion tensor ops per second,” all courtesy of the Turing microarchitecture.

The first Turing GPU is a 754 mm² monster that comprises a traditional shader array, Nvidia tensor cores, and a new processing resource called the “RT core.”

Huang said the chip can address as much as 48 GB of frame buffer, and two Quadro RTX cards with this chip can combine resources to create a 96-GB pool of coherent memory using NVLink. The Quadro RTX product Huang was showing appears to have a 384-bit interface to GDDR6 memory running at 14 Gbps for a total of 672 GB/s of memory bandwidth.

Developing…

Comments closed
    • anotherengineer
    • 1 year ago

    ahhh the signature black leather jacket.

    They really need to make a Black Leather Ed. card, with real leather coating the cooler. I bet it would sell out at insane prices just cause.

    • USAFTW
    • 1 year ago

    Interesting that they got both the TU100 and 104 variants ready to go simultaneously. I wonder if this means they’ll also release the TU104 and TU100 (2080 and 2080 Ti) GeForces at the same time?

    • tipoo
    • 1 year ago

    Winding down, no geforce today then

      • NTMBK
      • 1 year ago

      It’s SIGGRAPH, what did you expect? Gamescom is next week, we’ll hopefully see Geforce there.

    • RtFusion
    • 1 year ago

    And RTG (AMD) is once again a plastic bag whirling in the wind…

    It will be a very long time before RTG (AMD) goes toe-to-toe with nVidia again like they are with Intel now.

    Really makes me think what the hell happened at RTG for them to fall behind like this. I would really want them to complete because I don’t want to nVidia becoming the only high-end solution forever.

    AdoredTV posted a leak on the GeForce side of things for the RTX series not too long ago and it does look bright for nVidia but very bleak for AMD in this market. His comments near the end of the video (worth watching) are really depressing on the outlook of competition in this market.

    [url<]https://www.youtube.com/watch?v=ms7HQ7rckpA[/url<]

      • Krogoth
      • 1 year ago

      AMD RTG is going back to ATI’s former roots.

      They are going after customer-tier and OEM graphics with iGPUs and semi-intergrated solutions. It makes perfect fiscal sense from their standpoint. AMD RTG’s discrete GPUs haven’t made massive revenue since Evergreen. The failure of Tathti to recapture the high-end GPU market pretty much sealed it. Hawaii never had a chance against the Maxwell family. Vega was taping out by the time Pascal was hitting retail channels.

        • RtFusion
        • 1 year ago

        You are right on all accounts and that is just depressing on from my perspective. I am planning a build next year when the next gen Zen2 architecture comes out and I was hoping to hop onto the AMD bandwagon again (last AMD gpu I owned was the HD 7970 Ghz Edition, beastly GPU back then). But their future roadmaps don’t have anything competing in the high-end market until 2020, which leaves me with nVidia to go to.

        I know RTG is re-focusing on making money (because AMD as a whole still needs it) but relinquishing a very lucrative market such as PC Gaming Graphics, to just nVidia eliminates choice and forces game developers to construct all of their game engines around a single architecture. This whole ray-tracing push this year, nVidia is all in and barely a peep from RTG.

        I just don’t like how this will look like for the next several years for graphics and I think prices are just going to go up as a result. Worse still, have to pay in Canadian dollars…

          • christos_thski
          • 1 year ago

          Perhaps intel’s upcoming foray into the discrete GPU market will bring some much needed balance. I have, unfortunately, lost all hope for AMD’s graphics group competing in the near future, too.

          I think raytracing has a long way to go before it becomes a serious option in the gaming market (barring adoption at gen PS6 or Xbox Three, which would spell literal disaster for the Radeon group).

          And I actually think even intel has better hopes of competing on a GPU accelerated ray tracing market. Wasn’t Larrabee supposed to be far and ahead of everything at ray tracing (as opposed to the mediocre rasterization end results that ultimately killed the project at home) ?

          PS: By the way, I consider the line of thought that more money at made at the oem market, so lagging far behind in performance is not important, to be somewhat weak. Wasn’t AMD’s cpu division supposed to be doing exactly that until they started pursuing performance with Zen? And we all remember how well it had been going before Zen.

    • Krogoth
    • 1 year ago

    I’m willing to bet that RTX is going to be the new “GTX” brand moniker and Nvidia is going to try make real-time”ray-tracing” a gaming reality.

    If this is the case then customer-tier version are going to be a repeat of Geforce 3 launch.

      • JustAnEngineer
      • 1 year ago

      When they brought out the GeForce 4 MX, which was actually two generations [b<]older[/b<] (GeForce 4 MX was a rebadged GeForce 2 MX which was much less capable than GeForce 2 GTS).

        • Krogoth
        • 1 year ago

        I suspect they might attempt a repeat of similar shenanigans by making future “GTX” line-ups have no hardware support for Nvidia’s “Ray-Tracing” acceleration.

        RTX = Ray-tracing acceleration support

        GTX = No Ray-tracing acceleration support

        But Nvidia’s marketing could easily throw a curveball in there and you’ll end-up needing a decoder ring to make sense of it.

      • tipoo
      • 1 year ago

      I’e been thinking something like that, if the ray tracing part of the Geforce versions is cut down (as this is a pretty big die), it might also be like early tessellation units that weren’t quite powerful enough to really use.

      • DoomGuy64
      • 1 year ago

      They are doing exactly that from what I’ve heard, and sponsoring ray-tracing via gameworks, which will literally obsolete every card that is not “R”-TX branded. Those multi-grand Titan cards are now paper weights against 104 series mid-range parts.

      Of course, this new branding and planned obsolescence push will no doubt alienate the existing base, especially by killing a long running and loved naming scheme and confusing users between GeForce and Radeon, as well as angering Titan owners who cannot acceptably run the new gameworks effects.

      Am I surprised by the new depths Nvidia will sink to scam people out of money? No. That peaked with gameworks, Kepler, and Gsync, and this is merely more of the same. The only real answer to this is just disabling gameworks, and ignoring the manufactured hype.

        • NTMBK
        • 1 year ago

        They’re investing money to add [i<]extra[/i<] features. If you can't run the new features, just disable them and move on with your life. Anyone who gets angry because they can't run shiny reflections in a computer game needs to get out more. Would you prefer NVidia to sandbag their graphics software engineering, and keep everything stuck at a level where it could run on a GTS 450?

          • K-L-Waster
          • 1 year ago

          [quote<]Would you prefer NVidia to sandbag their graphics software engineering...[/quote<] If you've been following his posting history, I think his preference would be for NVidia to sandbag the entire company.

            • DoomGuy64
            • 1 year ago

            No, just play fair and stick to standards instead of walled gardens.

            • K-L-Waster
            • 1 year ago

            Which means what, never add anything new? By definition, anything new isn’t part of a standard.

            Your idea of “play fair” seems to be “don’t release anything AMD can’t beat.”

    • chuckula
    • 1 year ago

    I will give Nvidia credit on having a much more convincing panicked response to Threadripper 2 than Intel did.

    • just brew it!
    • 1 year ago

    Practical real-time ray tracing is like practical nuclear fusion: It has been coming “soon” for a really long time. I’ll believe it when I see it being used in real products, running real software (not just benchmarks), producing visuals which are quantifiably better than those produced by mainstream GPU tech, at reasonable frame rates.

    • chuckula
    • 1 year ago

    Nvidia needs to have a company museum with two exhibits.

    One of their GPUs.

    And another of the leather jackets that Jen Hsung has worn on stage.

      • tipoo
      • 1 year ago

      So one leather jacket?

        • chuckula
        • 1 year ago

        He’s got one on that’s definitely different than what he’s worn in the past.
        Notice the lack of piping on the ray-traced arms!

          • tipoo
          • 1 year ago

          Hmm. I assumed he was Jobsing it.

    • chuckula
    • 1 year ago

    Big Turing is 754 mm^2. The only chip that Nvidia makes bigger is Volta. That ain’t gonna be cheap!

      • anotherengineer
      • 1 year ago

      Nvidia is becoming like Apple. They can command a premium for anything and people will pay it.

    • tipoo
    • 1 year ago

    “And this GPU…Reflects light perfectly into the camera…Because it has the force”

    That was exactly the type of awkward leather clad humor I was tuning into this for.

Pin It on Pinterest

Share This