Radeon Pro Duo spearheads AMD’s push for VR dominance

GDC—At its Capsaicin event this evening, AMD took the wraps off a wide range of software and hardware projects that put the spotlight on virtual reality. The company boasts that its products underpin 83% of VR-capable entertainment systems around the world, a figure driven in large part by the company's presence in the console space. AMD is also exploring VR opportunities beyond gaming in the health care, education, media, training and simulation, and entertainment industries.

First and foremost, the company is releasing a new graphics card called the Radeon Pro Duo. This is the dual-Fiji card (previously known as Gemini or the Fury X2) that CEO Lisa Su first showed off in June of last year. The card comes with 8GB of HBM VRAM. Like the Fury X before it, this card relies on liquid cooling to manage its prodigious heat output. According to AnandTech's Ryan Smith, the card will come with ISV-validated drivers for professional applications, much like AMD FirePro cards.

The Pro Duo is the first card in what AMD is calling its "Radeon VR Ready Creator" program—products meant to serve double duty as powerful VR development and playback platforms alike. The company says the card will deliver 16 TFLOPS of compute performance, and it'll be available early in the second quarter of this year with a $1500 price tag.

Polaris may be the first step in AMD's next-generation GPU architectures, but Radeon Technologies Group VP Raja Koduri also shared a tantalizing look at the company's next-generation roadmap. While the company does admit this roadmap is subject to change, that projected info does offer a first look at when we can expect various features (like HBM2) to arrive on future AMD graphics cards.

AMD is also partnering with Crytek to put Pro Duo-equipped PCs in universities around the world as part of Crytek's VR First project. Crytek and AMD plan to work together to promote VR development for head-mounted devices like the Oculus Rift and HTC Vive using AMD's LiquidVR SDK. LiquidVR is a set of development tools for virtual reality applications. AMD says that its tools let devs perform GPU-accelerated head tracking, harness multiple graphics cards to scale rendering performance, pass head-tracking data to the GPU with a minimum of latency, and ease the process of connecting and displaying VR content on VR headsets.

Along with the VR Creator product tier, AMD is also introducing a "Radeon VR Ready Premium" badge that will identify Radeon graphics cards and Radeon-equipped PCs that should offer a high-quality VR experience. One of those systems is the HP Envy Phoenix desktop we looked at a couple weeks ago. Cards from the Radeon R9 290 series and up should be eligible for the VR Ready Premium badge.

Last, but certainly not least, AMD showed off a version of its Polaris 10 GPU running Hitman. From what we know so far, Polaris 10 is AMD's "console-class" graphics card for thin-and-light notebooks. The company expects that graphics chips and cards using Polaris 10 silicon will be able to deliver as much as two times the performance per watt of Nvidia's GeForce GTX 950. The company says that it demonstrated Polaris 11 in December of last year. If that's the case, a Polaris 11 chip running Star Wars Battlefront at 1080p and 60 FPS drew about 90W at the wall. A similarly-configured, GTX-950-equipped PC drew about 140W during that same demo.

Comments closed
    • kamikaziechameleon
    • 4 years ago

    Dude, that case is awesome!!!

    • swaaye
    • 4 years ago

    Companies sure are betting on the helmets.

    • ronch
    • 4 years ago

    I bet this card is more effective in giving the player a sense of ‘being there’ in games set in hot environments… like the desert.

      • auxy
      • 4 years ago

      I gave you a +1 because I smirked. (‘ω’)

    • Mr Bill
    • 4 years ago

    Any chance of a full review vs the Titan X?

    • Dysthymia
    • 4 years ago

    The dual GPU FuryX reveal rather than a juicy Polaris reveal was a letdown. Still though[url=https://i.imgur.com/3W30DLJ.jpg<]...[/url<]

    • tbone8ty
    • 4 years ago

    did anybody see damage at the event?

      • DancinJack
      • 4 years ago

      Nope.

      • Jeff Kampman
      • 4 years ago

      Scott made a brief appearance on stage toward the end of the webcast when they did an all-hands thing.

    • southrncomfortjm
    • 4 years ago

    AMD really isn’t very good at the Dual GPU cards. They always seem to have significant stutter and therefore don’t enhance the user experience in a proportional amount to their cost. Yes, more FPS, but highly inconsistent frame times.

    This is a big “meh.” Let’s get some real hardware out that is really next gen, then we can talk.

      • chuckula
      • 4 years ago

      Both Nvidia & AMD have those problems.
      Multi-GPU cards are nice for compute though where things like “stutter” are irrelevant and the two GPUs operate pretty much independently from each other.

      While I’m not pretending to be a Vulkan expert, I have peeked at the API docs and in some of the initial setup code you are basically responsible for querying the hardware to get a “VKDevice” instance that is used as the target for graphics. Of course, multi-GPU setups can return multiple VKDevices and you can theoretically throw graphics at both in a much more explicit way than is possible with older APIs. That’s really nice theoretically, but it also means a crapton of work for developers and the potential to get awesome or horrific results since none of this is given to you “magically” but instead requires strict attention to detail.

      • Cuhulin
      • 4 years ago

      The dual would seem to do two things here: (1) it’s something to introduce now that VR is shipping so that people buying Rifts or Vives can buy new video cards and have those be ATI; and (2) ATI has its async timewarp system for VR, and it’s probably easier to devote the second GPU to “need it now” tasks.

      Many have of us have speculated that Damage is at AMD because of his work on stutter, among other matters, so I am sure that they will be conscious of it.

    • Krogoth
    • 4 years ago

    Nvidia has been using names of famous physicists as codenames, while AMD uses circumpolar stars in the Northern Hemisphere as codenames.

    Seems legit.

      • tipoo
      • 4 years ago

      So basically to both of them

      [url<]https://www.youtube.com/watch?v=IRsPheErBj8[/url<] ?

      • Meadows
      • 4 years ago

      They are both things people look up to.

    • djayjp
    • 4 years ago

    Anyone have any idea what case that’s based on? Looks Maingear-ish no?

    *edit: nvm I see the psu branding now. Though I wonder where/if one can buy just one of those cases….

    • Unknown-Error
    • 4 years ago

    So no HBM2 until 2017?

      • DPete27
      • 4 years ago

      It appears so. Even Nvidia won’t have HBM2 on anything but GP100 (Titan) this year. On the upside, it looks [so far] like the mid to high end cards will carry GDDR5x at least.

    • vargis14
    • 4 years ago

    I will believe 2.5 times as power efficient when I see it…wake me up when they arrive and fail to impress. I HOPE i AM WRONG BUT?

    • christos_thski
    • 4 years ago

    Everyone’s waiting with baited breath for the finfet gpus. They’ve been so hyped we’ve been made to expect nothing sort of gtx970-levels of performance at 950/960 priceranges. Instead we get fake pascal mock-ups by nvidia, pointless “it runs Hitman” demos by amd (no fps, no system specs, no nothing) , car and compute talk by nvidia, lastgen 1500-dollar cards by amd…

    Both nvidia and AMD have hyped finfet GPUs to the moon and back. Even non-techies postpone upgrades waiting for those. They’d better deliver, lest GPUs continue well onto their path of becoming as interesting as the CPU “race” snoozefest…

      • chuckula
      • 4 years ago

      Remember when Intel introduced finfets in 2012 and all people could do was scream about how much they sucked and how Intel’s process must be broken because all those stupid finfets seemed to do was make mobile parts more efficient?

      Yeah, well it looks like in 2016 that everybody else’s finfets are mysteriously “broken” just like Intel’s were way back in 2012.

      It must be Intel sabotaging everybody else, that’s the only explanation.

        • christos_thski
        • 4 years ago

        I abhor integrated GPUs, but at their respective rates of improvement, might we see igpus actually catch up with discrete GPUs’ advances in 5 years or so? It seems discrete graphics are going all “incremental” and power efficient on us.

          • brucethemoose
          • 4 years ago

          Intel’s die shrinks aren’t happening at a breakneck pace anymore, either. I expect they’ll be stuck on 14nm for a long time.

            • NeelyCam
            • 4 years ago

            10nm Cannonlake coming next year:

            [url<]http://www.legitreviews.com/intel-mobile-platform-roadmap-shows-kaby-lake-cannonlake-still-coming_179859[/url<]

          • dragontamer5788
          • 4 years ago

          iGPUs are already faster than cheaper dedicated GPUs.

          The problem is that a dedicated GPU can have GDDR5 and then utterly pwn the iGPU, which is sharing a memory-bus with the CPU. The more expensive A10 APUs from AMD don’t have a very large practical effect on FPS, its clear that they are memory-starved.

            • Beelzebubba9
            • 4 years ago

            Also discreet GPUs can consume many times the amount of power of an iGPU, so on some level that will always cap performance.

      • bfar
      • 4 years ago

      Cards are being kept very close to the chest (literally). Everything is pointing towards an unusually competitive year.

      28nm to 16nm is an unusually large die shrink. The products will be more compelling than we’re accustomed to, so there’s a real killing to be made.

      • Shobai
      • 4 years ago

      [in case you’re interested, you’re probably after ‘bated’ rather than ‘baited’ unless you’re talking about having halitosis bad enough to kill people]

        • christos_thski
        • 4 years ago

        ahahaha, thank you, good sir. seems to be a common mistake but nice of you to point that out, it’s rather embarrassing. so much so that I’m pulling the almighty “english is not my native language” card for this 😉

          • Shobai
          • 4 years ago

          It’s all gravy, baby =) I make enough mistakes myself that I try to make sure I only point out someone else’s in good grace. Congrats for having a second language!

    • DrDominodog51
    • 4 years ago

    That atomic wing clearly wasn’t very spicy, if he was still speaking well after that.

    • chuckula
    • 4 years ago

    Quick question for the flash-enabled people who could actually watch the presentation: What exactly was the meaning of “Capsaicin” in all the marketing hype leading up to this thing? Was there some big reveal that made it all make sense?

      • Jeff Kampman
      • 4 years ago

      Raja Koduri likes extremely spicy food. That’s about all there is to it.

        • chuckula
        • 4 years ago

        This is kind of like a combination of finding out the “twist” ending of an M. Night Shamylalalaaman movie and when Ralphie finally gets his decoder ring and the secret message is “BE SURE TO DRINK YOUR OVALTINE.”

      • DPete27
      • 4 years ago

      AMD is [b<]HOT!!![/b<]

        • anotherengineer
        • 4 years ago

        RTG is HOT!!!

        ftfy 😉

          • jihadjoe
          • 4 years ago

          ATI is HOT!

          • tipoo
          • 4 years ago

          Which is a fair point, as also conspicuous was the RTG branding continuing. I kind of feel like Zen is the last shot before an RTG spinoff is the game plan.

      • ronch
      • 4 years ago

      I think using ‘capsaicin’ is pretty lame. I mean, yes I know it’s what makes chilly hot but… WTH. [Scoff]

        • Mr Bill
        • 3 years ago

        Its the molecule that makes any species of chili pepper hot.

      • Mr Bill
      • 3 years ago

      Model# based on Scoville Heat Scale?

    • brucethemoose
    • 4 years ago

    Anyone else notice the conspicuous “HBM2” label on Vega… And not on Polaris?

      • ImSpartacus
      • 4 years ago

      That’s how I interpreted it. It’s too soon for HBM2 or GDDR5X.

      • chuckula
      • 4 years ago

      Aside from cost issues the performance envelopes of these Polaris parts don’t require HBM2 levels of memory bandwidth.

        • brucethemoose
        • 4 years ago

        It scales down, right? A single stack on a tiny interposer would still help with power consumption and PCB size.

          • ImSpartacus
          • 4 years ago

          It helps, but not as much. High end single GPUs are pretty much capped at 250W for all intents and purposes.

          Lower end GPUs have more wiggle room to tolerate the extra power from GDDR5.

          • chuckula
          • 4 years ago

          A single stack would help power consumption.. but not by as much as you’d think when the GPU was already not going to be pushing lots of high-speed GDDR5 over a wide bus.

          A “tiny” interposer (relatively speaking) would have a nominal reduction on PCB size, but then again, it isn’t going to make a notebook grade discrete GPU fit into some entirely new radically smaller notebook design than what you can do now.

          The problem is that a “tiny” interposer still has the large fixed manufacturing costs that are associated with assembling these units even if the variable costs are a little lower than for a large HBM card.

          Near the end of the discussion in the latest TR video blog with Kanter he mentioned that he had heard from reliable people he knows in the industry that AMD was spending at least as much (if not more) on the HBM + packaging for Fury cards as they are actually spending on the very large GPU silicon itself. Don’t underestimate how expensive this stuff can be and how it quickly becomes uneconomical when it needs to be paired with a notebook having a sale price of $600 instead of just the GPU going for $600.

            • brucethemoose
            • 4 years ago

            That makes sense.

            I was thinking it would help from a competition standpoint, where HBM would let a small mobile card do more in the same TDP/form factor than the competition. But that doesn’t really matter if it doubles your production costs and kills your margins.

      • willyolio
      • 4 years ago

      has anyone else not read the memo that Polaris is a midrange/low power part? that’s all it has ever been, and yet rumour sites keep deliberately misinterpreting Koduri’s interview and say it’s a high end part.

      it isn’t. it was never meant to be. and there’s no reason to use HBM2 on a midrange part.

        • brucethemoose
        • 4 years ago

        I always thought “Polaris” was a whole range of GPUs, and that the first one just happened to be a small one. But that might not be the case.

      • Anomymous Gerbil
      • 4 years ago

      Ars explains it a bit better:

      [url<]http://arstechnica.com/gadgets/2016/03/amd-radeon-pro-duo-revealed/[/url<] And Anand as well: [url<]http://www.anandtech.com/show/10145/amd-unveils-gpu-architecture-roadmap-after-polaris-comes-vega[/url<]

    • brucethemoose
    • 4 years ago

    A dual-Fiji release now is sort of disturbing, as it suggests 14nm AMD GPUs that can actually match Fiji are still a long way away.

    EDIT: The best case scenario I see is a 7990/290X situation, where they were like 6 months apart. But that wasn’t even a die shrink.

    • Chrispy_
    • 4 years ago

    Mmmmm, Polaris is 2.5x perf/Watt over 28nm?

    Even if they’re talking about Greneda/Hawaii instead of Fiji, that’s a 980Ti SLI beater in a single 250W card.

      • ImSpartacus
      • 4 years ago

      Based on the timing in the graph, folks are assuming that it’s Tonga.

        • DPete27
        • 4 years ago

        Likely this. But it’d be funnier if they were referring to Tahiti.

    • flip-mode
    • 4 years ago

    Is Krogoth as unimpressed as I am?

      • chuckula
      • 4 years ago

      Guess what AMD’s #1 selling point on chips that people will actually buy is: Performance per watt.

      Yeah, it’s just as snooze-inducing when AMD does it as when Intel does it.

        • ImSpartacus
        • 4 years ago

        Well shit, their selling point today is perf/$. What more do you want?

          • DPete27
          • 4 years ago

          They’ve clearly been losing market share to Nvidia because of perf/watt, so it only makes sense that AMD would focus on that.

            • ermo
            • 4 years ago

            ^ This.

            • ImSpartacus
            • 4 years ago

            Do you think it’s because people like perf/w?

            Idk, Nvidia already had a lead and they’ve maintained the image that they have better drivers along other things. Nvidia also came out with g sync first and people seem to associate g sync with variable refresh.

            • DPete27
            • 4 years ago

            I associate G Sync with big business caring more about making a buck than the betterment of an industry.

            • ImSpartacus
            • 4 years ago

            I meant to say “laymen” when I said “people”.

            And I know this stuff gets murky very quick. I’ve just anecdotally noticed a hard bias towards Nvidia and Nvidia tech.

            • chuckula
            • 4 years ago

            AMD hasn’t lost market share over perf/watt as a metric that people actually care about directly.

            AMD has lost market share because their perf/watt disadvantage has made it difficult to get the necessary level of performance into a workable product.

            For example, the 290X cooling debacle comes to mind, although even though the 390X isn’t really any better it has been more well received because at least they got the cooling solution right so the 390X is doing better in the market even though it might have higher power draw.

            • DPete27
            • 4 years ago

            [url=https://techreport.com/forums/viewtopic.php?f=3&t=109936<]TR gerbils have shown[/url<] that Hawaii can be tuned back to achieve impressive peformance/watt characteristics. The problem with that is it's not how the cards are shipped/marketed/priced. I think there's a general consensus (not just me) that current AMD GPUs are less....elegant in achieving their performance than Nvidia, especially since Maxwell. Power consumption may not be as critical on the desktop compared to laptops, but I think most would agree that given the same performance and price, power draw is definitely an advantage for the purchasing decision and that many would even pay a slight premium for lower power draw. Don't get me wrong, I don't consider myself an Nvidia or AMD fanboy. As stated above, I'm tired of Nvidia's antics with GSync and quite frankly, if they don't support FreeSync with Pascal, I'm hoping Polaris is a winner because that's where I'll be headed, without a doubt.

            • DancinJack
            • 4 years ago

            I would probably agree with this. Less elegant is a nice way of putting it.

            • Freon
            • 4 years ago

            I’m not convinced that’s really significant. Not for desktop performance GPUs.

          • ronch
          • 4 years ago

          Not really. Where I live a GTX 950 costs about the same as an R9 370 but the 370 sucks way more power. Same thing with GTX 960 vs. R9 380. At full load, diff is ~100w! I’d gladly pay for Nvidia even if their stuff is priced higher (which isn’t the case). No wonder AMD is losing market share. People just want to get stuff that’s perceived to be better and more refined, and they’re willing to pay for it.

    • chuckula
    • 4 years ago

    That’s a nice PC from the RTG there.

    I noticed it has a big AMD logo outside and …

    (•_•)
    ( •_•)>⌐■-■
    (⌐■_■)

    Intel Inside.

      • brucethemoose
      • 4 years ago

      It helps that the RTG is technically a seperate entity than the CPU division now.

      Still pretty funny/painful to see.

        • tipoo
        • 4 years ago

        Not yet, legally. But the rebranding certainly does seem to point to that being the plan.

      • ronch
      • 4 years ago

      For a microsecond there I thought this post was Auxy’s.

      I like Auxy’s emoticons better though. Japanese characters just scream ‘anime’.

        • auxy
        • 4 years ago

        ^^) _旦~~ enjoy~

    • ImSpartacus
    • 4 years ago

    I think this dropped a little too soon, lol.

    • DancinJack
    • 4 years ago

    So, what did the Polaris 10 sample score in that test?

    Edit: DOING A DEMO RIGHT NOW: [url<]http://m.ustream.tv/channel/gQETjZzLnf9[/url<]

      • chuckula
      • 4 years ago

      Requires flash.
      Facepalm.

        • Voldenuit
        • 4 years ago

        Someone pls post the score for those of us who have gone Flash-less.

        We thank you for your sacrifice.

          • DancinJack
          • 4 years ago

          Unfortunately they didn’t do the SteamVR test. They ran Hitman in DX12. Was very meh.

Pin It on Pinterest

Share This