Report: AMD Navi details come to light courtesy of a loose-lipped Sapphire rep

If you're someone who reads this site regularly, I don't have to tell you that Nvidia is pretty much dominating the graphics card market right now. AMD's still holding out with refreshes of its Polaris and Vega designs, but they're looking awfully long in the tooth compared to Nvidia's Turing. AMD needs new Radeons, and fast. Fortunately, it looks like Navi is on its way pretty darn quick. A Sapphire representative speaking to the Chinese press blabbed a whole bunch of info about two new cards coming from the company. The details were, uh, detailed in a now-removed blog post over at Chinese-language site Zhihu; thanks to Videocardz and TechPowerUp for the info.

Loose lips sink ships, after all—wait, what else did you think we meant?

According to Sapphire, AMD will be releasing two tiers of Radeons based on Navi, as usual. Just as Polaris gave us the RX 470 and RX 480, Navi will come along as one variant targeted at the GeForce RTX 2060, and another model aiming for the GeForce RTX 2070. Sapphire's representative apparently claimed that one of the cards—most likely the the faster of the two cards—will be "stronger than 2070," presumably referring to the GeForce RTX 2070.

That's fairly promising news, but the representative also revealed that Navi will not in fact have dedicated ray-tracing hardware. Instead, Sapphire promised said hardware for "next year's new architecture." That bit of information is particularly interesting given that Sony has already promised ray-tracing acceleration for the PlayStation 5. The PS5 is known to be using AMD hardware, and it isn't intended to launch until at least next year, so it looks like Navi could be yet another revised-GCN stopgap as some have postulated right here in our comments. Further cementing that idea is that Sapphire's representative said that Navi won't be "scaled up" to a larger design—this one model of chip is it.

Still, it's not as if GCN is incapable, particularly on 7nm. The Radeon VII holds it own—outside of ray-tracing-based workloads, at least—and it's certainly possible that what amounts to a hot-clocked Polaris chip hooked up to GDDR6 memory could be a fairly compelling product. It all comes down to price; as a wiser man than I once said, "There are no bad products, only bad prices." Pricing on the Navi cards was revealed at the press event, though, and it doesn't set our hearts alight: $399 for the slightly slower model, and $499 for the supposedly-RTX-2070-beating card.

Those prices are awfully close to Nvidia's suggested prices for its GeForce RTX cards. Right now, you can find RTX 2070 cards at Newegg for just $480. Navi will have to do quite well indeed in current titles to make up for the lack of DXR acceleration. Fortunately, we may not have long to wait to find out. According to Sapphire's representative, the long-swirling rumors that AMD will launch Navi at Computex were half-correct. If Sapphire speaks the truth, it seems Dr. Su will be announcing the cards at that show, with an actual launch coming later, in July.

Comments closed
    • rinshun
    • 5 months ago

    I was interested in a sub 200 USD GPU. Leaks suggested there would be 2. Is there any hope they will show up?

    • ronch
    • 5 months ago

    GCN’s performance is fine, it’s the energy efficiency that’s killing it. Yes many people don’t care but less efficiency means you can’t clock as high or add more stream processors without hitting the power limit. And in recent years GCN is not only less efficient, it’s a lot less efficient.

      • tipoo
      • 5 months ago

      The SIMD change rumor should be a decent help with efficiency. Part of why Nvidia achieves higher shader usage is AMD pays a 4 cycle penalty for any execution, the SIMD change will cut that down to 2, better filling each shaders idle time.

    • Srsly_Bro
    • 5 months ago

    The rumor is the dollars are in Singapore dollars, not USD. Convert currency to USD and you have a better price and a different story to tell.

      • sconesy
      • 5 months ago

      $289 and $360, eh? That would be a different story. Wish it was in HKD…

      • drfish
      • 5 months ago

      That’d be great, but my gut says the “rumor” is just wishful thinking…

        • Srsly_Bro
        • 5 months ago

        There is a rumor out with 16 for zen2 @4.2/3 on R15. I don’t want to say the score because of hype train, but it can be found. If true, Intel will cancel a few more products for next month.

    • hubick
    • 5 months ago

    LG has a reasonably priced 85″ 8K HDMI 2.1 (required for 8K@60hz) TV out now, and I think it would be great for viewing “DSLR resolution” photos on. All we need is an HDMI 2.1 capable video card to drive it.

      • blastdoor
      • 5 months ago

      Is there any 8k content on anybody’s horizon?

        • moose17145
        • 5 months ago

        Hubick was specifically referring to using it for viewing pictures from an actual dslr camera capable of going well above 4k, so i would assume for him his use case is more niche and goes beyond simple content consumption.

        Use cases for 8k, and even resolutions well above 8k are out there, but they generally fall into the engineering, medical, and photo/video editting markets from what i have seen. I presume hubick’s use case falls into the editting of very high res still shots based upon his short original posting.

    • chuckula
    • 5 months ago

    We won’t let AMD get away with this!

    We can liquor up our very own OEM representative to leak rumors about our GPU being canceled!

    Take that AMD!

      • ronch
      • 5 months ago

      Yeah, torture is unnecessary when alcohol works just as well if not better.

    • Wirko
    • 5 months ago

    This leak shall be dubbed Navi-gate.

      • drfish
      • 5 months ago

      Love it!

        • Wirko
        • 5 months ago

        Also, the caption would look better in Italian this time. [i<]Le labbra sciolte affondano le navi.[/i<]

      • Mr Bill
      • 5 months ago

      Because, failure of Navi-gation sank the tanker?

      Oh wait, no… Its because of the leak.

    • jensend
    • 5 months ago

    [quote<]Navi will have to do quite well indeed in current titles to make up for the lack of DXR acceleration.[/quote<]I don't see any reason why. No midrange buyer should bother taking raytracing into account in their 2019 purchasing decisions. By the time games are actually relying on ray tracing as more than a gimmick, the midrange 2xxx RTX series will be too long in the tooth to perform well in raytraced games anyhow.

      • RAGEPRO
      • 5 months ago

      Well, let’s look at the probable choices:
      ー Buy a Radeon with ±(x) performance for (y) money and (z) power.
      ー Buy a GeForce with (x) performance for ≈(y) money and ≤(z) power.
      Aside from the power and/or cooling argument, the GeForce has hardware DXR acceleration and tensors, nevermind all the other Nvidia-specific features. So yeah, I think I was justified in saying that “beating an RTX 2070 at $500” is not really exciting.

      If they can pull out a 20% average win over Nvidia at the same price point, then we’re talking. Alternatively if they make 12 or 16GB of GDDR6 standard, then there’s an argument.

      But even if it was (XYZ) vs. (XYZ), you still get a better product with Nvidia. You get DXR, you get PhysX, you get a reasonable driver control panel, and so on.

        • Redocbew
        • 5 months ago

        I agree with the 2070 at $500 thing. If it can’t do better than that, then bells and whistles aren’t going to save it.

        However, DXR is still very much included under “bells and whistles”. If it factors into a buying decision at all, then it should be a decision which will stay with you for the next 3 to 5 years. If it’s true that we’re reaching diminishing returns without some kind of change in how scenes are rendered, then I’m all for it, but it’s going to take a while for that to happen.

          • Laykun
          • 5 months ago

          $500 for a card that does traditional rendering at 1.0x speed and doesn’t have DXR.
          $500 for a card that does traditional rendering at 1.0x speed and does have DXR.

          Which do you pick?

          I’m dubious of AMDs claim of “beating” the 2070, I bet you it’s more likely going to trade blows, being good in some games and worse in others.

        • tay
        • 5 months ago

        I love the dig at the driver control panel. It is just so over the top for Radeon.

          • Chrispy_
          • 5 months ago

          I don’t know – there are many things Nvidia do better than AMD, but control panel is not one of them. The Nvidia control panel is an abomination from the Windows XP era. It’s dated, lacks features, and is clunky to use.

          The AMD one is a bit millenial, but at least it does a whole bunch of useful stuff.

        • ptsant
        • 5 months ago

        He was specifically quoting the DXR feature. Your answer is about perf, money and power. Which is exactly his point. This is what matters, not DXR.

          • RAGEPRO
          • 5 months ago

          My answer is also about DXR. Read the rest of it. 🙂

        • freebird
        • 5 months ago

        PhysX is open source now, so unless it is an older game; all games going forward using PhsyX should support CPU & any GPU.

      • Chrispy_
      • 5 months ago

      Yep, I bought an RTX card [i<]in spite of[/i<] the RTX feature set, not [i<]because of[/i<] it. Admittedly, the only card this applied to in the RTX range was the 2060FE, but that was a simple case of more performance AND fewer Watts AND less noise AND at a lower price than the nearest competitor at the time of purchase (Vega56). It was a straight, unanimous upgrade in every way even when I completely ignored the entire RTX feature set.

      • ronch
      • 5 months ago

      Yeah it’s like buying an S3 ViRGE thinking you’ll be ready for the 3D revolution.

        • Mr Bill
        • 5 months ago

        LOL! That brings back memories.

          • Chrispy_
          • 5 months ago

          I think the only game I ever successfully ran on my S3 ViRGE without graphics corruption was POD.

    • Krogoth
    • 5 months ago

    Sounds like desktop Navi will end-up being a cheaper to make Vega 56/64 minus general compute stuff. Not exactly surprising since desktop Polaris was basically a cheaper to make Hawaii that ate less power when it came out.

      • Chrispy_
      • 5 months ago

      I hope you’re wrong, otherwise we’ll be getting a 2019 $399 Navi that performs exactly like a 2017 $399 Vega, ignoring the missing compute or the fact you can pick up Vega for $249 at the moment.

      The more obvious reason that this leak is utter BS is that the Sapphire rep compared a $399 Navi to a $349 RTX 2060.

      There’s no way AMD can rock up to the party six months later than an RTX 2060, ask for $50 more than an RTX 2060, and not have the RTX, VRS, DLSS support of the RTX 2060. Chances are >0 that Navi will be power-hungry and inefficient compared to the RTX 2060, too. I really hope it’s not, but AMD’s track record for GPU power-efficiency this last decade has been godawful.

        • Krogoth
        • 5 months ago

        Vega 64/56 at their current price points barely have any profit margin (huge silicon + HBM2). Navi is going to be a direct replacement (GDDR6/5 + smaller more efficient silicon).

        It is a repeat of what Polaris did. It was a smaller, more efficient version of Hawaii with 90% of performance and 40% less power at load. Much cheaper to make (256bit memory bus, less power circuitry)

        RTX, VRS and DLSS are completely irreverent to the mid-range portions of the market. Nvidia’s mid-range and lower-end GPUs are too weak to properly handle those feature sets. 2080 barely is able to use them. You need at least a 2080Ti to make them useful.

        As long as AMD RTG doesh’t overvolt/overclock crap out of the silicon. The power efficiency between TU106/TU116 and Navi should be pretty close.

          • Chrispy_
          • 5 months ago

          I agree with you that AMD needs a cheaper-to-manufacture product at Vega 64/56 level but this $399 card can’t be it, because at that price it simply won’t sell.

          At the moment, Vega56/64 performance is hovering around $300, not $399. That’s the market dictating that price point, not AMD.

          At $280 you have the 1660Ti, which puts up a reasonable fight against the Vega 56, which is why AMD have slashed prices to $280 with various offers of MIR bringing the cost down to $250, three free games, etc.

          At $350 you have the 2060FE which matches Vega64 pretty comfortably in performance whilst being a much nicer card to live with in terms of power/heat/noise.

          So AMD need to sell Vega performance at somewhere around $300 to be competitive; They don’t have a choice in the matter.

    • Billstevens
    • 5 months ago

    Sounds about right. I guess it was too much to hope for that they would undercut Nvidia by $100….

    No undercut and comparable performance with +/- depending on game is more likely…..

    • enixenigma
    • 5 months ago

    I guess the Vega 64 lives on in my computer for at least another year, then.

      • cygnus1
      • 5 months ago

      Yeah, GTX 1070 here. Years old now. Nothing compelling enough for the money to upgrade it…

        • LoneWolf15
        • 5 months ago

        Other than a second 1070…I did that. One (bought new at introduction MSRP), one mint condition Founder’s Edition, both in SLI. Works nice.

          • cygnus1
          • 5 months ago

          I may end up doing that. Not averse to it at all since I did exactly that with the GTX 760’s many years ago now. But honestly the GTX 1070 powers my 2560×1440 144hz monitor just fine with the games I play. A higher priority would be upgrading from the i7 4790k. But even that doesn’t seem worth the money of buying new motherboard/CPU/RAM. Would likely be $800+ for that combo and it just wouldn’t be a huge jump in performance for anything I do.

          ¯\_(ツ)_/¯

      • Concupiscence
      • 5 months ago

      Yeah. I’ve got a Vega 56 in one box and a secondhand GTX Titan X in another. For 1440p60 they’re both great, and I can’t think of anything worth the cost of future-proofing when everything I own runs just fine.

    • Pancake
    • 5 months ago

    Navi: Hey! Listen!

    Gaming community: Yeah, but, nah.

      • derFunkenstein
      • 5 months ago

      I love the Ocarina of Time reference. And I hate that f***ing fairy.

      • Krogoth
      • 5 months ago

      Leather Jacket Man: Sorry, can’t hear you over my piles of sweet hard cash…….

        • ronch
        • 5 months ago

        Yeah he has so much cash there that the room is freezing, hence the jacket.

        Geddit??

    • Concupiscence
    • 5 months ago

    If [url=https://old.reddit.com/r/Amd/comments/braa94/navi_simd_changes_leak/eobzt8j/?st=jvy54jyb&sh=5b631ecb<]this leak[/url<] is any indication, Navi may have some performance surprises up its sleeve yet. And I realize it's déclassé to not care about hardware raytracing in 2019, but I still don't.

      • Goty
      • 5 months ago

      AMD had expressed reluctance towards making exactly this sort of change in the past, so it will be really surprising if this turns out to be true.

        • DoomGuy64
        • 5 months ago

        IMO, repeat of SM3.0. AMD didn’t care until they got it right, Nvidia pulled the exclusive shtick at reduced performance and no AA, then it became completely passe after AMD not only beat them, but got AA working. Which Nvidia learned from and hacked dx10 games to disable AA on radeon cards, etc.

        Overall, AMD is not going to go full DXR until it moves away from proprietary rendering methods and can run at acceptable performance.

      • nanoflower
      • 5 months ago

      It’s hard to get excited about ray tracing hardware when you have to buy the most expensive card out today to get decent performance with all the eye candy turned on, and there still aren’t many games out that take advantage of the feature. In a year or two I think it will start to make sense to consider it an important feature in your purchasing decisions but not today.

        • Chrispy_
        • 5 months ago

        When you say “not many games”, the list is still just three games in total, right?

        One of which is a twitch shooter where framerate matters far more than eye-candy.

        The other two are not twitch shooters but how about you tell me how many times in [url=https://www.youtube.com/watch?v=urRSFTujsJY<]this three minute video[/url<] you can tell the difference between RTX on and RTX off? I counted differences in shadows and lighting a whopping four times for no more than a second or two in total - and I honestly couldn't say which one looked better. Apart from the 30-50% performance loss with RTX on, there wasn't any appreciable difference 🙁

          • ptsant
          • 5 months ago

          Thanks for the link. If I can’t find the differences in a side-by-side comparison then I’m unlikely to care when actually playing the game. BFV was a better showcase, but the technology is still very immature to influence a purchase decision.

      • dragontamer5788
      • 5 months ago

      Its all based off of a single sentence from a (now deleted) twitter post.

      Definitely don’t get your hopes up. Its an interesting leak for sure as a hypothetical, but it represents **major** changes to GCN. So major that its kind of unbelievable IMO (32x wavefronts instead of 64x wide. 2x SIMD units per CU. Etc. etc. That’s a LOT of change in there…)

      • Krogoth
      • 5 months ago

      It makes sense if they want to make a cheaper “Vega” and compete against TU106 and TU116.

      • ronch
      • 5 months ago

      I’m not very particular about fancy graphics. Heck I’m even fine with DX8-class graphics from the early 2000’s.

        • K-L-Waster
        • 5 months ago

        Then why would you be in any way concerned about anybody’s soon-to-be-announced GPU?

        I mean, sure, NAVI will probably beat 500 FPS at Diablo II in 640×480, but still…

          • ronch
          • 5 months ago

          Regarding your question, I’m actually still running an HD7770 with no plans to upgrade. So yeah, I’m just interested in AMD catching up but I have no plans to buy.

    • DancinJack
    • 5 months ago

    soooooo just another day at AMD RTG building over-hyped GPUs that don’t live up to said hype. Nothing to see here, mates!

Pin It on Pinterest

Share This