AMD unwraps Ryzen SoC with 24 Vega CUs for Chinese-market console

Rumors of a “big” Ryzen APU with a massive Vega graphics processor on board have been circulating for months, fueled by appearances in public hardware benchmark databases and the like. Now, we may know why. AMD has taken the wraps off a semi-custom product it's produced for Chinese manufacturer Zhongshan Subor for use in a new gaming PC and console for the Chinese market.

Subor's semi-custom chip includes a four-core, eight-thread Ryzen CPU running at a 3-GHz clock speed. It's paired with 24 Radeon Vega compute units running at 1.3 GHz, all connected to 8 GB of GDDR5 memory. For reference, that's even more raw shader-processing power running at higher nominal clock speeds than the 20 Vega-ish CUs of the Intel Radeon RX Vega M GH chip in the Hades Canyon NUC, to say nothing of the maximum of 11 CUs on Raven Ridge APUs.

Subor's gaming PC. Source: AMD

AMD says Subor demonstrated this gaming PC at ChinaJoy, the country's largest gaming and digital entertainment expo. The system will apparently begin shipping in late August, followed by a similar console running a customized operating system that will launch before the end of the year. There's no word on global plans for this part, and its semi-custom nature means it likely won't be made available to other manufacturers, but its mere existence could suggest exciting things for the next generation of console hardware.

Comments closed
    • tipoo
    • 1 year ago

    Interesting release for sure. When the 8th gen consoles came out, it would be years before a standalone APU could be bought at the performance of the PS4’s GPU, and even recent ones lag in some ways. The way things are going now, APUs could be pretty close to what the PS5/XB? are looking like they’re going to be, with Ryzen+Vega/Navi, and another year or two and die shrink down to add CUs and cores.

      • NoOne ButMe
      • 1 year ago

      APUs with such large CU count (and bandwidth) have the issue of the market for such is not worth the investment unless an OEM like Apple wanted it in a computer.

      Which basically leaves iMacs as the only real viable place for such a part

        • Anonymous Coward
        • 1 year ago

        But having made such a part, wouldn’t AMD make a profit from selling it more widely, even to a small market? Adopt the graphics card strategy, but in some flavor of ATX.

        Probably they were contracted such that this one customer has exclusive access.

          • NoOne ButMe
          • 1 year ago

          The issue is one of ecosystem, such a part would need to be soldered down onto a board with a fixed amount of RAM, or AMD would have to push through the development for GDDR5 dimms. Or go with an ultra-wide memory bus.

          When you go with GDDR5 soldered, you get to the point where you lose margin and overall cost advantages that an integrated solution can bring.

          If you go with an ultra-wide memory bus, probably 384b or 512b DDR4… Which is problematic for motherboard design. Oh, and getting companies willing to invest in the platform.

            • Anonymous Coward
            • 1 year ago

            Treat it like a GPU on ATX, no ecosystem required beyond someone who makes the board and markets it via the usual channels. Also hopefully an OEM or two with AIO systems.

            I presume you are referring to the higher price of GDDR5, I admit that harms the value proposition somewhat, but still the buyer would be getting mid-range GPU performance which would have had similar memory in discrete form.

            No doubt people better informed than us have considered the numbers if detail.

    • jarder
    • 1 year ago

    AMD have updated their page to give some more details:

    [url<]https://community.amd.com/community/gaming/blog/2018/08/03/new-amd-semi-custom-soc-combines-the-power-of-amd-ryzen-cpu-and-amd-vega-gpu-for-gamers-in-china?sf194826771=1[/url<] now they specifically mention a [quote<]256-bit GDDR5 interface onto a single chip and 8GB of GDDR5 on the motherboard.[/quote<] There's also a pic of the SOC and motherboard up on reddit now: [url<]https://www.reddit.com/r/Amd/comments/94h6dn/amd_soc_for_zhongshan_subor_pictured/?utm_content=comments&utm_medium=hot&utm_source=reddit&utm_name=static.chiphell.com[/url<]

      • faramir
      • 1 year ago

      It’s a monolithic chip, not a MCM judging by the picture. Interesting!

    • DavidC1
    • 1 year ago

    Vega M GH doesn’t have 20 CUs, it has 24, just like this one. It’s the GL that has 20.

    Originally we thought it had 28 CUs, because early leaks had 1792 SPs.

      • jarder
      • 1 year ago

      Yes, the early leaks indicated 28CUs, so it’s very possible that this one has 28CU, but with 4 disabled for binning/yield.

    • DragonDaddyBear
    • 1 year ago

    Steambox part 2?

    • DragonDaddyBear
    • 1 year ago

    If only they made a NUC-like computer with this. Hey, a gerbil can dream.

    • derFunkenstein
    • 1 year ago

    Something like this would have been perfect in the PS4 Pro / Xbox One X, but those launched too early for this sort of thing, I think.

    • ronch
    • 1 year ago

    1. Subor. Um.

    2. What a hideous console design.

    • Krogoth
    • 1 year ago

    This is the beginning of the end for mainstream discrete GPUs. AMD RTG is testing the waters before committing more semi-intergrated GPU/iGPU solutions into other platforms with Vega and soon Navi architectures. Intel will certainly try to match up.

    I wouldn’t be too surprised if mid-range discrete GPUs SKUs become scare by the time 2025-2027 rolls around.

      • DragonDaddyBear
      • 1 year ago

      Are you saying you are impressed?

      • Kretschmer
      • 1 year ago

      I respectfully disagree. Resolutions will get higher, VR will become more mainstream, and GPUs will still need 150+ watts to keep up. I think discrete GPUs will become less prevalent but still exist as an option for big gamers. Cloud gaming is a bigger threat to discrete GPUs as a market.

        • Krogoth
        • 1 year ago

        Resolutions have reached diminishing returns and VR will always be a niche. The masses and mainstream gamers don’t care for either thing. They make-up the bulk of discrete sales and revenue through sheer volume.

        We have already seen demand destruction happened to discrete audio cards, digital cameras, soild-state storage. Discrete GPUs are not immune to it.

        • maxxcool
        • 1 year ago

        na, eyeballs can only discern so much fidelity on a tiny screen inched from their face. Well hit diminishing returns on helmets and glasses very quickly. After that die shrinks will take care of the rest to feed them.

        I agree with Big-K. Mid range gpus are the new mainstream gamer\compute discrete card. low end gpus are on the express full-on extinction slide at this point.

        In 1 or 2 more generations IGPUs will then start eating into S^^TY mid range cards with crappy 128bit buses.

      • DavidC1
      • 1 year ago

      The end of dGPUs?

      Nah.

      It’ll be like how SSDs are at best. SSDs are taking marketshare but greater majority is still using HDDs, because its far cheaper per GB.

      And unless they somehow get rid of requiring on package memory, these high end iGPU types will end up not much more than a smaller discrete solution, and remain expensive too.

      • brucethemoose
      • 1 year ago

      But why?

      There’s not much of a gaming performance advantage putting the GPU/CPU on the same die, or on an interposer, at least not now. It’s more expensive than manufacturing 2 chips. AMD has pretty clearly shown that they can only afford to tape out a few designs (unless someone foots the bill for it, like here). And if they replace low-mid end GPUs with this, they’re cutting out a whole segment of their market (customers on existing AMD CPUs, or any Intel CPUs).

      It is a pretty attractive gaming laptop design, just as Hades Canyon is, but that’s about it.

        • Krogoth
        • 1 year ago

        It is not about performance advantages. It is all about bottom line costs and miniaturization. It is actually cheaper to integrate GPU onto the same board or die versus the current discrete card (daughterboard) tied to a motherboard (CPU) setup that exists in current PC systems.

        Most gamers would love a small relatively hassle-free chassis that can handle their gaming needs. Heck the hardcore gaming crowd would love it for LAN parties.

        Almost everything you find on a modern motherboard used to be on separate discrete cards. That’s kinda why ATX form factor’s seven physical card slots allocation seems silly and redundant these days.

          • brucethemoose
          • 1 year ago

          If you embed everything, maybe

          But current consumers seem to like their sockets, and that presents a problem. A big GPU on the same die as a CPU needs alot of memory bandwidth, which either means expensive on-package memory, or GDDRX embedded on the motherboard + a socket with a ton of pins.

          I think AMD needs access to something cheaper than current interposers, like EMIB, before that really takes off. That should happen by 2025, but it’s still awhile away AFAIK.

            • deruberhanyok
            • 1 year ago

            Eh. Look at everyone still holding on to i7-3770k chips. I don’t think the socket is a big deal.

            These days, by the time it’s worth the cost of upgrading you need a new motherboard and probably RAM anyhow. When was the last time you had a motherboard last you through more than one generation of processor?

            If AMD did a Ryzen/Vega APU with onboard HBM that would be a pretty capable mainstream gaming setup (the Hades Canyon NUC is a good example; that GPU generally outpaces a 1050ti). If it was embedded and available in a standard form factor like mini ITX, for people to toss into a system build and add RAM and storage, I expect it would be a very compelling option.

            I’m sure they could do it socketed as well, but I don’t know what the benefit would be.

            • renz496
            • 1 year ago

            that’s because intel slowing down their performance uplift after releasing sandy bridge to the market. some people still holding onto their sandy/ivy bridge because minimal performance upgrade that being offered by intel the last few years. but it is a different story for GPU. i myself still using my 2500K but so far i’ve been using GTX460, GTX660 (then SLI), GTX960 and GTX970.

            CPU tend to last longer but that’s not the case with GPU. even if the GPU part is quite capable today but it definitely not five years later.

          • Laykun
          • 1 year ago

          Except none of those add-in cards could use 150W+ power. All these solutions will be thermally limited and you’re more likely to see a co-existence rather than a disappearance of discrete GPUs. You can bet your bottom dollar nvidia won’t allow the discrete GPU market to die, especially considering they’re it’s biggest benefactor.

            • JustAnEngineer
            • 1 year ago

            Where is NVidia’s nForce chipset business these days?

            P.S.: How about NVidia’s [url=https://www.anandtech.com/show/945/6<]SoundStorm[/url<] sound card business?

            • Laykun
            • 1 year ago

            Nice what-about-ism. Historic decisions do not predict future decisions, nvidia is a very different company now.

            • Krogoth
            • 1 year ago

            Discrete GPUs aren’t dying thought. They will become an niche catered to the high-end who are willing to deal with cost and headaches with one.

            The masses will be content with next generations of iGPU sand semi-intergrated solutions which will handle all of their needs.

            • Laykun
            • 1 year ago

            But 150W is mid-range, not just high-end. The 150W+ power bracket covers the majority of GPUs on steam, and since this SoC is GAMING focused there’s some general overlap here.

        • NoOne ButMe
        • 1 year ago

        Your die/package gets to small nd thermal density gets to great.

      • maxxcool
      • 1 year ago

      Agree, but with a faster time frame.

      • meerkt
      • 1 year ago

      I think the opposite. Intel plans to enter the dGPU market in 2 years:
      [url<]https://jobs.intel.com/ShowJob/Id/1706059/Head-of-Technical-Marketing,-Discrete-Graphics/[/url<]

      • Redocbew
      • 1 year ago

      The next TR giveaway should be to guess how many times the end of discrete GPUs has begun according to Krogoth.

        • Krogoth
        • 1 year ago

        It has been going on ever since SoC solution because ubitqious. iGPUs have already killed off the low-end basic 2D/3D discrete card. Now, it is nothing but used older cards and overpriced professional-tier SKUs.

        It is just a matter of time before mid-range discrete SKUs begin to hear the same music. I expect to happen within 5-7 years down the road.

      • tipoo
      • 1 year ago

      I’m not inclined to agree. GPUs have been on a trend of higher and higher power consumption for years (with the occasional fall backwards on an efficient architecture, but that’s an anomaly to the trend), and simple physics will always prefer a dedicated card over an APU for thermal capacity. At least unless hardware has so completely outstripped software that there’s no need for them, but for now there is, and die shrinks are getting damned hard to come by.

        • Redocbew
        • 1 year ago

        What’s considered low or mid range graphics has changed also. If IGPs are going to be “good enough” for “most people”(whatever that means), then it’s not just the hardware that’s in question.

      • designerfx
      • 1 year ago

      not at any time would this make sense, krogoth. Tiny die performance/integration simply will not ever match the heat dissipation and size available for discrete GPU’s, by it’s very definition.

        • Anonymous Coward
        • 1 year ago

        The die size will of course scale with the amount of hardware in the die. This example APU is not “tiny”.

    • dpaus
    • 1 year ago

    Is there any (reliable) indication of what the TDP of this chip is?

      • Jeff Kampman
      • 1 year ago

      Nothing from AMD itself.

      • cygnus1
      • 1 year ago

      I’m guessing it’s not low…

      • NoOne ButMe
      • 1 year ago

      Probably about 105-130 watts.

      CPU probably 20W for the cores (R5 1400 CPU cores only is <25W)
      Memory controller, I/O, etc. probably 30-40W (depends on GDDR5 clock)
      GPU probably 55-70W (Vega at 1300mhz is fairly efficient)

      70W being on GloFo 14LPP, 55W being on TSMC 12FF

      • pogsnet1
      • 1 year ago

      65W still since it reduced the CPU clock speed.

      • auxy
      • 1 year ago

      Most of an RX570 (150W) + most of a Ryzen 3 1200 (65W) ≈ around 180-190W is my guess.

        • thx1138r
        • 1 year ago

        Wouldn’t a Ryzen 5 2400GE (35W) + Radeon RX 560 (60-80W) ≈ 95-115W be a more accurate and appropriate guestimate. I reckon it would be around the 95W mark.

          • auxy
          • 1 year ago

          The CPU comparison is neither here nor there, but the GPU comparison is definitely to the RX 570, not the RX 560. 24 CUs = 1536 SPs, presumably 256-bit memory interface like the RX 570, and given that presumably 32 ROPs just like the RX 570. It’s a lot of presumption but in any case the GPU is certainly 1.5x~2x as big as RX 560 depending on which RX 560 you are talking about.

          95W is a ridiculously low estimate, that is not even enough for the GPU alone nevermind the GDDR5 memory which uses a lot of power. My lowball estimate is 150W, expected 180W, possibly even higher (but probably not, zen is efficient).

            • BobbinThreadbare
            • 1 year ago

            This is Vega not Polaris though.

            Vega 56 is 210 TDP

            210/56*24 = 90 watts

            (and this chip doesn’t appear to be trying to boost to 1.4ghz which should shave more power off)

            • mczak
            • 1 year ago

            There’s imho no way it’s over 150W. Yes RX570 is 150W typical board power, but that’s because AMD pushed it over the limit where it operates efficiently. Vega at 1300Mhz however is still efficient, so a comparison with the RX470 would be more appropriate, which had 120W TBP, and it should still be below that (because after all it’s still a bit less beefy due to having 25% less shader units). So GPU+GDDR5 should be below 120W, and 4 Zen cores at 3 Ghz (there’s no word whether it’ll support Turbo, but in any case Zen is indeed very efficient at 3Ghz, even the 8 core Ryzen 2700 with 65W TDP has a base block of 3.2 Ghz) should be below 30W.
            That would give a 150W ceiling, and I’d assume it’s actually lower than that because it’s not designed to run full cpu+gpu load at max clocks for long duration (so they can skimp on cooling). Hence my guess would be closer to 100W rather than 150W.

            • thx1138r
            • 1 year ago

            You missed my point completely. This chip is an APU, not a separate CPU and GPU. Thus a more appropriate comparison would be to another APU.

            The 2400GE has 4 Zen cores along with 11CUs (704SP) all running in a 35W TDP. Now just size up the GPU part of the APU by adding on a rx560 (16CU / 1024SP at 60-80W) and you’ll have your guesstimate.

            And who was talking about the GDDR5 power? the OP asked the question about the TDP of the chip, not the overall power usage of the whole system.

            • NoOne ButMe
            • 1 year ago

            150W is high, 180 Watts is insanely high.

    • NTMBK
    • 1 year ago

    Pretty cool. I hope they start selling these in the West, it would be perfect for an entry-level gaming PC.

      • blastdoor
      • 1 year ago

      Arguably “an entry-level gaming PC” is what the Xbox could have and should have always been.

        • Mr Bill
        • 1 year ago

        +3 for irony

    • blastdoor
    • 1 year ago

    I wonder what OS they are going to run and how they will get developer support….

    Maybe pirated Windows?

      • cygnus1
      • 1 year ago

      [url=https://images.anandtech.com/doci/13153/ChMkJlti4d2IRqSLAAHX5UtRDwEAAqU2QIWY08AAdf9883.jpg<]There's a picture of the supposed thing supposedly running what appears to be Win10 over at AT.[/url<]

        • jarder
        • 1 year ago

        There’s a better pic on Hexus:

        [url<]http://hexus.net/tech/news/cpu/120764-amd-announces-new-semi-custom-ryzen-vega-soc/[/url<]

        • sreams
        • 1 year ago

        “There’s a picture of the supposed thing supposedly running what appears to be Win10 over at AT.”

        There’s a picture of the supposed thing supposedly running what is supposedly Win10 over at AT, supposedly.

        FTFY

          • Fursdon
          • 1 year ago

          “There’s a picture of the supposed thing supposedly running what is supposedly Win10 over at AT, supposedly.”

          There’s a picture of the purported thing purportedly running what is purportedly Win10 over at AT, purportedly .

          FTFTFY

      • ronch
      • 1 year ago

      I bet $5 it’s based on Linux or Unix.

        • brucethemoose
        • 1 year ago

        Linux doesn’t run PubG.

        On the other hand, those desktop pics could easily be Linux with a shameless Windows skin. I know I’m stereotyping Chinese tech companies a little, but they would totally do that.

    • thx1138r
    • 1 year ago

    I would love to see some benchmarks of that. In particular, assuming it has a 256-bit GDDR5 memory interface, it would be nice to know how standard windows programs would run with that increased bandwidth but higher latency.

    Also, how about a mini-ITX version of that machine, that would have a lot of potential as a basic but respectable gaming PC.

      • NoOne ButMe
      • 1 year ago

      There are jailbroken PS4 pro systems… it is Linux but fundamentally should behave similarly to Windows with regards to it using GDDR5?

        • thx1138r
        • 1 year ago

        The big difference between this system and the PS4pro is the choice of CPU cores. The Zen cores should be considerably faster than the old Jaguar cores and I’d like to know how standard windows/Linux apps run on this system.

      • auxy
      • 1 year ago

      I’ve run Linux on a PS4 with an SSD and everything runs like balls. The slow Jaguar CPUs are what’s holding it back I think tho. (‘ω’) This thing should be much faster.

    • jihadjoe
    • 1 year ago

    [quote<] For reference, that's way more raw shader-processing power running at higher nominal clock speeds than the maximum of 11 CUs on Raven Ridge APUs, to say nothing of the 20 Vega-ish CUs of the Intel Radeon RX Vega M GH chip in the Hades Canyon NUC.[/quote<] I feel like you transposed Kaby Lake-G and Raven Ridge here. As it is the sentence basically reads "24 is more than 11, let alone 20." Since you seem to be going for an even more emphatic difference with "to say nothing of", then shouldn't that fragment of the sentence be used in conjunction with Raven Ridge which the new SoC improves on the most?

      • Jeff Kampman
      • 1 year ago

      I originally wrote it the way you suggested, but it’s early and I was rolling around the phrasing in my mind while getting up to speed with a cup of coffee. I do agree that the 24 – 20 – 11 phrasing is most naturally and I’ve reverted it to that order.

    • JustAnEngineer
    • 1 year ago

    A console with for-real desktop CPU cores? While that would certainly be refreshing, do we really expect the big console suppliers to accept something that requires that much cooling?

      • Hattig
      • 1 year ago

      The CPU parts are only running at 3GHz (I don’t know if they have turbo as well), it’s likely 65W if not 45W.

      The GPU aspects are harder to quantify. It’s probably around 100W, and then there’s the GDDR5 memory on the package as well.

      Still, it’s not hard to cool that amount of power. Sony and Microsoft do it in their consoles.

      • jihadjoe
      • 1 year ago

      Zen cores are lot more efficient than the Jaguar cores that made it into Xbone and PS4.

        • ermo
        • 1 year ago

        Not to mention wider, faster and more powerful. Something like this is pretty much the perfect Vulkan-powered SteamBox.

    • nico1982
    • 1 year ago

    That’s a legit PS4.5/Xbox 3Π … nice.

Pin It on Pinterest

Share This