AMD’s Ryzen 7 2700U and Ryzen 5 2500U APUs revealed

AMD’s CPU division has had a heck of a few months. Its Ryzen 5 and Ryzen 7 CPUs established new levels of bang for the multithreaded buck. The Ryzen Threadripper 1920X and Ryzen Threadripper 1950X put the company back in the high-end desktop game for the first time in many years. Benchmarks suggest that Epyc server CPUs present compelling values versus the Xeon Scalable Processor family, too.

Those processors are all impressive enough, but AMD’s most important Ryzen CPUs will likely be those with an integrated graphics processor on board. A chip with integrated graphics is critical to addressing the needs of the vast majority of PC users at home, in the office, and on the go. For most users, the integrated graphics processors in Intel’s CPUs offer more than enough pixel-pushing power, and those IGPs drive the vast majority of displays out there. An integrated graphics processor is also essential in the thin-and-light notebooks that make up the bulk of PC sales these days.

Try as it might have, AMD’s past APUs with Radeon graphics on board didn’t set buyers’ hearts on fire. Consequently, and much as it has with Xeons in the data center, Intel has long enjoyed a vast market-share advantage over AMD in the attractive mobile market thanks to its finely tuned process technology, architectural advantages, and highly scalable CPU designs.

The Raven Ridge APU die. Source: AMD

Today, though, AMD is putting the final puzzle piece in place for a fully competitive Ryzen CPU lineup with its Ryzen 5 2500U and Ryzen 7 2700U mobile APUs, both of which use the system-on-chip formerly code-named Raven Ridge. Although the official series name for these parts—Ryzen CPU with Radeon Vega Graphics—is a tad unwieldy, AMD’s latest attempt to fuse CPU and graphics processing power seems more elegant and compelling than it’s ever been.

See, APUs have always boasted far better integrated graphics performance than Intel’s CPUs, but that graphics power was paired with CPU cores that trailed Intel’s in the kinds of single-threaded workloads that make up the bulk of most regular folks’ computing tasks. Outside of a couple niche cases like HTPCs or truly bargain-basement gaming builds, past APUs never really found a natural home in our recommendations for system builders thanks to that unbalanced performance.

A block diagram of the Zen core

The Zen CPU architecture seems poised to change all of that. As the past few months have shown, AMD finally has a CPU core that’s competitive with Intel’s latest and greatest, even if it can’t quite match Skylake and its derivatives clock-for-clock and in every workload. The Haswell or Broadwell-class performance that Zen provides has proven sufficient for many PC users, and folks with even older chips have long been digging in their heels. We also know that Zen is quite power-efficient, since Ryzen CPUs have had no trouble keeping up with Intel’s latest on that front in our estimations.

The Vega 10 GPU

AMD’s Vega graphics architecture may also mark an important step forward for the Radeon graphics on APUs. Although the vaunted High-Bandwidth Cache Controller has been replaced with a custom memory controller for Raven Ridge APUs, other Vega architectural features could still make Radeon graphics a better fit for APUs than ever.  For example, the Draw Stream Binning Rasterizer, a form of hybrid tiled rendering, helps to reduce pressure on off-chip memory bandwidth by keeping the data associated with each tile (or bin) in on-chip cache and minimizing data movement between on-chip and remote memory. In a relatively bandwidth-rich implementation like that of the Radeon RX Vega discrete graphics card, the DSBR’s economizations might not be keenly felt, but I’d be willing to bet that they’ll be useful in the bandwidth-restricted confines typical of Ryzen APUs.

Outside of its core CPU and GPU IP, AMD’s design teams are working with a more consistent set of building blocks than they have been over the past few years, too. Although the company didn’t talk much about this fact in its presentation, it’s worth noting that the Zen and Vega architectures have been tuned for and produced on the same 14-nm LPP process from GlobalFoundries from the get-go. That common production process would seem to make integrating both parts on a single die much easier than it may have been in the past, where AMD apparently had to prioritize performance on some parts of the chip (e.g. Kaveri’s integrated GPU) at the expense of peak CPU frequencies.

That 14-nm process could also provide better performance scaling as TDPs for Ryzen APUs climb well past the 15W envelope that AMD is initially targeting, too, an important consideration for chips that will presumably drop into desktops built on the Socket AM4 platform. Carrizo APUs, for example, were designed around a 28-nm high-density logic library that was typically used for GPUs. That high-density library reduced die area and power consumption to fit those chips into their optimal 15W TDPs, but that choice resulted in less-than-ideal frequency scaling in more relaxed thermal envelopes. Recall that the only Carrizo chip offered for the desktop was the graphics-free Athlon X4 845 at 65W. That sub-optimal scaling doesn’t seem like it should be a problem with Raven Ridge.

AMD has also started using its Infinity Fabric interconnect in shipping products. The use of Infinity Fabric means the implementation inelegancies from past APUs should be a thing of the past. Recall that the Llano APU used several purpose-built interconnects to join its Radeon graphics with the Stars CPU cores of the time. Trinity and Kaveri APUs refined this approach with wider, more numerous pipes and coherent methods of accessing main memory, but those interconnects were still specifically built for use in APUs and couldn’t be reused in other AMD products. The Infinity Fabric, on the other hand, has already played a key role in the eight-core Zeppelin die of Ryzen 7, Ryzen 5, and Ryzen 3 fame, as well as the multi-chip modules for Threadripper and Epyc CPUs. it also features in the Vega 10 GPU that powers the Radeon RX Vega 56 and RX Vega 64 graphics cards.

All told, the use of the Infinity Fabric to tie together Zen CPU cores and a Vega graphics processor in an SoC fabricated on a common 14-nm process would seem to be the most complete realization so far of the “SoC-style approach” that CTO Mark Papermaster sketched out all the way back in 2012, and it may well be the best example of the revitalized AMD firing on all cylinders. Depending on how one looks at it, Ryzen APUs are the culmination of a years-long journey or the first step into a new era. Probably both. Let’s see how AMD plans to tackle Intel’s mobile-market dominance.

 

Getting Ryzen ready for Raven Ridge

The Ryzen family of APUs are smarter about revving up CPU clocks compared to the Precision Boost behavior in their desktop brethren, thanks to a new per-core boost mechanism called Precision Boost 2.

Much like Intel’s Turbo Boost 2.0, AMD describes this new approach as an opportunistic scheme that will extract the maximum possible boost speed from every core given thermal and workload characteristics, using the same 25-MHz granularity available from existing Ryzen chips.

As a result, the “two-core boost” and “all-core boost” figures that AMD has provided for past Ryzen CPUs won’t apply to Raven Ridge SoCs or future Ryzen CPUs. Instead, active CPU cores will gracefully throttle up or down as workloads and thermals allow.

Also thanks to the SenseMI suite of technologies in the Zen architecture, AMD’s Extended Frequency Range (XFR) technology will be coming to Ryzen APUs as Mobile XFR. Like their desktop counterparts, Raven Ridge parts will be able to boost above their specified maximum boost clocks if a given system’s cooling hardware, thermal conditions and workloads allow.

Don’t expect to see the same XFR headroom from every system, though—assuming you see it at all. AMD suggests the feature is always active, but it also noted that XFR will only be available in “premium notebooks with great cooling solutions,” and it further notes that a product will have to meet the company’s own performance criteria in order to ship with mobile XFR enabled. The end notes of AMD’s presentation suggest that one of those criteria will be the system with mobile XFR enabled could apparently handle 25W of waste heat from the CPU, so we wouldn’t expect to see it in the thinnest and lightest systems with Ryzen APUs inside.

To manage power from the host system, Ryzen APUs will take advantage of integrated low-dropout regulators (LDOs) on board the SoC itself. Voltage regulation on Ryzen desktop CPUs is performed by external circuitry on the motherboard (even though LDOs are present in desktop Zen dies), but the integrated LDOs take center stage in the move to mobile.

AMD claims the combination of a shared voltage rail and regulator from the motherboard, in tandem with the on-die LDOs, allow it to reduce current requirements from the external voltage regulator and to build smaller and lighter laptops. Because each CPU core has its own LDO, as does the IGP, those regulators can also act as fine-grained power gates to let unused elements of the chip power down when they’re not needed.

The use of the LDOs on board Raven Ridge also gives AMD full per-core frequency and voltage control for all CPU cores and the integrated GPU based on workload. If only some CPU cores are under heavy load, for example, the chip can direct power and raise clocks where needed while throttling back less-stressed functional units.

These tradeoffs can also happen across the SoC if the graphics processor is under heavy load and the CPU isn’t, according to the company. Since all this monitoring and adjustment is occurring on-die, AMD claims that adjustments happen more quickly and accurately than they otherwise might.

When the SoC is largely idle, AMD can employ a range of power-saving techniques. Each core can enter the C6 deep-sleep state when it’s idle, and if all cores enter the C6 state for a long enough period, a CPUOFF state reduces the power flowing to the L3 cache, as well. Similarly, the LDOs allow the graphics portion of the die to be power-gated when it’s not being used, and a deeper GFXOFF state reduces power to the GPU’s uncore. Finally, if both functional units enter deep-sleep states, a VDDOFF state will halt the external voltage regulator. Power-saving schemes like these have been present in past APUs, but this implementation seems to be the most integrated and refined that AMD has shipped so far.

The Infinity Fabric on Raven Ridge is subdivided into two regions whose members share distinct characteristics: those that can remain gated off during display refreshes, and those that need to become active for short periods during those events. AMD’s designers worked to separate and minimize components of the fabric that need to be able to become active during display refreshes from those that can remain gated off in order to save more power.

If deep-sleep states are important for power saving, being able to quickly respond to a user’s demand for processing power is just as critical. AMD claims that the Raven Ridge SoC is generally able to wake up gated-off components much faster than the Bristol Ridge FX-9800P could, thanks to state saving and restoring optimizations in the Zen pipeline, retention registers in the Vega GPU that reduce the need for saving operations, and a bypass for the phase locked loop clock that prevents that component from slowing the elements of the SoC during wakeup.

A comparatively easier-to-explain change is that the Raven Ridge SoC package itself is only 1.38 mm thick, down from 1.82 mm for similar chips from the Bristol Ridge generation of APUs. Every millimeter counts these days, and paring off even fractions of one will likely help AMD’s design partners put Raven Ridge APUs in chassis that one could shave with.

 

The Ryzen APU duo

Now that we’re sufficiently up to date with AMD’s foundations for Raven Ridge, let’s talk about the two implementations of the SoC that AMD is announcing today.

  Cores/

threads

Base

clock

(GHz)

Boost

clock

(GHz)

GPU

compute

units

GPU

peak

clock

L3

cache

TDP RAM

support

Ryzen 7 2700U 4/8 2.2 3.8 10 1300 MHz 4MB 15W Two channels

DDR4-2400

Ryzen 5 2500U 2.0 3.6 8 1100 MHz

The Ryzen 7 2700U will tap four Zen cores and eight threads running at 3.8 GHz boost and 2.2 GHz base speeds. The integrated GPU boasts 10 Radeon Vega compute units for a total of 640 stream processors. AMD calls this integrated GPU “Vega 10 Processor Graphics” in its footnotes, so expect that branding to show up in marketing materials and on spec stickers. AMD suggests a 1300 MHz boost clock for the integrated GPU, but it isn’t going into more detail about base or typical speeds today.

The Ryzen 5 2500U will offer the same four cores and eight threads as its bigger brother, but they’ll run at lower clock speeds: 3.6 GHz boost and 2 GHz base, to be exact. The Ryzen 5 chip also loses two Vega compute units in the bargain, so it could carry a Vega 8 Processor Graphics sticker. The Vega 8 integrated GPU has 512 stream processors, and AMD claims peak clock speeds of up to 1100 MHz for this IGP.

To get their quad-core layouts, both CPUs implement one Zen core complex (or CCX) provisioned with 4MB of shared L3 cache. That’s the smallest L3 cache we’ve seen on a Zen CPU so far, down even from the 8MB in the Ryzen 5 1400. Each Zen CCX can have as much as 8MB of L3, so it’ll be interesting to learn whether this choice was made for power savings, market segmentation, or some other reason.  Each chip will retain 512KB of L2 cache per core, though, and they’ll be able to run with DDR4 memory at speeds up to 2400 MT/s.

Both of these chips will nominally run within 15W power envelopes, although system integrators will be able to squeeze them into TDPs as low as 12W or open up headroom to as much as 25W. Intel’s eighth-generation Core i7s can squeeze into machines dissipating as little as 10W, or as much as 25W, so AMD’s chips should offer designers about about as much flexibility.

A sneak peek at performance

No CPU announcement is complete without performance numbers, and AMD offered us a few tidbits from its internal labs to whet our appetites. As usual, I’d caution reading too much into these results. In part, that’s because the company chose an Acer Spin 5 notebook as its host for the Core i7-8550U that’s going up against the Ryzen 7 2700U in these tests. It’s a 13.3″ notebook that’s just 0.63″, or 16 mm, thick. My guess is that this laptop is likely quite thermally constrained given its compact body and small footprint.

I only bring this up because AMD included a set of test numbers for an Acer Swift 3 notebook with a Core i5-8250U in in its notes that it didn’t graph. I’m never one to let good test data go to waste, so I went to the trouble. Swift 3 notebooks come in 14″ and 15.6″ bodies, so they could offer better cooling potential than the 13″ Spin 5. With the same four cores and eight threads as the i7-8550U, the potentially better-cooled i5-8250U doesn’t prove as easy a match for the AMD side of the court. Keep these numbers in mind as we examine AMD’s internal performance figures, and note that AMD didn’t offer any details of its reference notebook for either of its SoCs. We have no idea how slim or well-cooled that machine might be.

Cinebench is first up. The company’s internal testers pitted the Ryzen 7 2700U against Intel’s second-best Core i7s from the Kaby Lake and Kaby Lake Refresh generations: the Core i7-7500U and Core i7-8550U. The Ryzen 7 2700U basically matches the Core i7-7500U in the single-threaded portion of Cinebench, and it’s about 10% short of Intel’s latest-and-greatest. That performance bodes well for the kinds of lightly-threaded workloads most people tend to spend most of their time doing on the average PC.

Zen CPUs tend to do well in Cinebench’s multi-threaded test, as well, and Ryzen APUs are no exception. For heavier-duty use, the Ryzen 7 2700U puts up an impressive Cinebench all-thread score of 719, or a whopping 44% faster core-for-core and thread-per-thread compared to the Kaby Lake Refresh part. This best-case showing for Raven Ridge suggests prodigious multithreaded performance potential for the 15W power envelope.

In the range of applications that AMD provided (but didn’t graph) data for the Core i5-8250U, however, it becomes clear why the Spin 5 and its Core i7-8550U might not be the most representative implementation of that chip. Whether in POV-Ray, PCMark 10, TrueCrypt, or PassMark 9, the Core i5-8250U in the Swift 3 beats out the Core i7-8550U in the Spin 5. In whatever reference design AMD is using for the Ryzen 7 2700U and Ryzen 5 2500U, both SoCs clearly have enough room to stretch their legs, and that might explain the rather wide gaps between the Ryzen 5 2500U, the Ryzen 7 2700U, and the Core i7-8550U in some tests. We’d be curious to see what would happen if one were to put the i7-8550U in the Swift 3. Still, the Ryzen APUs remain quite competitive with the Intel parts given these caveats.

Of course, many prospective Ryzen APU buyers will be more interested in these chips’ gaming chops than anything, and if 3DMark Time Spy is any indication, the Vega 10 processor graphics in the Ryzen 7 2700U could prove quite compelling if the price is right for notebooks built with this chip. The 2700U more than doubles the performance of Intel’s UHD Graphics 620 IGP in the Core i7-8550U, and it matches a result taken from the public 3DMark database for a machine with a Core i7-7500U and a GeForce GTX 950M inside. Of course, a scroll through 3DMark’s databases also shows a variety of similar Core i5-7500U-and-GTX-950M systems with scores well over 1000. Don’t throw out your gaming notebook just yet.

AMD didn’t provide competitive numbers for Intel’s UHD Graphics 620 IGP in real-world games, but the Ryzen 7 2700U still puts up a good showing. Acceptable performance with reasonable graphics quality in immensely popular games like League of Legends and Dota 2 at 1920×1080 would probably seal the deal for many mainstream gamers, but potentially playable performance at 1920×1080 with Counter-Strike: Global Offensive might not hurt, either.

The seemingly magical results for Overwatch are tempered by the fact that the Ryzen 7 2700U achieved them with low settings at 1280×720 and with a 79% render scale. Some might be able to tolerate those settings, but full-HD gaming this is not. Comparison data with Intel’s UHD Graphics 620 IGP would be nice to see, but if frame-time performance jives with these FPS numbers, gaming on a high-end Raven Ridge machine seems likely to prove a better experience overall than with the common UHD Graphics 620.

Although it’s not pitting Ryzen Mobile CPUs against Intel parts in the battery life department, either, AMD does claim large increases in longevity from its Ryzen APUs compared to Bristol Ridge parts. That most drastic improvement comes in VP9 video playback, where the Vega GPU is likely able to accelerate decoding that the Radeon graphics on Bristol Ridge can’t. Likely thanks to that help, the Ryzen 7 2700U delivers twice the battery life as the FX-9800P does. In H.264 playback and MobileMark 14, the gains are less drastic, but 15% to 26% better battery life is still nothing to sneeze at.

 

HP,  Acer, and Lenovo hop on board

SoCs are no good without OEMs to build systems around them, and AMD is announcing three major design wins today from Acer, HP, and Lenovo. These machines will be Raven Ridge’s pioneers during the holiday season of this year. AMD expects a full ramp of partner designs built around Ryzen APUs in 2018.

The HP Envy X360 is a 15.6″ convertible with a 1920×1080 touch screen. HP will offer it with a Ryzen 5 2500U SoC, as much as 8GB of dual-channel DDR4-2400 RAM, and solid-state storage ranging up to 512GB or a 1TB mechanical hard drive. The Envy X360 is a hair over three-quarters of an inch thick (or 19.5 mm), and it weighs about 4.7 lb (2.15 kg).

Lenovo’s Ideapad 720S is a 13.3″ traditional notebook. It’ll come with a 1920×1080 IPS display, up to 512GB of NVMe storage, and options for Ryzen 5 2500U and Ryzen 7 2700U SoCs. In a choice that’s notable for all the wrong reasons, Lenovo will offer the machine with some amount of single-channel DDR4-2133 RAM. Given what’s likely to be a bandwidth-hungry SoC, the choice to stick with one channel of DDR4 seems unfortunate. At just half an inch thick (13.6 mm) and 2.5 lb (1.14 kg), though, the Ideapad 720S definitely delivers on AMD’s thin-and-light promise.

Acer’s Swift 3 rounds out the wins AMD has secured for the moment. This is a 15.6″ traditional notebook with a 1920×1080 IPS screen. Acer will offer it with as much as 8GB of dual-channel DDR4-2400 RAM and SSDs as large as 256GB. At 0.7″ thick (18 mm) and four pounds (1.8 kg), the Swift 3 seems like the kind of straightfoward machine that’s just right for the middle of the bell curve of PC users.

Conclusions – for now

If its internal performance numbers are to be believed, Ryzen APUs seem to have what AMD needs to take the fight to Intel in the largest and most attractive segment of consumer PCs today. The combination of Zen CPU cores and Vega graphics power in Raven Ridge seems to make for the kind of truly balanced APU we’d want to recommend as an alternative for Intel CPUs in thin-and-light notebooks. With competitive single-threaded performance, high multithreaded potential, and promising gaming performance, the Ryzen 7 2700U and Ryzen 5 2500U seem like a fine opening salvo as AMD re-enters the high-performance mobile market.

On top of these apparently solid SoCs, AMD can already count design wins from major manufacturers in its corner, and those systems mostly seem as though they’ll put the best foot forward with Ryzen APUs. We’ll need to see where pricing for these systems lands, of course, and buyers will still have to bite after years of Intel dominance in notebook PCs. Still, the final puzzle pieces seem to be falling into place for AMD to complete its x86 CPU renaissance.

Comments closed
    • abiprithtr
    • 2 years ago

    It’s been 3 weeks, and I still read the article title as ” AMD’s Ryzen 7 2700U and Ryzen 5 2500U APUs [i<]reviewed[/i<]"

    • tipoo
    • 2 years ago

    Curious how the graphics side will compare to the MX150, seems to be shipping in systems that have that (Swift 3)

    • ET3D
    • 2 years ago

    Looking at the gaming scores, they look in the ballpark of desktop Bristol Ridge. Not bad for 15W, but not quite close to discrete chips like the MX150.

    I do hope though that it’d be the death of the M445 and its ilk.

    • Chrispy_
    • 2 years ago

    [quote<]AMD's past APUs with Radeon graphics on board didn't set buyers' hearts on fire[/quote<] The quality of the Radeon graphics on AMD's past APUs was irrelevant because the CPU portion of the product was a good 8-9 years behind the competition in terms of performance. 8-9 [i<]years[/i<] behind in a market where products become obsolete in 5 is why they failed.

    • Glock24
    • 2 years ago

    The last AMD system I has was around 2012, a Llano build. Since then I’ve moved to a laptop as my primary computer, and I’ve tried hard all these years to get a decent AMD laptop, but all I could find were cheap crappy ones.

    These mobile Ryzen APUs look tempting, but I’ll wait until they release > 15W chips. My current laptop has a 45W CPU and a ~55W GPU, so 15W won’t cut it. I do game from time to time, and a “big” APU with ~100W TDP would be great for portable gaming.

    Also I’m waiting for desktop APUs for a new Mini-ITX build.

    Edit: 100W APU being it mostly GPU.

      • ludi
      • 2 years ago

      Shouldn’t you be measuring performance by a something other than power consumption? I mean, sure, you can make some ballpark estimates of how much horsepower is being shoved in there but there are better ways of measuring it.

    • RdVi
    • 2 years ago

    Lenovo aren’t supporting AMD by using their chips and then pairing them with single channel low speed RAM… quite the opposite really. Poor form.

      • Anonymous Coward
      • 2 years ago

      However, see: [url<]https://techreport.com/news/32520/lenovo-thinkpad-a275-and-a475-take-amd-pro-apus-on-the-road[/url<]

    • BobbinThreadbare
    • 2 years ago

    Max clock speed on the 2500 is 200 mhz faster than the Ryzen 5 1400. Depending on the price that might be a very nice entry level gamer chip.

    • HERETIC
    • 2 years ago

    Is it me or am I missing something here?

    The 15 Watt(race to sleep) platform seems to be the wrong place to launch
    quad core Zen and Vega graphics…..

    I’d have thought a 35 Watt TDP would really allow Zen and Vega to shine,and
    they could slip in the “race to sleep” later……………………….

      • DeadOfKnight
      • 2 years ago

      That’s what you want to buy, but that doesn’t mean that’s where the money is. AMD wants to sell ultrathin laptops for Christmas.

        • HERETIC
        • 2 years ago

        Laptops in general are a “compromise” Size-Power etc.
        ultrathin(race to sleep) adds another layer of compromise.

        At 15 watts they’ll probably have a CPU almost as good as Intel
        and a GPU a little better.

        I just think AMD could have come out the gate screaming-
        LOOK AT ME-LOOK AT ME-LOOK AT ME.
        Which a 10 Watt CPU and 25 Watt GPU would give them……………

          • Spunjji
          • 2 years ago

          How could they do that if they were selling larger, heavier laptops than their competition? You’re also assuming that you’d get sufficient performance scaling from that additional TDP headroom which, in the case of the graphics, probably isn’t true due to bandwidth limitations. At 35W you’re in the same form-factor territory as Gen 8 Intel quad-core 15W + 25W dedicated nVidia graphics with its own VRAM. They’re not going to beat that combination.

            • HERETIC
            • 2 years ago

            Call me “optimistic” but I’m hoping a AMD APU in the <45 watt will send all those
            low end M-GPU’s off to silicon heaven,you know the ones that are often advertised
            by their Ram size not their capability………………………………..

            As to the rest-I see-trying to put improved graphics with a “race to sleep”processor
            a contradiction in itself………………………

    • DeadOfKnight
    • 2 years ago

    If I remember correctly, a GeForce 950M performed similarly to the GeForce GTX 750 Ti, which was a very respectable graphics card that had performance similar to the Xbone’s GPU without the need for any additional power. If the iGPU in this is comparable to that, then this should be a very good gaming chip indeed and will likely become the go-to chip for HTPCs.

      • Spunjji
      • 2 years ago

      If the benchmarks so far are to be believed, the iGPU here can roughly match (and sometimes slightly exceed) 950M performance.

      Having used a laptop with one of those in myself, I know that it’s perfectly capable of 1080p gaming with moderate details and 2560×1440 on some less demanding titles. It’ll be good stuff!

        • tipoo
        • 2 years ago

        Hmm, so this should also be beating the MX150 if this is true.

        [url<]https://www.notebookcheck.net/Mobile-Graphics-Cards-Benchmark-List.844.0.html[/url<]

    • DeadOfKnight
    • 2 years ago

    Come on AMD, that die shot clearly shows 11 Vega compute units. What gives?

      • cmrcmk
      • 2 years ago

      Hedging manufacturing bets? TDP constraints? It’s probably safe to assume that when they release 25+ W chips, you’ll see SKUs with all 11 active.

        • Gadoran
        • 2 years ago

        Formerly right now ryzen-5-2500u is a 25W cpu, we don’t need to wait.
        Clearly AMD put at this TDP value the SKU to do the bench suite reported by the article. It is not fair enough still an underdog like AMD has to do such things to survive.

        Obviously the comparison was a failure too because the rigth Intel configuration should be i5-8250U + MX150 in a reference design. But a deadly comparison it is not welcomed i believe, better utilize an old 28nm GPU under power constrains.

          • Spunjji
          • 2 years ago

          Very coherent, thanks for sharing. /s

      • NTMBK
      • 2 years ago

      Saving the best chips for Apple?

        • tipoo
        • 2 years ago

        I was just thinking of the rMBP 13 looking at this. Even if Intel could offer a bit better CPU performance, the quads in this would be a fair uplift from what we have now (7G dual cores), and the integrated graphics would be a lot better than the Iris 650.

        The chassis seems too small for the MX150, so this would be a nice option…Using an AMD CPU would be a first though.

      • dragontamer5788
      • 2 years ago

      Manufacturing defects may destroy a CU but otherwise the chip is perfectly usable.

      This just indicates that AMD expects that one-CU will be busted on the average, but the chip is still overall “sellable”. A similar technique happened with the PS3 (with 8 SPEs being made on every PS3, but only 7 SPEs accessible by the software. The assumption was that 1-SPE would be broken due to manufacturing defects)

      • freebird
      • 2 years ago

      Correct.

      [url<]http://www.guru3d.com/news-story/amd-raven-ridge-apu-spotted-4-zen-cpu-cores-and-704-shader-processors.html[/url<] 11 (Compute Units) x 64 (shaders per CU) = 704 shaders Since the 2700U only had 10 enabled = 640 shaders and the 2500U 8 enabled = 512 shaders then perhaps the full 11 will only be enabled in higher wattage parts or they are having yield issues and can only release these models yet.

      • ET3D
      • 2 years ago

      I think purely yield. They want a lot of these. Like the RX 460 was 14 CU’s. Also, more units don’t always translate to significantly higher performance, due to using more power and therefore running at lower clocks. A lot of binning would be needed to make that worthwhile.

      AMD might introduce a higher end chip in the future.

    • DeadOfKnight
    • 2 years ago

    This looks like it’ll be a fine contender for Intel’s NUC.

      • smilingcrow
      • 2 years ago

      NUC being short for NUClear option.

    • DeadOfKnight
    • 2 years ago

    This is what the console makers should have waited for, and just launched the next-generation PS5/Xbox4 instead of these half-cycle refreshes that do next to nothing to bring more complexity to games and only (kinda) enable 4k. To be fair, this could still be in the cards, but probably a lot further out now that brand new consoles exist (and arguably didn’t need to).

      • Voldenuit
      • 2 years ago

      Wait 4 years (xbone came out Nov 2013) while your competitors get a head start and corner all of the eighth gen market? None of the players could afford to do that.

        • DeadOfKnight
        • 2 years ago

        Would it really take 4 years from now to launch products based on Zen/Vega architecture? I could see maybe 2. And if you haven’t checked, PS4 Pro sales are pretty lackluster and even if the Xbox One X manages to sell well, it’s been proven time and again that games sell consoles and Sony gets all the best exclusives year after year. They did this to try and sell PSVR, which was also too soon IMO. I just think it’d be better to have Zen/Vega in consoles by 2020 than “4k” systems today.

      • BobbinThreadbare
      • 2 years ago

      Why would console makers want integrated graphics?

        • ronch
        • 2 years ago

        The current consoles technically have integrated graphics because their CPUs and GPUs are in an APU setup.

          • tipoo
          • 2 years ago

          Though with GDDR5 or eSRAM to address the main scale up issue with integrated graphics. After that it was chip size and wattage.

          If Intel continued the Iris Pro models rather than the Plus with half the eDRAM, they’d probably be around the base consoles with Baby Cake or Coffee.

        • NTMBK
        • 2 years ago

        The same reason they’ve been using it since the XBox 360 shrink- to reduce costs and improve efficiency.

      • cmrcmk
      • 2 years ago

      I could only speculate on the GPU side, but for the CPU, this is a fundamentally different architecture. PS4 and Xbox1[i<]n[/i<] use Bobcat-derived cores. While the Zen cores are certainly higher performance both in IPC and pure Hz, the console manufacturers have to be careful about anything that could break backwards compatibility.

        • DeadOfKnight
        • 2 years ago

        It’s still x86-64 architecture, which was the whole point of last gen consoles going with AMD instead of PowerPC or Nvidia for console SoCs. There should be no compatibility issues for the foreseeable future. The reason the newest consoles still use Bobcat CPU cores is because they are going to run the same generation of games that have to be playable on XB1 and PS4.

        They have stated repeatedly, though of course only time will tell, that there will be no Xbox One X or PS4 Pro exclusive games. The newest consoles are simply a refresh to allow you to run games in VR/4K/HDR at decent frame rates. Not necessarily a bad idea, but it’s underwhelming when the tech to do so much more for the same price is around the corner.

          • Spunjji
          • 2 years ago

          The tech to do so much more for the same price is /always/ around the corner. That’s part of the console equation. What this has to do with a 15W APU is beyond me.

            • DeadOfKnight
            • 2 years ago

            No, no it’s not. Not from AMD it hasn’t been for a very long time. And this 15W APU has CPU cores that are better than those in the Xbox One X and the PS4 Pro. With a higher TDP we are talking about an even bigger increase.

          • tipoo
          • 2 years ago

          There’s no reason x86 is any better suited to instant BC than staying in PowerPC or MIPS would have been, the ISA is just the ISA. It’ll depend on how low level the APIs are. I think Microsoft will prove fairly transferable with their DX12 derivative XBO API, while GNM I think is fairly lower level (see the PS4 Pro not enabling the full graphics performance on unpatched games, boost mode disables half the GPU).

      • ET3D
      • 2 years ago

      GPU power is the limitation for 4K, not CPU power, so having a die which dedicates a lot of space to a powerful CPU isn’t a good way to go for moving to 4K and VR.

      Besides, I think that it will be a while before AMD offers custom Zen based APU’s.

        • DeadOfKnight
        • 2 years ago

        If you do a bit of research, you will find that a good deal of current generation console games that suffer from performance issues are mainly held back by the weak CPUs in these systems.

          • NTMBK
          • 2 years ago

          Yes, and any games for these refreshes need to also run on the original base consoles, and will still be held back by that CPU. A beefed up CPU would go to waste.

            • DeadOfKnight
            • 2 years ago

            I’m not arguing with you there. The new refreshes don’t need a new CPU; it’s just a shame they decided to do these refreshes to extend this generation of consoles when the tech from AMD to bring something substantial enough to warrant a new generation is coming out here in 2018.

          • ET3D
          • 2 years ago

          It may be true (though I think it isn’t), but if the goal isn’t to fix all performance issues but rather to move to 4K and VR, then GPU is key.

          Why I think it isn’t true is that games drop resolution left and right. You don’t even get 1080p in many cases. So sure, after dropping the res the CPU may be an issue, but the basic limitation is still the GPU. If the choice is between a faster CPU and a faster GPU, the faster GPU would improve things a lot more.

            • DeadOfKnight
            • 2 years ago

            It’s not all about FPS in current games. Developers limit what they can put into the game based on what the hardware is capable of. We have less complex games than we could have with a bigger CPU. There’s a reason a lot of PC ports of games, even if they come with issues of their own, often bring features that consoles don’t get.

            I’m not talking about post-processing and textures and stuff like that. I’m talking about how you can have bigger battlefields with more players for multiplayer games. You can have more NPCs and more systems at work for RPGs. RTS games developed for PCs could work with a mouse and keyboard, but most would not run well on current consoles.

            There is a reason a game like Star Citizen hasn’t even given consideration to current consoles, and it’s largely due to the weak CPUs. There’s no way they could run it, even at the lowest video settings. Assassin’s Creed Unity was too ambitious on current consoles, and it ran terribly because of the CPU. Physics and number of objects and players and everything would have to be dumbed down.

        • tipoo
        • 2 years ago

        For sticking with the same level of game complexity as the 8th gen base consoles at 4K, sure, but the CPU is clearly the limit in a lot of places. the Jaguar cores are pretty weak by now, bested by modest dual core pentiums even if all 6.5 available cores are used well. See Destiny running at 60fps on Pentiums while at 30 for CPU reasons on console, per DF interview.

        • Waco
        • 2 years ago

        GPU power *can* be a limitation at 4K. CPU power is absolutely part of that equation – have you ever tried gaming at 4K with a weak CPU? It’s [i<]bad[/i<]. Really [i<]really[/i<] bad. Stuttery mess bad, even with a Titan X.

          • DeadOfKnight
          • 2 years ago

          That’s a good point, too. The bottom line to me is that Sony and Microsoft just wanted to capitalize on the new 4K and VR trends, but if they just waited a couple more years they could have launched the next generation with the performance to make it worthwhile for anyone to get on board.

          Now, I think they will wait a little longer since they have new boxes to sell. Which means more games that don’t push the envelope in any way and more PC ports that don’t take advantage of the hardware. It’s just a shame to see what’s technically possible but not going to happen anytime soon because $$.

            • Anonymous Coward
            • 2 years ago

            I prefer consoles with many slow CPUs to hypothetical consoles that can get buy with minimal CPU parallelism. I think that helps establish good software architecture, which then comes back to help PC’s.

            I do not intend to purchase a console, incidentally.

            • DeadOfKnight
            • 2 years ago

            That’s a good point as well. I remember thinking this very thing when this generation was newly announced. Judging by the Coffee Lake review, it has been working. Not only that, but we’ve been getting some decent ports from Japan on PC over the past couple years. As a PC gamer, these consoles are great.

            However, there does come a point where developers are squeezing every last drop of performance out of the hardware they can and have to scale back to where games become playable. This is already happening on the CPU side of things, therefore we get games that in general are not very CPU intensive.

            • Anonymous Coward
            • 2 years ago

            I hereby predict that the next consoles swap up to 8 cores of Zen, at modest clock speeds. Quote me on that some day when I’m famous.

            The low CPU demands have worked out pretty well with the under-delivery of CPU cores on the Intel side of things.

            Actually back to my first line, it seems likely that the general stagnation in CPU power is going to provide an opportunity for consoles to essentially catch up when they do their next refresh in a few years. I’m not seeing a lot of reasons to exceed 8 cores any time soon. Will be interesting to see what Intel can do to justify their prices in the coming years.

            • DeadOfKnight
            • 2 years ago

            I agree the consoles will catch up. Consoles have always been about being “good enough” for the best price possible. They are high in value in comparison to PCs for gaming when they first launch, and with progression slowing down, they are going to reach a point here soon when that same value extends to a lot longer than just when they first release. Of course you’ll still be able to spend $3000 for a much more powerful gaming PC with massive chips and power and heat and all that, but it will be more for bragging rights than anything when the console games look great and run at 4k/60fps.

            • Anonymous Coward
            • 2 years ago

            Interesting though that MS and Sony would like to shorten the console refresh cycle at the same time as it should be possible to do interesting things for a longer time.

            It is a bit of a shame that their “4k” refresh didn’t allow anything new on the CPU side. Seems to me that they could have a smooth progression of backwards compatible consoles, games would just specify at which version they would be able to function. Not even a tiny bit more CPU? 🙁

            (Also I am bothered how similar their hardware is, what cowards!)

            • ET3D
            • 2 years ago

            Yes, it’s a good way to get on the trend, bring new things to the market sooner, and stay within a budget.

            I’m sure that in the long run we’d get better CPU’s, I just don’t think it makes sense to hold off advancing consoles. I think that a half generation device with more GPU power is better than not doing a thing.

    • meerkt
    • 2 years ago

    [quote<]the Raven Ridge SoC package itself is only 1.38 mm thick, down from 1.82 mm[/quote<] The age of two decimal places phone thickness advertising is upon us. Rejoice!

    • juzz86
    • 2 years ago

    Bit cheeky to pinch the Sandy Bridge designators, but I’ll allow it!

    Interested to see how it real-world reviews.

    • MileageMayVary
    • 2 years ago

    I will buy two when the desktop versions come out.

    /taps foot.

      • Voldenuit
      • 2 years ago

      If you’re building a GPU-less desktop, wouldn’t a NUC-like SKU with a soldered-on Ryzen 2×00 APU on a tiny custom mobo make more sense than an mITX/uATX/ATX box?

      If you’re thinking mITX/uATS/ATX and adding a GPU, why bother with an APU? Might as well go Ryzen (moar cores) or Coffee (moar cores, moar IPC).

        • derFunkenstein
        • 2 years ago

        This might work for an in-home server in a way a NUC can’t, but otherwise I think you’re spot-on.

          • Redocbew
          • 2 years ago

          I’ve been using a NUC for an in-home server for a while now, but yeah. This would be better.

        • mikato
        • 2 years ago

        This is what I want. They better put out the NUCs with Ryzen APUs!

    • swaaye
    • 2 years ago

    I don’t trust 15W chips. Intel’s 15W chips throttle after a fairly short burst of exciting speed, especially if you try to play games where the CPU cores can bounce down to 800 MHz so the GPU can maintain some level of adequate speed.

    Looks like they want in on that premium notebook market though, which wants 15W APUs. Makes sense.

      • Anonymous Coward
      • 2 years ago

      Clocks approaching 4ghz plus a decent GPU in 15W, it does sounds like a bit of a scam. This needs some good testing…

      • Gadoran
      • 2 years ago

      In fact this AMD piece of silicon is 25W. So say the bench suite results they showed.
      After all we know very well both Zen an Kaby now.

      • tipoo
      • 2 years ago

      That’s why what Acer did was interesting – the chassis was built for 25W, so presumably these chips have a fair bit of performance potential if they’re cooled well.

    • UberGerbil
    • 2 years ago

    [quote<]Ryzen CPU with Radeon Vega Graphics[/quote<]Darn it. I was expecting a further extension of the Ryzen neologism: Zenga. (RyVeg? Ryga? ZenVe?)

      • dpaus
      • 2 years ago

      I’d buy a huge, powerful AMD chip named V’ger. It’s good at keeping creatures with names like ‘Krogoth’ in line.

        • spiketheaardvark
        • 2 years ago

        Yeah but how would you keep it from digitizing the earth?

      • derFunkenstein
      • 2 years ago

      I’ve played Zenga. I’m really bad about pulling out the middle piece and putting it at the top of the tower, which makes for fewer plays for everyone else.

    • AnotherReader
    • 2 years ago

    According to [url=http://www.anandtech.com/show/11964/ryzen-mobile-is-launched-amd-apus-for-laptops-with-vega-and-updated-zen<]AnandTech[/url<], Raven Ridge is almost as large as regular Ryzen: 210 mm[super<]2[/super<] vs 213 mm[super<]2[/super<].

      • derFunkenstein
      • 2 years ago

      Given how ambitious the graphics appear to be, I guess that’s not too bad. I was going to suggest they get out a tiny chainsaw and cut it in half again for low-end builds, but AMD itself killed that when it convinced Intel to up the core counts across the board.

      • UberGerbil
      • 2 years ago

      Yeah, and I expect that’s the answer to this article’s question [quote<]That's the smallest L3 cache we've seen on a Zen CPU so far, down even from the 8MB in the Ryzen 5 1400. Each Zen CCX can have as much as 8MB of L3, so it'll be interesting to learn whether this choice was made for power savings, market segmentation, or some other reason.[/quote<] (c) "Some other reason" -- floorplan and die area.

      • chuckula
      • 2 years ago

      That’s approximately 40% larger than a 6-core Coffee Lake part.

      It’s obviously that big because of the Vega graphics.

        • Anonymous Coward
        • 2 years ago

        Vega is supposed to be good for compute, I wonder if all those transistors will very often be asked to do anything other than graphics.

          • Krogoth
          • 2 years ago

          Vega is definitely a monster at general compute stuff but I doubt the iGPU will be used for general compute in a portable platform due to power/thermal concerns.

            • NTMBK
            • 2 years ago

            It might get some use in, say, a Macbook Pro running Adobe Premiere, or Photoshop.

            • Anonymous Coward
            • 2 years ago

            Hmm, I’m seeing a lot of reasons for Apple to luv them some AMD here.

            Edit: spelling

      • Alexko
      • 2 years ago

      Not surprising, since the 11 NCUs alone are about as large as the 4-core CCX.

    • cpucrust
    • 2 years ago

    Hopefully, the OEM’s don’t saddle the displays with low quality level IPS or TN screens as has been typically done to AMD based mobile solutions in the past.

    I wonder if FreeSync would be included ?

      • Mr Bill
      • 2 years ago

      Yes…
      [quote<]AMD made a secondary point alongside this data: the Ryzen Mobile Vega 10 and Vega 8 graphics will support FreeSync 2 panels, especially in that 30-60 Hz range. [/quote<] From the [url=https://www.anandtech.com/show/11964/ryzen-mobile-is-launched-amd-apus-for-laptops-with-vega-and-updated-zen/2<]Anandtech review[/url<]

      • Mr Bill
      • 2 years ago

      However, none of the three design wins mentioned FreeSynch panels.

      • spiketheaardvark
      • 2 years ago

      No these days they gimp systems with crap cooling and single channel memory.

      Freesyce is cheap and lets them put a sticker on the thing. I’m surprised none of the launch systems have it.

        • Voldenuit
        • 2 years ago

        [quote<]Freesyce is cheap and lets them put a sticker on the thing. I'm surprised none of the launch systems have it.[/quote<] OEM Thought process: Freesync is for gamers. We can't sell anything to gamers unless we add a dozen RGB LEDs on it first!

          • cpucrust
          • 2 years ago

          Additionally, it must have a Killer NIC, flashy stickers plastered all over it, and must not forget: aliens and/or dragons.

            • UberGerbil
            • 2 years ago

            … or a bicep-hawk.

            • Redocbew
            • 2 years ago

            And flat heatsinks with even more RGBs. Lots and lots of RGBs.

            • Waco
            • 2 years ago

            I’m saddened that this is reality.

            • spiketheaardvark
            • 2 years ago

            I’m waiting to move my system to RGB until they have HDR RGB.

            • K-L-Waster
            • 2 years ago

            8K wide aspect ratio 14 bit HDR AMOLED RGB… on every surface.

            • LostCat
            • 2 years ago

            I like dragons. And would like Killer NICs if mine had worked right.

    • rpjkw11
    • 2 years ago

    More and more I find a need for a convertible, so I’m really intrigued with mention of an HP Envy X360 sporting a Ryzen CPU/APU! I want power, speed, and as long a battery life as possible (though the battery is the least important of the three) and I think/hope AMD will deliver.

    • ronch
    • 2 years ago

    The great thing about wanting to play old games like King’s Quest is even a boring APU like the A8-7600 will do really well and you feel good knowing your APU isn’t overkill for what you do (the 1800X is really nice but I just can’t imagine using it to play old games). It’s just fine. Given my use case these days I just might end up with Raven Ridge or a later iteration one day.

    • ET3D
    • 2 years ago

    PCWorld quotes 9W as the minimum TDP, not 12W.

      • Jeff Kampman
      • 2 years ago

      Maybe they should look at AMD’s specs, then? I’m double-checking.

      EDIT: Official materials definitely say 12W, so…

        • ET3D
        • 2 years ago

        I’ve seen that image with the 9W on several sites by now (TechSpot, Guru3D, …). It seems to be part of a slide set. I do see 12W on AMD’s site.

        Edit: Looks like Legit Reviews has the same slide with 12W in it. That’s makes it even more curious. Was the slide set sent with different values to different sites? Was this an error which was later fixed?

          • Redocbew
          • 2 years ago

          Remember their “advertorial” which had a 60 FPS line across its graph that magically adjusted its self once people noticed something was off? It wouldn’t be the first time they’ve goofed like that.

          • willmore
          • 2 years ago

          Maybe there was a third member of the family that got cut from the launch late in the game?

    • ronch
    • 2 years ago

    Me: “Hi. I wanna buy a Ryzen CPU with Radeon Vega Graphics.”

    Guy at the store: <Takes two boxes off the shelves, a Ryzen 7 1700X and a Vega 64>

      • Redocbew
      • 2 years ago

      That would require that they have a Vega64 on the shelf.

        • derFunkenstein
        • 2 years ago

        +1 for the apparently-sincere belief that a retail worker wouldn’t just try to convince you to go Intel for the higher sales price.

          • Voldenuit
          • 2 years ago

          Nah, the Vega is already +$100 inflated from MSRP.

            • JustAnEngineer
            • 2 years ago

            $20.
            [url<]https://techreport.com/forums/viewtopic.php?f=3&t=119967&start=210[/url<]

    • ET3D
    • 2 years ago

    Looks reasonable. I hope it has a lot of success and ends up in a Dell XPS 13 (or something similar) to upgrade my current one.

    • Unknown-Error
    • 2 years ago

    All that from a 15W APU? Is AMD messing with us? Eagerly waiting for TR review. Jeff, I hope you can get a hold of a notebook with Dual-channel memory to push that GPU hard.

      • ronch
      • 2 years ago

      I expect TDP to be around 35w or so, maybe 45w.

        • Gadoran
        • 2 years ago

        No. They adopt the same approach of Bristol Ridge, they leverage on the Tskin of the laptop. The disasvantage is that the SKU must to be connected with the plastic case and this solution is hated by OEMs. The advantage is that under bench suite the SKU look better for a short period of time. Unfortunately a medium/long session of gaming or real work mean a drop of frequencies and performance in general. This was the real reason of the failure BR.
        Intel OEMs utilize this solution sometime for Core M line.

          • Spunjji
          • 2 years ago

          Bristol Ridge failed because it was an intermediary gap-filler product based on awful CPU cores and outdated graphics.

    • Krogoth
    • 2 years ago

    Power efficiency of the platform and price are going to decide the success of the Ryzen Moblie not raw CPU and iGPU performance.

      • dpaus
      • 2 years ago

      Agreed, but as we saw with their previous-generation mobile chips, even a deeply discounted price won’t get you anywhere if the CPU and iGPU performance aren’t at least ‘in the ballpark’. With this architecture, AMD is comfortably standing right on the pitcher’s mound. And they have already demonstrated that they can offer highly-comparable products at a very attractive discount vis-a-vis Intel.

      Now, as others have pointed out, if they can just keep the OEMs from putting their products into cheap, crippled econoboxes.

    • rudimentary_lathe
    • 2 years ago

    Looks like a pretty competitive product. Let’s just hope they price it lower than Intel’s offerings like they did on the desktop side.

    I was hoping for a little more integrated GPU power, but realistically there’s only so much they can do with the power and cooling constraints of a mobile device.

    • DavidC1
    • 2 years ago

    The CPU and GPU seem to be great. It’s basically Skull Canyon for laptops. The power management still seems to be lacking, and a few generations behind.

    I reckon its because extra R&D budget pays off in things like power management, where a system wide approach is needed to improve it and that’s the advantage Intel has. Haswell was the era where Intel made titanic advances in power management, with every thing before and after it a mere generational improvement.

    Traditionally, this has been the reason why AMD even in their glory days never did well in mobile.

      • swaaye
      • 2 years ago

      Yeah I’m looking forward to testing done my TR to see how the power management performs with mixtures of load across GPU and CPU, and over time to see how the clock speeds ramp up and down. Especially in this 15W segment.

    • Voldenuit
    • 2 years ago

    [quote<]Power-saving schemes , but this implementation seems to be the most integrated and refined that AMD has shipped so far.[/quote<] Jeff, I think you accidentally the whole thing.

    • DancinJack
    • 2 years ago

    Uggghhhhhhh internal AMD numbers. Makes me want to puke. Just give people chips and let them test. If your product is good enough, people will buy it.

      • shank15217
      • 2 years ago

      Good idea, where will you put these chips? Hope you have a commercial soldering machine, a EE degree and whole lot of documentation.

    • drfish
    • 2 years ago

    WRT to the Ideapad 720S, if you can’t throw your own RAM in there to go dual-channel/2400, it’s a complete and utter failure of a product. Which would be a real shame, because it’s so promising otherwise.

      • chuckula
      • 2 years ago

      Part of the issue with many mobile systems that use AMD processors is that they are targeting the “OMG CHEAP!” end of the market. People tend to forget that when you go “OMG CHEAP!” then [b<][i<]everything[/i<][/b<] ends up cheap. The issue is that dropping an AMD APU into an expensive notebook that was otherwise using an Intel chip doesn't turn a $1,200 notebook into a $600 notebook by magic.

        • drfish
        • 2 years ago

        I’m not forgetting, I’m just ticked off. 😉

      • Veerappan
      • 2 years ago

      No kidding. I’m in the market for a new 13-14″ laptop, and I have been hoping that Raven Ridge would earn my $$. This Ideapad 720s looked like the perfect candidate, until the single-channel ram was announced.

    • lycium
    • 2 years ago

    Acer Swift 3 looks awesome, great job AMD! Do want.

    • NTMBK
    • 2 years ago

    This is going to make for a great Macbook.

      • chuckula
      • 2 years ago

      [s<]ARM[/s<] [b<]AMD[/b<] in Macbooks....... CONFIRMED!

      • maxxcool
      • 2 years ago

      Na, Apple is more interested in getting fully ARMED so as to reduce third party independence.

      ARM is so close to performance to watt parity with recompiled code they would not want to spend the money and ink a long term deal with AMD

    • derFunkenstein
    • 2 years ago

    That kinda trick with the i7-8550U being in an ultraportable where the Ryzen APUs were not is the kind of thing that really just grinds my goats every time. I get it, it’s marketing, but it makes the marketing completely irrelevant and unreliable.

      • chuckula
      • 2 years ago

      I believe AMD’s internal smoke-and-mirror results just as much as I believe Intel and Nvidia when they do the same thing!

        • derFunkenstein
        • 2 years ago

        At least when Nvidia does something weird (like relative x faster numbers), they’re only comparing their own products. 😉

        [url<]https://techreport.com/news/32744/nvidia-makes-the-geforce-gtx-1070-ti-official[/url<]

      • psuedonymous
      • 2 years ago

      Yep, it seems like AMD is consistently unable to just do a straight like-for-like test in their marketing materials, and does something on the edge of dodgyness to try and tilt things in their favour that just end up souring their numbers (e.g. [url=https://techreport.com/review/28912/tiny-radeon-r9-nano-to-pack-a-wallop-at-650#perf<]posting GPU results with Anisotropic Filtering disabled[/url<]).

      • jensend
      • 2 years ago

      [quote<]where the Ryzen APUs were not[/quote<]And how exactly do you know this? There was no indication of what the chassis or thermal solution for the APU was. It's conceivable that they only graphed the 8550U ultrabook results and not the 8250U full-fat results because they thought the benchmarked Raven Ridge thermal solution was comparable to the ultrabook and not to the other. In particular, given that [url=https://www.notebookcheck.net/Acer-Swift-3-SF315-8250U-MX150-FHD-Laptop-Review.258594.0.html<]Notebookcheck's 8250U Swift 3[/url<] pulled 55W, it's conceivable that it uses a TDP-up 25W processor configuration and that AMD preferred an apples-to-apples 15W TDP comparison. Or it could be dumb marketing. But I think the pitchforks and torches are premature at this stage.

        • derFunkenstein
        • 2 years ago

        They copped to it in their omission. If they were comparing like-for-like 13″ ultraportables you darn well better believe they’d have been touting it.

          • Waco
          • 2 years ago

          Yeah. AMD marketing is AMD’s worst enemy. It ruins any sense of honesty they might have had and puts people on the defensive in terms of product performance.

        • Spunjji
        • 2 years ago

        That laptop also has an nVidia GPU in it rated at around 25W, which makes the 55W figure a little less astonishing after you account for display power and conversion losses. The review also notes that the CPU tends to throttle a little.

    • maxxcool
    • 2 years ago

    I expect the CPU’s to take a 3-5% hit in some operations, a little more in bandwidth intensive operations.. (5-8%).

    That being said.. a 4 core design WITH Amd’s version of hyper threading WITH Amd’s Compute prowess “should” prove extremely worthwhile AS LONG AS OEMS DON’T GIMP IT WITH CRAPPY RAM AND SINGLE CHANNEL SETUPS.

    I see iGPU/cpu fully integrated dominance performance wise shifting to Amd for a year or so… and for a Intel guy that’s a lot for me to say.

    Edit, added caps because Thors’day ..

      • enixenigma
      • 2 years ago

      [quote<]...AS LONG AS OEMS DON'T GIMP IT WITH CRAPPY RAM AND SINGLE CHANNEL SETUPS. [/quote<] Already happening with one of the initial designs. It's like OEMs can't help themselves...

        • maxxcool
        • 2 years ago

        sigh … /faceplam/

        • spiketheaardvark
        • 2 years ago

        They need some sort rebate program for putting in dual channel memory, or OEM are going to kill ryzen mobile with crap setups.

          • maxxcool
          • 2 years ago

          !Rigth!?

          • IGTrading
          • 2 years ago

          Or buyers should be educated that any OEM offering Dual Core CPUs in Single Channel-only (thet don’t allow a second SO-DIMM to be added for DC ) laptops is mocking its clients and basically stealing their money.

          The OEM can hope to save 0.5% by such a move while the performance that the buyer gets tumbles down by anywhere between 5% and 40% .

          Basically the OEMs that do this say : “We’re going to save 0.5% of our costs and we don’t give a flying fick if our clients loose 40% of the performance.”

        • Chrispy_
        • 2 years ago

        I swear that Intel must be paying OEMs to only offer AMD in sabotaged configurations.

        TN/eMMC/single-channel

          • IGTrading
          • 2 years ago

          Oh my, why would you say such a thing ?! 🙂

          [url<]https://www.youtube.com/watch?v=osSMJRyxG0k[/url<]

            • egon
            • 2 years ago

            More of that, please.

          • torquer
          • 2 years ago

          They must also be paying them to make the same crap “sabotaged” configurations with Intel chips? Look at all the $300 15.4″ laptops out there. They are almost all BS SoCs. Makes for a cheap laptop for folks who don’t know any better (and honestly may not need more)

        • DPete27
        • 2 years ago

        Are we sure that it will be limited to single channel, or just that they only fill one slot from the factory, allowing the consumer to add a 2nd.

          • derFunkenstein
          • 2 years ago

          It says “single channel” on the Lenovo image. Hard to interpret that as anything but.

          What’s with the other two? “Dual channel (up to 8GB)” seems kind of weak to me. Even my “old” Kaby Lake Inspiron 7000 13″ 2-in-1 supports 16GB in a pair of SO-DIMMs.

            • DPete27
            • 2 years ago

            Single channel = 1 stick of RAM. That doesn’t mean there isn’t a 2nd open slot inside.

            • derFunkenstein
            • 2 years ago

            Well you’re welcome to remain optimistic and I’d be happy to be wrong, but that would be seriously damaging to the marketing for the device if the first thing AMD does is say it only has single-channel RAM without also specifying that you can change that.

            • JustAnEngineer
            • 2 years ago

            My Zenbook has 10 GiB of memory: an 8 GiB SODIMM for one channel and 2 GiB soldered to the mainboard for the other channel.

            • derFunkenstein
            • 2 years ago

            Good for you. Like I said to DPete27, your’e welcome to be optimistic. It’s not in the spec.

            • JustAnEngineer
            • 2 years ago

            I was merely providing an example of how notebook manufacturers make weird design decisions, even for higher priced systems.

        • yuhong
        • 2 years ago

        The fun thing is that DDR5 will go with 80-bit instead of 72-bit ECC, allowing x16 chips to be used. With 8Gbit DDR4, you would have to use x16 chips or go with single channel with 8GB of RAM.

      • jihadjoe
      • 2 years ago

      To be fair, I think the OEMs do that so when the customer chooses to upgrade RAM there’s a free channel and he doesn’t have to throw out the old SODIMMs.

    • chuckula
    • 2 years ago

    Looks good but in the world of notebooks you really don’t benchmark a piece of silicon in isolation, you benchmark a full system. We’ll see how that works out as TR gets review units from different manufacturers.

      • Magic Hate Ball
      • 2 years ago

      I’m also curious how the non-ultrabook TDP limited chips will perform as well.

      35-65 watts when plugged into the wall for gaming with a bit thicker chassis for cooling?

    • srg86
    • 2 years ago

    I know this won’t be a popular opinion but meh.

    Would still need to be paired with an Intel PCH to make me interested. Intel iGPU does all I need and AMD are still on my avoid list.

      • RAGEPRO
      • 2 years ago

      Why is that, out of curiosity?

        • srg86
        • 2 years ago

        Because I’ve had so much of a smoother experience with Intel chipsets (and memory controllers), rather than quirky AMD ones. It turned me from an AMD guy to an Intel guy.

        In a nutshell, better build quality of Intel products and those that use them.

        I fully admit this is anecdotal.

          • RAGEPRO
          • 2 years ago

          Gave you an upvote for sincerity even though I think you might oughta give AMD another shot. 🙂 Don’t get me wrong, they’ve surely had issues in the past with half-baked chipsets and poorly-performing memory controllers, but Ryzen is a new generation. I’ve spent a fair amount of time fooling around with a friend’s Ryzen 7 1700X system and it has impressed me with how fast and solid it is, even without an overclock.

          Also anecdotal, of course, but I in general don’t approve of the “I had a bad experience, so I’m writing off this brand” mindset. I had [url=https://techreport.com/forums/viewtopic.php?f=37&t=119717<]a HORRIBLE experience[/url<] with a Samsung display, and I'd still consider giving them another shot (particularly given that the display, when it was working, was quite fantastic.)

            • cmrcmk
            • 2 years ago

            [quote<]Gave you an upvote for sincerity[/quote<] Agreed. It's so nice to read a dissenting opinion that is presented with objectivity and a lack of vitriol.

            • srg86
            • 2 years ago

            It would basically take another Netburst to make me move back, or if genuinely a reason to use an AMD product because it fitted a purpose that no Intel chip does, until then Intel are my go-to brand.

            Now for graphics, I’d rather use AMD than nVidia because of their Linux drivers.

            I’m just not willing to put up with the hassle like I would have done in earlier years.

          • DPete27
          • 2 years ago

          Even though I don’t see the better Vega IGP as a major selling point, the fact that you get AMD graphics drivers instead of Intel is HUGE. As long as the CPU and power consumption side of things is comparable to Intel, the graphics drivers would be enough for me to go with AMD.

            • srg86
            • 2 years ago

            But Intel’s Graphics drivers do everything I need, I have no need for anything better. They are also great in Linux.

            Again, meh

            Now if I need a discreet card, then yeah, I’ll go AMD.

            Maybe I should clarify, I don’t play games and the GPU other than desktop effects is simply a graphics adapter.

            Anyway, that’s my 2 cents, I don’t want to hijack this any further.

            • Spunjji
            • 2 years ago

            Just FYI you got an upvote from me on this, though I would stress that gaming isn’t the only problem where Intel drivers have issues. Connecting up to an HDTV can range from being irritating to impossible, depending on the settings available on the TV itself. Never had such issues with AMD drivers (Nvidia on the other hand… yeah, no.)

      • Magic Hate Ball
      • 2 years ago

      That would be pretty pointless considering how much functionality is on the SoC itself for Ryzen/Ryzen mobile.

        • srg86
        • 2 years ago

        Which is why for me, its meh

          • w76
          • 2 years ago

          Actually it should be a plus, the SoC I mean. At least in my case, all my past problems with “AMD” chipsets were actually crappy 3rd party parts and drivers on the motherboard, not actual AMD hardware.

            • srg86
            • 2 years ago

            For me it was actual AMD hardware. The third party parts actually worked better.

      • ET3D
      • 2 years ago

      What AMD-based laptops did you have trouble with?

        • srg86
        • 2 years ago

        Not laptops, desktop hardware.

          • ET3D
          • 2 years ago

          Doesn’t seem to me like a very good reason to avoid this product. I’m not sure when you had that problem or what particular issues you had with it, but I’m sure that this SoC is a lot different than the product you had problems with.

            • srg86
            • 2 years ago

            I have no reason to switch, I’m happy where I am, that’s my main reason. There’s no compelling push factor.

    • Waco
    • 2 years ago

    Want want want.

      • MOSFET
      • 2 years ago

      Helluvaninteresting APU for NUC-alikes.

        • Waco
        • 2 years ago

        And a cheap gaming laptop on the go!

      • ViuM
      • 2 years ago

      Already have it.

      Vega works wonders with graphics. GIMP 2.9.6 is a breeze.

      Applications that leverage multicores work wonders. Unfortunately VMWare Workstation cuts performance down by 1/5th, mainly due to its outdated graphics driver. So alternate virtualization is required,

      HP Envy x360 Ryzen 5 2500u +Vega 8GB Ram 1TB HDD

      Cinebench R15: single core: 142 cb
      multicore: 633 cb
      Open GL 3.3 : 52 fps

      It’s advisable to replace the HDD for an SDD, for best results.

      Battery lasts around ~8 hours based on performance mode and screen brightness.

      Screen isn’t as bright as it could be.

      4K performance is awesome, blows Intel away.

      If Lenovo/HP/Acer optimized it with 32GB Ram and a 1TB U.2 drive, and a 4K display. Alienware laptops would not compete. All under 1K

        • Waco
        • 2 years ago

        If this was available as the 11″ model (which I have with an Atom/Pentium in it) I would have one delivered immediately.

Pin It on Pinterest

Share This