Intel debuts embedded Skylake-R CPUs with Iris Pro graphics

Intel is set to launch its 65W Skylake R-series CPUs this quarter in the form of the Core i5-6585R, the Core i5-6685R, and the Core i7-6785R. These new chips are all quad-core CPUs paired with Iris Pro graphics, which means we get three Skylake Gen9 GPU slices. Those three slices offer up 24 execution units (EUs) each, for a total of 72 EUs. That setup should make for a solid entry-level graphics solution. The highlight of these CPUs for enthusiasts may be the 128MB of eDRAM that Iris Pro brings with it, though. Intel's ARK indicates that these CPUs include improved turbo boost speeds and an L3 cache that matches that on board the Skylake K-series chips, an improvement over Broadwell R-series chips.

Congatec's TS170, an embedded system that uses Intel's Xeon E3-1515M v5 CPU with a similar Iris Pro P580 IGP

While the R-series Skylake chips are officially classified as desktop CPUs, they won't be socketable parts like the Core i7-5775C. That's in keeping with Intel's previous guidance that it won't be releasing socketable Skylake CPUs with Iris graphics. Since these aren't socketable chips, they're closer to mobile or embedded processors that use a ball-grid-array (BGA) motherboard connection. But even mobile and embedded isn't the right fit for the Skylake R-series CPUs, as their 65W TDP is a pretty hefty hurdle for mobile or embedded cooling solutions to clear. That leaves me wondering what products these CPUs are destined for.

From the thousand-foot level, low-TDP CPUs are suited towards mobile computing and high TDPs towards desktop PCs. Peering through the clouds at Skylake, we know that thin laptops are typically built with Intel's 15W Core i chips or even more power-sipping Core M chips. Thicker gaming or workstation notebooks are usually built with Intel's 45W CPUs in mind. I've compiled this table of Intel ARK data on a wide range of CPUs with Iris graphics so that we can discuss what a 65W Skylake R-series system might look like. 

  Core i7-6650U Core i7-6567U Core i7-6770HQ Core i5-6585R Core i5-6685R Core i7-6785R Core i7-5775C
Cores 2 2 4 4 4 4 4
Hyper-Threading Yes Yes Yes No No Yes Yes
Base Clock (GHz) 2.2 3.3 2.6 2.8 3.2 3.3 3.3
Turbo Clock (GHz) 3.4 3.6 3.5 3.6 3.8 3.9 3.7
L3 Cache (MB) 4 4 6 6 6 8 6
GPU Iris 540 Iris 550 Iris Pro 580 Iris Pro 580 Iris Pro 580 Iris Pro 580 Iris Pro 6200
GPU EUs 48 48 72 72 72 72 48
GPU Base Clock (MHz) 300 300 350 350 350 350 300
GPU Max Clock (MHz) 1050 1100 950 1100

1150 1150 1150
eDRAM (MB) 64 64 128 128

128 128 128
TDP 15W 28W 45W 65W 65W 65W 65W
Price $415 N/A $434 $255 $288 $370 $377
Found in Microsoft Surface Pro 4 Vaio Z flip Intel Skull Canyon N/A N/A N/A Desktop PC

Skylake R-series pricing: Anandtech

At the low-end of the TDP scale, we find products such as the Core i7-6650U-powered Microsoft Surface Pro 4 and the Core i7-6560U-powered Dell XPS 13. Those 15W CPUs are a far cry from the 65W R-series. A real concern, as evidenced by posts on Reddit and Ultrabook Review, is that these devices are known to have issues with throttling due to insufficient cooling or aggressive thermal limits. That's because modern CPUs with integrated graphics often have to balance their thermal budgets between these resources. Those aggressive thermal constraints can sometimes be overcome with software such as TechPowerUp's ThrottleStop, though. My hope with 65W CPUs is that manufacturers will be aware of the thermal demands of the chip and provide sufficient cooling that won't hamper the R-series chips' performance.

The 28W Core i7-6567U is available in the likes of the Vaio Z flip. This CPU could easily have been omitted were it not for the fact that it supports the same number of threads and is clocked competively with the i5-6585R and i5-6685R. The i7-6567U's 3.3 GHz base and 3.6 GHz Turbo clocks compare well with the i5-6685R's 3.2 GHz base and 3.8 GHz turbo speeds. The caveats are that the i7-6567U has less L3 cache (4MB versus 6MB), a smaller 64MB eDRAM cache, and Hyper-Threading support  compared to the dedicated quad-core hardware of the Core i5-6585R and Core i5-6685R.

Moving into the 45W TDP range, we have NUCs like Intel's Skull Canyon, powered by the Core i7-6770HQ. As the i7-6770HQ is the Iris variant of the i7-6700HQ, we can also find this level CPU performance in laptops like Dell's XPS 15. Those machines seem capable of entry-level gaming on their own, but they can't replace a midrange or high-end discrete GPU. Because of the thermal demands of a 65W chip, it's unlikely we'll see a Skylake-R chip in this product segment.

The most familiar 65W Iris CPU to our readers may be Intel's Core i7-5775C, which matches up well with the i7-6785R based on Intel's ARK specifications. From our i7-6700K review, we saw the i7-5775C was a surprise contender. That chip delivered impressive gaming benchmarks across the board, thanks to its 128MB of eDRAM. It's easy to imagine that the i7-6785R may deliver better frame-time performance than a Core i7-6700K thanks to its similar CPU architecture, clock speed, L3 cache, and even Iris GPU EUs (not that you'd use the IGP for demanding games). To be clear, eDRAM isn't the end-all, be-all of performance in my eyes, but it makes a strong case for the Skylake R-series CPUs as good foundations for a gaming platform.

Where does this leave the Skylake R-series CPUs? There's always the possibility of a high-end mobile workstation replacement to be powered by these chips. For example, an HP ZBook or Dell Precision mobile workstation would seem like good homes for one of one of these CPUs. Another possibility is a larger NUC-like system from Intel, Gigabyte, or Zotac that can handle the relatively high TDP of these chips. Gamers choosing one of these solutions could add a discrete GPU by taking advantage of an external GPU solution over Thunderbolt 3. Until such a system emerges, though, it's hard to judge what these CPUs might mean for the enthusiast.

Comments closed
    • Chrispy_
    • 3 years ago

    [quote<]Intel debuts embedded Skylake-R CPUs with Iris Pro graphics But where will they end up?[/quote<] In a NUC that [url=https://techreport.com/forums/viewtopic.php?f=22&t=117938<]nobody can argue a good use for[/url<], apparently 😛

    • tipoo
    • 3 years ago

    Any word on Skylake eDRAM bandwidth? Any change?

      • Raymond Page
      • 3 years ago

      I haven’t seen anything. There’s the 64MB and 128MB eDRAM packages, which will have different performance. Last I recall from memory the 128MB was 50GBps. I believe the eDRAM is still a separate die that’s on the same wafer. So it’s possible is identical to haswell.

        • tipoo
        • 3 years ago

        Yeah, I think as some speculated when it came out with Haswell, they’re staying on an n-1 process for the eDRAM. Good way to make use of old fab capacity I suppose. And now they’re slapping eDRAM on more models with the 64MB Iris, so it makes sense.

        Haswells was 50GB/s both ways iirc, so by the regular RAM way of counting things, about 100GB/s? Or am I confused. Intel did claim it was equivalent to a regular GPU bus around 100GB/s though.

          • Andrew Lauritzen
          • 3 years ago

          Yes HSW is ~50GB/s each way so 100GB/s aggregate read/write.

          IIRC SKL’s EDRAM is slightly faster but in neither case is the EDRAM bandwidth typically a relevant bottleneck. Remember it’s just one level in a much smoother cache hierarchy vs. GPUs that have quite small on-chip caches and then everything else falls to GDDR.

    • tipoo
    • 3 years ago

    Pretty curious for tests of the 72EU part. The 40EU Skylake one is decent-ish for what it is, nearly doubling the EUs and all the other architecture tweaks should make for a pretty good part.

    Then again, the cost of it is pretty high too, compared to APUs. But they’re also doing what I always wanted with Iris parts with 64MB eDRAM for cheaper.

    • vargis14
    • 3 years ago

    If I had the money even with kaby lake coming I would upgrade my 2600k platform with this i7-6785R CPU….I’m sure a z170 can get some extra clock speed on top of it all….bet it would last another 5+ years with ease…and my secondary rig would live on for a few more…the 2600k that is:)

    • snowMAN
    • 3 years ago

    That would make for a great Mac Mini or even small iMac.

    • Chrispy_
    • 3 years ago

    I’d love to see an i5 6585R against an AMD A10 and even some low-end GPUs.

    Skylake quad-core at up to 3.6GHz with 72 EU’s running on DDR4 bandwidth at only $250?

      • derFunkenstein
      • 3 years ago

      Yeah it’s kind of a shame there won’t be a socketed version available. I have a feeling it’ll run circles around an A10-7850K paired with an R5 240, and it’d probably keep pace with DDR3-variant R7 250s.

        • Andrew Lauritzen
        • 3 years ago

        Broadwell 48EUs already runs circles around all of the A10s and handily beats the R7 240…
        [url<]http://www.anandtech.com/show/9320/intel-broadwell-review-i7-5775c-i5-5675c/7[/url<] Pretty sure SKL 72EU won't have an issue. You guys need to stop massively underestimating these chips 😉

          • Raymond Page
          • 3 years ago

          Among iGPUs, I’d consider the Iris Pro king of the hill. With 128MB eDRAM, its got a ridiculously powerful advantage on bandwidth and latency compared to the other offerings.

          • tipoo
          • 3 years ago

          This is true, though it also *should*, for how much more it costs than them 😛

          [url<]http://ark.intel.com/products/88040/Intel-Core-i7-5775C-Processor-6M-Cache-up-to-3_70-GHz[/url<] vs [url<]http://www.newegg.ca/Product/Product.aspx?Item=N82E16819113280[/url<] A10 not even being the IGP leader is a bit sad in the context of AMD having a GPU company bought out and Intel traditionally doing poorly, but for over double the cost it damn well better be good.

            • Andrew Lauritzen
            • 3 years ago

            Sure, but remember that cost is set after performance is known, not the other way around. Thus it’s not really a terribly interesting observation that things are priced to fit into their relevant performance profiles… that’s by explicit design and market forces.

    • Klimax
    • 3 years ago

    Brix Pro and Brix Gaming. (Mini high-performance boxes)

    • Voldenuit
    • 3 years ago

    Is there anything to stop an OEM from selling a bare bones mitx or matx motherboard with one of these CPUs soldered on?

      • danazar
      • 3 years ago

      Well, standard coolers won’t fit it (it’s going to have a much lower vertical height off the board than a socketed CPU, which is what standard coolers are made for), which means having to sell a custom cooler for it. May as well build it as an integrated barebones instead–motherboard, heatsink/fan, case. So basically you’ll get a NUC, or something from Shuttle or Zotac.

      • Bensam123
      • 3 years ago

      Why matx or mitx? What about just a normal motherboard with the processor already on board and mounted at the same height a normal 1151 would be at so you can use whatever cooler you want?

      Pretty sure this is what Intel wants now that I think about it. Make people think it’s their idea to ditch the socket.

      • Raymond Page
      • 3 years ago

      Technically, only flowing the BGA solder.
      Practically, soldered CPUs on motherboards never took off as a product category, so the monetary incentive want there. Now days, I see while systems like Intel’s Skull Canyon and Gigabyte Brix as being the only chance of an OEM release as a desktop.

      • Hinton
      • 3 years ago

      My thought exactly. If it mimics the Broadwell in gaming performance, just sell a standard ATX MB with the i7 soldered in.

      Its the top of the line, so it not being user replacable is irrelevant.

      Looking forward to seeing it benched against the 4 ghz Skylake to comfirm or debunk this.

    • ImSpartacus
    • 3 years ago

    Whoa, somebody did their homework. Good commentary backed up with several varied sources. Good stuff.

      • Raymond Page
      • 3 years ago

      I appreciate this, thank you.

      • DrCR
      • 3 years ago

      Indeed, properly good post, Raymond.

    • UnfriendlyFire
    • 3 years ago

    Socketed Skylake-R with dual cores, at a reasonable price and for both desktop and mobile would be a scary day for Nividia and AMD. Bye bye low-midrange dedicated GPUs (ex: GeForce 830M and below).

      • DrCR
      • 3 years ago

      Except non-geeks won’t know any better.

      • Deanjo
      • 3 years ago

      The same thing was said when AMD brought out their APUs yet low-midrange GPUs still flourish.

        • ImSpartacus
        • 3 years ago

        Yeah, it’s interesting how we still see crummy old <$100 gpus in oem machines.

        My estimate is that the kind of process traits ideal for a cpu and those ideal for a gpu are at odds (e.g. density vs clock speed) and it’s just fundamentally tough to get both on the same die without compromising.

        But then again, integrated graphics HAVE been artificially bandwidth starved by ddr3/4. That seems to be changing from several different angles (e.g. hbm, hmc, gddr5x, wider ddr4 implementations, etc).

        If the bandwidth problem gets fixed, then we could very well be looking at cpu+gpu combos that an honestly do 1080p gaming in the near future. I think that’s the benchmark that needs to be satisfied before iGPUs can legitimize themselves.

      • BurntMyBacon
      • 3 years ago

      Or it will be business as usual. I predict that they will simply adjust their next generation product stack to start at dedicated GPUs that are one step up from the predicted IGP competition like they’ve done since IGPs first appeared.

      • mczak
      • 3 years ago

      FWIW even the GT3e Skylake (Iris 550 in 28W cpus such as the Core i7-6567U) pretty much already beats or at least draws even with a GeForce 930M/940M. Low end GPUs are imho sold due to their brand recognition mostly…
      (If you think about it this isn’t really surprising: Those low-end discrete chips are restricted to 64bit ddr3 memory, that is, only half the memory bandwidth of the IGP to begin with (assuming dual channel memory), _and_ on top of that the IGP has 64MB edram – not to mention the IGP is built on a much better 14nm FinFet process.)

        • UnfriendlyFire
        • 3 years ago

        Intel also prices their Iris Pros at a very high price, which makes the mid-range dedicated GPUs the “cheaper” option. It only gets ridiculous when the dedicated GPU is barely more powerful than the standard Intel’s IGP. There was a laptop that paired a quad-core i7 with a GT 610M, that also had 2GB of DDR3 VRAM.

        I know that for the desktop side, a $400 desktop Iris Pro can be beaten out by a $100 CPU and a $300 GPU combo.

          • Andrew Lauritzen
          • 3 years ago

          It actually does invert entirely in some designs – i.e. they pair discrete GPUs that are *slower* than the basic GT2 integrated ones. It’s all marketing – people think “discrete is good, integrated is bad” hence why AMD tries to market their integrated stuff as “discrete class”. Unfortunately reality is only a very small part of selling this sort of thing.

    • Mr Bill
    • 3 years ago

    Office PC, competing against AMD A10-7970K (or even Zen?) builds.

    • cygnus1
    • 3 years ago

    [quote<] Gamers choosing one of these solutions could add a discrete GPU by taking advantage of an external GPU solution over Thunderbolt 3. Until such a system emerges, though, it's hard to judge what these CPUs might mean for the enthusiast. [/quote<] This is what I'm hoping for honestly. My next PC will be very small with a CPU like this and able to use an exGPU that I can share with a laptop with a similar, but lower TDP CPU/GPU.

      • Raymond Page
      • 3 years ago

      I’ve got a hunch that Apple is going to use a i7-6567U or i7-6770HQ in the new Macbook Pros so that they can dump the discrete GPU (dGPU) and save the space for extra battery or cooling.
      If that’s the case, I’m kinda in the same boat except I’ll forgo a gaming desktop and just use a gaming laptop.
      If we get some benchmarks of Skull Canyon with a Razer Core eGPU, we’ll know how potent the eDRAM is, because with an i7-6770HQ at 45W, I suspect it’ll beat the i7-6700K for gaming.

        • derFunkenstein
        • 3 years ago

        The entry level 15″ MBP already foregoes discrete graphics for Iris Pro. Pretty sure that’s what Jeff said he has.

        • tipoo
        • 3 years ago

        They already did, for the base model which I have at least. I imagine there would still be something to divide the base and high end models though, so I’m not sure they’ll remove dGPUs entirely.

        The base model by the way, does not seem to fare any better on cooling. This thing hits 99C on load.

      • Raymond Page
      • 3 years ago

      If you want a small PC with a CPU like this and the ability for an eGPU, I think Intel’s Skull Canyon NUC is what you want to purchase.

      For Thunderbolt 3 enabled laptops, check out this spreadsheet: [url<]https://docs.google.com/spreadsheets/d/12G1VTFWkTL5tb8nxUAtnDHwTLyya9I3Vw-OXXrIN4e4/edit#gid=0[/url<]

        • cygnus1
        • 3 years ago

        I was hoping for a quad core though, pretty much exactly like the Skylake-R’s here. But I’m sure someone will put out some benchmarks of the new NUCs with eGPU’s and if they’re fast enough, I’d consider one for sure.

          • Raymond Page
          • 3 years ago

          Skulk Canyon’s Core i7-6770HQ is a quad core with hyper threading. It’s just clicked lower than the R-series and had less TDP headroom for CPU and GPU to turbo within for extended periods.
          Also, none of these are unlocked processors, so turbo headroom and your cooling solution dictate how performance they will be, not just cooling and added voltage.

            • cygnus1
            • 3 years ago

            That’s a good point. I had assumed all the NUCs were getting the lower TDP i7’s that were dual cores. Now I’m really looking forward to seeing what comes out as far as 3rd party cases/cooling for the Skull Canyon NUCs. Just need to see some eGPU benchmarks to make sure the experience is still pretty decent.

      • ImSpartacus
      • 3 years ago

      That’s an interesting use case. I hasn’t thought of that.

      That’s going to be an expensive route for you and the latency might be annoying, but it’s doable.

        • cygnus1
        • 3 years ago

        Yeah, I want to see reviews and benchmarks once the eGPU hardware becomes more common place. I’ll likely have the laptop before anything else. Which latency do you mean could be annoying?

    • DeadOfKnight
    • 3 years ago

    If the rumor mill is to be believed, Skylake-C is coming with Kaby Lake motherboards.

    • Bensam123
    • 3 years ago

    Should’ve just been desktop parts.

    • derFunkenstein
    • 3 years ago

    These chips are destined for iMacs and iMac-like AIOs. Even though Broadwell’s desktop CPUs were socketed, that seems to be about the only place you can find those, too.

      • danazar
      • 3 years ago

      I think this is it. An iMac isn’t designed to be user-serviceable anyway, so why not use a soldered-down CPU in it?

        • Raymond Page
        • 3 years ago

        Taking this to the logical conclusion…Mac is the future of PC gaming.

          • derFunkenstein
          • 3 years ago

          Not following you. How is integrated graphics on any platform the future of PC gaming?

            • Raymond Page
            • 3 years ago

            The eDRAM acting as a L4 cache cuts cache memory access latency by ~30% which yields some impressive performance gains on workloads exceeding the 8MB L3 cache and sized below the 64MB eDRAM on Iris and 128MB eDRAM on Iris Pro.
            If an iMac chooses one of these 65W parts, you’re looking at a killer CPU with much better memory latency for up to 128MB, which should work well for anything that isn’t massive texture transfers to a GPU.
            The onboard GPU is rubbish, but every Mac I’ve ever seen has Thunderbolt, and with the Razer Core enclosure, we should see Macs becoming gaming machines with Thunderbolt 3 interfaces to an eGPU and the Iris chipset will be killer due to the eDRAM.

            Editing to clarify that the Iris GPU for gaming compared to a discrete GPU such as a GTX 980 Ti or the new GTX 1080 is rubbish. These are different classes of GPU by orders of magnitude of power utilization and shear dedicated silicon and transistor count.

            • derFunkenstein
            • 3 years ago

            Yeah, but man, if you’re into this for PC gaming reasons, just build a PC. The CPU isn’t (and hasn’t been for a while) the limiting factor in games, and that eGPU enclosure costs as much as a good GPU itself. Now you’re talking about a $2500 build (assuming that only 4K iMacs get these parts similar to how their lineup is now). For $2500 you can build a killer desktop and get a nice display with added bonuses like FreeSync/G-Sync.

            • mganai
            • 3 years ago

            Macs have much better resale value, which is a plus for anyone on an upgrade path.

            • derFunkenstein
            • 3 years ago

            So you pay more, but you can get more back when you sell. That makes zero sense.

            Macs have tons of uses. Trying to shoehorn a gaming rig into it just isn’t one of them.

            • mganai
            • 3 years ago

            You get what you pay for mostly. iMacs have some of the best monitors in the biz.

            PCs depreciate much quicker. It’s true they’re more flexible though.

            • Raymond Page
            • 3 years ago

            If price is a factor, a dedicated gaming desktop will be more bang for your buck.

            However the eDRAM is a massive L4 cache, A minimum of 8 times the size of L3. That’s a lot of instructions and cached data to benefit from to push down CPU latency and enable a slight FPS gain.

            • derFunkenstein
            • 3 years ago

            Yeah, REAL slight. Based on this [url=https://techreport.com/review/28751/intel-core-i7-6700k-skylake-processor-reviewed<]internet web page[/url<] I found, pretty much every game was GPU limited, not CPU limited. Even with a relatively fast (and, compared to the CPU, quite expensive) GTX 980 at low resolutions (1600x900 for Witcher 3!), the CPU is far outstripping the graphics card (or maybe the game engine - maybe it's not a coincidence that Witcher 3 topped at at around ~120fps, but we don't see the first 50% of the frame times so there's no way to tell). There's no way an iMac with one of these eGPUs is going to outrun a desktop with a better graphics card in any game. If that's the "future of PC gaming", stop this train; I wanna get off. Now if you want to get off the gaming pipe dream, there's lots of great uses for these CPUs. Gaming is just the worst example. Skull Canyon NUCs are worse. For $700 to get a CPU/case/mobo/PSU is a good price, but now it's $500 more to even dream of adding an eGPU. Unless you're living in a dorm or going to lots of LAN parties, this line of thinking makes approximately no sense.

            • Raymond Page
            • 3 years ago

            eGPU over Thunderbolt 3 (which is PCIe x4 extended out of the chassis) will have a 3-5% FPS hit compared to direct PCIe based on various PCIe disablement benchmarks I’ve reviewed. In that regard, yes, an eGPU will always be worse than a direct PCIe attached graphics card.

            There is nothing preventing these CPUs from being housed on a motherboard with PCIe x16 slots. The reason I mention Skull Canyon is because its the most powerful quad core with Thunderbolt 3 which could (has to have eGPU firmware certification) support an identical GPU as housed on a traditional gaming PC desktop.

            The speculation is that Gigabyte could create a gaming oriented Brix with a PCIe x16 slot and using an Iris enabled chip. This would provide the eDRAM which is a large L4 and the same dedicated graphics bus you get on your traditional gaming PC desktop. In this instance, with proper cooling, I fully expect the Iris eDRAM to provide an edge to the Gigabyte THERMO-NUC.

            The other place this makes sense is for gamers that are in a typical business environment needing a performant Ultrabook type platform. If they get a laptop with an Iris CPU and Thunderbolt 3, they’ll be able to use an eGPU enclosure for better graphics. That external graphics could then be hooked into something like a Skull Canyon for folks that have even more money to throw around, but don’t want to buy dedicated graphics cards for their laptop eGPU and their desktop GPU solutions.

            From a pricing standpoint, the Gigabyte THERMO-NUC, offering an x16 PCIe slot and Iris Pro enabled CPU should based on its eDRAM outperform anything your desktop PC could do, because it *is* a desktop PC, they’re just throwing away a bunch of I/O that is frankly not necessary for most use cases.

            • derFunkenstein
            • 3 years ago

            OK so you don’t want an iMac or an iMac hardware or even a Skull Canyon form factor at all. What you’re *really* advocating for is someone to make a motherboard with one of these CPUs embedded in an mITX form factor or similar, and I’m all for that.

            You should have just said that, and then we’d have avoided all this confusion. :p

            And if I’m wrong, then my question is: why? Why do you want to pay more and get less? You want 3-5% boost in FPS from the CPU and then you’re happy to give up 3-5% on the video side, [b<]where games are actually limited[/b<]. But that's only if I'm misunderstanding you now.

            • Raymond Page
            • 3 years ago

            I was making two points about possible systems. One that there’s a possibility for an uber Iris enabled gaming machine with a discrete GPU, ala THERMO-NUC. The other is that eGPU is a value proposition for those that want gaming paired with Ultrabook style performance/mobility.

            iMac and Skull Canyon will benchmark similarly to Ultrabook gaming with eGPU. I do not want an iMac or Skull Canyon unless I have a pre-existing eGPU from a gaming laptop setup.

            There’s the definite performance hit for eGPU, but there’s two value propositions I want to outline that I don’t think have been clearly identified.
            1) System and graphics upgrades are decoupled with an eGPU at an initial $500 outlay for eGPU chassis, that is once a system (Macbook Pro 2016), graphics (GTX 980 Ti), and eGPU (Razer Core) are purchased the system and graphics can be upgraded independently from each other as long as the eGPU is supported, e.g. Thunderbolt 3 but Thunderbolt 4 breaks compatibility. So I can buy a new GTX 1080 Ti when it comes out while retaining my Macbook Pro 2016. Then in 2019 I can upgrade my Macbook Pro and keep my GTX 1080 Ti. Thus saving from buying a combo laptop/discrete GPU solution.

            2) Weight and power savings by decoupling the eGPU from the host (discrete GPUs only offer power savings by disabling in a laptop such as Optimus). Basically if eGPU is sold to manufacturers they may swap a discrete GPU for more cooling, battery, or smaller device. Personally I’d prefer more battery life in a laptop that’s a bit thicker.

            Now for the ultimate gaming machine, we ignore those value propositions of eGPU and consider only eDRAM as providing that last boost to gaming FPS over a traditional desktop i7-6700K w/single GTX 980 Ti. The stepping stone is Skull Canyon, benchmarking it will tell us if Iris eDRAM + eGPU is truly going to provide shocking performance or just meh results. The next step is to identify a platform that can utilize a GPU properly and add eDRAM to outclass lets say a traditional desktop i7-6700K w/single GTX 980 Ti.

            This is where I see Gigabyte Brix gaming or pro being able to step in. Gigabyte Brix gaming with a miniaturized desktop GPU might be able to pull it off, if they ever figure out their thermal throttling issues due to insufficient cooling. And then a Gigabyte Brix pro with i7-6785R and eGPU GTX 980 Ti might be able to match or exceed the i7-6700K w/single GTX 980 Ti. The 5% eGPU hit might be overcome by eDRAM and it might not. I’m pretty sure that we’ll see they’re pretty close in performance, the CPU w/eDRAM will cache more instructions and thread swap better and the traditional desktop will have better bandwidth and latency to the GPU.

            • derFunkenstein
            • 3 years ago

            I kinda get where you’re going, but that “ultimate gaming machine” that you describe as having an eGPU gives up any advantage it gets from the L4 cache. You said yourself, you’re losing 5-6% by giving up an x16 slot. That’s the *absolute maximum* that the L4 is getting you, based on TR’s own numbers from Broadwell CPUs.

            So now instead of having a (slightly) handicapped CPU, you’re (slightly) handicapping what’s already your bottleneck at high resolutions. Forget the value proposition for a moment. How is that the “ultimate” anything?

            The ultimate is a no-compromises system that I don’t think will ever exist—an mATX or even mITX motherboard with the 65W variant and a full x16 slot that gives the system’s biggest gaming bottleneck plenty of space to breathe.

            These CPUs are probably amazing, and maybe—MAYBE—someone will make your dream a possibility. Until then, I’d rather give the biggest bottleneck the widest berth. That’s where the best performance will be.

            • Raymond Page
            • 3 years ago

            Sorry, the website ate my fuller post when it timed me out once and I had detailed the Gigabyte Brix differences and what’s necessary to be the ultimate R-series + dGPU solution. I wasn’t trying to suggest the Brix eGPU + R-series was ultimate.

            Gigabyte would need to take their Brix gaming (has miniaturized desktop GPU) and their Brix pro (has R-series CPU) and create a slightly larger case called the THERMO-NUC and implement both features in that. Best of both worlds without eating the eGPU performance penalty, which we both agree is expensive and not as performant as a direct PCIe x16 car.

            I still think Iris helps for making a laptop a gaming beast. The Iris basically pushes the laptop CPU up a class where the graphics solutions for gaming have to be one of the following:
            1) eGPU
            2) [url=http://moonlight-stream.com/<]MoonLight[/url<] game streaming - open source gamestream client

            • derFunkenstein
            • 3 years ago

            I think we’re probably on the same page on a lot of this. I think the best hope for the “ultimate” is, weirdly enough, Shuttle. They’ve been making SFF barebones machines for years that take full-sized graphics cards. I could see paying $650 for a CPU/mobo/case/PSU that would also take DDR memory, plenty of storage, and a dGPU. That would be a steal, honestly, so it’d probably be more like $800, but still, that’s as no-compromise as it’d get.

            • Andrew Lauritzen
            • 3 years ago

            Guys, if you’re not interested in the iGPU at all (either for use by itself or multi-adapter), these chips are not for you, period. Huge amount of wasted silicon, cost and TDP if you don’t care about that part.

            Despite the trashing earlier, the iGPU is no slouch here, edging out quite a lot of discrete offerings. Obviously if you’re going for something high end (say a 2+TFLOP desktop chip) it’s not going to be able to compete, but it’s still better than many discrete GPUs that get sold today, let alone ones from a year or two ago.

            Agreed that the eGPU stuff makes almost no sense, but if people feel like buying Skull Canyon based on some weird use case involving eGPU, who am I to say no 😉

            • Raymond Page
            • 3 years ago

            Andrew, you’re not following what we’re stating.

            eDRAM on Iris enabled chips is highly desirable for various workloads, like compilation and file servers always love more cache. If Intel offered eDRAM on a non-Iris enabled chip, maybe a CPU with no iGPU at all, we’d discuss that instead, because we care about the eDRAM rather than the perks of the iGPU.

            The only reason I state the iGPU on Iris is rubbish is that it doesn’t match performance with a discrete GPU such as a 940m or better. Its a perfectly reasonable GPU for driving a 1080p laptop and 4K displays. Its the gaming on the iGPU that’s not up to the discrete GPU that gamers will desire (assuming gamers want 1080p @60FPS with max settings).

            eGPU makes the most sense for laptops and enables a shared GPU between a mobile laptop user and a dedicated desktop that they can swap the eGPU between. Skull Canyon would qualify as a dedicated desktop.

            I’m also amending my posts about Iris GPU being rubbish to identify I’m comparing it to a discrete GPU, and someone can dig through to find that I consider a 940m class GPU on par with the Iris Pro.

            • Andrew Lauritzen
            • 3 years ago

            > eDRAM on Iris enabled chips is highly desirable for various workloads, like compilation and file servers always love more cache.

            I get your argument. I agree more cache is always nice, but for CPU-only workloads it simply does not justify the cost. You’re much better off going HSW-E or similar and getting more cores, L3$ and bandwidth. There are a few workloads that fall in the ~64-128MB ish working set range – most notably WinRar and a few scientific simulations – but not enough that it’s worth it for even enthusiast consumers. I don’t actually know of any benchmarks where one of the EDRAM parts exceeds the performance of a HSW/BDW-E despite the architectural improvements – they may exist, but they are going to be rare exceptions.

            Don’t get me wrong, I’d love for these things to be available. But for a given cost, if you’re not interested in the iGPU at all, get a HSW/BDW-E instead.

            > The only reason I state the iGPU on Iris is rubbish is that it doesn’t match performance with a discrete GPU such as a 940m or better

            SKL Iris Pro generally falls between the 940m and 950m (while using less system power). And I’ll note that there are many GPUs below those that get sold and used for “gaming”, despite our personal cutoffs (I’m a snob and pretty much on get top of the line desktop stuff :).

            The point in the chip and the form factors that it goes into (ex. the NUC or laptops) is that it’s really the best performance you can fit into 47W. If power/form factor is not a constraint, desktop stuff with higher TDPs is always going to be a better option.

            > eGPU makes the most sense for laptops and enables a shared GPU between a mobile laptop user and a dedicated desktop that they can swap the eGPU between.

            I’ve been over this bit before… once you actually run the numbers neither the cost nor practicality of it makes any sense vs. just building a separate desktop. Even expensive laptop hardware compares poorly against cheap desktop hardware. As above, it’s optimized for the wrong thing if you’re going to be sitting at a desk.

            • Raymond Page
            • 3 years ago

            Thanks for detailing the Iris Pro falling between the 940m and 950m. I was having a tough time figuring that out from [url=http://www.notebookcheck.net/Mobile-Graphics-Cards-Benchmark-List.844.0.html<]notebookcheck.net's benchmarks[/url<].

            • Andrew Lauritzen
            • 3 years ago

            To be clear, I’m just ball-parking things based on “on-paper” specs – I don’t have any more detailed benchmarks on hand. The 940m is a ~850GFLOP machine which is roughly in line with HSW/BDW Iris Pro (and they performed fairly similarly in practice as per notebookcheck), while SKL Iris Pro is >1TFLOP. Notebookcheck lists it (Iris Pro 580) between 940m and 950m as well, although they don’t have any benchmarks yet so I’m assuming that’s just their best guess as well 🙂

            Obviously that’s not the whole story but it’s a good rough start. To get any more specific you’d need to look at specific benchmarks and so on which I don’t believe are out yet since there aren’t many (any?) Iris Pro 580 products out there yet.

            I think the NUC comes out very soon though so that will be a good thing to test.

            • derFunkenstein
            • 3 years ago

            I’m sure the iGPU is a huge step up from the 40-EU Haswell Iris Pro and the 48-EU Broadwell variant, but I wouldn’t ever expect it to supplant my GTX 970, or even a GTX 960.

            On the other hand, I *am* very excited about Intel producing “good enough for gaming” graphics, and, based on what I can find, it seems like Skylake with Iris Pro is right there. Despite my drooling over Pascal this weekend, my gaming “needs” are pretty light, since I do most of my gaming on the couch (whether it’s console or HTPC, 1080p is about as high as I can handle right now). I think Skull Canyon could make for a really nice HTPC, although the price is higher than I’m willing to go considering it’d be a glorified Netflix and Diablo box. I can do those on an Xbone. It can do a lot more, and so it’s a great tool for other folks, though.

            • Andrew Lauritzen
            • 3 years ago

            > but I wouldn’t ever expect it to supplant my GTX 970, or even a GTX 960.

            That is basically physically impossible… that would mean the GTX 970 and 960 were terribly inefficient GPUs for using so much more power 😉

            It’s only meaningful to compare architectures at the same power levels. Which is exactly why I say if you don’t care about form factor/power, these chips are simply not for you.

            And again, it’s always a question of: price, form factor, performance – pick two 🙂

            • tipoo
            • 3 years ago

            My Haswell Iris Pro was already around 800Gflops, I imagine the 72EU Skylake part would be edging in on the Xbox Ones level of performance. Plus strapped to a much better CPU.

            • NTMBK
            • 3 years ago

            So you expect gamers to

            a) Pay a premium for a top-end integrated GPU, and then pay extra for an external one anyway
            b) Pay a premium for an elegant, highly integrated design, then add a big ugly box spewing cables on the desk next to it
            c) Pay a premium for OS X, and then run Boot Camp on it

            Doesn’t sound like the future of gaming to me!

            • ImSpartacus
            • 3 years ago

            I think that’s a ways out, but yeah, Apple will likely make their own egpu. The speculation that I liked was that they’d fill the mac pro chassis with three gpus and go about it that way.

            But I don’t see that as the future of pc gaming. It’s tough to compete with a big bulky hand built machine.

            • BurntMyBacon
            • 3 years ago

            [quote<]The onboard GPU is rubbish, but every Mac I've ever seen has Thunderbolt, and with the Razer Core enclosure, we should see Macs becoming gaming machines with Thunderbolt 3 interfaces to an eGPU and the Iris chipset will be killer due to the eDRAM.[/quote<] I'm not buyin what your sellin. Hardware compatibility, driver support, and game selection will all let you down in this scenario. I suppose you could buy the Mac and run Windows to get around some of these issues, but HP, Dell, MSI, et al will sell you a similar system loaded with Windows to begin with.

            • Raymond Page
            • 3 years ago

            Apparently you missed this comment by me:
            “Mac’s are like the Microsoft Surfaces, sometimes great hardware gets a crappy OS.”

            Mac OS X is okay for playing GoG games. However you’re absolutely right about the game selection. It just doesn’t have many ports compared to the full Windows choices.

            Iris GPUs are rubbish for gaming, so a dedicated GPU or eGPU is required for entry to be considered a gaming machine. As every Mac I’ve seen has Thunderbolt ports, I fully expect that to continue with the Macbook Pro refresh later this year. When those have Thunderbolt 3, the hardware will be capable of running Microsoft Windows (hoops necessary to boot are left up to hackers) and with that be a capable gaming laptop via eGPU support.

            Any laptop that supports an eGPU like the Razer Core should be able to compete with desktop PC graphics within the ~5% FPS range as long as the CPU doesn’t become the limiting factor (and it will on the 15W parts). To be clear, if you pick up a Dell XPS 13 with the Iris chipset and its eGPU certified Thunderbolt 3, you should be able to get a Razer Core, stick a GTX 980 Ti in it, and game at up to 95% of the performance of that card if it was placed in a dedicated gaming PC. The 15W CPU in the Dell XPS 13 *will* limit the performance of the GTX 980 Ti, but that’s why the 45W in Skull Canyon becomes interesting if it gets used by the new Macs or these new R-series in an iMac AIO.

            • Raymond Page
            • 3 years ago

            Adding to clarify that the Iris GPU for gaming compared to a discrete GPU such as a GTX 980 Ti or the new GTX 1080 is rubbish. These are different classes of GPU by orders of magnitude of power utilization and shear dedicated silicon and transistor count.

          • DrCR
          • 3 years ago

          Year of the Mac desktop™

            • Raymond Page
            • 3 years ago

            Mac’s are like the Microsoft Surfaces, sometimes great hardware gets a crappy OS.

          • BurntMyBacon
          • 3 years ago

          [quote<]Taking this to the logical conclusion...Mac is the future of PC gaming.[/quote<] I'm not picking up what you're putting down.

            • Raymond Page
            • 3 years ago

            Use Bootcamp to get Windows running.

            Now you have a Mac device that should have an Iris enabled CPU with the eDRAM discussed in the article. Based on the i7-5775C’s performance, we should see the higher TDP Iris chips, specifically the Core i7-6567U. Core i7-6770HQ, or Core i7-6785R outclass higher end CPUs all the while support eGPUs over Thunderbolt 3 to ensure good GPU support.

      • bwcbiz
      • 3 years ago

      Yeah, AIO and SFF are what I was thinking of. But 65W is pretty high TDP for those. The heat can be managed, but most AIO use a power brick, and a 120-150 W power brick for the system is a pretty significant cost vs. 90-100 W if you use a lower-power CPU.

      • tipoo
      • 3 years ago

      Unless this is true…

      [url<]http://wccftech.com/amd-making-custom-x86-soc-apple-imacs-2017-2018/[/url<]

    • chuckula
    • 3 years ago

    It’s unfortunate that these things won’t be available as socketed chips. Not even so much for the built-in graphics but for the L4 cache, which can show some nice gains in the right non-graphics workloads too.

    Then again, Kaby Lake is looking extra boring… maybe some socketed models will get some L4 cache love.

    • chuckula
    • 3 years ago

    Some additional die size analysis here: [url<]https://techreport.com/forums/viewtopic.php?f=2&t=117799[/url<]

Pin It on Pinterest

Share This