Intel’s Kaby Lake CPUs revealed

Moore’s Law may not be dead, but it is changing. In March of this year, Intel revealed that it would be adding a third stage to its traditional tick-tock model of technical advancement. That new model, “process-architecture-optimize,” began its latest swing back in August 2014 with the Broadwell architecture, Intel’s first time out with its 14-nm tri-gate process. A little over a year ago, we got our first taste of the Skylake “tock,”  which ditched the fully-integrated voltage regulator (FIVR) of Haswell and Broadwell chips and introduced Intel’s Gen9 integrated graphics to the world, among other refinements.

The Kaby Lake die, helpfully labeled. Source: Intel

Now it’s time for the “optimize” part of the 14-nm show. Today, Intel is raising the curtain on its Kaby Lake (or “seventh-generation Core”) CPUs. We got a clearer picture of just what this third step in the new Intel architecture life cycle means at IDF a couple weeks back, so let’s dive in.

Getting the most out of the 14-nm process

To improve Kaby Lake’s performance, Intel says it’s optimized its 14-nm tri-gate process so that it can lay down taller fins with a wider gate pitch. Intel calls this improved process “14-nm plus.” Those improvements result in faster-switching transistors that are good for what Intel describes as “four or five speed bins” of extra clock speed, or about 400 to 500 MHz higher Turbo frequencies at the top end of a CPU’s range.

Kaby Lake CPUs will also be able to hit those peak speeds faster than before, thanks to an improved version of the SpeedShift technology that first appeared in Skylake. Where a Skylake Core i7-6500U might have needed over 20 ms to hit its peak Turbo speeds, a Kaby Lake Core i7-7500U might hit its Turbo peak in under 20 ms. When Intel talks about the “responsiveness gains” of Kaby, they’re referring to these SpeedShift improvements.

More efficient cat-video handling

Along with the process improvements, the biggest change in Kaby Lake is to the integrated GPU’s media encoding block. 4K video streaming, wider color gamuts, and HDR content are all in the pipe, and Intel wants to make sure its chips are ready to handle those demands efficiently. The company notes that for a lot of younger people, the 15″ screen on their notebooks is the big screen they have ready access to, and it wants to offer mobile users the opportunity to watch 4K content without draining their batteries unduly.

In addition to the 1080p HEVC (also known as H.265) encode and decode capabilities of Skylake, Kaby gets 4K HEVC 10-bit encode and decode support, as well as hardware decode support for YouTube’s competing VP9 format. The Multi-Format Codec block in Kaby offers better support for wireless displays and an improved quality feature for the QuickSync Video Fixed-Function mode.

In practice, these improved video functions let Kaby Lake CPUs play back 4K HEVC video much more efficiently than they would be able to if they had to rely on CPU power alone, as demonstrated by the local playback graph above. The Kaby CPU only needs to consume 0.5W to do its thing, compared to the 10.2W of the Core i7-6500U.

For YouTube playback with VP9, the hardware decode capability in Kaby delivers similar improvements. The seventh-gen Core CPU only needs to consume 0.8W, compared to 5.8W with the CPU alone.

What isn’t changing

We would normally perform a deep-dive on what else has changed under the hood of these Kaby Lake CPUs here, but aside from the enhanced video decode block and the benefits of the 14-nm “Plus” process, Intel was surprisingly frank about how little generation-to-generation differences there are between the sixth-gen and seventh-gen Core processors. Kaby Lake will use the same basic execution pipeline as Skylake before it, and the guts of its Gen9 integrated graphics processor are also basically identical to what arrived with Skylake. In conversations with Intel, I confirmed we’re not getting any kind of FreeSync or VESA Adaptive-Sync support with this generation of chips, either.

At least from a gamer’s perspective, it’s hard not to be disappointed by that fact. FreeSync monitors are becoming more and more widely available, and Intel is clearly thinking about the low-end gaming market to some degree. The company devoted a considerable portion of its demo time to showing off how well a thin-and-light Kaby Lake machine could play Overwatch, and I’d be willing to bet the gameplay experience would be even better with a variable-refresh display hooked up. Sadly, that’s not to be with Kaby. We’ll just have to wait another year to see whether Adaptive-Sync will land in Intel CPUs, it seems.

The first Kaby Lake products

Intel is revealing six Kaby Lake products today: three 4.5W “Y” CPUs and three 15W “U” CPUs. Instead of trying to explain the dizzying array of features inside each of these chips, I’m just going to post Intel’s full accounting of what’s inside. First off, let’s look at the 4.5W family of seventh-gen Core products. These are all dual-core parts with Hyper-Threading enabled.

As we might expect from Intel’s description of its process improvements, the highest-end 4.5W Core i7-7Y75—seriously, take the naming responsibilities for these things away from the engineers, folks—can Turbo up to 3.6GHz, compared to 3.1GHz for the Skylake Core m7-6Y75. That’s an impressive improvement, especially if Intel’s Adaptive Performance environmental-monitoring tech lets the chip hang out at those higher clock speeds longer.

You’ll also note that Intel is using the Core i5 and Core i7 designators on its higher-end 4.5W CPUs now, instead of the Core m5 and Core m7 branding. Intel says it’ll be reserving the Core m brand for low-end CPUs in the Kaby family, but it otherwise seems like we’ll be seeing Core i branding from top to bottom in Intel’s product range. My sense is that this move might make it harder to figure out which systems have which CPUs in the already arcane notebook market, but we’ll just have to pay close attention from here on out. 

The 15W seventh-generation Core parts all tell a similar story. The highest-end Core i7-7500U runs 200MHz faster than the Core i7-6500U at its base speed and 400MHz faster under Turbo-friendly workloads. The Core i5-7200U gets a 200-MHz base clock boost and a 300-MHz Turbo bump over the Core i5-6200U, and the Core i3-7100U makes do with a paltry 100-MHz bump over the Core i3-6100U. This cheapest 15W Kaby part doesn’t do Turbo Boost.

All told, these are nice improvements across the board for these meat-and-potatoes CPUs in Intel’s lineup. Along with the improved media encode and decode features in Kaby, the bumped clock speeds should give Ultrabook-class mobile PCs a little more all-around muscle for the things that regular folks do with their PCs. Expect to see Kaby-equipped notebooks roll out beginning next month.

What’s next

The six Kaby parts we’re seeing today should cover a broad swath of mainstream PC needs, but there’s more in store. Intel says to expect mobile CPUs with its higher-end Iris graphics and vPro management support to arrive in January 2017. We’ll also need to wait until the New Year to see desktop parts benefiting from the move to Intel’s 14-nm “Plus” process, as well. Intel remained mum about the chipsets and CPUs we can expect for the desktop at IDF, although we’ve seen hints of the 200-series chipsets as far back as Computex. It’ll be interesting to see how Intel’s 14-nm process improvements translate to the higher TDPs where enthusiast processors roam.

If we can peer into our crystal balls for a moment, today’s Kaby news might be most disappointing for Apple fans. The company’s highest-end MacBook Pros rely on dual- and quad-core CPUs with eDRAM and Iris graphics on board, and those products are nowhere to be found in this first round of Kaby releases. Some MacBook Pros already feature Broadwell parts, but the highest-end quad-core MBPs still rely on Haswell Crystal Well CPUs. Unless Apple is intending to take the MacBook Pro in a more mainstream direction, we wouldn’t expect to see updated versions of the company’s highest-end laptop until next year, at the earliest. If that’s the case, we’ll have an exciting January to look forward to.

Comments closed
    • richardjhonson
    • 3 years ago
    • itachi
    • 3 years ago

    No l4 cache eDRAM .. such a shame !!

    • BIF
    • 3 years ago

    All these model naming conventions just confuse the hell out of me. Eventually I guess I’ll learn it. Or not.

    In the meantime, can anybody tell me how the “a” sound in “Kaby” is supposed to be pronounced?

    1. Does it rhyme with “baby”?
    2. Does it rhyme with “cobby” (like corn on the cobby)
    3. Or does it rhyme with “cabby” or “crabby”?

    This is a serious question; I’m genuinely curious and it gives me a headache anytime I can’t pronounce a word or name in my head whenever I read about it in print. Thanks.

    • End User
    • 3 years ago

    Desktop quad cores by December:

    [quote<]planned for December are the Kaby Lake models of the desktop quad-cores (S 4+2)[/quote<] [url<]http://www.notebookcheck.net/Intel-Detailed-Kaby-Lake-Road-Map-leaked.172282.0.html[/url<]

    • Usacomp2k3
    • 3 years ago

    I hate the naming for laptop cpu’s. When I was searching for the quad-core for my work laptop I had to keep the ARK open to know which was which. Why can’t they restrict the i7 name for quads.

      • NovusBogus
      • 3 years ago

      Simple rule of thumb when evaluating Intel products:

      Anything containing the letter U is dog poop.

      Anything containing the letter N is even stinkier dog poop, but at least it’s cheap.

      Anything containing the letter Y is arguably dog poop, but very power efficient and carries a bit of a price premium.

      Otherwise it’s probably legit.

        • tipoo
        • 3 years ago

        Since they made Core M the Core I Y series, I remember it by thinking, Y the Eff did they do that.

    • Gastec
    • 3 years ago

    I’m still on a Lynnfield i7-860 here and I have this uneasy feeling that my next CPU, be it Intel of AMD might be the last one I’ll have in this life. I hope it will last for at least 25 years. I mean, I don’t see anything spectacular at the horizon, except something in the likeliness of dual, triple and quadruple CPUs on a single mobo.

      • tipoo
      • 3 years ago

      Hopefully the move beyond silicon wouldn’t take more than 25 years with Moores general observation slowing way down and expected to hit a wall…

      • BIF
      • 3 years ago

      Until Intel comes out with “Vertically Stacked Right Angled Ceiling Attached Quantum Symmetrically Scaling Cloud Processing” modules.

      You will need one just to remember the acronym, VSRACAQSSCP.

    • evilpaul
    • 3 years ago

    So I have to wait to be disappointed that there’s no unlocked Kaby Lake SKUs with huge L4 caches (and hopefully better base/turbo clockspeeds) again?

    • mFvwv0zduc
    • 3 years ago

    OMG even less and less watts and no real performance improvements… Give me a break. I have all these watercooled system and fans and 1KW power brick for something bigger than that;) I will be waiting for AMD Zen and 8 coarz 😀

      • End User
      • 3 years ago

      It’s all about the 200-series chipset feature set and 4K encoding/decoding.

    • Bensam123
    • 3 years ago

    And the disappointments don’t end… Guess we’re waiting till Zen. ~_~

      • chuckula
      • 3 years ago

      I don’t think anybody is holding their breath for Zen in mobile despite what AMD says about how they would really really like for Zen to be in mobile parts one day.

      Right now the bar has been set so low that if an 8 core Zen can actually beat a 5 year 4 core Sandy Bridge consistently it’s a “miracle” but you were the same one claiming that Sandy Bridge was obsolete because 8 coarz.

      Incidentally, where were you at the Skylake review admitting that all the crap you spewed about Bulldozer being “future proof” was wrong?

      Here’s a link to where that “non future proof” 2600K from 2011 is still beating the 8370 — a 2014 part that’s faster and even more power hungry than Bulldozer ever was BTW: [url<]https://techreport.com/review/28751/intel-core-i7-6700k-skylake-processor-reviewed/14[/url<]

        • Bensam123
        • 3 years ago

        …I wasn’t talking about mobile nor was I trying to pander the narrative in such a way it’s AMD Vs Intel. Also not certain what past AMD has to do with Zen, nor remarks taken completely out of context, cherry picked, and paraphrased.

        But strawmans as usual. Nothing new here… just like with Kaby Lake.

          • chuckula
          • 3 years ago

          Look you sociopathic D-bag with your “strawman” whines to hide your own history of lying, here’s a reminder of how you insulted Haswell with a completely disingenuous and dishonest rant that only got upthumbed because your crew was circling the wagons in desperation: [url<]https://techreport.com/discussion/24879/intel-core-i7-4770k-and-4950hq-haswell-processors-reviewed?post=735330[/url<] Guess what sunshine? In 2017 AMD is flat out copying that chip* on which you heaped so much scorn in 2013. In literally every way AMD is basically waving the flag and saying that Haswell was so good that they hired Jim Keller to dump everything they've done for the last decade and just clone Intel to the best of their abilities. Riddle me this you little shill: You know how [i<]scared[/i<] of Zen Intel is? They're so [i<]scared[/i<] that according to your own little strawman Kaby Lake is not any better than the same chips Intel has been selling since 2011! That's how much AMD keeps Intel's design team up at night that Intel basically saw no need to do anything in response to Zen even though we've been hearing hype about the stupid chip since 2012. How does it feel to be that irrelevant? * OK, in fairness the un-core part of Zen is more like a 2009 Nehalem instead of a 2013 era Haswell.

    • Klimax
    • 3 years ago

    Interesting. Looks like “optimization” is from process perspective. One question resolved, few remain…

    • dikowexeyu
    • 3 years ago

    Finally! I’m going to update my 4 cores to 16 cor… oh, no, 2 cores…

      • ronch
      • 3 years ago

      16-core Zen chips will be priced against 2-core Intels.

      And run at 1/8 the clock. Hehe.

    • Airmantharp
    • 3 years ago

    Just started school again, and since the fan is dying in my 2000-series i7 laptop I’m dragging the DTR around- it’s time for something thin and light with a good battery and excellent display/keyboard/touchpad, and these CPUs look perfect.

    Also nice that they’ll be able to handle HDR content properly, which means they’re more likely to come with decent displays!

    • egon
    • 3 years ago

    On the one hand, it’s despressing how incremental overall performance improvements have become.

    On the other, browsing through my old computer mags from the mid 90s, it’s a reminder the rapid pace could be frustrating in its own own way. Not only did performance PCs cost a lot more to buy, but having spent thousands on that coveted, top of the line 486DX2/66, you’d have about 18 months until it’s considered low-mid end, and the same money would buy a more than twice as fast Pentium 90 system.

      • hansmuff
      • 3 years ago

      Shit back then, you spent $300 on a Mitsumi CD reader, and all of a sudden games be like “needs double speed drive.” FFFFfffffffffffffffff
      The 486DX 50, not the /2 50, was as expensive then as an 8 core Intel chip today. The memories of that..

      We do live in better times. Bottom end performance is pretty damn high, SSDs are a dream come true, memory is cheap. My 5 year old system received maybe $500 in graphics cards upgrades over those years, and I could’ve gotten away with way less but I’m picky with graphics. All the rest stayed fast enough.

        • bhtooefr
        • 3 years ago

        To be fair, the 486DX 50 was a higher performing chip than a 486DX2 50.

        (Twice the FSB.)

        Good luck running VLB graphics cards on it, though.

          • hansmuff
          • 3 years ago

          I did, at full speed because as you probably know, there were no brakes on that bus.
          It was completely out of spec, but one chip that could do it well at the time was the IMO legendary TSENG ET4000W32p. The performance of that card was through the roof, especially on that bus speed 🙂

    • NovusBogus
    • 3 years ago

    I want to know if they’re still planning on only supporting Windows Malware Edition and maybe Linux, and what ramifications this lack of support will actually have for the end user.

      • End User
      • 3 years ago

      [quote<]what ramifications this lack of support will actually have for the [b<]end user[/b<][/quote<] None whatsoever. 🙂

        • AutoAym
        • 3 years ago

        Well played, sir.
        Well played. 🙂

      • yuhong
      • 3 years ago

      Let’s just say that is mostly not up to Intel or AMD to decide anyway.

        • NovusBogus
        • 3 years ago

        They’re the ones who write drivers and instruction sets, and give motherboard OEMs their marching orders, so it’s absolutely up to them what happens next.

    • crystall
    • 3 years ago

    Meh

    • Andrew Lauritzen
    • 3 years ago

    > “i7-7Y75—seriously, take the naming responsibilities for these things away from the engineers, folks”

    Don’t you dare… trust me, it takes marketing departments to come up with this level of craziness 🙂 To engineers there’s an architecture, config and power target. I guarantee the “engineer” names would make more sense to the enthusiast type folks who are here.

    The argument is that the marketing names make more sense to “regular people”, which I have my doubts about but no firm data.

      • tipoo
      • 3 years ago

      I think they’re onto something, just yesterday my 75 year old neighbors asked me for ultrabook recommendations with i7-7Y75-HQ-GT-M-93 processors

      • chuckula
      • 3 years ago

      Of course that product number is bad.

      IT NEEDS MOAR SEVENS!

      I suggest:

      [b<]i7-77Y7757-(14/2)C[/b<] The C is there just because.

        • Gastec
        • 3 years ago

        Like momma always said, 7 is what 7 does. Life is like a box of 7.

    • cygnus1
    • 3 years ago

    Since so little is changing in Kaby Lake, I wonder if it will be socket compatible with Skylake and if Intel will release the 200 series chipset before next year…

      • End User
      • 3 years ago

      R u kidding me? It is all about the chipset.

        • cygnus1
        • 3 years ago

        Yeah, I want to build a new system before the end of the year. Since the updated CPU isn’t coming before then, I could work with an updated chipset and go ahead and build with an i3 or i5 Skylake if I can do an a drop in upgrade later to a Kaby Lake i7. Then I can recycle the Skylake chip into some other project system.

    • ronch
    • 3 years ago

    I suppose Intel has been paying attention to how Zen is shaping up.

      • Kretschmer
      • 3 years ago

      Savage! I like it.

    • WhatMeWorry
    • 3 years ago

    Intel should of called it Crappy Lake.

      • Kretschmer
      • 3 years ago

      Sorry, responded to the wrong post. Meant to add something to Ronch.

    • tipoo
    • 3 years ago

    Can someone explain to me what the hell it means for a processor to be “designed for the immersive internet”? I feel like that’s one of the buzziest phrases I’ve seen in a while.

      • Chrispy_
      • 3 years ago

      Well the new processor design leverages the synergies of the evolved, cloud-based internet.

      Purportedly the analytic and algorithmic AI behind the design of these new chips is hoped to be an olive branch to people hoping to envisage seamless continuum in everything from the Internet of Things all the way to quantum computing. Not wanting to be left behind by Mobilegeddon and Generation Z,

      Intel is making sure that this new design complies with all current and upcoming requirements for Net Neutrality and full data encryption, in order to avoid the buzzwordgate scandal of Ed Snowden.

        • tipoo
        • 3 years ago

        Have your upvote, made of 100% green material from The Cloud

        • ronch
        • 3 years ago

        Have an E… for Effort.

        • Redocbew
        • 3 years ago

        They’re free range and grass fed, also.

        [url<]https://www.penny-arcade.com/comic/2016/06/06/values-and-values[/url<]

          • Firestarter
          • 3 years ago

          now I want happy soap

            • NeelyCam
            • 3 years ago

            [url<]https://www.amazon.com/Nivea-Happy-Time-Bar-Soap/dp/B004KJ7N5M[/url<]

          • tipoo
          • 3 years ago

          Farm raised, grass fed. I want to eat it, not interview it!

          -Danny Bhoy

        • kvndoom
        • 3 years ago

        That may be the first time I have upvoted a post that made me nauseous.

        • puppetworx
        • 3 years ago

        So you can watch porn in 4K on it yes or no?

          • BIF
          • 3 years ago

          The free stuff probably won’t come in 4K.

          (I don’t know what that means)

      • chuckula
      • 3 years ago

      The interwebs are underwater now.

      • ronch
      • 3 years ago

      Not nearly as ‘designed for the internet’ as NetBurst is.

      Good grief who came up with ‘NetBurst’. Let’s kill him.

        • tipoo
        • 3 years ago

        I think the heat already killed him

        • the
        • 3 years ago

        He crawled into a series of tubes to make his escape.

        • yuhong
        • 3 years ago

        I think SSE was once sometimes called “Internet Streaming SIMD Extensions”. The fun thing is that modern web browsers and modern web applications are more CPU- and RAM- intensive than back in these days.

        • Anonymous Coward
        • 3 years ago

        Related note, I just claimed a dual “Netburst” Xeon workstation to turn into my home file server. The second-hand Prescott seems to have died on me. 🙁 Actually that was a funny machine, I could hear wether its processor was loaded or not. Hmmm.

      • derFunkenstein
      • 3 years ago

      You know how even online print media is dead because you can’t make money on GIF ads anymore? That’s why even most websites where all the content is text (edit: except this one!) have annoying auto-play video. That’s “immersive”. You’re immersed in ads, and now Kaby Lake processors can let you be immersed in ads longer since they can handle all the video.

      • blastdoor
      • 3 years ago

      Deep Netburst

      • Gastec
      • 3 years ago

      After what Chrispy_ wrote and two more beers the answer to your question can only ever be “Yes” 🙂

        • Chrispy_
        • 3 years ago

        I’d like to take credit but I just put “top buzzwords 2016” into Google and copypasted a lot.

        I re-read my own reply and noticed a paragraph break instead of a space. That was a copypaste error but if you want to believe the buzzword hype, it’s there for people who think outside the box and read between the lines.

        [i<]edit: Damn You Autocorrect! 7 edits required![/i<]

          • BIF
          • 3 years ago

          “Buzzword Bingo” was a thing.

          I won once with “metrics”.

      • Anonymous Coward
      • 3 years ago

      Actually, what it means is that the processor is designed to be as efficient as possible while doing nothing, and yet be very quick to render bloated web pages at any moment.

    • Takeshi7
    • 3 years ago

    [quote<]That new model, "tick-tock-optimize," began its latest swing back in August 2014 with the Broadwell architecture, Intel's first time out with its 14-nm tri-gate process.[/quote<] It's not "tick-tock-optimize". It's "process-architecture-optimize" They're completely getting rid of the clock related onomatopoeias.

      • tipoo
      • 3 years ago

      Huge missed chance to use Tick-Tock-Toe

        • vargis14
        • 3 years ago

        love it:)

        • ronch
        • 3 years ago

        Should be Tick-Tock-BOOOOMMMM!!!!!

      • Jeff Kampman
      • 3 years ago

      You’re right, I should have checked my phrasing. Fixed.

    • WhatMeWorry
    • 3 years ago

    I’m afraid even the tick-tock-optimize will soon become the tock-optimize cycle.

    • Krogoth
    • 3 years ago

    “Moore’s Observation” has been invalid for over a decade now. It is now just a silly meme that refuses to die.

      • derFunkenstein
      • 3 years ago

      The same could be said for being “not impressed.” :p

      • JustAnEngineer
      • 3 years ago

      Moore’s Law was dead before Gordon Moore retired.

      • PixelArmy
      • 3 years ago

      Do you ever get tired of regurgitating crap? It can’t possibly taste good.

      [url<]https://techreport.com/news/29895/intel-turns-off-its-tick-tock-metronome?post=971734#971734[/url<]

        • Krogoth
        • 3 years ago

        That news blurb is all about moving goalposts. It is just Intel marketers that are desperately trying to keep the “Moore’s Law” meme alive.

        You can only put so much lipstick and embalming fluid on a corpse.

    • PrincipalSkinner
    • 3 years ago

    [quote<]If we can peer into our crystal balls[/quote<] This creates completely wrong images in my head.

      • tipoo
      • 3 years ago

      Doubles as a Tinder tagline

      • Generic
      • 3 years ago

      Simple enough to fix.

      If we peer into our [i<]collective[/i<] crystal balls- No, no that's not it. If we gaze intently at our [i<]quartz balls, firmly in hand[/i<]- Nope, that's worse. If we pay [i<]undivided[/i<] attention to our [i<]rock hard[/i<] spheroids of futures yet to [b<]burst forth[/b<]- Never mind.

    • torquer
    • 3 years ago

    Unless I’m mistaken, Skylake-U processors with Iris Pro GPUs already exist and are not present in any current gen MacBook Pro. Likely new MBPs would launch as a “late 2016″ model with these CPUs and then be refreshed mid/late 2017 with Kaby Lake. Given all the rumors surrounding a redesigned 13” MBP, I doubt Apple will wait another 6 months to a year for the next CPU refresh.

      • Bauxite
      • 3 years ago

      Skylake-H with iris pro (580) was launched early 2016, but currently only the NUC and some obscure xeon encoder pcs showed up this summer.

      0, yes Zero laptops anywhere have them, seems really weird.

        • blastdoor
        • 3 years ago

        It occurs to me that if Apple made a decision to go with semi-custom Zen SOCs, that could explain the weird delays in updates to the Mac lineup and also the gaps in Intel’s product lineup where Mac-ish SKUs used to sit.

          • tipoo
          • 3 years ago

          It was rumored for their iMac,but unless AMD really pulled a huge coup I’d doubt it on their laptops where they really care about battery life.

            • blastdoor
            • 3 years ago

            GloFo’s 14nm process is inferior to Intel’s, but if AMD could offer a SOC that is better tuned to Apple’s needs, then maybe the benefits of customization could outweigh the disadvantages of the process.

            I recognize that’s easier said than done, though.

          • TEAMSWITCHER
          • 3 years ago

          AMD never could deliver parts to Apple in the quantities needed for a launch. Maybe Tim Cook has decided that these “Meat-and-Potatoes” processors are good enough for MacBooks now. It wouldn’t surprise me – Tim Cook doesn’t seem to care about Mac Superiority as Steve Jobs did.

            • tipoo
            • 3 years ago

            Macbooks stayed on Core 2 Duo after the much better i series launched under Jobs in favor of the 940M/320M chipsets, Apple was always willing to tread water on the processor front if something interesting came along. At least after the x86 switch, before which they constantly tried to differentiate with processors.

            • torquer
            • 3 years ago

            They care about battery life and graphics. They’ll never put anything in a MBP with less than Iris graphics, which is why refreshes took so long in the past including on the retina iMac side.

          • Kretschmer
          • 3 years ago

          AMD has failed to deliver power-competitive parts for the better part of a decade. That has little appeal to Apple.

    • sweatshopking
    • 3 years ago

    A company the size of Intel should be able to make better slides than those wtf yellow explosions.

      • maxxcool
      • 3 years ago

      Every time I see those I cringe ..

      • torquer
      • 3 years ago

      During their latest round of layoffs they nuked a lot of their marketing people. No joke – I have a friend who works there and out of 14 people who made similar types of materials for trade shows and product announcements, she’s the only one left on her team.

        • TheRazorsEdge
        • 3 years ago

        So you’re saying the ugly slides are her fault?

        I mean, if she’s the only one left…

      • ronch
      • 3 years ago

      Yeah they’re so bad my brain just ignored them.

      Oh and at least we know AMD marketing may still be good at something.

      • Meadows
      • 3 years ago

      What’s wrong with the yellow explosions? It’s 1995 all over again.

    • DPete27
    • 3 years ago

    The i5-7200U and the i3-7100U have the same pricing? Wow, I didn’t realize even the Skylake, Broadwell, and Haswell equivalents are this way. Seems silly to even build an i3 laptop then.

      • DreadCthulhu
      • 3 years ago

      You can bet that the large OEMs are not paying list price for these chips. And even if they were, they still might make i3 laptops for “market segmentation” purposes.

      • ImSpartacus
      • 3 years ago

      I’d bet some good money that the vendors buying these in bulk aren’t paying those rates. I figure it’s just a placeholder.

        • Gastec
        • 3 years ago

        Beside they will take that “reference” price of any CPU and add 50%, just for the client 🙂

    • NTMBK
    • 3 years ago

    Those names make [i<]no[/i<] sense. They literally only just introduced the "m3, m5, m7" terminology, and now they're calling the new 4.5W part an i7, not an m7? But the low end part is still m3, not i3? WHY ARE YOU DOING THIS INTEL

      • tipoo
      • 3 years ago

      Celeron covering both Broadwell cores and Braswell cores drives me mad too. One of them is a trap!

      • the
      • 3 years ago

      Because Intel’s market department hates it when consumers actually know exactly what is inside of their machines.

      • DPete27
      • 3 years ago

      [url=https://www.youtube.com/watch?v=4F4qzPbcFiA<]"Intel i7 Inside"[/url<]

        • NTMBK
        • 3 years ago

        [url=https://www.youtube.com/watch?v=qpMvS1Q1sos<]I see your Ackbar and raise you a Weird Al[/url<]

        • Gastec
        • 3 years ago

        The plebs know what an i7 is. What the fu…is an m7?

          • synthtel2
          • 3 years ago

          Sounds like a really bad idea from BMW. 😛

      • stefem
      • 3 years ago

      “I wonder if drug test are mandatory only for Intel engineers” cit.

    • anotherengineer
    • 3 years ago

    Did I read those tables right? DDR3 and not DDR4?!?!?!

      • the
      • 3 years ago

      It should support both memory technologies but the only reason to go with DDR3 right now is to take advantage of marginally lower prices.

      For some corner cutting OEMs, this is attractive.

      • Jeff Kampman
      • 3 years ago

      If I had to take a completely unsubstantiated guess, you’re seeing that because Intel isn’t updating the memory controllers in Kaby to do LPDDR4 in the 4.5W parts. The 15W chips do offer DDR4 support.

    • NTMBK
    • 3 years ago

    They’re optimizing for 4K video on a 15″ laptop screen… talk about diminishing returns. Can you even tell the difference at that point?

      • tipoo
      • 3 years ago

      A little bit. 1080p looks very very slightly soft on my rMBP. Not really worth the added file size or power draw going to 4K though. You could always use a Krabby Patty CPU to output to a larger TV though.

      • chuckula
      • 3 years ago

      MOAR PIXELZ!!!

      In fairness to Intel, the accelerated playback for high bit depth video is a solid feature in an otherwise lackluster upgrade. Given the right display and encoded video source (and that’s not always a given) even if you can’t view the individual pixels the improvements to color depth are real.

      You can also drive an external 4K display from an ultrabook or HTPC NUC box, and that’s not a bad thing either.

        • Airmantharp
        • 3 years ago

        That’s what I get out of this- ability to drive a 4k HDR screen from an ultrabook/SlowNUC.

      • the
      • 3 years ago

      I have a 15″ laptop with a UHD (3840 x 2160) and I can say yes there is a difference in picture quality under normal use-cases. With movies in particular, it is more difficult to distinguish between UHD and HD resolutions though I figure most of that is due to the relatively low bit rate of 4K material I’ve sampled on this machine.

        • Thresher
        • 3 years ago

        The biggest problem is that there is so much legacy software out there that doesn’t scale correctly on high dpi screens. Heck, even some of the Windows 10 OS doesn’t use it well.

          • the
          • 3 years ago

          I fully agree. I have some software those interface elements are absolutely tiny on the UHD display.

          It also hyper weird when moving windows between an external 72-to-96 dpi monitor and the laptop screen. It is hit or miss if it scales or is simply a 1:1 pixel transfer on Windows 10.

      • Voldenuit
      • 3 years ago

      Hell yeah you can. I have a 3200×1600 screen on my 13.3″ yoga 3 pro, and 15″1080p looks blocky to me now.

      • funko
      • 3 years ago

      fairly easily for photos, but its not only the resolution upgrade, but the color and audio improvements 4k videos and 4k displays tend to bring with them.

      • EndlessWaves
      • 3 years ago

      a 15″ laptop screen at 50cm is the same apparent size as an 80″ TV at 2.5m

      The difference is immediately noticeable at computer viewing distances.

      • rudimentary_lathe
      • 3 years ago

      I too think 4K at 15″ isn’t worth it, but hardware support does seem to result in a huge power delta.

      • Generic
      • 3 years ago

      They’re not optimizing for a 15″ laptop. Their optimizing for 4k mobile.

      Moving from 10 watts to 1/2 a watt is hard to scoff at even if it’s a cherry picked example.

      Now you might argue that the screen itself is the battery killer, but that’s not for Intel to worry about.

    • derFunkenstein
    • 3 years ago

    [quote<]We'll also need to wait until the New Year to see desktop parts benefiting from the move to Intel's 14-nm "Plus" process, as well.[/quote<] Because only in 2017 does Intel get some sort of outside motivation to refresh its desktop lineup. Given the complete lack of changes to CPU architecture, that's fine. Can't move the goalposts too much between now and when Zen launches.

      • the
      • 3 years ago

      Well from the sound of things, Intel isn’t bringing any CPU refinements other than improved clock speeds. [i<]If[/i<] AMD's Zen can get within striking distance of Sky Lake/Kaby Lake's IPC, then Intel can continue to outrun AMD by boosting clock speeds.

        • derFunkenstein
        • 3 years ago

        Oh, I’m sure they will, but they don’t really have any outside market forces pushing that right now. And they won’t have any outside pressure unless Zen is as competitive as AMD makes it out to be.

      • Klimax
      • 3 years ago

      Trouble is, we are hitting already limits imposed by general code. (Massive number of branches means that IPC is limited by average number of instructions between branches) And most remaining options are quite power wasting.

      AMD is irrelevant. It doesn’t matter if AMD exist or not. If there would be option for bigger increase in performance, Intel would use it. Monopoly or not. I’s irrelevant. And it should be very obvious why it is so.

      BTW: GPU market is different beast…

        • Pwnstar
        • 3 years ago

        ZEN could change that. Maybe.

        • Akyute
        • 3 years ago

        I think you are ignoring some basic economics.
        Intel has a monopoly in many/most CPU markets. Demand for computers, and thus CPUs, is rather inelastic.
        Thus, Intel has pricing power over these markets. Monopolies are expected to choose the price which maximizes their revenue minus the total cost. This results in the highest price per unit to the consumer.

        Evidence? We can perhaps follow the Intel Extreme CPUs, where Intel has had pricing power for a long time and where elasticity is low. Within 10 years, the price has risen from $999 (Core 2 QX6850) to $1700+ (i7-6950X). This certainly exceeds inflation, but I lack the semiconductor knowledge to judge factors such as increased fab cost.

        Nonetheless, the principle remains. AMD may be able to prove it can supply some markets, thus creating lower prices for equal demand. AMD is relevant to price and availability, though perhaps as you may have meant, does not significantly influence the development of envelope pushing performance CPU designs. I cannot speak to the limitations of low level compilers and uops.

      • Anonymous Coward
      • 3 years ago

      On the server side, have you been paying attention to IBM? Power8, Power9 … apparently they have actually taken over some areas that were once held by Xeon at for example Google.

    • tipoo
    • 3 years ago

    Intels die shots show the telltale signs of hand tuned layouts. Ironically fully computer generated ones look more organic/mossy.

    • lycium
    • 3 years ago

    More like “Intel’s Kaby Lake GPUs revealed”, right?

    As an OpenCL programmer I’m actually quite bullish on those powerful Gen cores for parallel processing, but some part of me misses the days of x86 cores and clock speed scaling…

      • tipoo
      • 3 years ago

      The compute capacity in them is admirable. Here’s an Iris Pro beating a 750M. I wonder how gen 9 vs Pascal will shake out on mobile part compute.

      [url<]http://cdn.arstechnica.net/wp-content/uploads/2015/06/2015-Retina-MBP.010.png[/url<] I guess they went into "we're not letting this APU thing get ahead of us" mode and hence bolted decent GPUs onto every consumer CPU. I hope DX12/Vulkan allow the IGPs to do something useful even when you have a dedicated GPU. Say offload physics or something, since the compute potential of them is good per area die size.

      • Wirko
      • 3 years ago

      You must be right, the CPU cores look like some kind of … peripheral logic needed by the GPU part.

        • tipoo
        • 3 years ago

        I’m not saying it would be optimal, but I want a what-if machine to see what life would be like if every transistor for the GPU was still being used to further per-core performance. They’d probably hit diminishing returns, but I should think it would still land fairly higher than today.

          • bhtooefr
          • 3 years ago

          There’s always widening the cores, and doing speculative execution to keep the new pipes full.

          That’s a good way to bust out of your power budget, but with enough branches in flight, you could do some thoroughly astounding single-thread compute.

    • chuckula
    • 3 years ago

    [quote<]At least from a gamer's perspective, it's hard not to be disappointed by that fact. FreeSync monitors are becoming more and more widely available, and Intel is clearly thinking about the low-end gamng market to some degree. [/quote<] Yeah but think about it this way: Most of these new freesync monitors seem to require a minimum of 48FPS before Freesync kicks in. Let's be real here, how often is Krabby Lake actually going to be able to go fast enough to meet the minimum freesync frame rate in the first place?

      • anotherengineer
      • 3 years ago

      That’s not to say the vesa adaptive sync may possibly come all the way down to 24Hz or less in future revisions though.

      • tipoo
      • 3 years ago

      The 72EU + eDRAM Iris Pro has a chance…The Iris Pro 5200 from Haswell with 40EUs could handle most AAA games at 1440×900 low or 720p/med-high. ATs Skull Canyon NUC review shows its potential, but unbound by thermals in a desktop it could do even better.

      • the
      • 3 years ago

      Already pointed out but the GT3e/GT4e do have some impressive GPU power for their power consumption.

      Sadly, Intel previously let it slip that Kaby Lake was not going to support DP 1.3/1.4 this round. FreeSync support is likely coming when DP 1.3/1.4 gets added to Cannon Lake sometime next year.

        • tipoo
        • 3 years ago

        On the larrabee blog they said the Gen graphics guys wanted to do a dedicated GPU but Intel management isn’t letting them…let them free!

        [url<]https://tomforsyth1000.github.io/blog.wiki.html[/url<]

      • ImSpartacus
      • 3 years ago

      Krabby Lake? Hmmm, I think I like it.

        • tipoo
        • 3 years ago

        I think Krabby Patty every time I see Kabby Lake

        • lycium
        • 3 years ago

        Baby Cake

          • tipoo
          • 3 years ago

          [url<]https://www.youtube.com/watch?v=cm_iv_lUh6Y[/url<]

      • tsk
      • 3 years ago

      Many of these monitors use LFC(http://www.amd.com/Documents/freesync-lfc.pdf) so it will still be smooth under the freesync range(the GPU needs to do this though). Most of the monitors have a minimun freesync range of 40Hz, but some go down to 30 too.

      • EndlessWaves
      • 3 years ago

      Most?

      I’d be surprised if even half were that high and most of those are 144hz models so LFC still gives a lot of the benefits below 48fps.

      True, a lot of them are stuck at 35-40fps and even the best haven’t gone below 30fps yet but every little helps.

      • Airmantharp
      • 3 years ago

      An internal display would take such things into account- and let’s say they get it down to 24Hz (so 24-60Hz), that would still be incredibly useful for low-rent games (LoL, TF2) *and* for video streaming.

    • chuckula
    • 3 years ago

    [quote<]Moore's Law may not be dead, but it is changing.[/quote<] It's not dead yet! It's just resting after being shagged out following a prolonged squawk. It wants to go for a walk!

      • tipoo
      • 3 years ago

      It’s dead, Jim.

        • ronch
        • 3 years ago

        Yeah. Stick a fork in it.

          • the
          • 3 years ago

          Until quantum computing arrives where it can be both alive and dead simultaneously!

            • tipoo
            • 3 years ago

            [url<]http://www.smbc-comics.com/comics/20120218.gif[/url<]

            • ronch
            • 3 years ago

            But that would make things a teeny weeny bit more confusing.

            Should I call the taxidermist or not?

            • K-L-Waster
            • 3 years ago

            Call the taxidermist and cancel the appointment — assuming you have not yet made the appointment of course.

            • ronch
            • 3 years ago

            Ok I get it it. Moore’s Law is not dead, but neither is it alive. That makes it a Zombie.

            • Gastec
            • 3 years ago

            UNDEAD to the very quantum core! What Schrödinger always tried to explain to the masses of would-be zombies.

      • lycium
      • 3 years ago

      It’s cheating on Amdahl’s law with Murphy’s law.

      “We were on a break!”

      • Aquilino
      • 3 years ago

      It’s surely changing, but not that law:
      [url<]http://blog.dshr.org/2016/07/end-of-moores-law.html[/url<]

      • Pancake
      • 3 years ago

      E’s pining for the fjords.

Pin It on Pinterest

Share This