Intel releases six more 9th-generation CPUs, most without graphics

At its CES 2019 keynote, Intel moved very fast. To be fair, the company had a lot to show—or at least, a lot to talk about. One of the first announcements that Intel SVP Gregory Bryant swiftly sped through was that the company just launched six new Core-series desktop CPUs. See for yourself:

New 9th-gen

Intel Core CPUs

Base CPU

clock speed

Peak single-

core turbo

Cores / 

threads

TDP Last-level

cache

Memory

support

Integrated

graphics

Core i3-9350KF 4.00 GHz 4.60 GHz 4 / 4 91 W 8 MB DDR4-2400
Core i5-9400F 2.90 GHz 4.10 GHz 6 / 6 65 W 9 MB DDR4-2666
Core i5-9400 2.90 GHz 4.10 GHz 6 / 6 65 W 9 MB DDR4-2666 UHD 630
Core i5-9600KF 3.70 GHz 4.60 GHz 6 / 6 95 W 9 MB DDR4-2666
Core i7-9700KF 3.60 GHz 4.90 GHz 8 / 8 95 W 12 MB DDR4-2666
Core i9-9900KF 3.60 GHz 5.00 GHz 8 / 16 95 W 16 MB DDR4-2666

If you're wondering what the heck an "F" CPU is, don't feel bad—we didn't know either. As it turns out, the "F" designation indicates a processor with its integrated graphics disabled. The blinded versions of the Core i9-9900K, Core i7-9700K, and Core i5-9600K are nearly identical to their non-"F" variants aside from the missing graphics capability; the primary other difference is missing support for Intel's Transactional Synchronization Extensions (TSX-NI).

The actually new CPUs here are the Core i5-9400, Core i5-9400F, and Core i3-9350KF. The two Core i5s are the first multiplier-locked 9th-generation CPUs we've seen. As expected, they lose a fair bit of clock rate compared to the Core i5-9600K, but they also get to chop 30 watts off their TDP in exchange. Meanwhile, the Core i3-9350KF is a hot-clocked quad-core that seems to be aimed directly at enthusiasts and overclockers.

The decision to release a bunch of CPUs without integrated graphics is a curious one. It would be easy to point to the success of AMD's Ryzen processors—most of which lack integrated graphics—and suggest that Intel is simply playing "monkey see, monkey do," but that seems out of character for a company that traditionally has been a trend-setter, not a follower. Certainly there's little benefit to the end user in buying a CPU without graphics.

Intel says the new chips will be available starting this month, and that more processors—including additional H-series chips for laptops—will follow in Q2 of this year.

Comments closed
    • maroon1
    • 7 months ago

    So, core i3 is now going to have turbo boost ?!

    • WhatMeWorry
    • 7 months ago

    Seems a little ironical. Intel hires Raja Koduri and announces a big push into graphics and now this?

    • notfred
    • 7 months ago

    Interesting that hyperthreading is gone from everything apart from the very top Core i9-9900

    I wonder if this is in response to all the Spectre stuff where some of the attacks require SMT? Probably left on the enthusiast parts for “MOAR!!!”

      • Krogoth
      • 7 months ago

      It is party that and party because of thermal budget. Coffee Lake R chips are toasty SOBs even with HT disabled when fully loaded. Intel also doesn’t want to completely cannibalize its existing Skylake-X LCC SKUs.

      • Anonymous Coward
      • 7 months ago

      With so many cores, who needs SMT?

    • Tristan
    • 7 months ago

    all new CPU’s without graphics will be just 10$ cheaper, because this is CPU with integrated GPU, not GPU with integrated CPU.

      • f0d
      • 7 months ago

      have a look at the die shot here
      [url<]https://www.techpowerup.com/reviews/Intel/Core_i5_9600K/3.html[/url<] the gpu does take a fair chunk of the space in an intel cpu, seems like at least a third of a 6 core is gpu - more than $10 worth imo

        • Anonymous Coward
        • 7 months ago

        The 8-core no-GPU die would be smaller than the 6-core with-GPU die.

          • Krogoth
          • 7 months ago

          There isn’t any 8-core GPU-less die. It is just disabled and fused a.k.a dark silicon.

            • Anonymous Coward
            • 7 months ago

            True, but it [i<]would be[/i<] smaller if it existed. Imagine paying less for an 8-core than for a 6-core with a useless GPU, obvious choice for (I argue) any competent purchaser whether or not a OEM serves as a middleman.

            • Usacomp2k3
            • 7 months ago

            You have a really strong opinion on this don’t you 🙂

            • Anonymous Coward
            • 7 months ago

            Might just be bored with Intel’s sameness. So bland.

    • SuperBowlLIII
    • 7 months ago
      • Shobai
      • 7 months ago

      Is there an appropriate hammer for type of thing?

        • Captain Ned
        • 7 months ago

        Mais oui!!

          • Krogoth
          • 7 months ago

          Duke Nuked has graced us with his mighty presence. 😉

      • willmore
      • 7 months ago

      spam spam spam spam spam spam spam spam eggs bacon and spam

        • JustAnEngineer
        • 7 months ago

        [url<]https://www.youtube.com/watch?v=anwy2MPT5RE[/url<] [quote<]Have you got anything without SPAM in it?[/quote<]

          • willmore
          • 7 months ago

          It’s not got much spam in it!

    • rutra80
    • 7 months ago

    They prepare a shift towards dedicated GPUs to be released in 2020.

      • Krogoth
      • 7 months ago

      You couldn’t be more wrong. iGPUs is the future for mainstream platforms. Intel is trying to cash on the growing use of GPGPUs in certain markets with their upcoming discrete SKUs. They are also testing out their next generation iGPU platform in light of AMD RTG’s Navi.

    • DPete27
    • 7 months ago

    Just cut the IGP by 50-75% so I have a failsafe to fall back on if I need to troubleshoot my system without a dGPU and use that die space for the CPU cores.

      • Anonymous Coward
      • 7 months ago

      Now that you mention it, why did Intel keep the full GPU on those 8-core dies even in the face of capacity constraints? Nobody with a chip like that uses the iGPU for anything other than debugging.

        • nico1982
        • 7 months ago

        I’m sure Intel knows its customer demographic better than anyone of us can speculate. I guess that it is the best solution to support the widest products range – from server to mobile – with the smaller number of different die configurations.

        • Wirko
        • 7 months ago

        Now that *you* mention it: why did Intel never sell CPUs with iGPU disabled before, save for a few rare exceptions like i5-3350P? I suppose they never run out of partially defective chips, and due to iGPU surface area, defects should often be located there.

        Or maybe they manage to sell many of those chips as socket-1151 Xeons.

          • JustAnEngineer
          • 7 months ago

          I believe that what we’re seeing today is still spreading fallout from Intel’s 10 nm process development failure. They have been forced to scavenge every single CPU die that they can sell from their venerable capacity-constrained 14+++++++++++ nm process.

          • Anonymous Coward
          • 7 months ago

          I wonder if Intel has a warehouse somewhere full of dead-GPU dies from the past couple years. 🙂

        • psuedonymous
        • 7 months ago

        Because Intel sells vast amounts of desktop CPUs to businesses (via OEMs), and a comparatively teeny tiny trickle to enthusiasts. And those businesses want iGPUs.

        Think of it this way: once you add a dGPU, [i<]even if the cost of a GPU-less CPU plus the GPU is the same as an equivalent CPU with integrated graphics[/i<] the integrated CPU is preferable for a prebuilt system (99.999999% of business deployments. Nobody wants the headache of self-builds and self-support in volume). The SKU cost of the PC will be lower due to not having to muck about sourcing, shipping, installing, and supporting an additional component.

          • Anonymous Coward
          • 7 months ago

          I understand the argument for the mainstream chips all sporting GPUs, but who [i<]really[/i<] has an 8-core Intel chip without a discrete GPU plugged in? That doesn't strike me as a realistic concern. OEMs could, and would, and [i<]should[/i<], bundle a GPU in that price class.

            • Grahambo910
            • 7 months ago

            I have one. At work. As does every other engineer in my building. [i<]Really[/i<] I asked IT for a Quadro, as I do a fair bit of FEA and CFD, and even offered to install it myself. Apparently they don't want the "headache" of having someone on "non-standard" hardware.

            • Anonymous Coward
            • 7 months ago

            Weird. Sounds like you believe a Quadro would have been a better choice. Does the value of the integrated GPU in the 8-core product depend on customers making poor choices?

    • Krogoth
    • 7 months ago

    i3-9530KF will probably be an overclocking king assuming it is a Coffee Lake R with 1/2 of the CPU and iGPU disabled (dark silicon).

    • Waco
    • 7 months ago

    It seems the segmentation game is still going at Intel.

      • Krogoth
      • 7 months ago

      This move was done to fight off Ryzen+ SKUs and keep enough product flowing. I’m kinda surprised that Intel hasn’t done it sooner.

        • Waco
        • 7 months ago

        Intel wonders why [insert feature or instruction set here] isn’t being widely adopted, it’s because they chop them out randomly on parts.

          • chuckula
          • 7 months ago

          Considering AMD hard-segments its entire chip line in silicon so that you literally can’t get integrated graphics with anything more than a quad-core part*, I think the rage is a little misplaced here.

          * Funny, I seem to remember that Intel was pure-evil for holding back the entire computing industry for not adding moar coars to consumer parts. Here we are with 6 and 8 core regular parts with IGPs and it gets knocked for making its chips more like AMD where you don’t even get the choice.

            • Waco
            • 7 months ago

            No rage here, just annoyance at marketing games.

            • Anonymous Coward
            • 7 months ago

            There is a major difference between disabling something that already exists, and not including something in the first place. You even go so far as suggesting this is AMD doing product segmentation. Ridiculous.

            That said, probably Intel is so pressed to sell every last chip that fusing off the occasional defective GPU and selling the rest at the highest possible price makes sense.

    • Krogoth
    • 7 months ago

    This is Intel trying to get more yields out of their strained 14nm production line. Coffee Lake Rs aren’t exactly easy to make. They are selling off units that have defective iGPUs but CPU is still good.

      • bhtooefr
      • 7 months ago

      But I thought 14 nm yields were just fine, and 14 nm wasn’t a disaster for Intel at all.

        • chuckula
        • 7 months ago

        [quote<]14 nm wasn't a disaster for Intel at all.[/quote<] 14nm IS APOCALYPTIC!! In other news, AMD would like to write itself into the DC universe so Apokolips can visit them too.

        • Krogoth
        • 7 months ago

        14nm production line is strained through from sheer demand and mishaps of 10nm process which should have been taking over desktop/laptop CPU space by now.

        Yields aren’t 100% either. This is another way of Intel reclaiming silicon that is mostly working (bad iGPU) and get product into channel.

        • derFunkenstein
        • 7 months ago

        One would hope that by now, three and a half years in, 14nm was working fine.

        Unlike Krogoth, I’m less convinced about just fusing off the graphics, but it’s going to take delidding to be sure. It’d be an interesting way to free up fab space, and the R&D is probably not a big deal for Intel.

        • psuedonymous
        • 7 months ago

        14nm yields are just dandy, but demand is exceeding supply. Even a small trickle of dies that can have the GPU fused off will sell like hotcakes. Intel would normally just not bother with plenty of fully working dies to go around, but this is no longer the case due to supply going chiefly to high margin enterprise products.

    • Noinoi
    • 7 months ago

    Press F to pay respects to the excised integrated graphics.

    I’ve noticed that the i3 seem to have gained Turbo Boost, too. The 8th Generation desktop i3s didn’t have any form of turbo boost.

    Also do wonder if the i3-KF is a soldered chip or not; and if i3s are natively quad-cores, or severely damaged octa/hexacore dies. Same for the non-K i5s; we do know that the i5-9600K is apparently also based on the octa-core Coffee Lake Refresh die, just with two outright dead cores.

    With regard to the lack of graphics on these processors, I have a feeling that it’s so that they can have more supply for a given kind of processor if they don’t have to care about the iGPU working. If it ends up slightly cheaper for us enthusiasts that’ll be adding a video card anyway, all the better.

      • BurntMyBacon
      • 7 months ago

      FFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFF

    • chuckula
    • 7 months ago

    Yawn, wake me up when the finally disable the integrated memory controller and make you use a separate chip. That’s what I call innovation.

      • BobbinThreadbare
      • 7 months ago

      bring back slots, sockets are so lame

        • sreams
        • 7 months ago

        Bring back the pre-slot sockets. And SIPPs.

          • Wirko
          • 7 months ago

          Bring back the night, the OP could use some more sleep.

      • Krogoth
      • 7 months ago

      Let’s call the new chip a “Northbridge”. Pure genius!

    • f0d
    • 7 months ago

    would have been much better if the K and F designations were swapped around
    the Core i9 9900FK deserves a place in my computer

      • cmrcmk
      • 7 months ago

      If only they also had a low power ‘u’ variant of that chip…

        • tritonus
        • 7 months ago

        Still waiting for the non-IGP, ULP, socketed-CrystalWell, unlocked variant.

          • BurntMyBacon
          • 7 months ago

          I think there is a contradiction there, but I gave you the upvote anyways 😉

Pin It on Pinterest

Share This