Rumor: Unlocked Core i3 CPU will be late to the Kaby Lake party

Remember that Intel Core i3-7350K that the rumor mill was talking about a while back? According to a post on Japanese tech news site Hermitage Akihabara, if that CPU exists, it won't be joining the rest of Intel's seventh-generation processors at launch time. The site says that shipments of the new chips have already arrived in shops, but that certain models, including the 7350K, are absent from the lineup.

The site seems quite confident that desktop Kaby Lake CPUs will be going up for sale next Friday on January 6. Earlier rumors pegged the Core i3-7350K at around $175, but Hermitage Akihabara lists the chip at 25,000¥ ($214). However, the site also says that the prices it lists are higher than expected due to fluctuations in the exchange rate of the Japanese yen.

We're enthusiastic about the prospect of an unlocked Core i3 processor here at TR. We've long felt that higher clocks are superior to more cores for most scenarios, and we keep recommending the Core i3-6100 in our System Guides as the budget CPU to have. However, the purportedly high price of the Core i3-7350K makes the rumored chip look a little silly when a Core i5-6600K can be had for just a few dollars more.

Comments closed
    • NovusBogus
    • 3 years ago

    A super fast unlocked dual w/HT would be a very interesting challenge to the “go quad for future-proofing” philosophy. Price is questionable but maybe it’s an overseas thing.

      • ImSpartacus
      • 3 years ago

      You mean the future proofing that we all were supposed to do a decade ago, right?

    • Tristan
    • 3 years ago

    Best CPU for average users, as most soft still use single thread most of the time.

    • Kougar
    • 3 years ago

    Is it weird or annoying to anyone else to have each “generation” of Intel chips launching three different times across a 1-2 year range? Always keeping it in the news and in the general public’s periphery to the point that when it eventually launches on the HEDT platform it feels like an afterthought.

    • JosiahBradley
    • 3 years ago

    Why would someone throw a party for [s<] Skylake [/s<] Kaby Lake, is it it's birthday already?

    • Chrispy_
    • 3 years ago

    With even high-end laptops (ultrabooks) keeping dual-core with HT alive and well, the 7350K is going to be much more relevant than the old Pentium Anniverary Edition.

    As cool as the Pentium AE was, the 2-thread limitation became a bit of a PITA not long after it was released. 4 threads is the standard now, whether that’s an Atom, an i3 or i5, or even an AMD APU – nobody is making 2C2T for anything other than the most budget netbooks and tablets.

    The only disheartening thing is the price, which is so close to the K-series i5 that it’s almost insulting. The i5 offers way more cache and a dedicated core for one thread is better than a shared SMT core whichever way you spin it.

      • EndlessWaves
      • 3 years ago

      [quote<]nobody is making 2C2T for anything other than the most budget netbooks and tablets.[/quote<] 2/5s of Intel's entire desktop lineup is 2C2T, or 1/3 if you count the HEDT chips too.

        • Chrispy_
        • 3 years ago

        I was talking about vendors rather than Intel themselves.

        Once you dip below i3 you don’t typically see the socketted Skylake Pentiums anywhere on sale at all. If vendors want lower TDP they move to 2C4T CoreM or Pentium 4405Y. If they want cheaper than i3 they move to a Braswell or Cherry Trail option which is mostly 4C4T iin the things you see on shelves.

        2C2T just doesn’t perform well enough for the multitasking, background-services operating systems people use these days.

          • EndlessWaves
          • 3 years ago

          Excluding the stick PCs, the CPUs in desktops on the first page of listings for the popular UK site Ebuyer consist of:

          Celeron 1037U x4
          Celeron N3050 x2
          A6-7400k x1
          A4-5000 x1
          Pentium G4400 x2

          All but the A4-5000 are 2 cores without SMT, or single module.

          If we go on a page we get:

          N3050 x4
          N3700 x1
          A8-7600 x1
          i3-6100 x1
          A6-7400k x4
          1037U x1
          A6-6400B x1
          G4400 x2

          The 3700, 7600 and 6100 are HT Duals or Quads, the other twelve are all dual cores or single modules.

          I don’t know what the situation is where you are, but they’re still very widespread among the budget PCs offered for sale here.

            • Chrispy_
            • 3 years ago

            If you sort by price on a huge etailer like that, of course you’ll get the bottom of the barrel 2C2T or even 1M2T options. If you sort by relevancy or go to an actual store like PCWorld, Currys, Tesco, Sainsburys, Argos etc – they’re mostly as I describe.

            Yes, I still see the odd 2C2T celeron netbook but those tend to be old models on clearance.

    • DoomGuy64
    • 3 years ago

    Who buys dual cores in 2017? Who sells dual cores in 2017? What a rip, and overpriced. Ooo, It’s unlocked. Big whoop.

    How about an unlocked and full feature enabled i7 for $250? There’s a deal.

      • ImSpartacus
      • 3 years ago

      The fact that we’re still having this “debate” literally a decade after the [url=http://www.anandtech.com/show/2303<]65nm quad G0 Q6600s started competing with the 45nm dual E8400s[/url<] tells me that it's not obvious that lower clocked quads are superior. Tons of people claimed that a quad like the Q6600 would be much better "futureproofed" than a higher clock dual like the E8400. But after ten years, we're still hearing the same arguments.

        • Antimatter
        • 3 years ago

        Back then CPUs lacked turbo clocks. Today’s quads kinda get the best of both worlds.

          • ImSpartacus
          • 3 years ago

          And today’s duals have hyperthreading, so they also kinda get the best of bo-oh goddamn it, see? I’m doing this stupid argument again.

        • jihadjoe
        • 3 years ago

        And they were right. Overclocked to 3.3GHz My Q6600 is still quite useful today.

          • ImSpartacus
          • 3 years ago

          3.3GHz was a respectable overclock, even for a G0 Q6600. It’s roughly equivalent to pushing an E8400 to 4GHz (or slightly beyond), which would still be pretty potent today.

          Most Q6600s would’ve been happy to get to 3GHz and they probably would’ve had to dial back a bit if they were still active today.

            • jihadjoe
            • 3 years ago

            I was pretty lucky playing the silicon lottery that generation. My first Q6600 was a B3 from when Intel first dropped the price to around $260, and it made it to 3GHz on stock volts. It would actually do 3.2 (games and normal stuff), but would eventually crash under thermal load when encoding, folding or under Prime95.

            Edit: It seems some Gerbils [url=https://techreport.com/news/15960/core-2-quad-q6600-to-come-down-in-price-retire<]did even better[/url<], reporting 3.6 and 3.7GHz from the B3 Q6600s on air .

            • ImSpartacus
            • 3 years ago

            Yeah, but the dick measuring back-and-forth and lack of details make me wonder how many volts were running through those chips (or if it was even legit to begin with).

            We all know that a 24/7 oc is going to be different from a temporary suicide run.

            And if that 24/7 oc needs to keep chugging for close to a decade, then you can’t go crazy.

            • Waco
            • 3 years ago

            I had mine at 3.6 for a few days, but I ended up blowing the VRMs off the board during a stress test. The best part? DFI still existed, and they replaced the board since it was supposed to shut down under thermal/current overload, not explode into a gout of flame out the side of my case.

            The board was the only thing to die, and that Q6600 still lives on at 3.2 GHz in a buddy’s build. 😛

        • slowriot
        • 3 years ago

        Uh nah dude, this is pretty much entirely that a dual core chip is still cheaper for Intel to produce than a quad core one.

      • Anonymous Coward
      • 3 years ago

      I want three cores, six threads.

        • Farting Bob
        • 3 years ago

        AMD did a 3 core CPU a few years back.

          • JustAnEngineer
          • 3 years ago

          My family gave away the [url=https://techreport.com/review/16382/amd-socket-am3-phenom-ii-processors<]Phenom II X3 720 Black Edition[/url<] this year. It was running Windows 10 satisfactorily with an [url=https://www.asus.com/Motherboards/M3A78EM/<]M3A78-EM[/url<], 8 GiB PC2-6400, GeForce GTX460 and five hard drives in a Three Hundred Illusion case.

        • DoomGuy64
        • 3 years ago

        This. If you you’re going for acceptable mid-range, 3 cores 6 threads with 16GB ram is the 4GB GPU of 2017.

        Software is so bloated today that it is point blank unacceptable to have less than this. You need 16 GB of ram for a decent computer today, not because games need it, but because software bloat eats your memory and you need the extra ram to avoid constant disk swapping.

        It’s not that dual cores and 8GB can’t cut it today in performance, because they can. But only if you know how to disable and control all the worthless background services to compensate for it.

        Web browsers like Chrome are exactly why we need triple cores and 16 GB of ram. They waste ram, and unnecessarily load up processes for every single webpage opened. If FF stuck with their old methods for the foreseeable future, dual cores and 8 GB might cut it for low end, but even they are going multi-process, and have always had memory leaks, which is impossible to control outside of loading extensions like noscript and adblock. The average user therefore needs higher specs to compensate for today’s lazy programming and extreme bloat.

      • chuckula
      • 3 years ago

      Considering how AMD’s APUs perform: Who buys an AMD APU in 2017?

        • JustAnEngineer
        • 3 years ago

        It is surprising how many games run quite decently at 1080p on AMD’s APUs. If you compare the size and price of a mini-PC built with one of these vs. an Intel CPU + discrete GPU, the AMD APU can offer a compelling value. The latest twitch shooter may not run at full detail at 120+ fps, but games like Guild Wars 2 run well on an A10-7860K.

        There is a fairly small niche between no usable 3D gaming (Intel Core i3-6100 with integrated graphics) and full-on gaming (Intel i3-6100 or i5-6600K CPU plus discrete GPU), but even with your anti-AMD bias, you must admit that this is a price+performance niche that AMD has filled that Intel has ignored.

          • Anonymous Coward
          • 3 years ago

          I look forward to a Zen-based product filling that niche with a vengeance. Should be an ideal application, much better than top of the range chest-beating contents.

          • DoomGuy64
          • 3 years ago

          The only issue I have with AMD’s APUs is that they clearly can make better products as seen in the consoles, yet choose not to make these performance tiers available to the public. AMD would clean house if they did, especially in the SFF market.

          edit: fixed double negative. whoops.

            • f0d
            • 3 years ago

            yeah they could have done so much better
            if they made a similar cpu for desktop as consoles but replaced the 8 bobcat cpu’s with the 4 current ones in apu’s i would grab one for a mini portable computer instantly

            • Anonymous Coward
            • 3 years ago

            They must have certainly considered their options regarding the integration of larger GPUs with their various Bulldozer-type CPUs. I can only speculate what their reasoning was, but probably they concluded the volumes and margins were too small. The inefficient CPU cores also point to a wattage issue which reduces the appeal of the concept. Zen looks like it will help in that department.

            • ImSpartacus
            • 3 years ago

            Just like Intel did with the R series and C series?

            But oh wait, they didn’t and now it’s like pulling teeth to get them to put a bigger gpu in their offerings.

            I figure AMD might be a little more successful with raven ridge, but not that much more successful.

          • f0d
          • 3 years ago

          you can actually play a lot of games with intel integrated graphics – intel graphics isnt as bad as you make them out to be

      • Krogoth
      • 3 years ago

      It is because of simple economics.

      Dual-core chips are cheaper to fab and consume less power then larger pieces of silicon which makes them ideal for ultra-portable platforms.

        • Anonymous Coward
        • 3 years ago

        Cost savings are much smaller than 2/4 would suggest, considering packaging, the GPU and other standard components. I prefer to point to market segmentation.

        • ImSpartacus
        • 3 years ago

        So those extra cores take up that much space?

        Modern CPUs have reached SOC-levels of integration, so the cpu cores are becoming a smaller chunk of the overall silicon budget.

          • Krogoth
          • 3 years ago

          As far as yielding purposes are concerned. It does make a significant difference. Fabricating massive pieces of silicon has never been cheap to do.

            • DoomGuy64
            • 3 years ago

            Meanwhile cellphones have been making 8 core standard, with even the cheap knockoffs moving away from dual core. Keep this up, and people won’t even bother buying PCs, especially with the integrated graphics. Yet another point phones have one upped the PC.

            I don’t buy the excuses for dual core for even a second. The only plausible reason I can think of is massive profiteering and deliberate planned obsolescence. Selling garbage for the sake of causing a need to upgrade immediately after purchase, because the baseline performance is so terrible.

            One of the more obvious examples of this profiteering is SFF Mini-ITX boxes. If anything, SFF should be massively cheaper due to it’s size, but somehow cost MORE. The economic model of PC building is so broken that it appears to be reversed. Less cost more.

            I don’t think any of the current price gouging models are going to stay viable once Zen hits the market. There’s no way Intel is going to be able to keep selling dual cores at a premium at that point. The problem isn’t yields, it’s competition. Intel has been charging what it wants because there has been no viable alternative.

            • Krogoth
            • 3 years ago

            You are comparing two completely different CPU architectures and designs.

            Dual-core CPUs are bloody cheap on Intel camp. The “premium” you on some dual-core actually comes from HT support (not that isn’t really that much of a premium). AMD equivalents are bloody cheap too. They take-up the same space that budget-minded single-core CPUs used to take when dual-core x86 CPUs were considered to be high-end.

            • DoomGuy64
            • 3 years ago

            Doesn’t address my point about SFF, and the rest of your point is spinning Moore’s law. Everybody knows the PC ecosystem is rife with price gouging, especially in the pre-built sector with companies like Dell. We’ll see how long dual cores stay viable on the market once Zen hits. IMO, the only thing really saving dual core atm is HT and AMD’s complete lack of a competitive alternative.

            Quad core with HT will probably become the baseline once Zen hits. IF it performs up to the hype. Dual cores were great in 2005 with the X2, but in 2017 it’s nothing but profiteering from lack of competition, and market stagnation.

            • Krogoth
            • 3 years ago

            Ryzen will unlikely the change the landscape on the low-end desktop market unless AMD sells “quad-core” units at bargain basement prices for the low-end market. It is more likely that they sell “rejects” that are binned as dual and triple-cores units like AMD has done in the past with their Phenom I and II series.

            Dual-core chips aren’t going anyway either. The portable markets favor them due to constrains on power consumption and power envelope.

            • Anonymous Coward
            • 3 years ago

            It will be interesting if AMD decides to not bother with dual cores at all. They can hit whatever thermal targets they want if they are sufficiently aggressive with turning down clock speeds. Would be a nice marketing point vs Intel. Intel would be in a tricky spot if they wanted to talk about higher clock speeds, because there isn’t much room there between i3 and i7. AMD’s position would be easier to explain in one bullet point.

    • Bauxite
    • 3 years ago

    Kaby Lake desktop is on track to be as underwhelming/non-event as Broadwell.

    Maybe worse because those socketed ‘C’ model cpus with the cache were at least very interesting even though they were neutered a bit much on TDP etc.

      • ImSpartacus
      • 3 years ago

      I think the better analogy is Haswell Refresh.

      Broadwell at least had new chips. Kaby Lake feels very close to just being a Skylake Refresh (at least on the desktop).

      Just like Haswell Refresh, the most interesting thing is an unlocked dual core option.

      • Chrispy_
      • 3 years ago

      Remember, Intel have moved from Tick/Tock to Tick/Tock/[s<]Optimise[/s<]'Meh' Skylake was the Tock which updated the architecture, so Kaby Lake is the first of three totally unexciting [b<]rebranding exercises[/b<] that is barely more than a label swap on the Skylake parts we've had since August 2015. Coffee Lake comes after that (same architecture, same process node) so that's the [b<]re-rebrand[/b<]. Finally a new Tick comes with Cannonlake when Skylake gets copy-pasted onto 10nm manufacturing process. It might run hotter or cooler, faster or slower but it's still just Skylake put through the shrink-ray, so it's a [b<]mini re-re-rebrand[/b<] where you get to beta-test Intel's new 10nm process for them. What you're waiting for (a new microarchitecture with performance different to Skylake) is Icelake; It's about 2 years away on the current roadmap 2H 2018 (maybe closer to 18 months if Intel have to react to Zen being not-a-disappointment). Expect that to be about as underwhelming as Haswell > Skylake was!

        • Michaelzehr
        • 3 years ago

        So I’m going to have to keep straight kaby/coffee lake in my head? I already sometimes confuse maxwell/haswell… I’m going to have images of instant maxwell coffee at some point. Any chance we can go back to version numbers or something with an intuitive order to it? (At least google’s sweets are alphabetical.)

        Siiiigh… I’m getting old.

        [For anyone into the cognitive analysis aspect of things, my brain tends to organize things by how they sound. It’s useful for making bad puns, but I’ve had a difficult time keeping all the architecture names straight the last few years.]

          • Chrispy_
          • 3 years ago

          Personally I love codenames. Intel’s numbering system has actually been really consistent since Sandy Bridge but calling something a “Haswell i7 quad” is easier to process for me than a “4700-series”.

          2000 Sandy Bridge
          3000 Ivy Bridge
          4000 Haswell
          5000 Broadwell
          6000 Skylake
          7000 Kaby Lake
          8000 Coffee Lake
          9000 Cannonlake <Insert “Over 9000” joke here>

          …and then Intel will need a new naming scheme for Ice Lake and Tiger Lake in 2019, otherwise we’ll have the i7-eleventy-seven-hundred-K and the word eleventy isn’t likely to be one that you can use in a boardroom at Intel.

    • 1sh
    • 3 years ago

    What if Intel released a 5.0ghz core i3…

      • Prion
      • 3 years ago

      It would probably be THE processor to have for MAME or any (non-HLE) console/PC emulators.

        • evilpaul
        • 3 years ago

        Can’t a largish potato run MAME?

          • derFunkenstein
          • 3 years ago

          “Run” yes, but those “newer” arcade boards like Sega NAOMI2 and whatever it is that uses the PS2 architecture need some beef.

            • the
            • 3 years ago

            NOAMI2 is actually based off of the same architecture as the Dreamcast, just higher clocks and more memory. The Aurora board is the successor to this lineage.

            Sega’s other big card board of that time frame was Triforce which was based upon the Nintendo GameCube.

            Sega never touched the PS2 architecture for good reason: the PS2 design is a bit insane from a programming standpoint. Sony announced but I don’t think they ever shipped a version of the Emotion Engine with 32 MB of eDRAM for arcade boards.

            • derFunkenstein
            • 3 years ago

            I didn’t mean that Sega released a PS2-based board, just that those designs exist.

            edit: not System 573, that’s PS1 based. There’s a list here: [url<]http://www.system16.com/base.php#3[/url<]

            • christos_thski
            • 3 years ago

            Ummm, I downvoted you by mistake, sorry about that. Have a habit of highlighting what I’m reading and the lousy mouse skipped while I was clickety-clicking. 🙁

        • faramir
        • 3 years ago

        I run MAME perfectly fine on a $20 Cortex A7 board with 1 GB of RAM (OrangePI PC + RetroOrangePie).

        Double Dragon … 🙂

    • derFunkenstein
    • 3 years ago

    I agree with you, Zak. Even the rumored $175 price is a little silly. It seems to be Intel’s thing, though. A Core i3-6320 will set you back $160 at Newegg, so $15 more for unlocking isn’t a huge problem. The bigger problem is that the whole line is kind of pricy, mostly due to the lack of competition. $215 is flat-out silly, though, like you said.

    Zen-based Socket AM4 APUs—as opposed to the warmed-over Chorizo currently out in the wild—cannot get here soon enough.

      • clocks
      • 3 years ago

      “Zen-based Socket AM4 APUs—as opposed to the warmed-over Chorizo currently out in the wild—cannot get here soon enough.”

      Correct me if I am wrong, but my understanding is the Ryzen cpus released in Q1 have no IGP, thus making a video card mandatory?

        • MOSFET
        • 3 years ago

        That may be true, but derFunk’s statement is still correct.

        • derFunkenstein
        • 3 years ago

        You’re correct but the plan eventually is to have Zen-based APUs.

Pin It on Pinterest

Share This