Specs leak out for desktop Kabini processors

Up until CES, we had no clue that AMD would offer Kabini as a socketed desktop processor. But then we came across a desktop motherboard with a new, 721-pin socket primed for an unannounced line of Kabini chips, and AMD’s plan became (somewhat) clear.

This week, we’ve learned even more. A DigiTimes story posted this morning says that AMD will release several desktop Kabini APUs in February. The chips will be priced between $30 and $60, and AMD reportedly expects to ship 1.5 million of them this year. So claim DigiTimes’ sources at motherboard makers, at least.

Another story posted on Sunday by VR-Zone Chinese reveals purported specs for the desktop Kabini family. VR-Zone says there will be five chips, all with 25W power envelopes and integrated Radeon HD 8000-series graphics:

Processor Cores Clock

speed

L2

cache

Radeon

IGP

TDP
A6-5350 4 2.05GHz 2MB HD 8400 25W
A4-5150 4 1.60GHz 2MB HD 8400 25W
A4-3850 4 1.30GHz 2MB HD 8280 25W
E1-2650 2 1.45GHz 1MB HD 8240 25W

The DigiTimes story mentions the same five model numbers. However, it says the A4-5150 will be clocked at 1.05GHz, not 1.60GHz. It’s hard to tell which source is correct at this stage, especially since Kabini can be given different base and peak clock speeds with Turbo Core.

Another discrepancy between the two sources: while DigiTimes’ sources expect a February launch, VR-Zone believes the desktop Kabini lineup won’t be out until March.

Discrepancies aside, I think we’re getting close enough to the truth for comfort here. Desktop Kabini is coming, and it’s coming soon—possibly in five different flavors, the fastest of which will be clocked a smidgen higher than the speediest mobile model.

Comments closed
    • StashTheVampede
    • 6 years ago

    Whoa! Quad core home server, on the cheap, here I come!

    • ptsant
    • 6 years ago

    At the price, this well sell like hot cakes in price-sensitive markets (ie not USA or EU). This is also the right response to the tablet craze, for many casual users: cheap, good enough and in convenient form factors with typical PC expandability.

    • DPete27
    • 6 years ago

    AMDs answer to Bay Trail Atom on the Desktop. It’d be nice to see this comparison (and to other desktop CPUs). So far all we know is that the A4-5400 trades blosw with the i3-3317U mobile processor.

    • ronch
    • 6 years ago

    How come all these chips are rated for 25w? Either the top model is very efficient or the bottom model is freakin inefficient.

      • Shobai
      • 6 years ago

      This is the way AMD has, for as long as I can remember, rated their parts. Someone correct me if I’m off the mark, the way to read that is to say “this part will not exceed yy W”, whereas Intel’s rating is more like “this part will typically peak at zz W”.

      • OneArmedScissor
      • 6 years ago

      Desktop parts have the same cooling. Mobile varies, depending on what it’s meant to be used for.

      As a cost cutting measure, it’s likely that AMD sets a very high default voltage for this entire line, rather than carefully binning them.

      Intel does the same thing with some Pentiums and Celerons, which may have ultra low voltage-like clock speeds, but full speed TDPs.

      It doesn’t mean they’re bad chips, but they have to hit every price point, for every type of device, and something has to give.

      • HisDivineOrder
      • 6 years ago

      Imagine one chip hits 25W most of the time while some hit it every once in a while.

      Yet each of them needs to say 25W just in case.

      • Anonymous Coward
      • 6 years ago

      I think you have overlooked a third possibility.

    • ronch
    • 6 years ago

    Welcome to the world of ‘fast enough’ computing.

      • HisDivineOrder
      • 6 years ago

      That’s what tablets and smartphones have been arguing for since the beginning. OEM’s have taken too long to get high resolution displays and SSD’s into every PC sold, leaving room for tablets and smartphones to seem “quick enough.”

      Also, Microsoft dropped the ball and failed to optimize Windows to be as always-on, no reboots/restarts (not even for updates to components included with Windows), and make the OS scale all programs well with high resolution displays at varying sizes and aspect ratios as they should have.

      Put the two together along with a Microsoft tax that makes devices on Android cheaper than devices using any form of Windows and a previous tax that Intel was doing until their management change.

      Bam. You have why “fast enough” seemed fast enough to most consumers. Now it’s too late to change the perception.

        • NeelyCam
        • 6 years ago

        Yeah – it’s like saying “Honda Civic is fast enough”

          • JustAnEngineer
          • 6 years ago

          The 2015 Civic Type R, with it’s 280+ hp engine is probably more than fast enough for most of us.
          [url<]http://www.youtube.com/watch?v=aha-DQB3xDY[/url<]

            • NeelyCam
            • 6 years ago

            Either

            1) It’s gonna get cancelled because of yield issues,
            2) It overheats and the cooler is louder than the engine, or
            3) It gets downclocked to 140hp to stay within a 28mpg budget

          • peartart
          • 6 years ago

          When you’re stuck in traffic…

        • Klimax
        • 6 years ago

        Just a note: There is no such thing as “No restarts for certain updates” ever. Kernel side is sufficiently complex thing, that you want to touch it during run as little as possible no matter what OS you have.

        BTW: Hotpatching is present since a long time in Windows, but it was rarely used, because there was always an update for Kernel. (seen it directly quoted, not sure current status)

        And another thing, it seems that number of patches requiring restart decreased in Windows 8.
        (Pretty sure that even smartphones/tablets need to be restarted for updates…)

          • madmilk
          • 6 years ago

          Check out [url=https://www.ksplice.com/<]ksplice[/url<] for Linux.

    • marraco
    • 6 years ago

    4 cores is so past-decade…

    If they remove integrated graphics, they would liberate a lot of wasted power, which could be used for the main cores.

      • cal_guy
      • 6 years ago

      If your’re not using the iGPU then it get’s power gated, and effectively becomes a heatsink for the CPU.

    • Metonymy
    • 6 years ago

    This may be off-topic, but… I was looking at the Bay Trail-D MSI offering (mb with chip for $60 we hear) and thinking that I’d like to put it in a CM Elite 130 case to do some very low-end data serving. And that same thought could apply to a Kabini and a matching inexpensive board.

    But in either case, I find myself wondering (though I don’t know what the board for Kabini woujld be) what to do for a PSU. The MSI board for the BT-D has traditional ATX connectors but where do you get an appropriate PSU? It clearly needs very little power, but the really low power PSU’s are usually junk.

    Am I just not seeing them, or am I missing something entirely?

    Thanks

      • ibnarabi
      • 6 years ago

      these guys 🙂
      [url<]http://www.mini-box.com/s.nl/sc.8/category.13/.f[/url<]

      • DPete27
      • 6 years ago

      This is the area where I wish [url=http://www.newegg.com/Product/Productcompare.aspx?Submit=ENE&N=100007657%20600014014%20600014004&IsNodeId=1&bop=And&CompareItemList=58%7C17%2D151%2D090%5E17%2D151%2D090%2DTS%2C17%2D151%2D114%5E17%2D151%2D114%2DTS%2C17%2D151%2D115%5E17%2D151%2D115%2DTS&percm=17%2D151%2D090%3A%24%24%24%24%24%24%24%3B17%2D151%2D114%3A%24%24%24%24%24%24%24%3B17%2D151%2D115%3A%24%24%24%24%24%24%24<]TFX PSUs[/url<] would take hold. Unfortunately, very few case choices to date.

    • StuG
    • 6 years ago

    If priced correctly, that A6-5350 would make a pretty cool HTPC part!

      • Vhalidictes
      • 6 years ago

      I have a BGA A6-5200 mini-ITX board, and Kabini is fine for day to day use. I was running a network file copy and Skype while getting ~35 FPS in medium-detail WOW. And that was with second-hand slow DDR3-1333 memory.

    • anotherengineer
    • 6 years ago

    Now, when will a Kabini A6-5350 notebook be available??

      • willmore
      • 6 years ago

      For well threaded workloads, I have to think that would work better than my current laptop which has a Pentium B940 (2GHz SNB dual core no HT). The graphics would certainly be better.

    • kamikaziechameleon
    • 6 years ago

    I’m confused how Arm Processors are rapidly encroaching on desktop space. How my phone can have a quad core processor but a similarly priced laptop has a dual core is pretty hilarious. It lends itself to the layman interpretation of “My phone is about twice as powerful as my computer!”

    I know its not that simple but Seriously desktop and laptop performance gains are nothing compared to the leaps and bounds that cell phones are experiencing.

      • chuckula
      • 6 years ago

      It’s easy to go up when you start out at the bottom. The “leaps and bounds” of improvement in smartphones have already started to slow down, and a whole bunch of the new “quad core” Mediatek type chips are really slower than last year’s models, but try to make up for it by being cheap cheap cheap.

      • OneArmedScissor
      • 6 years ago

      The big gains are in GPUs and co-processors, which PCs also see, but graphics cards get all the love on PC enthusiast sites.

      Laptops moved on to “big” dual-cores with faster cache and 4 threads. Intel has HT enabled on most mobile dual-cores, and AMD is dual “module.” Both sync the last level cache to the core clock.

      Phones are moving in that direction, like Atom, Apple’s A6, and Nvidia’s Denver.

      There was a short lived period where laptops had Core 2 and Athlon quad-cores. Today’s small tablets take that approach, but the bigger ones already use laptop parts.

    • cmrcmk
    • 6 years ago

    Does anyone else have to stop and think every time an article pops up mentioning Kaveri or Kabini to remember which is which?

      • Voldenuit
      • 6 years ago

      [quote<]Does anyone else have to stop and think every time an article pops up mentioning Kaveri or Kabini to remember which is which?[/quote<] Cheat sheet: "Slow" and "Slower".

      • willmore
      • 6 years ago

      Kabini rhymes with teeny while Kaveri rhymes with very.

      • Meadows
      • 6 years ago

      Just call them Bini and Veri.

      • UberGerbil
      • 6 years ago

      I confess I’ve mostly stopped caring.

    • Hattig
    • 6 years ago

    I’m guessing these will perform as well as you can expect a $30 to $60 up-to-25W quad-core CPU to perform.

    It’ll be interesting to see how these compare to low-end Kaveris in terms of performance.

      • swaaye
      • 6 years ago

      Look at Bay Trail as well. The performance per watt comparison is not pretty for Kabini.

        • Narishma
        • 6 years ago

        Performance per watt is irrelevant for desktop systems at these TDPs. Performance per $ is more important here.

          • UnfriendlyFire
          • 6 years ago

          Not unless if you’re running a medium to large business. With 200+ computers in one building, those electrical bills will add up.

          • Andrew Lauritzen
          • 6 years ago

          Well, if you can get to passive cooling that’s still a big win in a lot of applications. At 25W that seems less likely vs. Baytrail in small form factors, but I guess it all depends what you want.

          • swaaye
          • 6 years ago

          That’s true and I’m sure AMD will sell these for very cheap and they will fit in great in the budget lineup

          • mikato
          • 6 years ago

          What about set top boxes, Smart TVs, etc? They’ll be moving to more powerful chips eventually as the more powerful chips run cool enough. Maybe I’m jumping too far ahead.

    • chuckula
    • 6 years ago

    For all the sturm & drang about Kaveri & Mantle, I think people are missing the point.

    If Mantle actually succeeds at living up to the hype, it will make truly cheap systems — like Kabini desktops, not $190 Kaveri APU desktops — able to play games using low to midrange AMD GPUs that whallop any APU and are quite affordable since the Kabini parts are actually low-cost.

      • Deanjo
      • 6 years ago

      [quote<]If Mantle actually succeeds at living up to the hype,[/quote<] I have more faith in the US being able to pay off their national debt by year end.

        • maxxcool
        • 6 years ago

        OPENCL already exists. while we will see ‘some’ hsa test cases and demo-apps… OPENCL will crush HSA as it is deeply embedded in android, windows, mac-linux, hp-linux, normal-linux and even java..

          • Deanjo
          • 6 years ago

          I’m not sure what that has to do with Mantle. But yes, as far as adoption of API’s go Apple’s openCL api already crushes AMD’s Mantle just do to it being a true open API.

          • Antimatter
          • 6 years ago

          You don’t know what HSA is. OpenCL is completely compatible with HSA as are other APIs.

          [url<]http://developer.amd.com/community/blog/2012/07/10/hsa-a-boon-for-opencl-and-heterogeneous-compute-in-general/[/url<]

            • maxxcool
            • 6 years ago

            yup.. but requires more coding and debugging. why add to your workload and costs when OCL is already good enough ?

            • madmilk
            • 6 years ago

            HSA will make OpenCL applications easier to write, not harder.

        • anubis44
        • 6 years ago

        @Chuckula
        “I have more faith in the US being able to pay off their national debt by year end.”

        Then you are truly one optimistic MoFo 🙂

          • chuckula
          • 6 years ago

          That was Deanjo… I’m nowhere near as optimistic as he is!

            • Deanjo
            • 6 years ago

            What can I say, I give all outlooks the benefit of the doubt. ;D

      • UnfriendlyFire
      • 6 years ago

      The question, same for HSA, is how well will Mantle be adopted by the developers? If it’s just BF4 and a handful of other games… Then it’s sorta meh.

        • Goofus Maximus
        • 6 years ago

        And not just “how well,” but also “how effectively” it is adapted, since it’s almost at the level of writing one’s own graphics driver for one’s game, which is quite difficult when you add in all the various levels and flavors of graphics cards/APUs/CPUs that the game developer will have to target for their games.

        Truthfully, if I were a game developer, I’d only use Mantle for the lower end graphics systems, just to cut down on my development/testing workload while getting the biggest bang for my buck.

      • Price0331
      • 6 years ago

      I dunno, anything that causes more market fragmentation when it comes to PC game performance makes me a sad panda. I wish OpenGL was more prevalent. Instead what will happen is developers will be trying to support multiple API’s, therefore leading to more expensive development, which will cause cutbacks on other things. And today’s games are buggy enough as it is. Or am I missing something somewhere?

      • Andrew Lauritzen
      • 6 years ago

      Well… despite the hype games do more on the CPU than just rendering. Best case Mantle will free up one “big core” thread (maybe two, but less common), but the rest is pure game dev territory.

        • dpaus
        • 6 years ago

        Listen up, punk! (sorry, had to say that…)

        You might have had a point two years ago. Both with both of the big consoles now having eight-core CPUs, all future games will inherently be highly multi-threaded. That >should< make the effect of Mantle’s better efficiency much more noticeable.

          • Andrew Lauritzen
          • 6 years ago

          No, but it’s actually the other way around for these CPUs. The more multi-threaded general “game code” becomes the closer the rest of the engine is to being stalled regardless of the serial graphics API thread. The big win case for Mantle is when you’re CPU bound and there’s basically one core on doing all the DX driver stuff and the rest are sitting mostly idle. If you’re already making good use of many cores that’s where the benefit is really bound to removing that one API thread (at best). Obviously good, regardless, but it’s not going to make little cores chug through the rest of the game code any faster.

            • Theolendras
            • 6 years ago

            Aren’t you missing the parallel use of CPU that Mantle is also designed to address. We’ll get to see the potential soon enough, Thief and BF4 patch should come out before long.

            • Klimax
            • 6 years ago

            Don’t think so. DirectX + drivers are already quite parallel and when hitting particular milestones you have to serialize and at that point it doesn’t matter whether you use DirectX or new hot made up thing.

            And further more it still missed absence of evidence of problem and that Mantle is solution to any alleged problem.

            Either way, Mantle will not magically make synchronization disappear… (And I don’t really think that programmers are that good to write great multithreaded code and better then specialists)

            • Theolendras
            • 6 years ago

            Not according to this :

            [url<]http://hothardware.com/News/How-AMDs-Mantle-Will-Redefine-Gaming-Doesnt-Require-AMD-Hardware/[/url<] There seems to be some desire to get some more CPU to get going to talk with the GPU, seems only one was in calling the drivers under directX.

            • Klimax
            • 6 years ago

            And directly takes things without question from AMD, which are lacking evidence.

            So, no. Fail article. (ETA: evidence still MIA)

            • Theolendras
            • 6 years ago

            Isn’t what we do most of the time as geeks. Speculate on upcoming technology.Well anyway, I can’t find the interview anymore but Johan Andreson expressed the same kind of concern. Still it might be money talking there, so yeah I’m taking it with a grain of salt, but to me it seems the greatest benefit, as it is a nonsense to have 10 times more powerful CPU to run a console port conveniently. And I for one think there is very possibly part of the stack that wasn’t written with multicore in mind, and up to now it wasn’t that much of a problem, but it now frustrate developpers now that the next generation console is now in the game.

            • Klimax
            • 6 years ago

            Well, Carmack is quite doubtful of Mantle…

            Games are not that easy to push multithreading, because despite some parallel tasks, a lot of time there are massive inter-dependencies, which serialize things significantly and even in graphics itself it is hard to driver number of threads, because inherent parallelism is only on shader level.

            • Theolendras
            • 6 years ago

            Don’t make me wrong I’m skeptical on Mantle until they get support for other major platform trying to leverage it as an advantage Microsoft isn’t willing to take and/or that it opens itself to other players that are willing to support it, like say Intel, Qualcomm, Nvidia, Imation technology or ARM itself. But even if fragmentation is an issue I’d like it to get there if only to get a more possibility without relying on Microsoft for any gaming build or at least settling to get better efficiency down the road. I don’t see it getting traction if only AMD get behind on a lone platform.

            I think there is probably great efficiency potential that can be extract if only from comments of Oxide engine developpers and BF4 improved performance. I wish it would succeed, but I doubt it at the same time.

            • Andrew Lauritzen
            • 6 years ago

            I was responding to the claim that increasing parallelism from game engines is going to make the overhead reduction effect of Mantle *more* pronounced. That is incorrect – it’s the other way around.

            Obviously multithreading in Mantle is going to allow slower-but-wider (more cores) CPUs to be somewhat more competitive with big cores in cases where they were otherwise bottlenecked on graphics submission. Again though, the gain is necessarily bounded to the amount of time being spent on rendering submission today (even if it was fully removed and Mantle was “free”), which is ~1-2 big core threads in the worst case.

            But that takes me back to the original point – most games do more than just rendering these days and there’s no magic bullet for that work. No one is saying that Mantle won’t help things, but rendering is just one piece of what games do on the CPU these days.

            • Theolendras
            • 6 years ago

            Fair enough, still it’s cool with me if the developpers have more flexibility to choose to invest some more in graphical fidelity vs physics/AI or whatever they decide to do with the CPU cores. I recognize that it might not be a problem if the stack is serial by itself, unless it becomes a relatively serious bottleneck.
            Which might very well be the case, DirectX stack is no slouch, but being around for so long without much competition, some weaknesses are getting expose overtime, as GPU performance evolve much faster than single thread performance, some serialization in the stack might now be a problem, for what was perfectly fine for years.

            But even if you’re able to multithread relatively well and do something else with the CPU, a more ganular multithread as a whole can be more efficient some time (up to some point) as you can cut latency on some work here and there and technically improve the worse response time, which is quite popular metric on TR nowdaway. Now to mention that it would scale better as more thread are available on the hardware market.

      • nanoflower
      • 6 years ago

      But only for games that support Mantle. Today that number is 0. By the years end it might be as much as 10. Not exactly earth shaking numbers. Though you are correct in suggesting that if it works well and isn’t too difficult to add to an existing project more developers will look into it. I just don’t see this being an instant game changer for AMD. Even if it does all that they say it’s going to take time for games to support Mantle and for large numbers of people to take it into account when purchasing new equipment.

      • Theolendras
      • 6 years ago

      Anyone of these that solve memory bandwith as well at acceptable price (no Iris Pro style mark-up)

      • ET3D
      • 6 years ago

      Probably not. Mantle will reduce CPU use, but that doesn’t mean that Kabini will be fast enough. CPU isn’t taken just by graphics calls, and games written for consoles will assume 8 cores. Faster, fewer cores may be good enough, but fewer cores which aren’t faster will not be, unless developers make an effort to scale to that.

      Short of it, Mantle isn’t a guarantee of great performance on a low end AMD APU’s, even though it certainly should make things better than it is.

      • Pwnstar
      • 6 years ago

      Why did you translate “und” from the German but not Sturm or Drang?

      [quote<]For all the sturm & drang about Kaveri & Mantle[/quote<]

Pin It on Pinterest

Share This