Faster Zambezi chips might show up early next year

AMD’s latest ballpark estimates peg the launch of Zambezi processors in September at the latest. That said, the initial batch of chips might be outpaced by faster successors not long thereafter. X-bit labs reports that AMD is cooking up a four-CPU Zambezi refresh for the first quarter of 2012.

That refresh will reportedly include FX-8170, FX-8120, FX-6120, and FX-4120 processors. Looking at the model numbers alone, it’s easy to figure out that these chips will likely be quicker than those in the launch lineup. X-bit labs says the first four Zambezi processors will be branded FX-8150, FX-8100, FX-6100, and FX-4100.

Specifications for some of the CPUs leaked out earlier this month. Word is that the FX-8150 wil have eight cores, a 3.6GHz base clock speed, a 4.2GHz turbo speed, and a 125W thermal envelope. The leading digit in FX-series model numbers apparently stands for the core count, while the last two digits represent performance. For example, the FX-4100 is said to have four cores, a 3.6GHz base speed, and a 3.8GHz peak turbo speed.

In any case, assuming X-bit labs’ information is correct, AMD should have some new blood to ready to face Intel’s 22-nm Ivy Bridge processors in early 2012. Whether we’ll see a competitive fight is another story, though.

Comments closed
    • gamoniac
    • 8 years ago

    I see nothing wrong with coming up with new, better products. Intel does it with SB and SB-E. Would we rather they keep it quiet until the day of release because some of us will face the dilemma of upgrading now or wait for 4 more months?

    • ronch
    • 8 years ago

    I’ve been waiting for Bulldozer and so hoping it will let AMD make a big comeback, but these early indications somewhat point to the possibility that Bulldozer’s launch won’t be that much different from Barcelona: Late and not performing as expected. If AMD can’t keep up with Intel, who can? AMD can’t be relegated to the low-end/mid-range forever, being at the mercy of Intel and forced to lower prices at Intel’s whim just to sell stuff. CPU R&D is expensive, and if they keep losing money, as they seem to often do, AMD won’t stay around forever. The only way for them to pile up the cash is to come up with a product that Intel can’t just bully around and force down the ladder.

      • dpaus
      • 8 years ago

      Bobcat and Llano are two such products, and they should result in some nice (if not spectacular) cash flows for AMD. If the early perfoirmance indicators hold true, Bulldozer – and especially Trinity – should contribute to that in a substantial way.

        • ronch
        • 8 years ago

        Yeah, but high end is where the money is. Furthermore, a high end processor like FX will have no trouble trickling down the performance and price ladders in the future, whereas a chip like Bobcat and Llano are practically stuck in their current market segments.

          • dpaus
          • 8 years ago

          [quote<]high end is where the money is[/quote<] If you mean 'high-end desktop', sorry, I disagree. Intel's 'Extreme Edition' CPUs are a fraction of a portion of a part of a percent of their revenue numbers. Server chips, on the other hand, are good money - if you can get it. AMD has been effectively shut out of real competition in the server market for a couple of years now, but if the Bulldozer architecture delivers on its promises, they will have the [i<]platform[/i<] to be a serious competitor. I don't think a 4-module/8-core chip will do it, but if they can engineer an 8-module/16-core chip in any kind of half-decent TDP, they'll have a killer server CPU on their hands.

    • Johnny5
    • 8 years ago

    If the last two digits are performance, and they already have 00 models for 8, 6, and 4-core chips, what happens when they want to release a model with a lower clock than one of those? Are they really certain they don’t want anything lower than a 3.6GHz base speed?

    • LiquidSpace
    • 8 years ago

    The title says it all, another trolling for shintel.
    Well if the next gen Bulldozer is going to be 22nm or at least a more improved 32 nm chip then it’ll be more or less faster than the “YET to be seen how fast, compared to Sandybridge” Bulldozer chips.

      • Game_boy
      • 8 years ago

      I’d be surprised, given GF’s most recent schedule and normal lead time from processes being ‘ready’ to retail parts, if 20nm chips appeared before 2014.

      I don’t think GF are doing 22nm.

        • NeelyCam
        • 8 years ago

        20nm… 22nm… they are all the same: marketing punchlines. “Ours is smaller!” What matters is transistor performance and gate/wire pitches.

          • dpaus
          • 8 years ago

          [quote<]"Ours is smaller!" [/quote<] That's what she said you'd said...

          • Game_boy
          • 8 years ago

          Yes, and Intel is a generation ahead on that at all times, with HKMG, finFETs, and node size.

          That said they are not capable of making Bobcat on 32nm at that die size because their density at low clocks is poor.

            • willmore
            • 8 years ago

            Really, I would have thought moving from a bulk to a SOI process would have been the biggest hurdle.

            • NeelyCam
            • 8 years ago

            Who is moving from bulk to SOI?

            • willmore
            • 8 years ago

            Game_boy said:

            [b<]That said they are not capable of making Bobcat on 32nm at that die size because their density at low clocks is poor.[/b<] TSMC has no 32nm, I don't think Intel will be fabbing chips for AMD, so that leaves GloFlo's 32nm SOI, right? Since bobcat is currently on a 40nm bulk process at TSMC, that would require it to be moved from a bulk to an SOI process, no?

    • maxxcool
    • 8 years ago

    In other news the sun might rise tomorrow…

    • ronch
    • 8 years ago

    I notice that whenever there’s a negative post about the product which an article is about, it gets thumbed down a lot. To those fanboys thumbing down the posts, stop pretending that you’re not fanboys. I know it hurts your egos knowing your favorite brand is being bashed, but hey, don’t cry foul about it.

      • ronch
      • 8 years ago

      Oh, I bet the AMD fanboys will thumb me down.

        • Farting Bob
        • 8 years ago

        I’ll thumb you down using my Intel powered system because you are being a whiny little attention seeking b****.

        Just so you know, its got nothing to do with fanboys, its because people dont like you.

          • ronch
          • 8 years ago

          And posting one’s opinion about a company/product makes people hate him? If that isn’t fanboyism, I don’t know what is.

          What about people like you making personal attacks (read: not about a product or company) like saying ‘you are being a whiny little attention seeking b****.’? Speak for yourself, pal. Looks like you belong to the club.

      • khands
      • 8 years ago

      We need to disband the thumbs up/down thing entirely, it’s petty, childish, and completely useless other than making people feel butt hurt.

        • killadark
        • 8 years ago

        oops i gave u a thumbs down

        • Waco
        • 8 years ago

        Making people feel butt hurt is half the fun. It lets them know that they either posted something stupid or posted something controversial.

        Neither are bad.

        • ronch
        • 8 years ago

        At least there are people like khands with some sense. Kudos, brother. Unfortunately, TR doesn’t have a Fanboy Filter yet.

      • helboy
      • 8 years ago

      Atleast in my case ,
      I Have been…
      I am…
      I will Be…
      a fan of AMD till I die ,and loyal too. :-).
      Know why?When i was a kid in school yearning for a PC and not having enough pocket money nor a dad who had 6 figure salary and my friends where all getting their own PC’s , I could save enough with dad contributing the rest and purchased my very first PC.It was an AMD.Friends asked AMD??what is that?
      it was a K6-2 500Mhz. I was afraid that the K6-2 was going to fry out,blow up,melt down or whatever the ney-sayers predicted.But 7 years later that same PC was running along just fine with enough Gaming under its hat that even a P4 cudnt match at that time :).Yea I definitely sound like a “fanboy” aint I? and I AM PROUD OF IT. After that I built myself three PC’s for myself (>100 for frnds and family) over the years and all were exclusively AMD machines.And I NEVER regretted my decision each time.
      Yes AMD!!!!!

        • Chrispy_
        • 8 years ago

        Backing the underdog is always more satisfying than backing Giant Evil Monopolycorp. Competition is good, m’kay?

        Having extensively used both brands of processor, I know what you mean though: I have been exclusively intel for the last 5+ years now and even though I get more for my money, I feel a little dirty because I’m following my ego and wallet but not my heart.

    • MadManOriginal
    • 8 years ago

    So AMD went with 4 digits in their CPU names this round…copying Intel again I guess.

      • xeridea
      • 8 years ago

      No, it totally makes sense. They are a completely different generation that the Phenoms/Athlon II, so they are higher number. It makes is super easy to judge performance from model, as the first number is cores, 2nd I speculate may increment with major iterations, 3rd is relative within core count. You can’t call it copying, its more like something that just makes sense. Seems to make more sense than Intels method (IMO), Intel numbering has always been confusing to me, even moreso talking about sockets. BTW, AMD had 4 number models with the Athlon64, X2, and first quads.

      • bcronce
      • 8 years ago

      I see GoodYear went with round tires this year, copying Firestone I guess.

        • khands
        • 8 years ago

        Names are the one thing marketing wise they could (and should) try and differentiate themselves on. Neither group is doing well lately.

        Edit: Clarification

        • killadark
        • 8 years ago

        i went trough the pain of logging in just to give thumbs up

      • NeelyCam
      • 8 years ago

      I would’ve had -70 with a comment like that

        • Palek
        • 8 years ago

        Braggart!
        😉

    • Vasilyfav
    • 8 years ago

    How about the slow ones show up first?

    And the first link in the news goes to “Lenovo Announces ThinkPad Tablet for Businesses” and nothing related to Zambezi. Wat?

      • Waco
      • 8 years ago

      Noticed that too. You can find it at X-Bit pretty easily though.

    • sircharles32
    • 8 years ago

    My fingers are crossed, as I’m in need of a system refresh, and have been holding off.

    In my opinion, AMD needs to keep a good control on the power consumption (ie, don’t go down the 140 Watt CPU route again), while keeping performance competitive. If the CPUs prove competitive, then they will be priced accordingly.

      • dpaus
      • 8 years ago

      I certainly want to see BD chips that are ‘TDP-competitive’, but I personally would have no problem with them also releasing their own version of an ‘Extreme Edition’ chip that produced stunning performance but required 140 Watts to do so. As long as a complete range of highly-power-efficient CPUs are also available, nobody would be forced to buy the 140W chip, but it would be available (presumably at very high profit margins) for those who want it and don’t give a cr@p about an extra 15W of power when running full-out.

        • OneArmedScissor
        • 8 years ago

        Clock speed is the only option they have, but you can overclock any of them to your heart’s content and push the TDP far beyond that.

        The reason AMD is no longer able to offer a separate, “extreme edition,” desktop CPU is because their true high end CPUs for servers have diverged from the direction of desktops. They have many cores, heaps of cache, and four memory channels, but low clock speeds, and require a totally different socket. It’s not possible to take those, configure them differently, and offer them as a faster desktop CPU, though they would certainly have to be more expensive, and they would look very dumb for it.

        Intel stuck to the fewer cores and high clock speeds model for a while longer, allowing them to reuse the exact same configurations, at the same prices, for both servers and desktops. However, that is changing with Sandy Bridge E, where the true highest end version of the chip will not be available for desktops, as it simply does not make sense.

        Welcome to the beginning of the end of truly “high end” desktops. By the next stage of integration, PC CPUs will be a system on a chip, likely with the option of varying amounts of integrated RAM, eventually replacing the need for separate RAM and expansion cards altogether. Server CPUs will become swathes of dedicated integer “cores” with special “GPUs” to replace the floating point units, tied together with extraordarily complex system busses, and quite possibly a memory channel for every single connected DIMM.

        They’re not going to be remotely similar, and extremely fine tuned for specific applications.

          • xeridea
          • 8 years ago

          Your predictions for the future are way off, and fit right in with all the other nuts that say we will be using our phones for primary computing. Integrated anything is never better than dedicated. There will ALWAYS be expansion cards. It doesn’t make sense to have integrated RAM in PC systems, it locks you in, does’t allow upgrading size/speed, or replacing, creating a single point of failure. So if one RAM chip goes bad you going to replace your entire system for $500? Desktops are all about expandability. It makes sense to have integrated graphics into CPU if not done crappy like Intel does, but this is just part of the market, and even those users can use Crossfire n such to augment performance.

            • Joe Miller
            • 8 years ago

            [quote<]Your predictions for the future are way off[/quote<] His predictions are interesting. Whether they are off or not is a matter of the future. [quote<]So if one RAM chip goes bad you going to replace your entire system for $500? [/quote<] This is exactly what happens now with the mobile phones.

            • OneArmedScissor
            • 8 years ago

            So in, say, 2015, you’re going to suddenly need 40GB of RAM for surfing the internets instead of 2GB? And your monitor, which has not increased in resolution in who knows how many years, is going to suddenly jump to 10,000p, requiring a ludicrously powerful graphics card, despite the fact that integrated GPUs will be more poweful than discrete graphics cards are today? Okely dokely!

            These requirements don’t change anymore, and as time goes on, none of them will. Like everything else electronic, once every feature is packed in, manufacturing improvements are focused on lowering costs and putting one in every single person’s hands.

            What do you think phones are going to do? Just stop improving? Phones will easily be more capable than peoples’ primary computers are today, and the image will be displayable wirelessly on any screen around you. Say goodbye to everything you ever plug into a TV or monitor. Convergence isn’t exactly a wildly new concept, so what’s with the aversion to progress?

            It’s not going to cost $500 to replace what will effectively be a phone with a dirt cheap LCD screen attached to it. In the future, why would it cost more than it already does today? If your microwave, washer, toaster, or whatever other household appliance dies, you don’t go swap a part out in it, do you? You just buy a new one, because it’s cheap and not worth the fuss.

            How would a RAM “chip” go bad if the RAM is the CPU? Do cores die one by one in your CPU over time? No. So why would any other parts? They’re more reliable than sticking random things from countless manufacturers all over a motherboard. Why bother with swappable parts if [b<]the swappable parts are the parts that break and increase costs to begin with[/b<]? But that's not why Windows 8 supports ARM SoCs. And that's not why Intel has been working on several SoCs for years now, with bigger changes than any of their other CPUs have seen in years. Nothing will change. Technology never evolves. Hooray for "enthusiasts." The people who are nuts are the ones who act like they keep up with changes in technology and then ignore what the entire point of it is.

            • xeridea
            • 8 years ago

            Get back with me in 10 years when I still have discrete RAM, Video card, and storage. Combining parts is good for some market segments, but not for everyone, and it doesn’t mean that discrete parts will suddenly not exist. Phones cost more than a whole computer before subsidies, and are intended to last ~2 years. Computers can last 10 years or so. The device is super compact, but it cost a lot to do so, the hardware is comparatively snail speed, and it is not possible to upgrade/replace any part of it.

          • Waco
          • 8 years ago

          AMD uses the same cores for their server chips as they do for desktop chips…they just package them differently. Why would you think they are totally separate dies? A Magny-Cours is just two Thubans with some fancy cache-coherency enabled to improve memory and cache performance with multiple sockets.

          The next iteration of server chips is literally going to be two Zambezi chips glued together a la Magny Cours. The only difference between the desktop and server versions will be the binning and the packaging.

            • dpaus
            • 8 years ago

            They could actually realize some thermal advantages with that arangement, which could be very important if they also integrate a more powerful GPU into the package.

          • Krogoth
          • 8 years ago

          Not to nitpick, but CPUs already have RAM intergrated onto the die. It is called “cache”. I believe you are referencing on-package DRAM modules which makes little sense, since SRAM (what CPU cache is build with) so much faster than DRAM. DRAM’s key advantage over SRAM is that it is much easier to create chips with huge capacities.

          The cost/benefits of intergrating DRAM onto the CPU don’t exist. It is cheaper to off-shoot DRAM onto seperate modules like it is now.

          Expansion cards will be around for enterprise/workstation users that need dedicated HBA controllers, GPGPUs, and NICs. There are several niches that still want them audiophiles, A/V types and PC gamers.

      • slaimus
      • 8 years ago

      AMD was following Intel in that department though for their highest end processor at the time. The QX9775 was a 150W TDP chip, and the QX9770 was 136W.

        • willmore
        • 8 years ago

        Don’t forget the failed Tejas!

          • Corrado
          • 8 years ago

          Yup, higher TDP than Prescott, which was already too high itself. Yay BTX!

            • Krogoth
            • 8 years ago

            We are in the stages of going through another form factor. The big push for SoACs are coming from both both camps. ATX is showing its age in a number of ways. BTX could have been it, but its overemphasis on CPU cooling to compenstate for Netburst’s ineffiencies is what killed it.

            Intel/AMD might be able to make more thermally efficent version of mATX that is smaller, because SoACs will remove need for having more than three expansion slots for mainstream systems.

    • Tristan
    • 8 years ago

    AMD is releasing rumors and promises, not products.

      • dpaus
      • 8 years ago

      [quote<][b<][i<]X-bit labs[/b<][/i<] reports that....[/quote<] 1st paragraph, 3rd sentence. And between Bobcat and Llano, AMD's been doing pretty good at delivering on it's promises lately. If it takes an extra few weeks - or even few months - to get Bulldozer right (aka 'highly competitive right out of the gate'), I'd rather that than them rushing out a flawed product (think 'TLB bug' or 'Sandy Bridge chipset')

      • Game_boy
      • 8 years ago

      Bobcat and Llano do their respective jobs well. That’s two segments (ultraportables and mainstream laptops) that AMD never had a presence in before.

      BD is late, but you have to judge it at the time of release against the products that are out at release. AMD may have delayed BD 3 months, but it doesn’t matter since IB was pushed back 2 and SB-E pushed back 6 or more.

        • Vasilyfav
        • 8 years ago

        [quote<] AMD may have delayed BD 3 months, but it doesn't matter since IB was pushed back 2 and SB-E pushed back 6 or more.[/quote<] Well, actually it does, since they haven't beaten either nehalem or sandy bridge architecture in x86 performance. Zambezi won't be competitive with Ivy Bridge, either on power consumption or performance.

          • dpaus
          • 8 years ago

          [quote<]Zambezi won't be competitive with Ivy Bridge[/quote<] Link to published-and-documented benchmarks, please?

            • Vasilyfav
            • 8 years ago

            Link to a psychiatrist classifying you as a non-AMD fanboy who’s in touch with reality and realistic expectations, please.

            • dpaus
            • 8 years ago

            Would you like me to ask her if she has an opening in her schedule for you?

            • Waco
            • 8 years ago

            Troll elsewhere, please.

            • Vasilyfav
            • 8 years ago

            I was stating a prediction based on the past 7 of performance of CPUs. An educated guess if you will.

            If you call that trolling, then you’re either less than smart or you have no idea what trolling means, which means you probably shouldn’t use the word.

            • maxxcool
            • 8 years ago

            All i care about, is the ability to buy a black processor for 100$ less… and overclock the living shit out of it and get 95% of the same performance as a 1000$ intel cpu.

            sure intel has a higher ipc
            sure intel has better process tech

            but for a couple hundred buck less, i can achieve 95% the same performance … ahem AND GET UP UPGRADE TO A NEW ARCHITECTURE WITHOUT HAVING TO REBUY A NEW MOTHERBOARD AND (possibly) RAM.

            FUCK YOU VERY MUCH INTEL… you might be better at a few benchmarks. but your not even CLOSE to being worth the total cost of ownership in my rack chassis, or my high end work stations.

            Class dismissed.

          • Corrado
          • 8 years ago

          How do you know? Remember the prescott p4s? The ones that were a 90nm die shrink and enhancement to the 120nm Northwoods? The ones that ran HOTTER and couldn’t clock as high as the Northwoods? Now, I’m not saying this is going to happen for Ivy Bridge, but automatically assuming that dieshrink = moarfastarbettar has been debunked by Intel in the past.

            • Vasilyfav
            • 8 years ago

            When was the last time an AMD CPU 1 tech process behind was faster than an Intel offering, aside from Prescotts?

            People who downvote me need to stop living in 2004, that’s all I’m going to say. I’ll refer them to this post when I’m proven correct by official benchmarks.

            • Corrado
            • 8 years ago

            “Aside from”, why do we need to excuse anything just to make your assertion true? We can do that for everything! “Intel beats AMD every time! Except for the times they don’t!”

            • Vasilyfav
            • 8 years ago

            Like I said, keep living in the era of single cores and windows XP. I’ll link you to my posts in October, so that you can downvote me for truth again like the fanboy you are.

            • SPOOFE
            • 8 years ago

            [quote<]Remember the prescott p4s? The ones that were a 90nm die shrink and enhancement to the 120nm Northwoods? The ones that ran HOTTER and couldn't clock as high as the Northwoods?[/quote<] I don't remember those. I do remember the Prescotts that ran hotter (or, as you say in your language, HOTTER, though I don't know how to pronounce the all-caps), but also ran faster, with more cache and new architectural reworkings. You're not comparing die shrinks, you're comparing slightly different architectures too.

            • Corrado
            • 8 years ago

            Isn’t Ivy Bridge supposed to be slightly reworked? Different transistors, different graphics, and PCI-Express 3.0 support. Also, I’m not sure how long you’ve been on the internet, but when you use CAPS its to EMPHASIZE something (or in the case of an entire sentence, its YELLING).

            And no, it wasn’t faster, clock for clock, than a Northwood. Check out the reviews of the era. Here’s a quote from TechReport’s own Prescott review:

            [quote<]Our benchmark results for the new Prescott-based Pentium 4 'E' processors are the very definition of mixed. In some cases, Prescott looks very good, but in others, it's slower than current Pentium 4 chips at the same clock speed. The larger caches and architectural tweaks have helped immensely in offsetting Prescott's super-long 31-stage pipeline, but they haven't entirely made up the gap. [b<]On balance, Prescotts are slower than Northwoods.[/b<] I expect Prescott P4s will look relatively stronger over time as SSE3 instructions are adopted and, especially, as clock speeds ramp up. [/quote<] [url<]https://techreport.com/articles.x/6213/16[/url<] I bolded the important part for you.

          • OneArmedScissor
          • 8 years ago

          “Zambezi won’t be competitive with Ivy Bridge, either on power consumption or performance.”

          I didn’t realize server farms were made out of stacks of laptops.

            • Game_boy
            • 8 years ago

            Power consumption is /more/ important for servers than performance because of power delivery and cooling constraints.

            • OneArmedScissor
            • 8 years ago

            Yeah, but the trouble is that you need something that will fit into a server board socket to use it for one. :p

            Ivy Bridge, as it will be known a year from now, is really only beneficial for laptops and has nothing to do with Zambezi. While it will be pitted directly against a Bulldozer CPU, that will be Trinity, and we know even less about that than we do about Zambezi.

          • anotherengineer
          • 8 years ago

          “since they haven’t beaten either nehalem or sandy bridge architecture in x86 performance”

          Be careful the way you word that, you wouldn’t want to look like a fibber.

          [url<]https://techreport.com/articles.x/16796/5[/url<] As you can see the AMD 955 edges out the i7 920 in a few benchmarks, and I do believe the i7 is nehalem based? So in reality the top AMD chips do compete with the base model nehalem chips.

Pin It on Pinterest

Share This