AMD quotes $300 price for 8-core Zambezi chip

Officially, AMD hasn’t announced exact prices or specifications for its upcoming FX-series processors (code-named Zambezi). However, the company has let slip one nugget of pricing information as part of its ongoing FX giveaway. The giveaway allows folks to sign up for a chance to win an "AMD FX 8-Core Processor Black Edition," and buried in the rules and regulations page is this:

You’re reading that right. AMD is quoting an "approximate retail value" of $300 for the eight-core FX chip. I believe that’s the first official indication that Zambezi might go head-to-head with the current flagship of Intel’s Sandy Bridge line, the Core i7-2600K, which currently retails for $315.

This news shouldn’t come as a huge shock to folks who’ve kept their ear to the ground, of course. Earlier this year, purportedly leaked AMD documents suggested that top-of-the-line Zambezi desktops would compete with Sandy Bridge-based Core i5 and i7 machines, and Llano systems would fight it out with the Core i3 series. (As we all know, the bit about Llano turned out to be true.)

Personally, though, I’m more interested to find out where Zambezi pricing will start… and whether AMD will manage to serve up an attractive alternative to the excellent Core i5-2500K. (Thanks to X-bit labs for the heads up.)

Comments closed
    • link626
    • 8 years ago

    amd will need an 8-core to compete with the 2500k.

    no way amd will have a quad core fast enough to compete.

    a 6-core might cut it. we’ll see.

    • michael_d
    • 8 years ago

    This price suggests that Bulldozer will not be able to compete with the Intel’s upcoming E-series of CPUs, kind of disappointing. Looks like AMD has given up on high end CPUs and concentrated on APUs and discrete graphics, plus, mainstream CPUs.

      • UberGerbil
      • 8 years ago

      I wouldn’t say they’ve given up. It may be this is them trying as hard as they can.

        • CaptTomato
        • 8 years ago

        even more disturbing….

      • CaptTomato
      • 8 years ago

      disturbing isn’t it..

      • Voldenuit
      • 8 years ago

      I doubt BD will compete with intel on a pure per core basis in x86.

      However, they are potentially on to something with Fusion. There are many potential applications for which x86 is not an efficient solution – massively parallel loads and/or FP come to mind. Even intel has seen the light and used dedicated fixed-function core logic for QuickSync.

      Right now, AMD’s APU (in Llano) can’t keep up with the sheer speed of Quicksync for encoding (albeit it produces slightly higher quality images). But if they play their cards right with Trinity (and expand OpenCL adoption with developers), GPU/APU computing could make AVX/SSE4 as obsolete as MMX/SSE made the x87 FPU.

      So in short, intel is still winning in x86, but there is no guarantee that x86 performance alone will be the most important criterion in determining computing power in the future.

    • d0g_p00p
    • 8 years ago

    You guys kill me with this whole TDP crud. If you cannot afford the whole $20 a year extra that a couple of watts *might* cost you to run your system why does any CPU over $100 even matter to you.

      • etymxris
      • 8 years ago

      Many of us pay for that heat twice when we have to cool our rooms. Cooling more heat also generates more noise, which is a big factor for many people. It also reduces laptop battery life (though I doubt this chip will end up in laptaps).

      • NeelyCam
      • 8 years ago

      Power kills trees and pollutes the Earth.

        • BobbinThreadbare
        • 8 years ago

        Depends where you live.

          • NeelyCam
          • 8 years ago

          True. In the USA, it kills trees and pollutes the Earth.

            • BobbinThreadbare
            • 8 years ago

            Still depends where you live. Plenty of hydro power in the US.

            Also, the US forests are growing so we’re not killing any trees locally.

            • NeelyCam
            • 8 years ago

            Using facts to disarm a perfectly good troll is unkind.

        • maxxcool
        • 8 years ago

        Better stop driving your car then… or taking the bus… or carpooling. regardless of “emissions” a car engine be it diesel or gas produces more wast heat than a couple hundred cpu’s.

          • NeelyCam
          • 8 years ago

          No. I was just making a statement. It doesn’t mean that I’ll give up my car, my airconditioner, my sauna etc.

          • Anonymous Coward
          • 8 years ago

          My threshold for caring is the amount of heat released by a large aircraft at takeoff.

      • Krogoth
      • 8 years ago

      Reducing TDP only makes sense if you are trying to reduce noise.

      Otherwise, it is a moot point.

        • jensend
        • 8 years ago

        Well, having lower TDP systems means that coolers can not only be quieter but also smaller, and things can be put into smaller cases without as much of a temperature and/or noise penalty. If you choose to trade away some cooling efficiency at the expense of higher temperatures to get quieter and smaller coolers, it’s worth noting that temperatures also have some impact on system lifespan.

        But it’s not like most people are counting on Bulldozer for HTPC/nettop type use, and people around here are going to upgrade their systems too frequently to be affected at all by the life expectancy changes given by running somewhat hotter than ideal. And it’s certainly not like these differences in processor TDP are going to change your electric bill appreciably or impact climate change in any real way. Turn your incandescents off a little more and you’ll totally drown out the differences here. So for most home desktops TDP is not so huge a deal. (Of course, for large installations esp. servers, space and power constraints are extremely important.)

      • UberGerbil
      • 8 years ago

      Try to imagine there’s a world beyond individual enthusiast fanbois having arguments on teh intraweb.

      Zambezi is primarily a server/workstation processor. Some of the people here may spec and purchase hundreds, even thousands of such systems. Sure, it may not matter to an individual gamer/enthusiast sitting in front of one system in his parents’ basement. But for businesses looking at buying a lot of systems, the costs add up (and multiply by cooling requirements).

      • MrDigi
      • 8 years ago

      Its not about the cost of electricity, its about the CPU having thermal headroom for OC, turbo headroom, stability, and for cooling easy/cost, mobile applications, …

    • chuckula
    • 8 years ago

    Time for a complete change of topic… So I figured I’d throw my name into the contest that AMD is sponsoring (for the inflatable doll of course!) Anyway, the advertising website with all the stupid comics & whatnot includes wallpaper downloads…. and they don’t even let you have the option of a 1920×1200 resolution wallpaper to advertise their own products!!?!?!?!

    WTF??? The company that constantly harps about how awesome its graphics are can’t even get an ad agency that is smart enough to make wallpapers at an insanely common resolution that a whole lot of its gamers use everyday? Oh, and this isn’t even a 1920×1080 vs. 1920×1200 rant, although 16:9 should die, they ALSO don’t have 1920×1080 wallpapers.

      • ronch
      • 8 years ago

      You’ll just have to crop and re-scale the images.

      • NeelyCam
      • 8 years ago

      1920×1080 will inherit the earth. 1920×1200 proponents are just wasting their time fighting the inevitable.

        • chuckula
        • 8 years ago

        By the time that happens I’ll be at 2560×1600. 16:10 FTW.

          • NeelyCam
          • 8 years ago

          You’ll be wherever the entertainment industry wants you to be.

            • derFunkenstein
            • 8 years ago

            +1 because it’s true, not because I like it.

          • derFunkenstein
          • 8 years ago

          You mean 2560×1440.

        • BobbinThreadbare
        • 8 years ago

        [quote<]they ALSO don't have 1920x1080 wallpapers.[/quote<]

    • mcforce0208
    • 8 years ago

    AMD are going to pawn sandy bridge (E). End of story.

      • LoneWolf15
      • 8 years ago

      Like Rick, The Old Man, or Chumley?

      • chuckula
      • 8 years ago

      AMD are going to Pawn Sandy Bridge E? Did they win one in a contest and need the money?

        • derFunkenstein
        • 8 years ago

        No, they’re just going to move forward and one space to the right to stake the Sandy Bridge (E) chip. Check.

      • NeelyCam
      • 8 years ago

      Let’s hope so. But I give it a 1:4 chance max

    • NeelyCam
    • 8 years ago

    When people are trying to guess the BD prices from some random giveaway rules/regulations, things have gone too far.

    I think it’s time for AMD to just release the damn thing, or at least officially release some information like clocks, prices etc. Folks have waited long enough.

      • ronch
      • 8 years ago

      I’ve gone tired waiting for BD to come out. I just think, at least we can turn to Ivy Bridge if BD turns out to be a real disappointment.

        • NeelyCam
        • 8 years ago

        Always an option… after waiting a bit more.

        The problem is that when BD finally comes out, IB release is too close, so everyone will feel compelled to wait for that to see if it’s better. Then IB gets delayed by a month or two, and suddenly folks realize they’ve been waiting a year longer to upgrade.

        Then Trinity is right around the corner…

          • ronch
          • 8 years ago

          Yeah. Then there’s Haswell. Then after that, who knows what AMD is cooking up.

    • krazyredboy
    • 8 years ago

    I think I’m just gonna wait for Cyrix to make a comeback.

    Otherwise, I’m happy with my i7, as it is, and even if I wanted to spend $300 on a new PC component, it would definitely be for SSD, right now…or even for the next year or so. I think that’s the only tangible gain in performance for me, at this point.

      • ronch
      • 8 years ago

      Sometimes I wish Cyrix would make a comeback, but I guess all their engineers already went their separate ways.

    • kamikaziechameleon
    • 8 years ago

    My current mobo (asus 890FX) crashes when the turbo mode feature is enabled. I’m not sure if I want to sell this mobo on craigs list along with the cpu. I was originally planning to sell my x6 and get an x8 but with my system also having audio issues I would rather just sell this and go sandy bridge at this point. AMD clearly doesn’t have their act together they’ve really fumbled this launch and dragged it out over months. I’m probably gonna just chill for the sandy bridge replacement and upgrade then.

      • DrkSide
      • 8 years ago

      So, based on two items (that probably are not CPU related) that YOU are having you conclude that AMD is not good. Any platform has problems for users.

        • swaaye
        • 8 years ago

        We shall see how bug free BD is soon enough. My bet would be on rev2 being quite an improvement, judging by history. Hopefully BD goes better than Phenom 1 did.

          • just brew it!
          • 8 years ago

          Hey, Phenom 1.5 was relatively bug free… πŸ˜€

      • ronch
      • 8 years ago

      Maybe it’s the motherboard? Perhaps it doesn’t deliver enough juice to the CPU. BTW, is there a Phenom II X8? Or are you referring to upcoming BD parts?

    • bittermann
    • 8 years ago

    Just release the dam thing….

    • CaptTomato
    • 8 years ago

    For most people an Intel chip will be faster, certainly overall, so I’m not the least bit excited about 6-8 slow cores.
    2500k is $70AUD cheaper than my e8400 dualcore….that’s what I call pawnage.
    AMD are fine with GPU’s, but they may as well hand the CPU business over to whoever puts their hand up.

    I’m going to let demanding games force my next PC upgrade, but I’m already dreaming about “Intel inside”.

      • dpaus
      • 8 years ago

      To paraphrase The Dark Lord:

      [url=http://bucultureshock.com/wp-content/uploads/2011/06/i-find-your-lack-of-faith-disturbing.jpg<]I find your lack of Visionβ„’ disturbing[/url<]

      • cegras
      • 8 years ago

      I don’t see why you’re preemptively judging something on the number of cores. What will you do if the chip, considered as a black box, outperforms the similarly priced intel offering?

        • dpaus
        • 8 years ago

        Go stand in [url=http://subversatile.net/pics/denile.jpg<]a river[/url<]...

        • CaptTomato
        • 8 years ago

        I always buy whichever part is the best value for money, what else would a rational person do?….but AMD are not going to topple Intel anytime soon.

          • LoneWolf15
          • 8 years ago

          No-one said they’d topple Intel. However, what if it turns out to be “the best value for your money”?

          AMD HAS had competitive offerings compared to Intel during the past decade, after all.

            • Anonymous Coward
            • 8 years ago

            Still, the chances of an upset are negligible. Intel does not appear to be screwing around.

          • just brew it!
          • 8 years ago

          They don’t need to topple Intel; they just need to keep ’em honest and live to fight another day!

            • CaptTomato
            • 8 years ago

            It’s not happening, AMD can’t make competitive CPU’s unless Intel cock it up…

      • Farting Bob
      • 8 years ago

      Funny you should say that, their CPU division made more money than their GPU division in the last announcement i do believe.

        • CaptTomato
        • 8 years ago

        “””Meanwhile, the $184 Core i5-2400 offers substantially higher overall performance than the X4 980 or the X6 1075Tβ€”for the same money, and with a lower 95W power envelope. If you look beyond the overall summary, you’ll find that the Core i5-2400 generally outperforms the X4 980 in workloads that are both lightly and heavily threaded””””

        The market isn’t rational……..

          • OneArmedScissor
          • 8 years ago

          No, you’re not rational, because you have repeatedly stated you think “the market” and “most people” consists of the absolutely miniscule DIY desktop demographic.

          2% of Intel’s shipped CPUs are i7 branded. It was 1% for the longest time and only went up to 2% because there are now lots of laptop branded i7 CPUs, including dual-cores and not just quad-cores. About a year ago, 70% of [b<]boxed[/b<] Intel CPUs, meaning the ones people bought to put in their own computers, were Pentium branded, which are some of the lowest end CPUs on the face of the planet, even lower end than many of AMD's cheaper CPUs. Get with the program. This isn't 2001.

            • CaptTomato
            • 8 years ago

            “”were Pentium branded, which are some of the lowest end CPUs on the face of the planet, even lower end than many of AMD’s cheaper CPUs”””

            I told you the market wasn’t rational.
            I’m a strict PC gamer, I own no consoles and most likely never will as they’re gimped platforms for dummies, so I only speak from my POV of gaming rigs, and from that POV, I suspect Intel will be da bomb!!!

            • maroon1
            • 8 years ago

            I think you are underestimating the performance of the new SB-based pentiums

            Even pentium G620 (which is the slowest SB-based pentium) is faster than Phenom II X2 565 in most cases
            [url<]http://www.xbitlabs.com/articles/cpu/display/pentium-g850-g840-g620_8.html[/url<]

            • CaptTomato
            • 8 years ago

            CPU fanboy’s don’t want to hear that….

            • NeelyCam
            • 8 years ago

            Wanna edit that…?

            • CaptTomato
            • 8 years ago

            To what….?

            • Kurotetsu
            • 8 years ago

            *watches as the point goes flying several hundred miles over CaptTomato’s head*

            • NeelyCam
            • 8 years ago

            Clean entertainment for the whole family.

      • poulpy
      • 8 years ago

      [quote<]For most people an Intel chip will be faster[/quote<] Ahem no, IMO for [u<]most people[/u<] it won't make any difference, because this kind of performance is overkill. 90% of the market would be perfectly fine with a good dual-core from either of Intel or AMD, as long as it: - comes at a good total machine cost - doesn't explode the electricity bill at the end of the month. Anything else is for geeks or professionals who won't shy away from investing $300 in a cpu alone and who will find ways to get these cores busy (well at least the professionals should).

      • maxxcool
      • 8 years ago

      If its a equal price basis, I have to agree… if its 100$ cheaper… then that’s a different story but from the looks of things they are going toe to toe which is a BAD idea.

      • Ryhadar
      • 8 years ago

      [quote<]I'm going to let demanding games force my next PC upgrade, but [b<]I'm already dreaming about "Intel inside".[/b<][/quote<] Ew.

      • SomeOtherGeek
      • 8 years ago

      You, sir, is jumping the gun. dpaus is funny today!

      • NeelyCam
      • 8 years ago

      Incomprehensible -17. AMD fanbois are just out of control

        • CaptTomato
        • 8 years ago

        It’s too big, we can’t win, we’ll just have to go intel inside and utilize all that power!!

      • Krogoth
      • 8 years ago

      Most people don’t care, expect for the price tag.

      The vast majority of the market doesn’t need six-core chips to do email, facebook, watch youtube. Let alone gaming and minor video encoding.

      Intel and AMD have very hard battle ahead of them. The battle of making $300-1000 CPU relevant. There’s a reason why ARM poses a considerable threat. They are proving they don’t need high-end chips to do day to day mainstream stuff along with the baggage that goes with it (huge heatsinks, large fans, large chassis etc)

        • CaptTomato
        • 8 years ago

        I’m a gamer, so I care about the state of HW, and I’m sure there’s plenty of others who also care.
        That we might be a minority is irrelevant to me……and yes, I have a tower case, huge CPU heatsink, and a ICE COOLER on my GPU, all in the name of POWER, something many gamers want so they can experience decent image quality.

        My PC is 3 yrs old, but can handle Metro2033 at 1080p med{still looked great}, so if we tally up my expenditure over 3yrs{just the guts of the PC}, I doubt I’ve spent more than $1500, and I could arguably keep my PC for another 6-12months, so when we divide it up, it’s less than $10 a week for a firstclass sensible gaming rig, yet for some strange reason/s, people choose pathetic consoles as their primary gaming platform, and then compound it with a slow laptop.

    • bcronce
    • 8 years ago

    Let me know when a 12 thread version comes out.

    But good for AMD for finally having something to compete with.

      • OneArmedScissor
      • 8 years ago

      If you actually need that kind of processing power, why are you even looking at desktops?

        • albundy
        • 8 years ago

        cus he likes clubbing and stomping when someone crosses his bridge, like any other troll. that and because many people have no clue what they are talking about. πŸ˜‰

          • Farting Bob
          • 8 years ago

          I think your firewall has blocked all sarcasm, i suggest you change the settings.

        • bcronce
        • 8 years ago

        Several games in the next 1-2 years will scale to 12/16 cores. My next CPU upgrade, I want to last 3-4 years.

        I like to rotate which part of my computer I upgrade. I bought an i7-920 3 years ago. I just upgraded to a 6950.

        My next upgrade will be an SSD for sure, but I would love to replace this 3 year old CPU. At the same time, I don’t want to purchase a CPU that will be out-dated really fast. My i7 stayed in the top 10 for a good 2 years. I want my next CPU to do the same.

        Intel is releasing 14nm in 2013, so whatever my next CPU is, it will have to be quite strong to stay competitive. An 16 thread 22nm should be able to hold it’s own for quite a while(about 2 years).

        I also really want to get an AMD, as I want to support them more. But my next mobo+cpu/memory upgrade will be in the $800 range. So if AMD can’t have a $600 CPU that can compete… I’ll either have to wait for go Intel… /shudder

          • smilingcrow
          • 8 years ago

          Keep on supping the kool-aid bro.

            • bcronce
            • 8 years ago

            ???

            Your reply seems to have nothing to do with my post. I posted 1 fact and the rest was opinion.

            All I said is I would like to upgrade my CPU, but I need a large enough upgrade to make it worth while.

          • just brew it!
          • 8 years ago

          Depends on what you mean by “last 3-4 years”. A bleeding edge CPU is only bleeding edge until the next bleeding edge CPU is released, and that last doubling of price gets you very little in actual performance benefits. You’d probably have better bang-for-the-buck over the next 3-4 years if you bought a couple of high-mid range CPUs over that period instead of one bleeding edge one.

          Price/performance of $600 CPUs is *terrible*.

      • SomeOtherGeek
      • 8 years ago

      Look! There it is!

      • ronch
      • 8 years ago

      [quote<]But good for AMD for finally having something to compete with.[/quote<] Um, dude, BD isn't out yet.

      • just brew it!
      • 8 years ago

      AMD has had those for a while already. They’ll cost you quite a bit more than $300 though:

      [url<]http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&N=100008494%2050001028%20600048544&IsNodeId=1&name=12-Core[/url<] πŸ˜€

    • ronch
    • 8 years ago

    How can they price something that isn’t even released yet?

      • jensend
      • 8 years ago

      Because they’re the only company which makes the chips, so they can set the price they sell them to resellers at as well as the MSRP. It’s not like a commodity e.g. wheat, where there’s one market price which no individual seller has control over. Any other stupid questions?

        • ronch
        • 8 years ago

        Reread my earlier post carefully, fanboy. I said, ‘How can they price something that ISN”T EVEN RELEASED YET?’ I [u<]did not[/u<] ask why they're the ones pricing their chips. The answer to that is obvious even for someone like you. Before you call me or someone else stupid, look at yourself first.

      • just brew it!
      • 8 years ago

      What are the error bars for a price that is stated to be “approximate”?

    • Arclight
    • 8 years ago

    I still consider their octo core CPUs as being “hyperthreaded Γ€ la AMD” quad core. In that sense a real octo core is still yet to come to the desktop.

      • derFunkenstein
      • 8 years ago

      There’s more hardware duplication than Intel’s SMT here, enough so that I’d call it 8 cores. The only shared hardware is 1 vector unit per pair of cores, and even that can do 2x 128-bit SSE4 at a time, enough to keep pretty much everything well-fed.

        • ronch
        • 8 years ago

        I wonder what Intel thinks about how AMD is counting cores with BD. Still, I would think that if it takes 8 BD cores to deliver the same performance as Intel’s 4 cores, AMD’s case may further invite debate. But, if it delivers twice the performance (or if 4 BD ‘cores’ can match 4 Intel cores) then I suppose the case is closed. I suppose that’s why AMD wants to reach high clocks with BD and perhaps it’s the reason why BD has been delayed. We’ll know how this story turns out in a few months (hopefully!).

          • derFunkenstein
          • 8 years ago

          I don’t think Intel gives a damn.

      • ish718
      • 8 years ago

      I wouldn’t call amd cores hyperthreading, they’re a bit too dedicated for that. If you have 8 intensive threads, bulldozers 8 cores will crush intels 4 cores+4HT

        • NeelyCam
        • 8 years ago

        Careful there… Intel’s IPC is way ahead of AMD’s at the moment, and nobody has reliable benchmark data on BD to determine how well it has caught up.

        Also, BD “cores” aren’t real cores, and it’s a bit misleading to call them such. True cores will beat BD cores. My guess is that a Sandy Bridge quad core (without HT) will mop the floor with BD quad-“core”… but we’ll see BD comes out in… was it September? Or is it pushed towards the end of the year now?

          • ish718
          • 8 years ago

          How could you say they’re not real cores? They’re just stripped down cores optimized for multithreading. They still possess the most fundamental dedicated units of regular cores.

            • ronch
            • 8 years ago

            This topic/debate can go on and on even after BD is just a faint memory. It’s a bit like comparing Nvidia’s stream processors with AMD’s. The point has become moot and people just look at the performance. If AMD says they’re 8 real cores, perform just as well as a 4-core Intel and sip the same amount of power for the same price, I don’t think people would care much. Thing is, it’s true that at this point BD’s performance is all speculation, despite what DonanimHaber or some other site says. It’s funny how some people will swear to the heavens that BD will beat the hell out of SB/IB or vice versa. Kind of like arguing whether the world will end in 2012 or not.

            • CaptTomato
            • 8 years ago

            It’s important because people are claiming AMD have a dog in this fight because of EXTRA cores….but do they…?

          • BobbinThreadbare
          • 8 years ago

          [quote<]True cores will beat BD cores.[/quote<] What do you mean by a "true core?" Like a Phenom II core? Because I think a BD core will beat a Phenom II, which has "true cores."

            • NeelyCam
            • 8 years ago

            Excellent point. An SB core has a chance to beat two BD “cores”… it most certainly beats one. Even better: two SB cores easily beat one BD “module” which is supposed to have two “cores”

            A Phenom II core might have more trouble with it, but two Phenom cores might be able to challenge a BD “module”

            • MrDigi
            • 8 years ago

            AMD early on stated the module approach results in 70% the performance of independent cores.

            • just brew it!
            • 8 years ago

            Remains to be seen.

      • Krogoth
      • 8 years ago

      What a gross oversimplifcation.

      We are on the threshold of a divergence point. AMD and Intel marketing teams are going to start define what a “core” is. They are going to contest on what are hardware requirements for it and throw more marketing jargon to confuse buyers even more. It is going to be a big mess like the MEGAHURTZ and Performance rating system back in the P4/Athlon XP days.

        • just brew it!
        • 8 years ago

        Yup… architectures have diverged enough that relative performance is necessarily going to be very workload dependent. If you have a specific workload that you care about getting maximum performance on, LOOK AT THE DAMN BENCHMARKS.

        This has been true since… well… practically forever.

      • Anonymous Coward
      • 8 years ago

      Interestingly, I only consider Atom to be 1/2 of a core.

        • just brew it!
        • 8 years ago

        Sounds about right.

    • burntham77
    • 8 years ago

    As much as I love the idea of 6 and 8 core CPUs, I have yet to find a use for it. My quadcore still gets the job done no matter how much I throw at it. Still, I do like that “turbo” feature on the 6 core chips, and would consider one of those once the TDP gets down to the double digits.

      • derFunkenstein
      • 8 years ago

      I think 6-core Zambezi will be close to double-digits. The 8-core chips are supposedly 125W, so turning off 1/4 of the chip has to save SOMETHING, right?

        • OneArmedScissor
        • 8 years ago

        That’s nowhere near 1/4 the chip. Cores are tiny. Most of the chip is cache, and there are all sorts of other things in there, like the memory controller, northbridge, power management circuits, and gourd knows what else.

        TDP is dictated by clock speed. Despite the fact that some CPUs have so many more cores than others at the same manufacturing nodes today, because of turbo boost, it still comes down to clock speed.

        Only the fastest 8 core, which exceeds 4 GHz and likely required higher voltage to do that, is 125w. The rest are supposed to be available as 95w, including even the quad-core, as they basically run the same speeds, but slightly lower if there are more cores active. Hopefully that makes sense.

          • jensend
          • 8 years ago

          It’s true that a lot of the chip is cache, but a lot of that is per-module L2, and the L2 associated with the disabled module would also be disabled. On top of that, faster caches are power-hungrier per MB than the slower shared L3, and a lot of the other shared components you mention are also on the comparatively-less-power-hungry side. So while the difference of course wouldn’t be a full 25% of the chip’s power, it might be closer (when all cores of both chips are in use and both are at the same clocks) than you’d think.

          Of course, you’re right about turbo boost etc giving CPU manufacturers the ability to target a wide range of TDP, which they generally use to give the fewer-core parts close to the same TDP as the higher-core variants.

          • derFunkenstein
          • 8 years ago

          well, a quarter of the cores and associated cache. You get the idea.

          Also, it should still lower power consumption either way. :p

      • BobbinThreadbare
      • 8 years ago

      You must not do any video encoding.

    • kamikaziechameleon
    • 8 years ago

    So basically when this line of cpu’s actually launch they’ll be between 6 and 12 months behind intel again. impressive.

      • NeelyCam
      • 8 years ago

      It [i<]is[/i<] impressive, actually. They've been some 18-24 months behind for a while now. BD coming out is a good thing for everyone (except Intel). We can make fun of AMD delaying and delaying and hyping and having yield issues and whatnot, but we really don't want AMD to fail and disappear. I like Intel CPUs, and I really want their prices to keep going down.

      • just brew it!
      • 8 years ago

      As long as they’re priced accordingly, and AMD manages to stay in business for at least one more product generation, it is still a net win for consumers. It may not be a net win for AMD’s shareholders, but hey I don’t play the stock market.

    • Tristan
    • 8 years ago

    Bulldozer is worse than SB in all respects: it needs more silicon, cores, frequency and power to match the sandy bridge, which is not even certain. Junk and so much:)

      • Elsoze
      • 8 years ago

      well as you said…
      [quote<]which is not even certain[/quote<] so pls do everyone a favor and wait until it is certain before attempting to be "certain" πŸ˜›

      • basket687
      • 8 years ago

      Well, we don’t know Bulldozer’s performance yet, but I will be really disappointed if its multithreaded performance isn’t higher than the 2600k.

      • novv
      • 8 years ago

      Did you managed to test a Bulldozer cpu from AMD ? Please share with us some results not just speculation! But I’m sure that you already have a SB system and you’re afraid that the new platform from AMD will be better. Anyway to be on topic I just wanna say that now it’s clear that FX Bulldozer will be available august-september for sale (the winners will receive the prizes starting 9 september!)

        • Ryhadar
        • 8 years ago

        [quote<]Anyway to be on topic I just wanna say that now it's clear that FX Bulldozer will be available august-september for sale (the winners will receive the prizes starting 9 september!)[/quote<] If [i<]only[/i<] I could take away the same conclusion from what AMD's lawyers wrote. All it says to me is that they're going to pick the winners, but not necessarily give out BDs on that date. That said, I keep seeing a September release, so maybe it's not far off. I just wish I hadn't seen the rumor about an October release... but what's one more month, right? *nervous chuckle* Bah, whatever. I'm still good on my C2D @ 3Ghz. I probably wasn't even going to buy a new mobo and processor until next year anyway.

      • smilingcrow
      • 8 years ago

      The TDP is a concern if performance only matches that of the i7-2600K; 125W for CPU only versus 95W for CPU+GPU. It could mean that the i7 is 50% more efficient which might not be such an issue for the mainstream but for over-clocking that can bode ill especially for air cooling.
      Too many other unknown parameters to guess how over-clocking will fair but even if it only matches the i7-2600K at stock speeds that’s still a very welcome big jump for AMD.
      I suppose it’s natural competition if you ignore price is SB-E.

      • maxxcool
      • 8 years ago

      such smelly bait

      • ronch
      • 8 years ago

      Be careful what you say against AMD. The Fanbois will thumb you down.

        • cegras
        • 8 years ago

        trolololo

          • ronch
          • 8 years ago

          Real funny. You know, you should think about being a comedian.

      • VILLAIN_xx
      • 8 years ago

      Hmm……I saw this posting before i even clicked on this link today. I am a wizard.

      [url<]http://images.sugarscape.com/userfiles/image/NOVEMBER2010/Lauren/251110danielradcliffemain.jpg[/url<]

    • blorbic5
    • 8 years ago

    What’s a Ruby doll and why is it worth $25?

      • Ryhadar
      • 8 years ago

      It’s been rumored* on other forums that it’s a doll of the “blow-up” variety.

      *[spoiler<]I'm kidding[/spoiler<]

        • kvndoom
        • 8 years ago

        Unlike the CPU, it would be obsolescence-proof! πŸ˜€

      • burntham77
      • 8 years ago

      [url<]http://img.ncix.com/gif/43249.jpg[/url<] Best I could find. Although I bet if you googled her with safe search off, you'll get more interesting results. I'm at work though so... no go.

      • ronch
      • 8 years ago

      Maybe it’s life-size and you can substitute it for a real girlfriend.

    • CaptTomato
    • 8 years ago

    1/5000000000000 people will be able to use 8 cores.

      • basket687
      • 8 years ago

      Well, any program that can take advantage of hyperthreading on a quad core will be able to benefit form 8 bulldozer cores.

        • CaptTomato
        • 8 years ago

        8 inefficient cores??

      • DancingWind
      • 8 years ago

      Really…. thats 1 in 5 bilion? Wow so If I fireup a multithreaded encoder noone else well be abel to use it? Talk about epeen power πŸ˜€

        • Creech
        • 8 years ago

        Actually that’s 1 in 5 trillion. So basically CaptTomato is saying chances are that no one on Earth will be able to use 8 cores.

        Lol, is Zambezi really that fast? πŸ˜›

      • DancingWind
      • 8 years ago

      Really…. thats 1 in 5 bilion? Wow so If I fireup a multithreaded encoder noone else well be abel to use it? Talk about epeen power πŸ˜€

      • Goty
      • 8 years ago

      I feel special now, thank you.

      • designerfx
      • 8 years ago

      0 people “use’ 8 cores. However, plenty of programs and computers on the other hand do.

      • Arag0n
      • 8 years ago

      The last time i was trying to do some image processing and pattern matching I had the feeling the more the better… maybe it was just me… there is a market for everything man. I can argue with you that Llano can be good in the low-end where performance good-enough is the target, but there is tasks at the top that require to have something as good as possible with flexibility if possible.

      • maxxcool
      • 8 years ago

      WOW I did not know i was so RARE!

      • dpaus
      • 8 years ago

      Or, um, every one of our clients, who are running our Java application.

      • albundy
      • 8 years ago

      1/5000000000000 people still use Windows 98SE? wow! that’s alot of poor people needing upgrades. Anything i fire up these days on my win7 machine uses all my 4 cores.

        • CaptTomato
        • 8 years ago

        So now you’ll spread that out over 8 slow cores….K, I get it{not!!!}

      • pdjblum
      • 8 years ago

      How the f*** do I know if I am the one who is able to use the eight cores? Some authority has to tell us who will be the one, so the rest of us know not to use all eight cores.

      • ermo
      • 8 years ago

      /usr/src/linux # make -j9
      or perhaps
      /home/ermo/OpenWrt/backfire $ make -j9

      … you get the idea. If a schmuck like me can do it, so can a whole lot of other programmers, engineers, 3D artists etc. 1 in 5 trillion? Try 1/1000.

        • just brew it!
        • 8 years ago

        Obviously he meant 1 in 5 trillion grandmas who use their computers only to read their e-mail, buy crap on eBay, and surf Facebook for pictures of their grandkids… πŸ˜€

    • wingless
    • 8 years ago

    No complaints here. Bring it on AMD!

    PS: Are they required to quote a value for tax purposes?

      • derFunkenstein
      • 8 years ago

      Yes, because you’ll get a 1099 and have to claim it on your income tax, at least in the US.

        • just brew it!
        • 8 years ago

        It is still below the reporting threshold ($600), so AFAIK AMD is not obligated to send you a 1099 or report it to the Feds.

Pin It on Pinterest

Share This