Poll: What do you think of Bulldozer?

In case you haven’t noticed, Bulldozer is out. Our review of the first desktop implementations of AMD’s all-new microarchitecture isn’t terribly positive. Finding a silver lining among the mountain of performance, power efficiency, overclocking, and value data we collected sure is tough.

We’re curious to see what you think of the chip, so we’ve made it the subject of our latest poll. You can let us know what you think by voting below or in the middle column on the front page.

Our last poll asked how game developers should deliver higher-quality art assets for PC releases. A 64% majority would prefer to see PC games released after their console counterparts if it means things like higher-resolution textures will be included. 23% don’t want to wait; they’d rather see PC games updated with more graphical goodness after coming out alongside console flavors. The remaining 13% stubbornly insist that gameplay is everything, and that they don’t care about art quality.

Comments closed
    • mutarasector
    • 8 years ago

    Hate to say it after being n AMD guy for as long as I have, but BD is seriously disappointing and is darn near the last straw AFAIC. I can only hope that Trinity turns out to be something worthy (and on time), but I’m ready to throw in the towel on them as AMD seems to stand for “Anemic Marketing & Design” these days. I suspect my Llano mITX board will be the last AMD based device for me unless Trinity nails it on time and out of the box on release. This isn’t just because of BD either, it’s also the less than stellar performance of USB 3.0 in the Hudson FCH as well.

    Neely, I think your persistance has paid off, and possibly made a convert out of me.

      • sschaem
      • 8 years ago

      We have more clarity, and it does look like Trinity will be a decent offering.

      The 1.4ghz BD B2 test at .8volt seem to show amazing low voltage clock and performance.

    • Kaleid
    • 8 years ago

    Bulldozers are big, thirsty and slow. Time for AMD to develop cheetah instead.

    • ronch
    • 8 years ago

    I’m ok with FX. The dust has settled and I know what it is: A multi-threaded monster that’s practically a Phenom II X6 that surges ahead to Sandy Bridge levels in mult-threaded apps. If you buy it and know exactly what you’re getting, then you should be happy with it.

    But to target FX specifically at gamers, well, I think they’re being a bit underhanded about it. From Anandtech’s review (Page 8 – Gaming Performance):

    [quote<]AMD clearly states in its reviewer's guide that CPU bound gaming performance isn't going to be a strong point of the FX architecture, likely due to its poor single threaded performance. However it is useful to look at both CPU and GPU bound scenarios to paint an accurate picture of how well a CPU handles game workloads, as well as what sort of performance you can expect in present day titles.[/quote<] And they're screaming that it's for gamers (it surely invites such an assumption, being branded FX and all)? Civ 5 does well, until you run the no-render benchmark. That's excusable because it's just a benchmark and you don't play with no-render on, but in most other games Anandtech ran, Bulldozer practically trails the pack. I guess that's why they're selling the FX alongside their graphics cards. It's just like the Phenom and Phenom II days. You sell the platform, not just the CPU, if your CPU can't stand straight on its own. And from the AMD blog: [quote<]If you are running lightly threaded apps most of the time, then there are plenty of other solutions out there.[/quote<] He might as well have said, "If you don't like BD, f*** off." [quote<]But if you’re like me and use your desktop for high resolution gaming and want to tackle time intensive tasks with newer multi-threaded applications, the AMD FX processor won’t let you down.[/quote<] I'm sure the FX will not let you down with multi-threaded apps. But SB not only gives you great multi-threaded performance, but ALSO great single-threaded performance. And does so using less energy and at a lower price. (I'm sure many people would be ok adding $60 to the total cost of a computer to get an i7 and get better all-around performance as well.) How's that? Honestly, it's marketing bull****. I really don't know about this guy, but I'm definitely not gonna buy a computer just to zip files or transcode videos. And even if that's the primary purpose of my computer, I want it to be able to excel in other things as well. As the saying goes: Not that you would, but you could. Again, I think FX is fine for what it truly is. [u<]It is what it is.[/u<] I even plan to buy a copy when AMD improves performance and lowers TDP levels. But these AMD marketers keep putting on a straight face about it and acting as though there's absolutely nothing wrong with the design. In a sense, they're right. But we don't live in a perfect world with perfect multi-threaded apps. I know they're marketers and they're just doing their jobs, but to do so with such arrogance and stubbornness to deny that there's anything wrong with BD at all in our imperfect world is really pushing marketing BS to new heights. Well, I guess that's progress for you.

    • kamikaziechameleon
    • 8 years ago

    A family member needed a new computer build for his works’ drafter. He proposed an APU from AMD, I fell out of my chair.

    Realistically AMD doesn’t have a Current CPU architecture for lower end workstations and high end consumers. That is a huge gap in their product line. Not to say they don’t have “Worth while” offerings as is proposed by many of the posters on here.

    I really like their APU offerings they aren’t the best but they are a fairly A symmetrical offering in terms of strengths and performance. I had hoped Bulldozer would continue that trend but it sadly launches as a completely pointless offering with last gens x6 processors stealing the show for budget multi threaded processors. you can get a 1090T for only 160 vs the 230 of a 8150. WOW!

      • FuturePastNow
      • 8 years ago

      I’m actually surprised AMD hasn’t offered a version of their Llano APUs for low-end workstations, branded as having “Opteron” cores and “FirePro” graphics. Just a straight re-branding and a business graphics driver. It would require literally no work on anybody’s part but hit all the right marketing points to sell low-end business machines.

        • sschaem
        • 8 years ago

        Because they cant manufacture them?

        I’m surprised AMD didn’t jump all over the place on Flash 11. Its the ideal if not best APU killer app they can dream of…

        AMD marketing again sleeping at the wheel.

          • travbrad
          • 8 years ago

          [quote<]AMD marketing again sleeping at the wheel.[/quote<] They aren't asleep. They are driving a different car.

    • DeadOfKnight
    • 8 years ago

    Core i5-2500K FTW

    • CMOl
    • 8 years ago

    I am very disappointed about the Bulldozer CPU. This was supposed to be the comeback kid. They could have die shrink’d X6 and added more L3 cache. Or had a desktop version that had quad-channel memory controller, after all there’s a server CPU bulldozer that has it. I saw on a hardware review site that Windows 8 will more intelligently align threads, thus putting the right thread to right module. World of Warcraft: Catacllysm Frame rate increased 11.8 percent. Maybe Amd could have done something like that to Windows 7. Only if they had Windows 7 driver/app that align threads and a quad channel desktop version of the Bulldozer CPU things could have been different. When I build a new system next year it will be Intel.

    • ronch
    • 8 years ago

    Forget all the benchmarks you’re seeing. If there’s anyone who could tell you whether FX is the chip for you or not, it’s AMD.

    [url<]http://blogs.amd.com/play/2011/10/13/our-take-on-amd-fx/[/url<]

      • RtFusion
      • 8 years ago

      I’d rather look at the benchmarks from well known sites like TR rather than someone from marketing. I let the numbers speak about the product, not some marketing guy.

        • sschaem
        • 8 years ago

        AMD speak as if this B2 stepping is exactly what AMD as hoped, no problems whatsoever.

        Reality is B2 is a failure, well, its not an architecture failure. But a cache coherency bug + overvolting is ruining this launch.

      • chuckula
      • 8 years ago

      I think I detect a subtle hint of sarcasm in your post there ronch….

        • ronch
        • 8 years ago

        I WAS being sarcastic.

      • sschaem
      • 8 years ago

      AMD is still NOT coming out clean on this B2 stepping…. this is going to come back to bite them in the a**.

      Nobody likes cover up. I almost feel like buying an FX-8120 just to expose the truth on this B2 stepping…

    • ronch
    • 8 years ago

    Did AMD hype BD too much such that our expectations about it became too high, only to get burst when the reviews came in? Tomshardware.com seems to think so.

    AMD marketing really sucks. That John Fruehe doesn’t look like a marketing guy at all!

      • Crayon Shin Chan
      • 8 years ago

      JF-AMD plain lied. He claimed that IPC would actually go up somewhere in the TR Forums.

      • sschaem
      • 8 years ago

      No question about it, AMD over-hyped Bulldozer, specially for the desktop market….

      But I personally see that for servers the architecture is actually sound and the potential will be unlocked when:

      a) AMD fix BD cache writes
      b) AMD fix BD memory writes
      c) AMD address the clock of the NB/L3 cache

      After that whats needed for the desktop parts

      d) GlobalFoundry / AMD tweak the 32nm process & BD die so less voltage (1.4volt is insane) is required at peak.

      Not so much for power efficiency during load (and iddle power is A.OK already), but this will allow 6GHZ turbo on air.

      We know the BD architecture support 8ghz clocking, hopefully AMD will address the overvolting issue.

        • ermo
        • 8 years ago

        When I first saw the architectural slides with [u<][b<]4[/b<][/u<] pipelines per (semi-)core, my first thought was "this thing is going to ROCK". Then I looked more closely at the fine print (thanks to you and chuckula) and realized that I'd misunderstood the part where two of the pipelines were AGUs (so 2 ALUs + 2 AGUs per core) and that if Barcelona had been painted the same way, it'd look like a [u<][b<]6[/b<][/u<] pipeline design (so 3 AGUs + 3 ALUs per core). That, coupled to the sub-par realization/process issues for Zambesi and the cache aliasing OS scheduling issues, made this a so-so release. And of course, AMD hyped the damn thing beyond all recognition. Still hoping that Trinity will be better balanced for desktop scenarios, but not really counting on it. I'm going to replace my PhII X4 @ 3.8GHz at some point, and if 'at some point' was now, I'd get a 2600K, no question.

          • chuckula
          • 8 years ago

          [quote<]Still hoping that Trinity will be better balanced for desktop scenarios, but not really counting on it. I'm going to replace my PhII X4 @ 3.8GHz at some point, and if 'at some point' was now, I'd get a 2600K, no question.[/quote<] It's Chuckula here... I actually am much more hopeful for Trinity at this point (with the assumption being that GloFo will be able to make them with decent yields). The reason is that Trinity is *really* designed for the mobile world. Lots of the "server only" bits of Bulldozer will not go into Trinity at all, making it hopefully small enough and low-power enough to be a decent performer. Trinity will most certainly not beat Ivy Bridge at CPU bound tasks, but it should allow AMD to leverage the much superior GPU in Trinity while staying within a halfway decent thermal envelope. In desktop systems Trinity will play to the same market where Llano has some strengths: HTPC or simple desktops where you do *not* want a discrete GPU. In Trinity, AMD is actually listening to Sun Tzu: Attack the enemy (Intel) where he is weak (GPU). AMD should be able to compete in these areas by playing on their strengths instead of launching a suicide charges at Intel's strong points.

    • Fighterpilot
    • 8 years ago

    No doubt the engineers at AMD have been reading the reviews of their new baby.
    Must be tough to see it called a monkey/turd/pig…etc.
    *waves*
    Now you know why us GPU fans hated the change from ATi to AMD for the graphics dept.

    • Coran Fixx
    • 8 years ago

    Cheese took so long to ship, now its moldy

    • ew
    • 8 years ago

    Before I make a final judgment on bulldozer I want to see some server oriented benchmarks. This architecture seems to focuses on concurrent non-floating point tasks and it might do really well for database and web hosting kinds of things. If it does turn out to work well for those things then AMD needs to double down on their efforts to get general floating point tasks on to the GPU.

    • HisDivineOrder
    • 8 years ago

    I consider it in comparison to other events that are similar. That is, is it more Geforce FX5800/ATI 2600 or P4 Wiliamette/Prescott/AMD Phenom?

    I think it’ll be closer to the FX5800 end of the spectrum. That is, I expect AMD to “quickly” (for a CPU architecture anyway) update this to have better IPC next year. It’ll probably still not beat Sandy Bridge (and especially not Ivy Bridge) at the high-ish end of the mainstream, non-SB-E parts, but it’ll be a lot closer to the speeds and IPC they were aiming for.

    I think Bulldozer is what was meant to release last year and this will be why they’ll have Piledriver cores ready so soon afterward for Trinity’s release. Again, it won’t beat Intel’s line in pure performance CPU-to-CPU, but it’ll make the differences a little less severe.

    So I think it’s disappointing that they were so far off target, but I always saw Bulldozer as a hint of what was to come anyway since I knew Piledriver was literally a few months away regardless. If Bulldozer had come out when it was intended to, especially its original schedule, it would have been received a bit more favorably.

    This chip reminds me of FX5800/ATI 2600 (or even Fermi to a lesser extent since at least Fermi was a good performer for the most part) because it’s gobbling power, it’s got inferior performance to even its predecessors at times, and it seems to have a successor a few scant months away.

    If I had been AMD, I would not have released this chip to the public as a CPU for personal use. I would have kept it as a server option and stopped there. Releasing it puts the stink of failure onto the re-launch of the FX line, a stink that never quite cleared off the Phenom branding after its disastrous launch. In fact, Phenom had a lot of that, “AMD is gonna finally get back in the game!” talk around it and when it came, it just fell flat. Phenom wound up later not being that bad a CPU assuming you were more focused on the value and less on the high end performance and I’d even go so far as to say Bulldozer is likely to have a similar fate, but that wasn’t what AMD was saying about Bulldozer and it wasn’t what the AMD faithful wanted to see, I guess. Having read a lot of comments elsewhere and here, it just feels like to me AMD’s releasing this chip like this is the last straw for a group that have been waiting a long time for the promise of a chip by AMD that would put them closer to the Intel 2600k range.

    There is a LOT of, “That’s it. I’m going Intel,” or “I love AMD, but I can’t wait any longer.” Intel hasn’t hurt themselves by offering the K line to facilitate overclocking and a line of cheaper i3’s that look incredibly competitive for gaming-focused computers. They especially look great in the power usage/heat production area, areas that AMD seemed to pioneer back in the day when AMD led the charge to de-emphasize the importance of clockspeed in favor of produce numbers.

    To go back on all of that and push out their version of Netburst seems peculiar when power usage and low power numbers have become the metric by which CPU’s are judged. This will be further emphasized as Intel continues to push TDW lower and lower while AMD is tied to an architecture that seems destined to be up in the stratosphere of power, no matter how much performance they ring from future iterations of the Bulldozer-inspired line. That’s probably one reason why Intel is doing it. It makes them look so much better when their chips outperform their chief competitor’s AND are considerably less power hungry while doing so.

    I do think AMD used Bulldozer and its initial design as a way to separate some things up to later merge them into GPU functions that will be added, enabling the potential for more modules to be slapped in to replace those duplicate functions that will be merged together across GPU and CPU. The FPU seems especially ready for that given its decoupling from the INT units. I also can’t help thinking that the higher speed CPU emphasis is something that will benefit GPU functions (no more slapping a GPU alongside the CPU haphazardly) later as they’re brought into the CPU. I imagine AMD eventually doing what nVidia did with Kal-El; having a low power, low performance core for system overhead/processes and a series of high power, high performance modules for when more oomph is required. Having modules means that they could create a module to be slower to install side by side with the faster modules of the current FX series, helping reduce power and heat in the long run.

    But releasing this chip in this state in this way colors with the same meh that Phenom had for its entire lifespan and the same, “Ehhhh…” that it passed down to Phenom II for its lifespan. The new branding was meant to distance AMD from that, but instead they’ve painted their new line now with an even worse stench. Because you have the A4/A6/A8 line that stink of “old technology” and now the FX line with its, “We waited YEARS for AMD’s rehash of the Netburst debacle?” Only the E and C line CPU’s wound up not being a massive disappointment.

    Meanwhile, Intel’s over there offering CPU’s with better performance, a fuller line, an insanely better power profile, and the promise of even better power profiles at higher performance levels in the very near future.

    Releasing this chip in this way was a serious miscalculation on AMD’s part. They’d have been better selling these chips as server chips and announcing that they cancelled Bulldozer before mass production could begin for consumer versions, deciding it was better for everyone with Piledriver so close to just move to that instead. Fans would have been disappointed, but not THIS disappointed. Everyone else would have read between the lines and known that AMD skipped a flop like Intel did with Larabee.

      • smilingcrow
      • 8 years ago

      I hear a lot of talk about Piledriver being so close but going by past experiences I would say let’s wait until he have a FIRM release date before jumping to conclusions. If Piledriver was that close maybe AMD would have pulled the BD release but they may not have been able to do that if they had contracts to fulfil.
      I’m not meaning to be completely negative as PD may have improvements on multiple fronts that turn it into a mediocre CPU.

    • Chrispy_
    • 8 years ago

    There needs to be a fifth option:
    [b<]THREE YEARS LATE[/b<] If this released on time, it'd have been exceptionally competetive with 1st gen i7's and i5's As such, it's the technology that AMD needed in 2009. Yes, it's an architecture that has potential in the future, but it's no good at the moment and needs work from software support as well as some serious ironing of the kinks at GlobalFoundries. Lower IPC per clock isn't necessarily a bad thing, but AMD are clearly dealing with a hotter, leakier, lower-yeilding chip than they designed, which is stopping them reaching the clockspeeds that are necessary for their lower-IPC clock design to be successful.

    • just brew it!
    • 8 years ago

    Voted “disappointing but serviceable”.

    I’ll consider getting one when they either get some of the issues fixed, or lower the price to be more in line with its performance.

      • ludi
      • 8 years ago

      Same vote here.

      Unfortunately for AMD, I’ve already been putting off my next system upgrade for a couple years, and there just wasn’t any reason, price or performance, to pass over the i5-2500k.

      Sometime next week I will begin converting my primary desktop system over to Intel for the first time ever.

        • Chrispy_
        • 8 years ago

        I switched back to Intel when Core2 was new and shiny. I can’t say I really noticed the extra performance over my A64, but I did notice that it was a cooler-running, quieter solution.

        I never cared about having a silent PC until then, and it sparked my first fanless build, which was fun for a pet project. Still running that machine as an HTPC…

    • TaBoVilla
    • 8 years ago

    Honestly, it’s more than a slight disappointment =( What troubles me most are the following:
    [list<] [*<]Large die size, huge transistor count (not cheap to make) [/*<][*<]Apparent frequency barrier [/*<][*<]Not meeting TDP budget on smaller process [/*<][*<]Longer pipeline, higher clocks are needed [/*<][*<]Design advantages are nowhere to be seen on current code [/*<][*<]Inconsistent performance, slow on single/dual threaded stuff [/*<][*<]it's difficult to position it within the market, pricing will be off in most cases against existing amd products and competition [/*<][*<]come on! only matches 1.5 year old X6's most of the times / mentioned before, AMD should have had a die shrink'd X6 with higher clocks instead[/*<] [/list<] AMD needed to fist the table, they barely justified having a chair in the upper end CPU market. This honestly looks worse than the Barcelona launch, but if AMD can pull it together and have quick, better iterations of BD chips just like they did with the phenoms, they might be competitive in the near future. Worse? Ivy Bridge is lurking dangerously close =(

      • HighTech4US2
      • 8 years ago

      AMD designed a Netburst processor (Longer pipeline, high clocks are needed) after Intel failed.

      Why did they think it would come out any different.

      —-

      Higher clocks require higher voltages and both of those result in CPUs behaving like a furnace.

      Take a look at some of those 4-5 GHz OCing results with the CPU burning 300+ watts.

      • HighTech4US2
      • 8 years ago

      > Ivy Bridge is lurking dangerously close

      True that is what my next build will have in it as it will be higher performance than both SB and BD and lower wall power to boot.

      I will be replacing my Q6600 system (that I built when it clear at the time that AMD’s first quad core would be late (and it turned out also to have under performance. Kind of what BD is now being seen as).

      I was a long time AMD builder before the Q6600 but I lost my AMD religion with the above mentioned disasters and AMD just continues to be a train wreck.

      • CBHvi7t
      • 8 years ago

      [list=1<] [*<]Large die size, huge transistor count (not cheap to make) [/*<][*<]Apparent frequency barrier [/*<][*<]Not meeting TDP budget on smaller process [/*<][*<]Longer pipeline, higher clocks are needed [/*<][*<]Design advantages are nowhere to be seen on current code [/*<][*<]Inconsistent performance, slow on single/dual threaded stuff [/*<][*<]it's difficult to position it within the market, pricing will be off in most cases [/*<][*<]against existing amd products and competition; come on! only matches 1.5 year old X6's most of the times / mentioned before, AMD; should have had a die shrink'd X6 with higher clocks instead[/*<] [/list<] [list=1<] [*<]That is the big trouble, they can not even sell it by price. [/*<][*<]That can get better [/*<][*<]See bulletin one [/*<][*<]No idea why they did it. [/*<][*<]That will not change [/*<][*<]What do you mean? [/*<][*<]It completely misses the consumer-market. [/*<][*<]that is the good part, they could keep the old chips and squeeze some more profit out of them.[/*<] [/list<] They need to make massive changes to this product to match the mass market. If they loose two modules and half the L2-cache they could spend the power on higher clock-rate and sell the thing for $100. Not an Enthusiast-Product but a mid-range deal that will find a customer.

      • clone
      • 8 years ago

      does a high transistor matter to anyone if it costs less to buy…. not to make but to buy?

      overclocked Bulldozers set a record a month ago reaching 8400mhz and you are mad about a “frequency barrier”?

      does it matter that a cpu doesn’t meet a TDP budget on a smaller process to anyone if it works as rated and overclocks 1000mhz higher than stock on air no less!!!, are we getting that fussy nowadays regarding $200 cpu’s and lastly does it matter how the cpu is made so long as the price is right to reflect it’s performance?

      I agree the supposed design advantages are barely showing a tangible gain atm but is Bulldozer a failure…. only kind of, AMD cranks out Bulldozer which really isn’t a great desktop part and offers nothing else, had AMD done a die shrink, core revision on Thuban to boost IPC as much as possible while also lowering it’s thermals allowing for higher clocks then they would be at least offering 2 sides of the coin inefficiently with 2 product lines.

      Bulldozer looks to be a server part, I’ve read a lot of talk that AMD is moving away from desktop which they likely won’t be admitting publicly for a while but after seeing the numbers on Bulldozer that is the direction they seem to be headed, given AMD’s size they can’t afford to cover all the bases like Intel and may be moving with Nvidia out of desktop where they see little opportunity.

      it’s obvious atm BD won’t fill the gap, it’s a pretty solid yawn on the desktop front while looking promising on the server side.

      all of that said I may still buy an AMD thuban as the perf is pretty good and I’m not looking to go high end on the CPU when I can get more out of graphics an SSD and memory, my budget for the next system just won’t allow for it…… real drag that I waited for BD though, would have preferred to have gotten the new system 6 months ago.

        • smilingcrow
        • 8 years ago

        “overclocked Bulldozers set a record a month ago reaching 8400mhz and you are mad about a “frequency barrier”?”

        After seeing how poorly BD performs it’s totally embarrassing that they pulled such a lame promotional stunt. It’s a total irrelevance anyway even if BD was a good CPU. It’s akin to the fat kid at school who was useless at sports bragging to the jocks that he hold the world record in tying his shoe laces. They should sack the marketing twit who came up with that idea.

        “does a high transistor matter to anyone if it costs less to buy…. not to make but to buy?”

        It sure as hell matters to AMD in the long run for a company that has been struggling financially for years. If they go out of business that will impact all buyers down the line.

    • mark625
    • 8 years ago

    Thanks for the review.

    (deleted incorrect 890gx comments)

    I was really hoping for a greater improvement in the memory controller and caches. The performance there does not seem to be in line with expectations, so hopefully that will improve with the next stepping or two. Also, keep in mind that AMD has the newer Piledriver cores already working in the next APU, the so-called Trinity.

    I voted that this initial release of Zambezi is a decent start for a completely new generation of processors. AMD should be rewarded for expanding our choices and bringing new technology to the market.

    • xiaomimm
    • 8 years ago
    • ronch
    • 8 years ago

    I went over the numbers here again and I realized that BD isn’t as bad as I initially thought it was. I presented my case in the Forums. You guys are welcome to check it out.

      • mesyn191
      • 8 years ago

      Nah its pretty bad. BD = AMD’s Netburst. The ideas behind BD are sound but AMD screwed up the implementation and GF’s process isn’t very good either.

      Perhaps if BD was much much cheaper you could say it was OK. But the die size savings aren’t even there which was the whole point of BD in the first place with its module approach. The damn thing is bigger than SB!!

      Its very hot, uses lots of power, has no price advantage, and performance isn’t much better than PhII in most things.

        • HighTech4US2
        • 8 years ago

        Big, Hot, Slow, Unfixable

        Where is Charlie with his famous quotes as BD sure does fit his above quote to a T.

      • sschaem
      • 8 years ago

      Its not as bad if

      1) AMD truly had to overvolt to allow them to clock the chip competitively, and so will be able to reduce voltage in future refined version
      2) AMD had to propagate all writes to all cache levels to address coherency issue across core
      3) The L3 cache is running at half speed because of manufacturing issues
      4) The memory controller write performance is borked, maybe related to how B2 stepping has to handle writes

      If all 4 are fixed, (along side the OS scheduler patch), the result could be dramatically better.

      Right now AMD is on the ring floor, wearing the green short, hearing a countdown to K.O… but it could be a Rocky situation, rise back up with more punching power and reverse the situation 🙂

      AMD deserve all the “epic fail” its getting.. but man I dont want AMD to just stay down for the count.

    • ronch
    • 8 years ago

    Ever since I read the Bulldozer reviews all over the Net, I realize that I’ve been down. It’s just so sad. I feel so sad for AMD, Bulldozer, and the design team lead by Mike Butler. They worked really hard on this chip for 6 years and now that the numbers are out they get a big bashing. On the chip’s merits alone it is one very nice design with ample performance for anyone, except that if you compare it to Intel, who has boatloads of R&D and legions more engineers with a zillion more Ph.D. degrees under their belts, then you realize what kind of competitor those AMD engineers are up against. I sincerely hope they can get things right really soon and fight back another day.

    Good luck, Mr. Butler. May the Force be with you.

    • JLW777
    • 8 years ago

    Can someone, pair an overclocked BD with Tri SLI of GTX 480 (oc too) and make the rig run under full load… Just curious to see total power draw. Tbh, I neva thought considering psu capacity as a major concern when selecting a cpu is an issue @ late 2011…

      • ronch
      • 8 years ago

      This may help.

      [url<]http://www.msi.com/service/power-supply-calculator/[/url<]

        • travbrad
        • 8 years ago

        Combining that with HardOCP numbers, it looks like you’ll need 1500W at the bare minimum (and even that’s pushing it). They showed the 8-core BD @ 4.6ghz using 450 WATTS when only running Prime95 small-FFTs (ie, not using the GPU or memory).

        That link shows 894WATTS for 3 stock GTX480s, so OCing them will add at least another couple hundred WATTS.

        So 1100 + 450 – idle power usage of GTX570 (the card they used) = about 1500 WATTS. It’s not good to run your PSU at 100% either.

    • Headloser
    • 8 years ago

    Man this suck.
    I don’t need a 8-core or even a 6-core CPU. I just want a 2-core CPU for crying out loud. NO GAME ON EARTH EVEN USED an 8-core or 6-core CPU.
    WHAT the hell they were thinking?
    At least they are backward compatible with the AM3 motherboard. That the only thing good for now.
    I just wait and see.
    Okay Okay, a four core CPU for some people whom seem to like it. And yes you are right ronch, it the AM3+ mobo that will only take the new CPU. Thank you for pointing that out to me. I should have check the info out first.
    Still the Core i5 is beating the crap out of AMD Newist CPU. Why haven’t they used any of the ATI technology? They doing better good with the 6000 series and the 7000 series is coming out soon.

      • ronch
      • 8 years ago

      As I understand it, it won’t fit on an AM3 mobo. You need AM3+.

      • cygnus1
      • 8 years ago

      Not accurate. Civ 5, Supreme Commander and SC2 all come to mind as games that use more than 2 cores

        • khands
        • 8 years ago

        Using a quad-core is almost the norm, and probably will be next year.

          • Bensam123
          • 8 years ago

          They do use more then two cores, but they’re terribly inefficient at using them. Leave your task manager open next time you play one of them set to low refresh speed. They use portions of different cores, but no where close to the whole thing. Same can be said about Source games. They use more then two cores, but not efficiently at all.

          Most game developers that are making consolized PoS design them around the lowest common denominator. Consoles don’t have quad cores.

        • XA Hydra
        • 8 years ago

        Supreme Commander FTW! The difference from an E6600 and a Q6600 was pretty noticeable. That said, I’m happy parking at four for awhile. There is always going to be someone who benefits from more however. I remember when review sites and magazines still recommended a single over a quad core, and now some games are barely playable with one.

        It’s dangerous to go alone. Take extra cores. 😛

      • dashbarron
      • 8 years ago

      A lot of people made similar arguments when quad cores first came, lamenting that nothing uses quad-cores and people should just buy dual-cores. Need to have them before developers start programming for them. Now we’ve got a good batch of games running or in development for four cores. The market creates a need for something and then a company fills that demand.

      • ronch
      • 8 years ago

      [quote<]And yes you are right ronch, it the AM3+ mobo that will only take the new CPU. Thank you for pointing that out to me. I should have check the info out first.[/quote<] Sure, no problem.

      • maxxcool
      • 8 years ago

      BF2 and BF3 will use every core possible

    • rechicero
    • 8 years ago

    I’d say AMD executed half of we expected. A lot of Bull, but so much for the Dozer part.

    • puppetworx
    • 8 years ago

    Limiting the scale to between ‘Okay’ and ‘Terrible’ will slant the results of any poll. I’m severely disappointed in your methods TR and hope this oversight of taking a balanced approach ends here.

    • StuG
    • 8 years ago

    Bulldozer was a real let down. Enough so that I will now be buying my first Intel Processor.

    • Krogoth
    • 8 years ago

    A server chip being shoe-horned into a desktop chip position.

    TR benchmark show hints that Bulldozer will fare much better in the server arena. Unfortunately, that doesn’t do anything for desktop users. We are going have to wait until AMD retools Bulldozer into a desktop configuration (removing L3 cache, throwing in an intergrated GPU, tweaking L1/L2 for better performance etc).

    Bulldozer’s real enemy is the upcoming Sandy Bridge-Es.

      • dragosmp
      • 8 years ago

      It’s a real server chip, look at the specviewperf11 benches in the legion review:
      [url<]http://www.legionhardware.com/articles_pages/amd_fx_8150fx_8120fx_6100_and_fx_4170,1.html[/url<] In server loads it thoroughly bests the i7 2600K and will probably be competitive at least up to the midrange with SB-E. Despite all its deficiencies Bdozer may be a successful server CPU. In that same review they have simulated lower end FX chips by disabling some cores and tuning the multipliers. FX-4xxx CPUs seem pretty competitive with i3, so it may not be a completely lost cause.

        • mesyn191
        • 8 years ago

        Power usage still sucks though. Many server guys will still stick with SB and IB variants over BD/PD. AMD’s server market share is likely to go down more rather than maintain its abysmal ~5% or go up.

          • khands
          • 8 years ago

          Nah, Interlagos is/will be far more power efficient than Zambezi since it doesn’t have to hit the same clock speeds in a server environment. I bet you can undervolt them pretty well too once you bring it down.

          • OneArmedScissor
          • 8 years ago

          Have you actually seen a power test of the Opterons?

          Bulldozer runs over 1.4v stock in turbo mode – just like existing Phenom IIs. This is where those 125w TDPs come from. However, AMD’s lower power server CPUs, using the same Phenom II die, run 1.0v, allowing for 65w dual-die chips, with the 12 cores, extra memory channels, cache, and all.

          That CPU wasn’t even designed for that sort of power range. Customers were asking for it, so further down the line, they started offering it, and it still worked well enough.

          Bulldozer, however, was designed for exactly that. The problem with its power usage in desktops is because it’s like overclocking an Atom to 3 GHz. It’s so far out of spec for what it was designed to do that the advantages are thrown out the window with it.

          • sschaem
          • 8 years ago

          Server chips wont be clocked at 4.2ghz turbo, and will be using the bin that allow AMD to use normal voltage.
          Those FX chip are surely overvolted and that seem to be the reason the power usage is to high.

        • kureshii
        • 8 years ago

        The Legion/Techspot numbers look pretty suspicious:

        [url< ]http://www.legionhardware.com/images/review/AMD_FX-8150_FX-8120_FX-6100_and_FX-4170/Synthetic_02.png[/url<] They don't match numbers posted on other sites, such as BenchmarkReviews (http://benchmarkreviews.com/images/reviews/processor/fx-8150/specviewperf.png) and HardwareCanucks (http://images.hardwarecanucks.com/image/mac/reviews/AMD/Bulldozer/31.jpg)

        I’d take all posted Specviewperf numbers with a pinch of salt until test settings are clarified.

        • djgandy
        • 8 years ago

        Erm, since when is specviewperf a server workload?

      • forumics
      • 8 years ago

      a gpu does floating point calculations doesn’t it?
      what if AMD took out all the FPs and put in 1 large GPU, would that give better performance?

        • khands
        • 8 years ago

        They want to do that eventually, fusing the two together though is a huge undertaking and both Intel and AMD have to take baby steps to get there.

      • esterhasz
      • 8 years ago

      [quote<]We are going have to wait until AMD retools Bulldozer into a desktop configuration (removing L3 cache, throwing in an intergrated GPU, tweaking L1/L2 for better performance etc). [/quote<] That's Trinity then. I can't help thinking that AMD is (consciously or not) abandoning the high-end desktop. It seems like they don't have the resources to compete on all market segments any more and the enthusiast desktop market is probably the logical area to fade out of (stagnant market, lots of hassle, high R&D costs, etc.). It's a bit sad but I'd probably make the same choice.

      • HighTech4US2
      • 8 years ago

      Intel has designed a single CPU that can be used for both Desktop and Server.

      AMD somehow can’t do the same.

      Looks like the problem is all AMD.

        • OneArmedScissor
        • 8 years ago

        Really? That’s funny. The future Sandy Bridge E/EN/EX is configured just like Bulldozer – and not at all like the PC iteration of Sandy Bridge or Trinity.

        Intel’s server CPU right now doesn’t even use the Sandy Bridge core. There are several different platforms, and they’re all still Nehalem. Nothing you said makes a lick of sense.

        • Krogoth
        • 8 years ago

        [double facepalm]

        What are you smoking?

        Intel has designed several different chips for different markets.

        LGA1366 (Bloomfield/Gulftown) = workstation/server platform, triple-channel DDR3, no IGP, no on-die PCIe controller.

        LGA1156 (Lynnfield) = desktop platform, dual-channel DDR3, no IGP, on-die PCIe controller

        LGA1155 (SB/Clarkdale) = desktop platform, dual-channel DDR3, IGP, on-die PCIe controller

        LGA2011 (SB-E) = workstation/server platform, quad-channel DDR3, no IGP, on-die PCIe controller.

        If you want to use ECC memory (a must for any enterprise-class server), you have to get Xeon versions of LGA1366 and LGA2011. This isn’t the case for AMD where almost their entire line-up has ECC support. Motherboard support is a different matter though for both platforms.

          • ermo
          • 8 years ago

          Actually, the [url=http://ark.intel.com/products/52277/Intel-Xeon-Processor-E3-1275-%288M-Cache-3_40-GHz%29<]Xeon versions of SB/LGA1155 support ECC as well[/url<] -- for single socket servers with up to 8 threads, they're pretty decent (think MS Small Business server, for instance). AMD is not especially competitive in that segment either, as both BD and the X6s use quite a bit more power, which is not ideal for a 24/7 server that doesn't idle much. The E3-1275 SB Xeon even has integrated gfx if the motherboard chipset supports it.

      • kamikaziechameleon
      • 8 years ago

      “Bulldozer’s real enemy is the upcoming Sandy Bridge-Es.”

      Bulldozer’s real enemy is AMD, last gen chips fair better than this one in many applications. The fact that the flagship chip doesn’t contend with anything in its price range and does’t overclock is indication that there are many issues.

        • sschaem
        • 8 years ago

        The biggest hurdle to AMD recovery: AMD Marketing department headed by Nigel Dessau and GlobalFoundry
        First issue is easy to fix, second… we dont know whats going on behind closed doors, just that the situation on sept 28th was “no solution”

          • smilingcrow
          • 8 years ago

          Their marketing department is the least of their worries they just need to fire the design team.

      • CBHvi7t
      • 8 years ago

      [quote=Krogoth]wait until AMD retools Bulldozer into a desktop configuration (removing L3 cache, throwing in an intergrated GPU, tweaking L1/L2 for better performance[quote]
      Remove L3? certainly not a good idea.
      Removing half the L2 would be a good idea because it would not only save space but also improve the poor cache performance.
      what do you need an integrated GPU for? this is not AMDs low performance, low power, low cost part.

    • j1o2h3n4
    • 8 years ago

    We desperately need a hitler version of the bulldozer fail, like the famous iphone4s
    [url<]http://www.youtube.com/watch?v=Lxn6Ag0mmhs[/url<]

      • Manabu
      • 8 years ago

      It didn’t take long to hittler hear about it: [url<]http://www.youtube.com/watch?v=SArxcnpXStE[/url<]

        • chuckula
        • 8 years ago

        Considering Bulldozer is made in Germany that video is twistedly appropriate.

        • smilingcrow
        • 8 years ago

        First time I’ve seen one of these types of vids and I laughed out loud.

      • LoneWolf15
      • 8 years ago

      Ahhhh…. Godwin’s Law in action again.

      However, it was very funny.

        • chuckula
        • 8 years ago

        Technically they skirted around Godwin’s Law… they never said that AMD were Nazis, just that Hitler was unhappy with AMD….

        • ludi
        • 8 years ago

        Properly executed Downfall meme videos do not trigger a Godwin violation.

    • cheerful hamster
    • 8 years ago

    I build DAWs, and before Sandy B I had a few customers build AM2/AM3 systems to save a few bucks in situations that didn’t require the best available performance. Since then it’s all been Sandy, and BD will do nothing to change that. You can’t even pop one into an AM3 board for a cheap upgrade. The word that comes to my mind is not “fail” but “tragedy”.

      • destroy.all.monsters
      • 8 years ago

      Possibly – but if you need cores – and don’t have the cash for a 990X they’re just fine. It depends on how well the DAW software uses cores certainly – and how much you’re doing at one time.

      If you’re running samples, rewire-ing, etc. while recording – the cores are a benefit that intel can’t match – and certainly not at the price point.

      In certain motherboards you can pop in a bd on an am3 mobo – make sure the bios is updated is all.

      • sschaem
      • 8 years ago

      The fact that AMD did nothing to promote cheap AM3 upgrades is not tragic, its retarded,

      Its like shooting yourself in the foot, and shooting a second time to stop the bleeding.

      AMD was simply to lazy to valid 4core BD on their AM3 platform…

    • 5150
    • 8 years ago

    It stinks.

      • tfp
      • 8 years ago

      [url<]http://www.youtube.com/watch?v=RiMOKmp0uZc&feature=related[/url<]

      • ludi
      • 8 years ago

    • Hsew
    • 8 years ago

    Nobody on the entire interwebz seems to know what he is talking about! We all know AMD has some super-special-secret BIOS SAUCE guaranteed to boost performance into the next decade! Right? Guys? Anyone out there?

    • Lianna
    • 8 years ago

    You know, I really like to read TR, because after interesting news with in-house analysis I can read meaningful comments that often complement the information, add facts and show other points of view. Today I have doubts. I consider myself an Intel fan and all my PCs (but two) in last 18 years were Intel Inside (though I work on a lot of different machines), but universal Bulldozer bashing I see today is very, very strange. Outside of TR’s, Anand’s and Xbit’s article, like I would usually do, all this bashing made me read 12 different reviews to check for others’ opinions.

    Your – commenters’ – expectations were colossal. Reading what you guys write, Buldozer were to have die smaller than Atom, use less power than SNB, have consistently better performance than 2600/990X (whichever’s better) in every benchmark and especially games, overclock great without using more power, be cheaper than X4 640 and fit AM2. If so, I understand you’re disappointed, but some people had more reasonable expectations. BD is faster than previous generation and uses less energy at idle and under load less than Intel’s 9×0 series. Is fast enough for games. Is faster than 2500 (and often 2600) in most media/rendering/transcoding, where you really have to wait and easily notice the difference in performance. Has all new features, even in lower models (compare with support for AES-NI in Intel desktop/laptop CPUs – only i7s and highest i5, I got burned at that). Is priced reasonably. And that AMD’s marketing would hype BD? It’s their job. When did TR’s readership start believing in ANY marketing hype?

    Yeah, BD may be not enough for you. But writing about “epic fail” or “junk” is definitely out of proportions.

      • maxxcool
      • 8 years ago

      If we have any beliefs they are as follows:

      1) 50% of TR is a rabid fan of one of the two camps. sooo get used to it and welcome to the internetz.
      2) 25% of TR is wait until its bench marked by damage… then proclaim fail/pass/win
      3) the other 25% are here because its a great site, and they don’t have a terribly great clue and may have “expectations” …. but those expectations were created BY AMD themselves.

      As for this release. you clearly are in marketing and don’t get it. It took way to long to develop to produce a chip that only matches current last gen tech from intel, and in some cases looses to much older stars 10.5 tech from the now VERY old phx4 phx6 cpus. Other than buying a whole new computer for the 1st time…. or possessing something from 2009…. this cpu is not worth upgrading too. a 4 core phenom overclocked to 4ghz, or in my case a 6x core 1090t running at 4ghz costing much less cash, and not requiring a platform upgrade rubs BD’s nose in cost / value is where “its at”… where AMD used to live in the value/performance segment.

      right now, they under perform compared to last gen intel, last gen phenom x4 / x6… and cost more $. this is a fail…. it is not a pass and not a win.

        • kamikaziechameleon
        • 8 years ago

        ^^^THIS!!!

        • paulWTAMU
        • 8 years ago

        I’m group 3–not terribly tech savy by the standards of this site, but like to dabble in it and read.

        But really, is it expecting too much to expect it to outperform the last generation Phenom lines?

      • kamikaziechameleon
      • 8 years ago

      Making a product that has no place in the market typically qualifies it as…junk. To say a cheaper product that was on the market for 1-2 years is better than the latest and greatest in the tech industry where things grow by leaps and bounds is RIDICULOUS! AMD didn’t just loose to Intel in the benchmarks and reviews it lost to AMD, lol. Typically moving backwards in performance is not something nerds celebrate.

      • kroker
      • 8 years ago

      Yeah, that, or… this really is a failure and some of us were hoping AMD will revitalize competition in the market. Will this launch force Intel to lower its prices and introduce faster clocked SB processors? Does it bring things forward? In fact, does it change or affect the x86 landscape in any significant way?

      • XA Hydra
      • 8 years ago

      “Yeah, BD may be not enough for you. But writing about “epic fail” or “junk” is definitely out of proportions.”

      No it isn’t.

      • sschaem
      • 8 years ago

      ” BD is faster than previous generation” lie
      “uses less energy at idle and under load less than Intel’s 9×0 series” deception, the 9×0 is much faster and barely use more energy
      “Is fast enough for games” but you can get more for cheaper with a good old a i5-2500k, hence the epic fail you see all over the web.
      “Is faster than 2500 (and often 2600) in most media/rendering/transcoding” lie
      “Is priced reasonably” $245, looking at performance result thats nearly $100 overpriced. The 110t is $175

      AMD marketing BD is a circus of deceit. AMD is not a fly by night operation, they will have to live with this words for many years
      AMD marketing destroyed what was left of AMD image.
      So no, its not the job of a marketing department to destroy a company image.

      AMD marketing could have hyped Bulldozer the right way. Not as a desktop cpu powerhouse, but it being the introduction of a new architecture for server. To be made also available as a desktop chip.
      All the slide they released in the past 12month is going to haunt them… You can be cocky and arrogant, but you need to deliver.

      Remember the “Ready and Able” smear campaign against Intel earlier this year. That was smart 🙁
      AMD need to get ride of the entire marketing department , its broken, they are hurting AMD ..bad.

      what worry me is all the recent crap that they are pushing about the FX, and all that endorsed by the new CEO… More of the same.

      AMD image is being relegated to a garage operation with second rate engineering. And its not Intel fault, its all AMD doing.

      In closure, AMD will survive, despite its marketing department.

      • Lazier_Said
      • 8 years ago

      It isn’t that BD doesn’t live up to SB. SB is an extraordinarily well designed and executed product. AMD doesn’t have the development resources to match that. That’s fine.

      But BD doesn’t even live up to AMD’s own last generation. A massive 2B transistor chip with a new process shrink .. and outside of niche synthetic benchmarks to leverage 8 integer cores it essentially runs with 45nm K10 from 3 years ago.

      Just how badly does a product have to fail to earn the title? It’s expensive, it’s hot, it’s slow, it doesn’t overclock, why would anyone buy this?

      • blitzy
      • 8 years ago

      BD doesnt need to be best in all aspects, it just needs to be best from one useful perspective.

      e.g. better performance but higher power consumption
      better value for similar performance
      high overclocking potential
      low power consumption for similar performance

      BD fails because it is not better than the competition from any perspective that is meaningful to me. It may not be that far behind sandy bridge, but why would I care? I would just get a i5-2500k and be done with it if I was building right now. It may be a sign of my age, but I just want the best bang for buck and no hassles.

      • maxxcool
      • 8 years ago

      For a dash of salty defense on amd’s part… its not a “EPIC fail”… just not a “pass” or “win”..

    • Mr.Lif
    • 8 years ago

    I bought one despite the questionable reviews. I’m not sure if I’ll regret it or not, though. I was going to get the 8150, but went for the 8120 in it’s stead. I’m not sure if that extra 400mhz is worth it. I more or less made a promise to myself that I’d buy a bulldozer processor, all of what. Three years ago? When they were SUPPOSED to be right around the corner? Anyway, I’ve waited long enough and am going to damn well enjoy my cpu.
    At least it’ll be an upgrade from my heavily overvolted unlocked phenom II X2. I imagine even the FX processor’s high power consumption is still less than the 1.5 volts on that phenom.

    Oh well, here goes nothing I guess.

      • Metonymy
      • 8 years ago

      c’mon, don’t feel bad: No SB is going to keep you warm during this cold winter like the 8120 will.

    • oldog
    • 8 years ago

    So, I have a question for the inteligencia after reading the Bulldozer reviews. Why are Intel chips better than AMDs? Better engineering? Better manufacturing? Tighter integration with software makers? They seem to be able to compete well with NVIDIA but not Intel.

    Something else?

    Inquiring minds and all that…

      • ronch
      • 8 years ago

      Intel has half the world’s Ph.D.s. AMD keeps losing the few its got.

      • mnecaise
      • 8 years ago

      Resources. Intel has huge resources. AMD’s resources are limited.

        • kamikaziechameleon
        • 8 years ago

        AMD used to have its own fab, that was then though.

          • mnecaise
          • 8 years ago

          Intel is the king of process. Very few can compete with Intel’s fabs and AMD was no exception. They knew this. Trying to keep up was killing them so they made the decision, for good or bad, to go fabless.

            • khands
            • 8 years ago

            At the time of the original FX’s AMD actually had better processes than Intel. They obviously learned their lesson though and will likely never give that up again (at least not till the memory of it has faded from several generations of executive turn over, but by then it likely won’t matter).

      • k00k
      • 8 years ago

      They’ve got incredible economies of scale, not to mention cutting-edge fabs. That counts for a lot: think about it–in the course of, say SNB, from introduction, the process matures, more production fabs come on line, all while keeping prices mostly the same. Economics would tell you that prices for a coveted product are bound to get lower as a product stays longer in the market and gets the benefits of production scaling, but no. If that’s not a virtual monopoly, I don’t know what else to call it, but on the flip side, this is a monopoly that still manages to innovate. What Intel does to keep their pricing tiers is pure genius: they harvest parts that fail higher-end validation and introduce it into lower value tiers/price points, all while keeping cushy profit margins and the same products at largely the same price points. They do harvesting much, much better than Nvidia or ATI/AMD.

      They simply can get a lot more out of their investment, especially well into a process/product’s lifetime. And since they get a lot more, they can afford to throw more money and talent at silicon engineering problems.

      AMD has none of those, sadly. By sheer incompetence, missed execution targets, or the confluence of other factors, we’ll never know. Maybe IBM can lend them a hand once again, like in the K8 days?

      Man, I wonder what Jerry thinks of BD.

      • XA Hydra
      • 8 years ago

      I have no idea. I question Intels size/cash/personnell/manufacturing as the culprit ( couldn’t hurt though ) since AMD went up against all that ten years ago and gave them a black eye. Intel was much larger than AMD then as well…Some really smart people just seem to have made some of the wrong design descisions ( With Netburst… And arguably Bulldozer )- or were strong-armed into following some suits orders that were backed by business experience rather than engineering.

      Sooo… I suppose I will just go with quantum unpredictability 😛

        • bhtooefr
        • 8 years ago

        Intel had their resources spread multiple directions, too.

        In the time period that Netburst was designed and released in, they were coming out with three completely unrelated families of chip.

        September 2000, the XScale 80200 came out. This was designed by an ex-DEC team, but it still took Intel resources.
        November 2000, Williamette came out.
        April 2001, Tualatin came out. Mainly just a die shrink of Coppermine, but it was the beta test for the Northwood’s process, so…
        June 2001, Merced finally came out after plenty of delays. HP had a ton of involvement in this one, but again, lots of Intel resources – and Intel was pushing Merced quite heavily, too.

        By 2003, Intel was now supporting 4 families of processor actively – the various XScales, developing extensions to Northwood and developing Prescott, working on Banias and Dothan, and working on Madison and its variants.

        At the same time, AMD was only supporting two families of processor actively: K8, and Geode LX. Keep in mind that Geode LX is based on the ancient Cyrix 5×86 core, so it’s not exactly like they were doing much there.

        Nowadays, Intel has three actively maintained processor families – Atom, Ivy Bridge, and Poulson.

        AMD has two actively maintained processor families, but this time they’re both major designs – Bulldozer and Llano. So, resources spread thinner at AMD, resources spread thicker at Intel.

          • XA Hydra
          • 8 years ago

          It’s too bad that the DEC Alpha got strangled…. Always wondered how things wouldv’e turned out… X86 was growing like crazy regardless, mind you, but that was an awesome piece of sand in it’s day.

      • OneArmedScissor
      • 8 years ago

      The most straightforward reason is that they’re too small to focus on everything at once. The Bulldozer core and design concept are not done for. It’s just that this iteration has no business in desktops.

      • Rza79
      • 8 years ago

      Let’s not forget that they bought some of the best processor makers in the world (ie Alpha, Elbrus, …). I’m sure those architectures gave Intel some very good ‘ideas’.
      Alpha was developing threaded architectures for years when Intel bought them and Elbrus was a very wide monster cpu.
      BTW AMD doesn’t compete with nVidia. ATI does. Just because they bought them, doesn’t mean those GPU’s aren’t developed in Canada anymore.

        • oldog
        • 8 years ago

        NVIDIA stepped on Intel’s toes and got a black eye correct?

        Is AMD gun shy?

          • HighTech4US2
          • 8 years ago

          You got that backward. Nvidia got $1.5 billion and a cross license from Intel for Intel’s misdeeds over the chip set license.

          The deal:

          Nvidia can no longer make chip sets for Intel but Nvidia doesn’t care as that was a dying low margin/revenue product. They are more interested in Tegra for revenue growth.

          Nvidia can not make a X86 processor nor emulate the X86 instruction set. Again Nvidia doesn’t care as Tegra (ARM) is their processor of choice. See Project Denver.

          Nvidia gains $1.5 billion and a cross license from Intel that allows Nvidia to produce ARM CPUs (both low power ones and high performance ones (again Project Denver).

          Future supercomputers will feature exclusively Nvidia in them (I.E. no X86 from either Intel or AMD) as one Project Denver derivative will have a high performance ARM along with a high performance GPU (Tesla) integrated together.

          Also expect other Project Denver derivatives to be in notebook, set top, cars, TV’s, tablets and other mobile computer devices.

          As for AMD’s problems they are mostly self inflicted.

    • kroker
    • 8 years ago

    I voted “Bad enough it’ll be hard to salvage” not just because of the disastrous performance per watt in many applications, but also because to me AMD’s reputation as a CPU developer is damaged beyond salvaging. The wait for Bulldozer has been excruciating, only for it to turn up to be a flop.

    Maybe Bulldozer will be better for servers, but I don’t run servers so I don’t really care.

    • StashTheVampede
    • 8 years ago

    Love for the TR review and it generally works with the others on the ‘net (yay). This FX chip looks closer to a “value” perspective than anything else. So the chip is $20-ish cheaper than the better (in most benches) Intel part, but what about the cost of the motherboard? Save another few bucks there and *maybe* you’re looking a not such a bad deal for buying a newer computer (or upgrading your existing AM3+ setup).

    The only real reason to buy this chip is if your current workload is heavily core+thread dependent. Are you a desktop user running a bunch of VMs? This chip will look pretty good, no? Are you encoding video on your single socket system? Yup, not too bad. Maybe you’re running some SQL server queries that eat up a ton of CPU cycles. Alrite, not the worst decision you could make.

    [H] used the BF3 Beta benchmark to show that the FPS differences between the fastest FX isn’t far behind the 2500/2600’s. Threading is where BD will do better and a lot of applications are moving away from single threaded altogether.

      • indeego
      • 8 years ago

      [i<]The only real reason to buy this chip is if your current workload is heavily core+thread dependent. Are you a desktop user running a bunch of VMs? This chip will look pretty good, no? Are you encoding video on your single socket system? Yup, not too bad. Maybe you're running some SQL server queries that eat up a ton of CPU cycles. Alrite, not the worst decision you could make.[/i<] All decisions you probably made in late 2010/early 2011 and purchased the chip available then.

        • LoneWolf15
        • 8 years ago

        Exactly. BD gives no incentive to upgrade for performance or power usage.

    • ronch
    • 8 years ago

    Comments on FX-related articles and forum topics filling up faster than you could flush and reload a Bulldozer pipeline after a mispredicted branch.

      • Kaleid
      • 8 years ago

      We need to overclock but do we have the power?

        • ronch
        • 8 years ago

        Of course we have the power. Be prepared for a surprise when your electric bill comes though.

    • Kaleid
    • 8 years ago

    Meh = PHENOMENAL FAIL

    I hope it still sells though for the clueless crowd.

    8x cores!!!!

    • Vulk
    • 8 years ago

    Well… As a workstation part, this kind of blows. As a server part it’s really kind of awesome depending on workload… So I’m kind of at a loss. The Integer performance on these can be insane. Disabling the second core on a module to give each core the full 256bit FPU can make them monsters for certain intensive FPU jobs… But yeah, I wouldn’t want one on my desktop. That’s disappointing to say the least.

    • kamikaziechameleon
    • 8 years ago

    After thinking about it more I’d vote “Bad enough it’ll be hard to salvage ” instead of “Disappointing but serviceable ”

    Its actually weaker than the last gen offerings in many applications, that is simply unacceptable. I already have a 6 core 1090T and have no need of bulldozer.

    • ultima_trev
    • 8 years ago

    Clock for clock it’s slower than, despite the fact it’s fabricated on a semiconductor process half the size of and draws more power than the original Conroe/Kentsfield-era Core 2 processors of FIVE YEARS AGO, there is nothing to describe this other than EPIC FAIL.

    Oh AMD, where art thou? Oh mighty slayer of Pentium III and Netburst with ye’s majestic K7’s and K8’s, why haveth thee abondoned thy faithful?

      • dpaus
      • 8 years ago

      Well, you haven’t sacrificed enough virgins lately, have you?

        • ronch
        • 8 years ago

        ROFL

        • ultima_trev
        • 8 years ago

        You are absolutely correct, sir!

        Hail Satan!

    • ronch
    • 8 years ago

    As much as I’d hate to admit it, there are just too many things that are wrong with Bulldozer and AMD’s marketing.

    Let’s go through them one by one. Note that I took some of these from other review sites.

    1. Large die size. And we’re talking about it being built on 32nm and with no GPU.
    2. Inconsistent and often unimpressive performance.
    3. Power efficiency unimpressive as well, 32nm and power gating and all.
    4. Turbo Core seems to be sacrificing too much power to push performance.
    5. Overclocking doesn’t seem to improve performance a lot as compared to SB.
    6. Unlike previous generations, you’re required to buy a new motherboard to use it.
    7. AMD marketing sure hyped it to insane levels for a long time, and continues to hype it up even after we’ve seen the numbers. Check their website.
    8. It’s late.
    9. It’s priced like AMD doesn’t have a clue how disappointing it is for most of us.
    10. Big chance FMA4 and XOP won’t gain widespread adoption in the desktop space.

    On the good side…

    1. It’s still useable.
    2. We had a lot of interesting academic discussions about it, before and after launch.
    3. AES-NI implementation is robust.
    4. It has an interesting design.

      • kamikaziechameleon
      • 8 years ago

      well put ^^^this

      • ronch
      • 8 years ago

      It would be really painful for AMD’s pride if they’re going to drop prices this soon after launch, after all the uber-hype. Give it a week or two.

      • sschaem
      • 8 years ago

      We need a ‘post mortern’ article (still appropriate even so its the chip birth day) that just focus on “What the heck just happened here” and go into details on those point with AMD commentary.

      I would love to hear what the BD design team got to say about this B2 stepping.

      cache and memory performance is crucial for gaming performance, and it seem something is holding the chip back.
      Same goes with the voltage… Address both (+ the os schedule tweak) and it could be a complete reversal, where $245 makes sense.

      • forumics
      • 8 years ago

      5. Overclocking doesn’t seem to improve performance a lot as compared to SB.

      TR overclocked a SB to 4.5 vs a BD at 4.4 and the BD manages to equal the SB in the cases that were presented

      • shank15217
      • 8 years ago

      Its a large die cause its a server chip with massive cache and uncore..

        • mczak
        • 8 years ago

        That is true partly (there’s also quite a lot of unused space).
        If you’d just toss out the unneeded HT links and used the unused space too you’d have enough space for a GPU. There are hints Piledriver will toss out the L3 cache completely too.
        BUT the problem I have with the chip from a perf/area standpoint is that even assuming you do all that (and performance remains the same even without the L3 cache which might just be almost true for client workloads given the latency the L3 has anyway) it still would seemingly just look comparable in die size to a SNB chip with the same number of modules/cores. So with roughly the same die area you’d get something which might be similar in performance for multithreaded workloads but way slower for singlethreaded performance, which is still not quite good enough really.
        But I guess we’ll see how Piledriver actually does.

      • Risme
      • 8 years ago

      Well said.

      In my opinion Bulldozer is a disaster. There’s no point sugar coating it in my opinion and as soon as AMD honestly admits that without denial they can start fixing whatever is wrong with it. So here are my initial thoughts about it.

      First, let’s take a look at this matter from business perspective.

      1.) R&D costs of the architecture, I’m not 100% sure but I think it has been in development since 2006. So the R&D costs must be pretty high.
      2.) Orochi die size, which all Bulldozer based CPUs use is 315mm² compared to Sandy Bridge’s 216mm² so Orochi must cost quite a bit more per die to manufacture.
      3.) Yields aren’t where they should be.
      4.) It’s not competitive with Sandy Bridge in terms of price/performance/W

      Second, let’s take a look at this matter from consumer or enthusiast perspective.

      1.) Performance is very inconsistent, worst case it loses to Thuban and in best case it wins over Sandy Bridge, especially in single or lightly threaded applications.
      2.) Most desktop workloads are single or lightly threaded so who needs 8 cores for desktop anyway.
      3.) It consumes way too much power, especially when overclocked.
      4.) Performance doesn’t scale as well with clockspeed compared to Sandy Bridge.
      5.) It’s priced way too high at the moment, and I don’t see how they can drop the price enough because the die is too damn big.

      There are much finer technical details why it performs the way it does but I’m not going to go into those because I don’t have the technical knowledge to explain any of it. Besides in the end the causes, even though interesting don’t matter to enthusiasts and even less to average joe consumers. It remains to be seen how it performs in server space but even then unless it absolutely flies through there then it won’t cover for desktop and R&D costs enough.

      Yields and power consumption will improve over time when GF’s 32nm process matures, but it won’t happen overnight. I don’t see how AMD could improve things enough with Piledriver, so maybe, just maybe Steamroller will deliver and that isn’t coming out until sometime in 2013.

      All in all I’m very disappointed, sad and frustrated by all this. Scott and Anand summarized the situation pretty well in their conclusions.

    • ronch
    • 8 years ago

    Double post. Apologies.

    • XA Hydra
    • 8 years ago

    Total disappointment. “Meh” performance with aircraft carrier power requirements and nuclear heat 🙁 Like many, I’ve quietly hoped AMD would claw their way back to the Athlon glory days with this one, but instead we have a lackluster “Prescott” style chip that tries something radical and falls on its face with a Prescott TDP to match.

    I hope the 28nm Radeon HD 7000x fares better…..

    • crsh1976
    • 8 years ago

    Disappointed; it’s not useless, it’s not total fail, but I definitely expected better.

      • khands
      • 8 years ago

      Option #3 myself, too many issues with this one.

      • kamikaziechameleon
      • 8 years ago

      The fact that the flag ship this gen looses to the flagship from last gen in so many benchmarks is a horrible sign.

    • JohnC
    • 8 years ago

    I voted an “Epic fail”. I know that technically you shouldn’t call it as such, and I don’t really wish for AMD to fail as a mass-market CPU provider (I don’t want to pay even more $$$ for Intel’s products) but that’s how I feel about it. Plus more votes for that option might give some extra “food” for personal amusement, as well as more creative apologies from AMD fanbots.

    • Meadows
    • 8 years ago

    Second option, no doubt about it. I wouldn’t call it a failure but I would deserve to be laughed at if I labeled it as anything but underwhelming.

    I see the potential, but:

    1) Currently I own a Phenom II derived AMD quad that runs at 3.6 GHz, so I have no [i<]need[/i<] for an upgrade whatsoever 2) Even if I start needing the upgrade, I will wait for the second, facelifted revision (at least), because currently it's an [i<]alternative[/i<], not an [i<]upgrade[/i<].

      • JohnC
      • 8 years ago

      Good point about it being an alternative, rather than upgrade (for some current Phenom users).

        • just brew it!
        • 8 years ago

        I think what’s got a lot of people bent is the fact that people who already have AM3+ motherboards probably bought them in [u<]anticipation[/u<] of being able to [u<]upgrade[/u<].

          • travbrad
          • 8 years ago

          Yep and to make it worse AMD was telling them “get ready for BD with an AM3+ board!”. It’s the most I’ve ever seen AMD hype a product, and at the same time it’s one of their most disappointing products ever… Where was this marketing during the AMD64 days?

          • A_Pickle
          • 8 years ago

          I would expect that their AM3+ boards will also work with Meadows’ “second, facelifted version,” so it’s not too much of a worry.

          How sad is it that I want to upgrade to an AM3+ board just for the front panel USB 3.0 header, though? I have a Phenom II X6 1055T, and I’m generally happy with the performance… though I’ll admit: That AES-NI block on Bulldozer is mighty tempting. Still, I’m more interested in front-side USB 3.0 than Bulldozer… ouch…

          😀

            • smilingcrow
            • 8 years ago

            It’s a sign of the times really. I have a 2500K in my desktop with a C300 128GB, 2TB HDD and the whole shebang only pulls 105W from the wall with the CPU fully loaded. So even with Ivy Bridge looking as if it will be much more power efficient why should I care?
            I’d also love a front loaded USB 3.0 connector in my case so I could lose the ridiculously long extension cable that I am using as I couldn’t find one shorter than 1m when I bought mine.
            If it wasn’t for people wanting to edit 1080P video on their PC I’d say that increased CPU performance was becoming irrelevant to mainstream users.

            • Waco
            • 8 years ago

            105 watts with a discrete GPU? What anemic GPU could you possibly be running for that to be your real usage after losses in the PSU?

            • smilingcrow
            • 8 years ago

            Who said anything about a discrete GPU? I don’t play games.

            • Waco
            • 8 years ago

            Nevermind then. 😛 You must have an extremely efficient PSU and motherboard to get those kind of numbers though.

            • smilingcrow
            • 8 years ago

            80Plus Gold. 🙂

            • travbrad
            • 8 years ago

            [quote<]If it wasn’t for people wanting to edit 1080P video on their PC I’d say that increased CPU performance was becoming irrelevant to mainstream users.[/quote<] If you consider editing HD videos 'mainstream' I think gaming would have to be included too. I'd wager more people play PC games than edit HD videos by a wide margin. It's actually the more 'mainstream' games that benefit most from CPU power too, as a lot of those MMOs and F2P games don't utilize the GPU very heavily. The most mainstream of mainstream games for example: [url<]http://media.bestofmicro.com/N/1/310573/original/wow%201680.png[/url<]

            • smilingcrow
            • 8 years ago

            Games won’t generally benefit much when moving from 4 to 8 (true) core CPUs whereas editing 1080P video will get a massive boost.

            • travbrad
            • 8 years ago

            I didn’t say anything about core counts so…I’m not sure if that is directed at me?

            I said games benefit from CPU performance, as evidenced by WOW running 50% faster on a 4core SB than on a 4core Phenom. CPU performance != core count. If that were the case Bulldozer wouldn’t be so slow.

    • ALiLPinkMonster
    • 8 years ago

    They better roll out a significant price drop asap if they still hope for it to be any kind of successful.

      • sschaem
      • 8 years ago

      newegg sell is for $279

      [url<]http://www.newegg.com/Product/Product.aspx?Item=N82E16819103960[/url<] Thats $100 above 'fair value' considering its B2 stepping

        • ronch
        • 8 years ago

        Who’s crazy enough to shell out that much when you can just grab an I5-2500k for less and get better/more consistent performance, lower power consumption, and a GPU+Quicksync to boot? It’s not like you can stick it on your old AM3 board either.

      • Farting Bob
      • 8 years ago

      The problem is the chips are HUGE for desktop CPU’s, and GF’s 32nm process is apparently still not up to snuff so yields will be lower than ideal. I wouldnyt be surprised if AMD arent making any money as it is out of them, they will be waiting for improved yields and better manufacturing overall.

      • ronch
      • 8 years ago

      Give it a week or two. Maybe they wanna capitalize on folks who know no better and jump the gun because it’s 8-core. When that dries up, expect prices to drop.

    • LiquidSpace
    • 8 years ago

    AMD is junk, Hopefully they won’t ruin ATI too.

      • shiznit
      • 8 years ago

      the high-end CPU is junk. The rest is great to decent.

    • Derfer
    • 8 years ago

    The primary reason I wouldn’t consider this “serviceable” is that power consumption. Poor performance can be corrected for in price. The fact that a 4.6 GHz BD draws almost 200 watts more than a 4.8 GHz 2600k is just so far beyond acceptable. It’s a shame cause I was liking the way the boards were shaping up.

      • chuckula
      • 8 years ago

      I went with “serviceable” but the “serviceable” bit is assuming that Piledriver or other improved version gets put out pretty quickly to correct the major failures of Bulldozer. Bulldozer in and of itself is likely going to have a *short* lifespan.

      • XA Hydra
      • 8 years ago

      Agreed. Why would you buy a car with 150HP that got 15 MPG if there was a sports car on the lot with 300 at 30MPG? That is how ol’ dozer seems to stack up to sandy bridge ( sad to say ). If you want compare at SIMILAR performance numbers, the power consumption / temp gap becomes even wider still!

      Imagine how this may look compared to Ivy in a few months… I fear it might shape up to Intel’s “FINISH HIM” combo… Ouch.

      • setbit
      • 8 years ago

      I agree with your basic point, but where do you get 200 watts from? I see 209 watts peak vs. 144 for the 2600K.

      65 is very bad, but it’s not 200.

        • Derfer
        • 8 years ago

        [url<]http://www.hardocp.com/images/articles/1318034683VZqVQLiVuL_9_2.png[/url<] I guess "almost 200" makes it sound worse than it is... though 177 watts isn't much better.

      • flip-mode
      • 8 years ago

      Ditto.

      • can-a-tuna
      • 8 years ago

      Well as far as I know 2600k is FOUR core processor and Bulldozer is EIGHT core processor so I don’t see the problem you describe. It’s not AMD’s problem if applications cannot multithread properly. I’m sure 4 core BD acts as fast like FX-8150 with basic test setup but with considerably less power consuming.

        • Derfer
        • 8 years ago

        Rather than getting into a “real core” debate let’s look at in terms of power and performance. The 2600k is doing more work and drawing more power than the 4 core 2500k at the same speeds for a reason. And rather than saying you’re “sure” please consider waiting for reviews of the 4100 before touting it’s performance to watt ratio.

      • Kaleid
      • 8 years ago

      P4 or x2900xt

      Lots of juice not enough power.

      • ronch
      • 8 years ago

      Agreed. I could still buy this even if it’s not as fast as SB or even if it’s not priced very well, just for the sake of curiosity for the architecture, but why should I pay more for electricity to run it? That’s a major bummer for me.

    • tbone8ty
    • 8 years ago

    This is basically a server chip for desktop.
    And needs to stop making parts for future and needs to make CPU for NOW!

      • shiznit
      • 8 years ago

      Ironically it fails as a server chip too, maybe more so. AMD is pretty much finished in the 2P market, and I don’t know anyone interested in 4P AMD either.

    • Ardrid
    • 8 years ago

    I started leaning towards ‘Disappointing’ but ultimately settled on ‘Epic fail.’ I went with that option given whet can only be described as four years of waiting for extremely lackluster performance. There are times where this processor struggles to keep up with Phenom II, let alone Sandy Bridge. If we were getting this processor last year, I have to think it would have been received much more favorably. As it stands now though, Bulldozer can only be considered a colossal failure given the amount of time invested and the fact that Ivy Bridge is literally around the corner.

    I’ll be very interested to see how the architecture holds up with server workloads.

    • FuturePastNow
    • 8 years ago

    I think a straight shrink of Thuban would have been better and a lot cheaper for AMD.

      • sschaem
      • 8 years ago

      llano is pretty much it. K10.5 cores on 32nm. Remove the GPU in llano and you have room for 4 more cores. Thuban x8 ?

      But I would prefer an X6 with AVX & FMA

        • khands
        • 8 years ago

        Pretty much.

        • FuturePastNow
        • 8 years ago

        Exactly. AMD already did the R&D work for 32nm Stars cores.

        I don’t care what Bulldozer’s “potential” is. Its potential is irrelevant. And I don’t care how obsolete the old architecture is when the new one isn’t any faster. With a similar clock and higher IPC, a 32nm Phenom III X8 would be superior to the released FX processors, at least for desktop users. Superior for AMD’s profits, too.

          • kamikaziechameleon
          • 8 years ago

          Bulldozer is fine for potential but crap as a product. I’m annoyed they basically made early adopters beta testers. That is the annoying thing that defeats the notion of a good R&D process. Its now apparent the reason BD was released like this was because the R&D needed some funds, lol. I personally agree with you guys, releasing a phenom 3 X8 @ 3.6 ghz with a 4.0 ghz turbo clock would have done more for me.

    • swaaye
    • 8 years ago

    The power usage is most disappointing but sure otherwise it is certainly a usable CPU. I can’t see myself buying one though considering the alternatives.

      • CheetoPet
      • 8 years ago

      Funny, I was gonna say the exact opposite. The power usage is fairly decent, its the performance and the pride point that hurt.

        • forumics
        • 8 years ago

        i don’t think its any upgrade to my ageing p2 940 after factoring in power vs performance considerations

    • Lianna
    • 8 years ago

    I’m not totally happy with where Bulldozer is, but definitely not disappointed either. I’m disappointed that what holds Bulldozer from being great is mostly not AMD’s fault.

    Gaming performance is worse or similar than SNB, but as with any modern performance microarchitecture, it’s certainly more than enough. Not for bragging rights, but enough for any gaming, including Ultra CPU mode in StarCraft II. Long story short, gaming is a wash. So is general office, entertainment and light productivity; more constrained by lack of SSDs than even Brazos’ cores and frequency.

    I’m slightly disappointed in GF 32nm process, not allowing for lower voltages that should give Bulldozer and SNB a parity in idle power consumption. Higher yields and lower voltage would boost the frequency slighty. But BD is not a mobile processor yet, and when it comes, yields will, too.

    I’m heavily disappointed in Windows’ kernel, which after 9 years of exposure to Intel’s HT still has trouble assigning different threads to different modules; 5 years of exposure to Core 2 Quad’s dual dies did nothing to stop kernel from too frequent core hopping, eliminating possibility of further power saving AND better performing Turbo Core or Boost (in both camps).

    I’m happy that BD finally gets AES-NI and higher core count – it will be welcome in servers. AVX/XOP/FMA(C) plus much higher memory bandwidth will make HPC crowd happy, both for higher throughput and higher precision of FMA. HPC DPFP potential performance is on par with SNB and twice the old architectures from both camps. AVX non-fused vaddpd and vmulpd is worse than SNB and that shows in AIDA and Sandra, both compiled for Intel compatibility, not for using every bit of Bulldozer’s performance (and precision) with FMACs. Both these features further strengthen AMD’s (quite good) position in servers/cloud, and workstations/cluster/supercomputers. These higher-margin markets will fuel advances for the business/home market – the rest of us.

    When you think that AIDA and Sandra (like every performance-oriented software) will optimize for FMAC, possibly doubling performance, and you look not at gaming bragging rights but at real time hogs – media, video, rendering, scientific applications – and Bulldozer’s value for money, you see quite positive image of AMD’s finest.

      • Lianna
      • 8 years ago

      If you don’t like my post, write what you disagree about and why.

        • JohnC
        • 8 years ago

        What for? Simply downvoting is much [i<]faster[/i<] and more [i<]efficient[/i<]. Understandably, these two things might not be viewed as positives by you, seeing that you're an AMD fanboy...

          • shank15217
          • 8 years ago

          Yea I don’t think down voting was meant to be used as you mentioned. There are clear indications this architecture has strengths, its just not a very balanced one. AMD has a lot of work to do to improve desktop performance and they will.

      • sschaem
      • 8 years ago

      Its never AMD faults…

      If gaming is good enought, why not get the cheaper i5-2500k then?

      By getting an i5-2500k you can save money as you dont need a high end video card anymore.
      A 460 GTX + i5-2500k get you the gaming performance of a fx-8150 + 560 GTX

      Nice saving by going Intel. right ?

      Gaming is ruled by the i5-2500k, no its not a wash.. check the 15 website again.

      AMD knew about their core problem 3 years ago, how come its only a few weeks ago that even linux developer where informed of this ? Only AMD to blame.
      AMD was so protective that it was under their rule that they refuse to inform OS developers of their requirements.

      10GB to DDR3-1600 is much higher memory bandwidth? the i7-2600k get 19GB with ddr3-1333.
      Intel get 2x the bandwidth with 20% slower memory… put down the crack pipe.

      AIDA was optimized for buldozer and AVX/FMA back in June 2011, and got a another update this August.
      AIDA is optimized as much as it can be for bulldozer.

      Sorry, but no matter hwo you want to spin it… the blame is on AMD… it will get better, but B2 stepping is an “epic fail”

        • shank15217
        • 8 years ago

        Gaming is one of the worst examples, lots of games are gpu limited, it hardly matters what cpu you choose as long as its decent. SB is a better choice for a whole host of other tasks, including many that Bulldozer was supposed to be better that.

    • indeego
    • 8 years ago

    Open browser–>Look at conclusion on TR and Anandtech–>Shrug and keep buying Intel.

      • maxxcool
      • 8 years ago

      LOL, that’s pretty much what I do, but i do look at dirt3 at least since its a decent dx11 benchie

      • 5150
      • 8 years ago

      ^ What he said.

        • Mourmain
        • 8 years ago

        ^ This.

    • Althernai
    • 8 years ago

    They’re going to have to work hard to make the next iteration worthwhile. The philosophy of the design was flawed: they should never have relied on high frequencies to attain the desired performance. I would have thought that since they had competed with NetBurst, they would have seen just how bad things can go when that kind of architecture doesn’t get the fabs to do what it needs, but apparently not.

    The jury is still out on the 4 modules, 8 “cores” approach. It’s useless for my workload because there are only 4 FPUs and they’re slow ones at that, but it might be useful for other server tasks. Unfortunately, in the current chips, the single-threaded performance is so pitiful that Bulldozer pretty much needs perfectly-threaded, integer-only workloads to win over the consumer versions of Sandy Bridge (and even then it doesn’t win by much). Also power consumption is high relative to Sandy Bridge — even at idle.

    I think it can still be salvaged, but they need a lot of things to go right.

    • dreamer77dd
    • 8 years ago

    It is a good APU. for people not in the know like your grandmother. It is cheaper then intel and dont need a new motherboard or GPU for your grandmother playing facebook games as well as pc games. i believe Trinity will need a new motherboard. i wonder what that chip will be like.

      • flip-mode
      • 8 years ago

      It’s not an APU.

        • XA Hydra
        • 8 years ago

        *Schwarzenegger accent* ^^^

        • Kaleid
        • 8 years ago

        [url<]http://www.insidesocal.com/friendlyfire/Apu%27s%20octuplets.jpg[/url<] 8 of them and they all suck

      • NeelyCam
      • 8 years ago

      I really hope you don’t build a BD rig as a present for your grandma’s facebook needs

      • ronch
      • 8 years ago

      Sorry, you seem to have it wrong. FX does not include a GPU on-die

      • Metonymy
      • 8 years ago

      ummm… yes… not in the know…

      • LoneWolf15
      • 8 years ago

      Llano is great for grandmother, cost far less, and is a real APU.

    • Ryhadar
    • 8 years ago

    Voted “Disappointing but servicable”. AMD time and time again has had very forward looking CPUs, though more often than not it’s too forward looking for their own good (see Phenom, HD 2900).

    This architecture probably has a good life ahead of it, especially if we get anything like the 2900 -> 3870 -> 4870.

    That said, I think it would have been in AMD’s best interests, given the new fab process and the size of BD to just make a friggin’ Phenom x8. I would have bought a Phenom x8 come upgrade time, I’m gonna pass on bulldozer.

      • k00k
      • 8 years ago

      Count me in the same boat. This is forward looking, but too forward looking. The design is more big-iron than desktop computing. (Think Sun/Oracle UltraSPARC thriving on many-threaded use cases)

      It doesn’t help at all that AMD’s rumored line of thinking with BD is that, they don’t need a good FPU, they’ve got a pretty powerful one they’re integrating more and more heavily: those GPUs can do floating point much, much better than traditional desktop designs. Hey AMD, that’s too far out. Where’s the product for the here-and-now? Let’s hope they’ve got Trinity and Piledriver in a much better stance than this one because Ivy Bridge will certainly dash any hope of AMD getting back in the game.

      For AMD to survive, they need to ramp up Llano and Bobcat. I don’t care how they do it, give more work to Chartered, get it on TSMC–give the market what it wants. GloFo’s current fabs and process alone aren’t cutting it.

      Study Bobcat more and look at how it can be made to fit smartphone and tablet form factors. Nvidia seems to be making a tidy amount of money with Tegra. Shove your pride aside, you know you’ve sold Xilleon (now under Broadcom) and Imageon (now Qualcomm’s Adreno), but just do it. Intel’s doing x86 Android already. Jerry’s AMD would jump at that and try to win some designs. You’ve got a pretty good support among Linux devs because of the device and driver specs you provide, now get more out of it.

    • yogibbear
    • 8 years ago

    If, instead of putting words to the different options you just went:

    9-10/10
    7-8/10
    5-6/10
    3-4/10
    1-2/10

    The results would be much more useful. Now we have to assume everyone reads everything the same way, puts the same importance on the same words within the phrase and then interprets the results the same way. Zero objectiviity.

      • dashbarron
      • 8 years ago

      Way to kill the fun of it all, Buzz Killington.

      • Johnny5
      • 8 years ago

      Then we’d have to assume everyone interprets a rating scale the same way (is okay a 5 or a 7?). Zero objectivity.

        • yogibbear
        • 8 years ago

        Answer me this: which is less objective of the two? There you go.

          • Aspleme
          • 8 years ago

          Actually, if you do your research, you will find that there is little to no difference between people’s votes when given a 1-10 scale or a horrible-great scale. When you measure the curve, it ends up being the same. Furthermore, when you look at individual votes, a person’s vote changes more from day to day than it does when you change the type of selection.

          In fact, there has been some evidence that suggests surveys like this with more unique options are actually more accurate than the scale you suggested because they evoke an emotional response that binds the decision more strongly.

          Furthermore, you’re acting like the data actually matters. It’s an opinion poll, not a review.

            • yogibbear
            • 8 years ago

            Okay whatever, there are issues with a scaled poll. I agree. I just think the original poll has 3 bad options and 1 mediocre option. (I’m not saying I didn’t vote for one of the bad options).

            • Aspleme
            • 8 years ago

            That’s true about the options… but since we don’t have information on how the on die GPU runs games/media or how asymmetric crossfire works, there’s really little else to say. It’s more expensive than the i5 2500 and underperforms in almost every category. Anyone who is excited about this processor for its own sake is a little deluded.

    • OneArmedScissor
    • 8 years ago

    Where’s the, “Boy, I’m surprised an 8 core CPU makes no sense for desktops!” option? What was anyone planning to do with an 8 core CPU? Play four games at once while you rip a Blu Ray disc and render the animation for Transformers 3 in real time?

    You guys seem to be loading these troll polls as of late. That last one forced you into saying you didn’t care about “art” if you think high resolution add ons are stupid. Give me a break. The silly options were at least entertaining.

    And now I replied, and you got your extra page hit. Hook, line, and sinker!

      • Aspleme
      • 8 years ago

      Modern games are taking advantage of multicore processors. Are they using 8 yet? Not any that I’m aware of… but when dual cores first came out, they weren’t using both of them. The hardware has to be available for the programmers to use. It doesn’t make sense to optimize your program for 32 threads when no one has that many yet.

      That being said, I still like Intel better. Make the best architecture, and then improve your core count.

        • StuffMaster
        • 8 years ago

        Yah, suppose your game is utilizing 4 cores, and you’ve got 2 handling other stuff. That leaves only 2 idle, and we all know what happens to those over time…

        • OneArmedScissor
        • 8 years ago

        That’s making a blind assumption that there’s something else to do in a game with more traditional cores. Integer cores are only good for so many things, and they’re not the only type of processor.

        Graphics cards have been capable of handling physics for a while, and with DX11, they can even handle the AI. The number of parallel tasks that traditional CPU cores handle is actually decreasing over time.

        And now, CPUs are all starting to come with GPUs in lieu of more cores, which are about to be standardized as DX11 GPUs. See where this is going?

        Say you have a desktop with a graphics card. On purpose, you will also want a CPU with an integrated DX11+ GPU. The physics and AI still go to your CPU, but the integrated GPU handles it. That’s the future, not piling on more of the same.

          • travbrad
          • 8 years ago

          [quote<]That's the future, not piling on more of the same.[/quote<] It may be the future, but it's not the present, which is where we currently reside.

            • OneArmedScissor
            • 8 years ago

            Ok. Thank you for agreeing with my original point, I guess?!?

      • ronch
      • 8 years ago

      Come on, don’t kid yourself. You KNOW you need 8 cores in your system! 🙂

        • XA Hydra
        • 8 years ago

        If you do get to that point, find a chip with 8 FULL cores complete with a matching set of FPUs…. It’s kinda like the old Via chips that ran the FPU at half the clock of the rest of the CPU.

        This is a 8/4 design. AMDs reasons for the design set aside, it influences the performance figures big time without a doubt.

      • xeridea
      • 8 years ago

      Since 8GB RAM is now common. If you think about it…. GB of RAM has (loosely) correlated with available CPU cores the last 5 years or so. So, most people won’t need 8 cores, but most don’t need 8GB or RAM either, they just get it because RAM is cheaper than dirt now.

      Coincidence? Maybe, but interesting.

      • maxxcool
      • 8 years ago

      Be it a limited example, but BF3 will use every core it can find for physics and rendering assistance. And if your archiving blu-rays.. well 8 cores is win… but from the benchies….. not for BD 😛

      • kroker
      • 8 years ago

      This is AMD’s NetBurst, like someone else said on another site. They’re using “core” count & frequency as a marketing gimmick even though the processor is inferior compared to the competition. Bulldozer has two integer units, NetBurst had the Rapid Execution Engine (where the ALU effectively ran at double the speed of the processor, increasing integer performance). Both architectures have a longer pipeline and are slower per clock than the previous generation. Both architectures are power hungry and hot.

      NetBurst was a failure, but it still made a lot of money because it was backed by Intel’s formidable marketing machine and chip manufacturing capabilities. AMD has neither.

        • XA Hydra
        • 8 years ago

        The Pentium 4 similarities are downright scary. Same disbelief upon seeing the benchmarks against the last gen ( P-III / Phenom II ) too.

        Ivy bridge was already pushed into next year. While it is clearly my personal upgrade choice at this point, the lack of competition with genuine teeth is disturbing ( insert “I find your lack…..” vader meme here ) and Intel has free reign to keep prices stagnant ( and high ) enough to breed mosquitoes in it if it wants, and as a for-profit business why not? 🙁

      • mnecaise
      • 8 years ago

      Developer. I can use the 8 cores.

      Compile might be (sadly) single threaded, but in addition to the IDE, I’ll have a SQL server running in the background and VMs for testing all running. I run a single heavily loaded workstation (think God box) rather than multiple machines or servers. I can crush a dual core and regularly eat my quad core.

      I know you were trolling; but, you asked the question.

        • OneArmedScissor
        • 8 years ago

        Workstation ≠ PC

        I’m not trolling. People keep trying to come up with exceptions, and they keep failing. As I already pointed out before, an 8 core CPU might apply to PCs for video encoding, one thing that is still time constrained, but now there’s QuickSync, and it will just keep getting better. It ends there.

        The right tools for the job beats too many tools. For a PC, Bulldozer is like carrying around a backpack that you have to go digging through, when all you needed is the hammer and nails right there in your belt.

          • mnecaise
          • 8 years ago

          fair enough. I mis-interpreted your point, what you were trying to say, and you’re right.

          The average user would find a dual core adequate. My wife’s machine is a dual core. My kids are using an old single core Athlon. I usually recommend a dual core machines for the typical end-user workstation.

        • chuckula
        • 8 years ago

        I *really* wanted to see a Bulldozer given to phoronix where they could check GCC compilation performance.

        The closest I’ve found is the Chrome build under Windows using MSVC. Not my preferred environment, but it *is* a parallel build that uses all the cores on the CPU and ostensibly should be very favourable to Bulldozer since it’s practically all integer code… well.. Anandtech ran the benchmark but the results were *ugly*. The FX-8150 loses to a 2400(!) in the compile benchmark. It also loses to the old X6 by a wider margin than the 2500.

        Results here: [url<]http://www.anandtech.com/show/4955/the-bulldozer-review-amd-fx8150-tested/7[/url<] (about 75% of the way down the page) [quote<]Our compiler test has traditionally favored heavily threaded architectures, but here we found the Phenom II X6 1100T to offer a tangible performance advantage over Bulldozer. While AMD is certainly competitive here, this is an example of one of those situations where AMD's architectural tradeoffs simply don't pay off—not without additional clock speed that is.[/quote<]

    • maxxcool
    • 8 years ago

    The lack of better “choices” for the buttons REALLY makes me want to call this a “troll poll”

      • puppetworx
      • 8 years ago

      I still found the vote I wanted but if you want to get an accurate poll result you TR needs to be more balanced then this. By only allowing a scale of reasonable-to-horrendous you have slanted the results of the poll.

      Yes, that might sound irrational but that’s how people behave and it’s well understood in the world of polling science. For once I’m disappointed in your methods TR.

Pin It on Pinterest

Share This