Analysts think AMD could pull out of manufacturing

It’s no secret AMD has taken a beating over the past couple of quarters. Multiple factors, like its aggressive price war with Intel and the delay of its R600 graphics processor, caused AMD to lose a net total of $1.185 billion between September 2006 and March 2007, and the company announced back in April that it would restructure its business model as a result. According to a report by InfoWorld, some analysts are now predicting AMD’s restructuring will go farther than expected—far enough that the firm might pull out of the manufacturing business and become entirely fabless as early as next year.

This is all speculation, naturally, but there may be a hint of truth to it. InfoWorld quotes AMD spokesman Drew Prarie as saying the rumor arose from a “speculative” Goldman Sachs report, but that AMD is indeed attempting to focus more on chip development than manufacturing. Its research and development partnership with IBM, as well as a manufacturing deal with Chartered Semiconductor and the fabless nature of its new graphics division, could be but the tip of the iceberg.

“We’re looking to find ways to extend that model beyond research and development to the full range of the manufacturing supply chain,” Prarie said Monday. “That could run the range from increasing the amount of processors we send out for chartered manufacturing, and could also include things like establishing partnerships on the manufacturing side.”

InfoWorld also cites Citigroup analyst Glen Yeung, who argues the aforementioned plans may result in a “transitional move” by AMD on the manufacturing front. According to Yeung, that move could shape into AMD outsourcing low-end processors to Taiwanese fab TSMC and/or selling a share of its existing and future fabs.

Comments closed
    • pluscard
    • 14 years ago

    The semi industry is cyclical. The design cycles of the two competitors are out of sync. When Intel launched the core2duo, they had both a fresh design and a process shrink advantage. The 65nm process ran cooler, and the shared cache and other enhancements of the core2duo proved to be a superior design. What I’d like to know is why it took 3 quarter for Intel to get back in the lead. (AMD continued to gain marketshare in Q3 and Q4 of ’06).

    Now both companies are on 65nm. AMD has an excellent mobile platform now, with strong battery life and better graphics than Intel. The 4-way Opterons still dominate large servers. Where AMD is behind is only on gaming and workstation.

    The “Barcelona” is not initially targeted at gaming and workstation. It’s a follow on cpu for the server market, doubling the number of cores while being Opteron socket compatable.

    Don’t count AMD out after one bad quarterly report, with the 2900xt issues piled on. AMD picked up the Dell account last year, and Toshiba this year. They wouldn’t be getting these major OEM design wins if their products weren’t competive.

    Hopefully their “gaming” cpu will be out by Christmas – but it’s close. It’s not the top priority for AMD – they’re quite content to rule server + mobile.

    • Wintermane
    • 14 years ago

    Fact is its likely amd cant afford to keep its fabs.

    Just to upgrade one to 32 nm likely will break the bank. We will know soon as they cany hold off for long before fabs either have to go 32 nmor have to be sold before they lose value.

    • blastdoor
    • 14 years ago

    I don’t think there’s any way that AMD could give up all manufacturing and still compete with Intel in anything other than the bargain basement of the market. But maybe they could focus their manufacturing efforts on the more profitable high-end of the market, and then have a close partnership to farm out the rest. Still, this would be nothing other than a desparation move by AMD, and it would hurt their competitive position. But I guess if it allows them to survive, then they’ll have no choice.

    • Snake
    • 14 years ago

    “y[https://techreport.com/cpu/<]§ For example, in 2005 for its desktop machines AMD had 4 separate socket architectures: 462, 754, 939 and 940 Intel, on the other hand, only had 2: 478, 775 This meant that both MB manufacturers and retailers had to have twice as many designs to handle the AMD systems. When AMD was the "pet" of the gamers no one cared...but now, with Intel's Core, people DO care. For some people upgrading to an Intel Core (in the early times) only meant doing a BIOS upgrade...while for AMD you had to buy a new MB and system to match, if you wanted to play around with the new CPU's. For some people this didn't matter...but, for others, it does. So I think AMD's choice has ended up hurting them because people just can't buy a CPU upgrade and keep the money flowing into AMD's pockets easily. They must change motherboards. And, for many people, if they are going to change motherboards, "I'll go with the hottest thing today!"...which are seen to be the Intels. Thereby losing a customer because of no easy upgrade path…due to a complex FSB architecture that is tied into the CPU core architecture. AMD ended up hurting themselves in the long run because their own architecture means that, due to more upgrading of the system necessary, AMD's market share is more prey to the fickle market direction. Intel, by unlocking the two, kept money flowing more easily into their pockets AND now that they are perceived to have a better design, AMD's lack of easy upgrade path means more people are simply leaving the AMD camp to purchase Intel. AMD has locked themselves out of the "easy" repeat business by making moderate upgrades somewhat difficult… and leaving a door open to their competition. I might be wrong…but, IMHO, sales for the last year prove that I am RIGHT.

      • blastdoor
      • 14 years ago

      Sales for the last year show that C2D > K8, and not much else. It sounds to me like you’re making a huge stretch.

      Also, i fail to see the vaunted “flexibility” of the FSB actually employed by Intel. For example, why are there no dual socket Xeon systems that use plain vanilla DDR3 — shouldn’t the miraculous flexibility of the FSB allow them to easily design a dual socket system that uses something other than FB-DIMMs?

      • Anonymous Gerbll
      • 14 years ago

      y[https://techreport.com/onearticle.x/11064<]§ AMD [7 / SS7] -> (Slot A-> 462) -> 754/939/940 -> 1207 FX {4x4} -> [AM2/AM2+/(AM3?)] Intel 7 -> (Slot 1 -> 370) -> (423 -> 478 -> 775) -> 771 {V8} -> 715? -> 1366?

      • blastdoor
      • 14 years ago

      You know, the really fatal flaw with this entire argument is that the integrated memory controller is pretty much the only reason the K8 beat the P4. The K8 is basically the K7 with 64 bit extensions and an IMC. The 64 bitness doesn’t really matter, since most people run 32 bit Windows. It was the IMC that gave the A64 its performance advantage, particularly for games.

      • redpriest
      • 14 years ago

      Snake,

      Your characterization of intel’s 2 sockets is far too simplistic, given the different VRM/processor support that each motherboard supports. Lest you forget, it was only a year ago that there was such a dearth of intel motherboards at Core 2 release that actually supported the processor.

      I remember specifically picking through dozens of different motherboards trying to find one that actually supported the VRM for Core 2. 1st release of 775 != last release of 775. The socket is the same but nothing else is. There is a lot of validation work for even just changing the VRM.

        • Snake
        • 14 years ago

        You are right – but what I am saying is multiply:

        AMD sockets

        by

        chipsets

        by

        feature sets

        It is far too much for any reasonable motherboard manufacturer to support once you get the computation.

        AM2 was _[

          • Flatland_Spider
          • 14 years ago

          <i>I’m not saying I’ve got the answer – I’m just an average idiot. But apparently something was wrong with AMD’s market scheme…because it collapsed really, really quickly, with one stroke, after the competition came out swinging.</i>

          Maybe AMD got hit with a really great chip, and didn’t have a plan for a counter attack to retain all of the fairweather fans. At least we now how many people will go with just the best performing product now.

          AMD’s situation has nothing to do with how many sockets they have, and has everything to do with their inexperience with being in the lead. It’s one thing to chase the leader, and it’s an other thing entirely to be the leader. One mistake and the lead gets blown. As a leader there is much less margin of error.

          You argument only makes sense if everyone went out and bought bare motherboards. Most people don’t. Most people buy a Dell or HP, run it into the ground, and buy a new one. If Intel didn’t spend billions of dollars “educating” consumers about the right brand and put the little Intel Inside stickers everywhere most people wouldn’t have a clue about what is inside their PCs. Just like people who buy Apples only care that it’s an Apple. (Not the greatest example but I’ll take it.)

          AMD still has the stigma of being a budget brand. People didn’t buy AMD because they liked AMD; they bought them because they couldn’t afford Intel. I know several people who bought AMD while holding their nose, and dropped AMD like a used diaper as soon as the Core 2 came out.

          The bottom line is that AMD thought they had the same loyalty that Intel does, but they don’t. They have some, but then the PowerPC architecture has its adherents too, that doesn’t mean it’s a mainstream platform though. They have to pump out great product after great product to stay in the game as they only people they are going to pick up are opportunists.

            • Deli
            • 14 years ago

            i don’t think that’s true. Most ppl just don’t give a crap as long as it works. it could be Intel, AMD, Mickey Mousse – inside, they don’t give a shit.

            man, a long road of bad news for DAAMIT this year. so sad.

            • Snake
            • 14 years ago

            “y[

    • pluscard
    • 14 years ago

    Compare Intel’s R&D budget to AMD’s total revenue.
    ——————————————————————–
    And your point is what?

    • herothezero
    • 14 years ago

    q[< This is speculation of the highest order. I can see AMD selling off their excess capacity but not all of their capacity. Anything's possible I guess.<]q Excess capacity? What excess capacity? AMD has practically destroyed their channel relationships because they didn't have product for them after Dell started buying up all their production. AMD has never had adequate or timely production capacity. q[

    • pluscard
    • 14 years ago

    Goldman Sachs has a major position in Intel. Anything they say regarding AMD should be interpreted accordingly.

    AMD has already spent the money to go 45nm in both FABs.

    We’re in the calm before the storm… AMD is about to make it’s big move with Barcelona.

    Plus

      • NIKOLAS
      • 14 years ago

      y[

    • alex666
    • 14 years ago

    If true, that would be pretty interesting given their investments in fabs in Germany and New York state.

    • flip-mode
    • 14 years ago

    This is speculation of the highest order. I can see AMD selling off their excess capacity but not all of their capacity. Anything’s possible I guess.

    • Dposcorp
    • 14 years ago

    Story from 2003.

    Only real men have Fabs.

    §[<http://www.xbitlabs.com/articles/editorial/display/tsmc.html<]§ lol

      • fyo
      • 14 years ago

      MUCH older than 2003. My best recollection places it in the mid-90s.

      Sanders’ was referencing Cyrix with his quote, so this would have been BEFORE they were snapped up by NatSem, whose website places that event in 1997.

      So no later than ’97, anyway.

    • herothezero
    • 14 years ago

    This would be a disaster for a company that has historically lost marketshare and mindshare for lack of timely, available product. You can’t compete with Intel if you don’t have the means of producing product–because Lord knows Intel has plenty of fabrication capacity and isn’t afraid to use it.

    • Sargent Duck
    • 14 years ago

    When I first heard that AMD was planning on buying ATI, I instantly thought that it was a bad decision, that the billions of dollars would have been spent better elseware, like I dunno, building fabs, research and development, cash for any potential shortfalls (like a price war with Intel). That, and how AMD rested on their laurels during the heyday of the A64. Hate to say it, but AMD’s position is entirely of their own making.

      • Perezoso
      • 14 years ago

      The ATI buyout was corporate seppuku.

        • Sargent Duck
        • 14 years ago

        For anybody else who doesn’t know what seppuku is:

        /[< Ritual suicide by disembowelment formerly practiced by Japanese samurai. Also called hara-kiri.<]/ - Dictionary.com

        • Bluekkis
        • 14 years ago

        Perhaps, perhaps not.

        The way I see it, buying ATI was very important step for AMD. Since they sold out their share of the flash business AMD has been dependent on their CPU buisness only. Buying ATI they got very broad product portfolio and as result they are no longer so heavily dependent on success of one product only. In short term it sure brought lot of problems but in long term, say 2-5 years and more, it was very wise choice. Also as AMD can now offer whole range of platform components (cpu, gpu and chipset) it will in future be important to gain mindshare in corporate environment where they like to buy all components from one source to avoid possible compatibility problems.

          • lolento
          • 14 years ago

          When did AMD sold out their flash business? I always thought Spansion is part of AMD because Spansion is still inside AMD’s building in Santa Clara.

          I think leaving manufacturing would be a wise choice. Nvidia, Qualcomm, Marvell, Broadcom are very good examples of successful companies that adopted the fabless business model (and ATi too). AMD haven’t been able to catch up with Intel in their fab roadmap while CSM and TSMC have been slowly catching up to AMD.

          Financially speaking, fab process R&D is very expensive as compared to design R&D. If AMD’s fab department can put their act together and close the gap with Intel then it would be a different story but at this moment, it makes sense to think fabless. I believe market investors will see this as a positive.

          • blastdoor
          • 14 years ago

          Great theory. Too bad they didn’t buy a profitable business though — seems like that would have been necessary to make your theory work in reality.

          I think there’s no question that the ATI purchase was stupid. It might have long term strategic benefits, but the long term doesn’t matter if you die in the short term.

            • shank15217
            • 14 years ago

            ati is a profitable business.

            • blastdoor
            • 14 years ago

            No, ATI is a break-even business. Some quarters have a tiny profit, others a loss.

            It is ridiculous to suggest that ATI can even remotely cover the losses of the CPU division in the way that Spansion once did.

            The ATI buy was, at best, a longer term strategic play.

            • wierdo
            • 14 years ago

            So AMD chose to risk its short term success for longterm survival instead of progressing into longterm failure for the sake of short term financial stability. That’s the glass half full view of it, but perhaps they overdid it if they can’t pull out of their short term slump though.

            Lets hope we don’t go back to this:
            §[<http://xgatech.com/uploads/photos/589.jpg<]§

      • DASQ
      • 14 years ago

      Those billions of dollars would not have made a difference in any extended, strong price war with Intel. Compare Intel’s R&D budget to AMD’s total revenue.

        • blastdoor
        • 14 years ago

        I think that’s completely wrong. ATI cost over $5 billion. That’s enough money to pay for the 45nm transition and absorb losses between now and then. Those billions matter — a lot. If the bond market regains sanity and starts pricing risk appropriately, then the misuse of those billions of dollars will be even more costly for AMD (ie, they won’t be able to float billions of dollars in bonds that pay relatively low interest).

          • DASQ
          • 14 years ago

          Since when could you just pay for a transition? Throwing money at R&D doesn’t give you equivalent results.

            • blastdoor
            • 14 years ago

            Money may not be a sufficient condition, but it absolutely is a necessary condition.

          • shank15217
          • 14 years ago

          AMD bought nexgen which gave them the experience to introduce a on die memory controller. The on die memory controller saved AMD’s ass. Lets not forget the past. Ati is arguably the leader in gpu technology. Ati’s GPUs have been taking the general purpose path for atleast a year before Nvidia. Frame rates aren’t everything. They stumbled a bit on r600 but stay tuned the for the revision. Reminds me a lot of the 1800 to 1900 revision.

    • slaimus
    • 14 years ago

    The story was probably over-exaggerated. AMD has put in tons of money for upgrading their German fabs and planning their New York one. They will probably cancel the NY one, and that news got blown out of proportion as them no longer having fabs.

      • shank15217
      • 14 years ago

      exactly, analysts are really shooting blanks these days. AMD has a manufacturing system that rival’s Intel. AMD is ahead in many ways over Intel’s processes in manufacturing. They will not just get out of the fab business. They will just not scale up as fast they planned.

        • NIKOLAS
        • 14 years ago

        You sound like you are channelling the Intellectual Giant that is Tom Yager from InfoWorld who said that AMD and IBM are redefining manufacturing.

        Perhaps he meant something other than CPU’s. 😉

    • DukenukemX
    • 14 years ago

    I can see AMD using IBM to make their chips in the future.

      • bthylafh
      • 14 years ago

      Yeah, that worked really well for Apple.

        • Xylker
        • 14 years ago

        Apple did not DESIGN the chips.

          • blastdoor
          • 14 years ago

          so you think it would have been better if they did?

            • d2brothe
            • 14 years ago

            Clearly, one of the major reasons Apple switched away from PowerPC was that they were designed for something other that desktop, and especially, laptop computing, they put out too much heat and thats because IBM was designing them as server chips, clearly that was important.

            • blastdoor
            • 14 years ago

            I agree that economies of scale are an issue here, but that would not have changed if Apple designed the chips. I think the major issue was IBM failing to execute on making the chips on time.

            Apple could have used 3 GHz G5s in the PowerMac, but 3 GHz never materialized. I don’t think that would have changed if Apple were designing the chips.

            Similarly, designs existed for faster G4ish laptops. But the problem was always IBM’s execution in building these things.

            • ludi
            • 14 years ago

            That’s because IBM had better things to do with the product line than make periodic rounds of chump change from a perrenially fickle customer.

        • StashTheVampede
        • 14 years ago

        To be fair: Apple’s PPC were customized processors that WEREN’T sold to others. IBM made “Apple” only chips — a waste of time and effort on both sides.

      • slaimus
      • 14 years ago

      From the ones I remember:

      Cyrix 6×86: IBM sold their own version of the 6×86 and undercut Cyrix’s prices. Cyrix stopped using IBM and was bought by National Semi.

      Nvidia NV40: Poor yields and high costs. Nvidia stopped using IBM for other NV4x chips (except NV45, which is just NV40 with HSI), and never even respun the NV40 to fix the VP.

      Nvidia HSI: IBM did not want to manufacturer it after a while, not sure who makes it now.

    • bthylafh
    • 14 years ago

    How long until DAMIT shows up on f*ckedcompany?

Pin It on Pinterest

Share This