Report: AMD R&D spending falls to near-10-year low

R&D investment is the growth engine of technology companies, and by that measure, AMD's quarterly spending might not paint a pretty picture. EXPreview has aggregated quarterly R&D investment data for AMD and Nvidia into a new report. According to the site's research, AMD's quarterly R&D spending has declined since 2012, and it's currently near 2004 levels.

The report shows how Nvidia and AMD have traded places in quarterly R&D spending over the past couple of years. Nvidia's spending trailed AMD's by over $100 million at the beginning of 2011, after which Nvidia opened its wallet wider. AMD's R&D spending remained flat through 2011 and began to decline in 2012. These days, Nvidia outspends AMD on R&D by margins similar to AMD's lead during 2011: $348 million per quarter for Nvidia, as opposed to $238 million for AMD. Nvidia spends a greater percentage of its revenue on R&D, too: 31%, as opposed to AMD's 20%, according to EXPreview.

Image: EXPreview

AMD's higher spending in years past continues to pay dividends today. The company's GCN GPU architecture came to market in 2011, and its low-overhead Mantle API essentially laid the course for the next generation of graphics APIs like Vulkan and DirectX 12.

AMD is still working on its next-generation Zen x86 CPU microarchitecture, the fruits of which we may see as soon as next year.

Of course, the elephant in the room is Intel, whose current quarterly R&D budget is nearly $3 billion. Since Intel is both a semiconductor design and manufacturing firm, its R&D budget encompasses the development of new chip fabrication processes, which are hugely expensive achievements. Neither Nvidia nor AMD owns its own chip fabs, which might help to explain some of the disparity in R&D spending.

 

Comments closed
    • spugm1r3
    • 6 years ago

    [quote<]AMD's higher spending in years past continues to pay dividends today.[/quote<] I imagine the 7% of their staff that no longer works for them may beg to differ.

      • BIF
      • 6 years ago

      It’s a lot more than 7%, yes?

    • ronch
    • 6 years ago

    After rolling all this in my head, I’ve realized that maybe, just maybe, AMD is putting all their resources into future products. While lower R&D spending is not a good sign especially for a company you know is already lagging behind, it seems that for practically every product lineup they currently have, money isn’t being spent to move things forward and keep the current products evolving in any significant way. The FX is practically at a standstill. Chipsets too. GCN only sees minimal development. Embedded products are just respins of desktop/laptop chips. APUs only see minor changes. While Steamroller and Excavator have seen significant changes and may have burned some R&D money, I reckon they weren’t that expensive (SR, for example, merely had another set of identical decoders bolted on and EX is just the same arch put through some automated design tools geared for density).

    So what could this mean? AMD knows its current CPU products are less than stellar and they have dwindling funds. In this situation, they have to choose which limbs to let die and which ones to save. They need to choose their priorities very carefully. They know there’s no way they can revive Bulldozer so they’re practically leaving it at that and throwing all their weight into their next CPU and GPU designs. GPUs are one of their crown jewels so they continue to invest money on that but perhaps they needed to cut back on it too, so that’s why you see few changes to GCN (and then it’s also only available in higher end SKUs) while they concentrate on their next-gen parts.

    Just my $0.02. Like everyone else this is a real head scratcher but also somewhat to be expected from a company that’s been on the brink of extinction for a while now. Running AMD can’t be fun these days.

      • Chrispy_
      • 6 years ago

      It’s an optimistic theory.
      Given AMD’s mismanagement issues I’d be stunned if they were actually doing that but optimism is good, especially when the alternative is that AMD can’t compete against Nvidia with GPUs the same way they are hopeless against Intel’s CPUs.

    • madtronik
    • 6 years ago

    I have read comments and articles about these graphs and I think they (the graphs) are mostly meaningless.

    First thing is, we do not have a clue how those expenses are computed. What is R&D and what not although we have some data.

    Intel R&D is a lot higher because they need to spend an outrageously amount of money in cutting edge fabs. That is their advantage. But it is a gargantuan fixed cost. That alone explains most of the differences between Intel and the duo NVIDIA/AMD which are fabless and their foundry costs are variable.

    For the “sky is falling” guys about AMD there are a bunch of things you do not understand. Both here and in many articles I’ve read where the editors seem clueless.

    First, as some comments say, most companies spend a % of their revenue as R&D. If you sell less you just spend less. The % is mostly fine.

    Of course you can say that as revenues fall and you R&D less you have less technology available in the future that will suffocate your financials because your products are not competitive. Good point. The thing is, you have to know the nuances of the semi industry.

    When you think R&D many of you think of big block diagrams and about caches, AGUs, ROPs and so on and so forth. That is only part of the story. A big part of the expense is designing a chip for a given process (that is, laying the actual transistors), and costs related to the manufacturing (paying for the photomasks, the tapeout, etc). The thing is, AMD is not making that many new chips anymore at the moment.

    If you recall, the last high end CPUs that AMD released were in 2012 (I am not counting factory overclocked versions of them that AMD released later). When is the last time you heard about a new Opteron processor released? There have been already several years without new products.

    When your company is in dire straits because of falling sales and strong competition you must select carefully your battles. That is, you must spend your R&D money in products you know have potential to sell. Also you need to spend efficiently the money you have. AMD is doing both (they have no other option).

    Bulldozer is a design that needs a great process to shine. High frequency (a lot of mhz). Current processes are crap for this (this was a BIG foresight error). Years ago, even after the spin-off of GloFo AMD paid top dollar to GloFo for developing high end processes specially tuned for them. In fact, they were the only customer of those high end processes like the ancient 32nm SOI that is still used in the FX line. The problem is that even working this way it was a costly proposition (only workable with high volumes) and they were tied to a single foundry. If the foundry screwed big time or had a delay you had no product. With Llano the GloFo problems were disastrous.

    So, now AMD works with standard processes and can move their designs to other foundries if problems arise or there are delays. A lot of chips have been produced already at TSMC. In addition they no longer spend money paying for special processes (less R&D). They use the same than everyone else and have lower costs. The problem now is that they have lower transistor performance.

    This lower performance transistors make pointless trying to design new high performance chips based on bulldozer. There is no way you can get decent performance out of there. That is why we don’t have any 28 nm FX cpu. Maybe even they would be SLOWER than 32nm SOI. So, what’s the point in taping out new chips? (you can follow the exact same reasoning with the Opterons which are the base for the FX).

    They have also learned to not depend on bleeding edge processes (using them when they are just in their infancy) that lay to waste their designs when the inevitable delays rear their ugly head, someone remember this? (http://www.techpowerup.com/150691/amds-next-generation-wichita-and-krishna-apus-detailed.html) That was a lot of R&D down the toilet.

    In addition, AMD is spending less in GPUs. Not in new designs, in chips. There was a time when a new architecture was introduced there were new designs over nearly all the price/performance spaces. Usually 3-4 different chips, variants of the same new architecture for different price points. Nowadays, AMD just sells you the older chips in a lower bracket and the all new chip is the higher end (and maybe something totally new in the middle). A more opportunistic approach. This way you spend quite a bit less but the high end architecture is top notch. I don’t think the R&D is much reduced in the new design. You just jumble around less variants.

    And then you have higher efficiency in the way you spend:
    [url<]http://news.synopsys.com/AMD-and-Synopsys-Expand-IP-Partnership[/url<] Things like this are a lot of LESS money for R&D that don't affect the pipeline in a significant way. Of course they should be designing new architectures to begin competing again but the sad part is that without the new FinFET processes available there is little to be done. So, until 2016 there is no point in releasing many new things. In addition, a big chunk of AMD revenues are semi-custom chips with R&D paid by the customers and that after that don't require new R&D (nor marketing, sales, etc). There are new semicustom chips in the pipeline right now. The R&D is paid by others and perhaps is not computed there. I do not know, I am not an expert accountant. Even if AMD doesn't churn that many chips, nothing stops them from creating full new designs for the future. Making the designs is a lot cheaper than putting them in silicon. In fact, making a new chip is getting a lot more expensive each year. Expect to see less variety of chips even if AMD fortunes turn around significantly. On the other hand you have NVIDIA which spends a lot of R&D making chips for self-driving cars and spends a bunch also designing and marketing new consoles themselves because the big fish are so cheap and mean they don't want to use their great technology. BTW, AMD has said that their priority right now are SERVER chips. Don't expect many new things for your AM1/FM2 rigs. A pity...

    • Srsly_Bro
    • 6 years ago

    For any awesome accounting people or finance people (awesome is exclusive to accounting), I did some quick research on the ATi acquisition. The purchase of ATi has been speculated to have included $3.2bn in goodwill and $1.2bn in intangibles. The analyst further speculates AMD has written off almost half of the $5.6bn purchase price of ATi in an 18 month period.

    AMD really overpaid for ATi.

    [url<]http://www.forbes.com/2008/07/11/advanced-micro-closer-markets-equity-cx_mp_0711markets40.html[/url<]

      • melgross
      • 6 years ago

      Except that but for ATI, AMD would be much further in the hole than today. They might even be out of business.

        • NeelyCam
        • 6 years ago

        Not so sure… all that money could’ve been spent on server chip research – a profitable market where AMD was kicking Intel’s butt

      • ronch
      • 6 years ago

      1. They really needed to buy ATI, but they did overpay for it.

      2. If they didn’t spin off their fabs, AMD might very well be gone today. They simply don’t have the silicon revenue to keep those fabs. Problem is, they inked a difficult agreement with the spun off company, and have for years suffered one-time-charges that never seem to end.

      3. Having overpaid for ATI, one would’ve thought that they would keep what would turn out to be a crown jewel and sold it to Qualcomm for peanuts.

      AMD sure is good at overpaying and undercharging, and looks pretty bad at negotiating deals.

        • Chrispy_
        • 6 years ago

        [quote<]AMD sure is good at overpaying and undercharging, and looks pretty bad at negotiating deals.[/quote<] Mismanagement. Let's hope Lisa T. Su isn't quite as incompetent as her predecessors.

          • ronch
          • 6 years ago

          I think Hector was really the guy who messed things up, and Dirk was really better off as a top-flight chip engineer than a CEO.

    • ronch
    • 6 years ago

    Is that graph adjusted for inflation? Because if it isn’t, it just means AMD is actually spending LESS on R&D nowadays than it did ten years ago.

    And as designs become more and more sophisticated, R&D expenditures tend to go higher. It seems either AMD has a very lean cost structure (highly unlikely given their history) or they are doing the nearly-impossible of creating more advanced products with less money.

      • Srsly_Bro
      • 6 years ago

      Many people don’t take inflation into account. Throughout that time period inflation rates have been quite low, and lower than historical averages. I don’t think factoring inflation would change the graph much. AMD also seems to have had a steady decline in R&D spending throughout the graph, where we could have made the same news almost every day since then..

        • VincentHanna
        • 6 years ago

        Inflation in California has been “relatively low?” Relative to the hyper-inflation crisis of the 70s, or ______?

          • Srsly_Bro
          • 6 years ago

          Thanks for your comment. AMD, Intel, and Nvidia don’t operate solely in the state of California. These are international companies, with R&D facilities located outside of California and even outside of the United states. The dollars spent aren’t earned only in the United States, either. Your concern should should be focused on inflation on a more global scale, as international companies operate globally, and not confined to one state. With respect to taxation, that’s a different argument.

          To answer your irrelevant concern with respect to California’s inflation, you should check out the BLS for California’s historical inflation. You’ll find your answer there.

          -srsly_bro

      • VincentHanna
      • 6 years ago

      That graph only goes back 5 years, fyi…

      They also do some of their R&D in India (now), and rely a great deal more on 3rd party R&D since they no longer make their own chips. AMDs research today, and their research in 2005 isn’t exactly an apples to apples comparison… dollar for dollar.

        • madtronik
        • 6 years ago

        I forgot to mention also that in my comment. In the end, the graph gives very little useful information.

        • ronch
        • 6 years ago

        The graph doesn’t go back ten years but the article does mention that their Q4 2014 R&D spending was near 2004 levels.

    • ronch
    • 6 years ago

    So is the APU really a good idea? Combining CPU and GPU cores that need to be compromised on one piece of silicon because both need to adhere to a limited TDP and manufacturing process that is ‘jack of all trades, master of none’?

    People who are pro-AMD always insist that APUs are the answer to a budget gamer’s prayer, but is it really? A serious gamer will never settle for an APU (more likely he’ll just save a little bit more) and casual gamers would do just fine on a Pentium + cheap discrete combo. Office use? Pentium will do great. Core i3 blows the 7850K out of the water in general performance for about the same cost or less. If APUs are as great and attractive as AMD and some fanboys make them out to be, AMD probably wouldn’t be in the red Q after Q. HTPCs are a good fit but how many people are building those? Not enough to float AMD’s boat, I think.

    Thing is, AMD should just offer GPGPU compute on a separate card that’s unfettered by CPU cores. Intel has Phi, Nvidia has Tesla. Nobody buyd APUs for serious GPGPU compute anyway and the finalization of HSA took soooo long. Does anyone really think many devs will jump right on the HSA bandwagon? Don’t think so. And even if they do, it’ll take a long while for programmers to get cozy with it enough to make APUs sell like crazy and keep AMD’s lights on. And of course, Intel is also doing their own thing so it’s more likely they’ll get their ducks lined up sooner than AMD. It’ll take Intel a while to get Phi rolling, but AMD is just too slow and Nvidia is already selling Tesla for a while now.

    I know some people swear by APUs, but sales numbers don’t lie. It’s a bad idea, a jack of all trades, master of none. HSA has promise, but given AMD’s pace it’ll miss the bus. With less money being spent on R&D, the future just doesn’t look very promising.

      • nanoflower
      • 6 years ago

      I can’t see people jumping in to HSA when they have DX12 to try and fully take advantage of. That’s going to occupy them for the next year. Plus, there aren’t that many design wins for AMD currently to make it worth while for developers to spend a lot of time on HSA. Maybe next year if AMD comes out with something special that allows them to get design wins from Intel.

      • madtronik
      • 6 years ago

      Of course! AMD should be restricted to GPGPU via the slow PCIE connection… Do you know that next-gen Phi is not a card but a regular CPU that fits in a socket? Do you know that NVIDIA is busy with their new NVLink connection in partnership with IBM so Intel don’t blow them out of the water?

      Have a clue what HSA is? HSA is a complex hardware and software system so regular programmers don’t have to learn the nuts and bolts of GPU programming in order to use heterogeneous computing resources. It should be one of the easiest ways to program an APU.

        • maxxcool
        • 6 years ago

        Phi is not a gpu.

        • ronch
        • 6 years ago

        Just because an APU will be easier than ever to program doesn’t mean everyone’s gonna jump on it already.

          • BIF
          • 6 years ago

          I think a lot are already worshiping at The Altar of Apoo. It’s all I hear about and frankly, I’ve had it up to “here”.

            • ronch
            • 6 years ago

            It’s gonna be 3DNow! 2.0.

    • the
    • 6 years ago

    This is a bit deceptive as 10 years ago, Global Foundries was still part of AMD. Researching semiconductor manufacturing takes a lot of money. Reflecting that as a percentage is another matter (not sure how the math works out off hand).

    AMD has also completed some major R&D for chips targeted at the console business. At this point, AMD would only be handling die shrinks of these chips and considering the delays of 20 nm and 16/14 nm FinFET from the foundries, they could be sitting on taped out shrinks and just waiting on manufacturing to catch up.

    Still, the drop in R&D is a bit concerning. It likely won’t affect Zen that much. The design should be going through logic validation in simulators right now with a tape out later this year. Then what follows will be some test wafers and more validation before going into production for a 2016 release. Ditto for the K12. These stages are rather straight forward and only really fall back to the design phase when major bugs are found.

    The problem for reducing R&D is that the designs after Zen and K12 are affected. Considering the cadence the industry moves at, the next iteration would likely be a tweaking of those chips and/or a process node shrink. Not the most exciting things but they still make money to accomplish and should be in the design phase right now. The next iteration after this is where the loss of investment will really have an impact. Long term this isn’t great but right now AMD needs to focus on their short term survival. So I can’t fault them for their strategy right now but Zen and K12 need to be a hit.

      • sweatshopking
      • 6 years ago

      Zen wont be a hit.

      • ronch
      • 6 years ago

      According to the graph, AMD’s R&D expenditures started to decline in 2012. At that time, Zen would’ve been in the early/mid deign phase.

        • the
        • 6 years ago

        At the same time, the console SoC’s were entering the validation stage to get ready for mass production.

        Similarly, AMD also cancelled several chips in that time frame. This is why we don’t have a Steamroller or Excavator based core design for the consumer AM3+ socket, or the server focused C32/G34 sockets. Such chips were originally planned and on the roadmap before being cancelled. These are the products we’d be seeing released right now.

        Zen and K12 on the other hand would have started their initial planning in 2012 or early 2013. This would have been after AMD started to reduce their R&D budget.

          • ronch
          • 6 years ago

          Seems to me AMD has canceled too many projects in the last 10 years or so. Rumor has it that the K8 we know is a quick band aid project to replace the original K8 project (which was actually what Jim Keller worked on, not the released K8). Every canceled project is akin to a flushed CPU instruction pipeline due to a branch misprediction. In this case, lack of management direction and vision causes real rocket-science work and millions and millions of R&D money (not to mention time wasted on to-be-canceled projects) to get flushed down the drain. Then AMD fires the engineers, which just makes the next companies these engineers work for the beneficiaries of the things these engineers learned while working for AMD before their projects were canned.

            • the
            • 6 years ago

            We got the K8 that Jim Keller worked on. The catch is that he left AMD before it was released. It also took a very long time to validate as it was just not a new chip but contained many fundamental changes to the under laying ISA.

            Though ultimately, AMD did need better management. However a few choices did pay off in hindsight like spinning of their fabs to create Global Foundries. AMD simply would not be here today if they didn’t pull off that move. AMD couldn’t provide the volume to keep those fabs going and be economically feasible. Spinning them off and then opening them up has made sense in the long term. However, I do believe that AMD sold off their stake in Global Foundries a bit too early.

            • ronch
            • 6 years ago

            While spinning off GF gives them some breathing room in the short-term, as we can see it doe present entirely new challenges to the company. Broadwell, and all other Intel chips for that matter, simply wouldn’t be as fast or energy efficient if Intel didn’t have complete control of their process tech. However, in due time AMD will learn how to adapt to the new paradigm, if they survive long enough, that is.

    • southrncomfortjm
    • 6 years ago

    I guess there could be worse things than Samsung buying AMD and throwing money at the problem.

      • ronch
      • 6 years ago

      Like the world blowing up?

    • ronch
    • 6 years ago

    I posted about this in this week’s Wednesday shortbread. You’re welcome.

    • HisDivineOrder
    • 6 years ago

    I just don’t get why it’s so hard for AMD to see the problems in their strategy. If you stop spending in R&D, you’re essentially selling old technology again and again. How long do you think you can cover that over with bundles and rebrands before people stop buying your hardware? I mean, the R9 290/X wasn’t even far and away ahead of nVidia’s best when it came out at release, but it did have a decent price strategy going for it.

    Sitting on that for a year though? Planning to probably rebrand that when the R9 390/X comes out?

    Their APU’s are a complete failure when they can’t even price them low enough to actually beat the cost of a dual core Intel CPU plus a low-end (not even what we’d call low end years ago) discrete card because most people buying a PC that will want an APU for GPU performance are going to be fine going a little bigger to get access to more upgrades for your GPU. And taking the once venerated FX branding and just squatting down, spewing crap all over it with a Bulldozer-based chip they HAD to know was going to disappoint …everyone? They even fed into that disappointment by promising performance that would NOT materialize right ahead of its release.

    So how does AMD respond to this? Do they do an about-face that realizes the holes in their strategy? Do they double down investment in R&D and focus engineers on superior architectures? Nope. They can a great majority of their engineers, support, and marketing teams. These are the people that could have helped them turn the whole debacle around.

    Then they go and hire a few legends in the industry. But they just canned most of the incredibly talented people that could have helped those industry legends actually do anything. They then promise a future architecture by said legends will save the day while continuing to sell the disappointing tech by their own admission.

    And the worst part? The CPU division is dragging down a GPU division that was once very profitable, even long after the CPU division had gone down in flames. If only AMD could find it within themselves to spin off or sell off the ATI part of AMD, then we’d all have competition for nVidia still after AMD’s CPU division implodes from the weight of poor (mis)management.

    Instead, they’ve gone on so long that now even the GPU division feels the brunt of the lack of R&D. The R9 290/X was originally scheduled to be revealed/come out even(?) in January of the year when it finally came out in October. AMD has been stretching their GPU’s based on GCN across multiple years rather than release them all at once like they once did so they can look like they’re doing more than they actually are. And they refuse to create a new reference cooler to match the excellent one built by nVidia and used as recently as on the Titan X for people who want/need a quality blower.

    It’s like watching a married couple where the hot chick you just love is being driven down the road by a drunken, delusional, drug-addled senile old geezer who is absolutely certain he can get rich if he gets down to the Gas ‘n Sip just across APU Street. And you call out after the hot chick because if she’d just get out of the car, she wouldn’t get hit by the Mack Truck barrelling down a side road. You just know AMD’s not going to stop for the stop sign and the truck’s gonna hit them both, obliterating them.

    But you can’t do anything to stop it. You want to save that hot red head. She’s still got great hardware, even if she ain’t got the money to get the hottest new drivers, but she could do a lot more than she is.

    And she’s gonna die, leaving us with a choice between the spoiled, whiney, clingy rich, stacked green-haired chick and her loose blue-haired frenemy.

      • odizzido
      • 6 years ago

      it is a shame. When AMD first grabbed ATI I was kinda excited to see what would happen with them having access to some more advanced manufacturing and AMD’s other resources…..but then nothing interesting happened and things went downhill for AMD. Now I wish AMD never purchased them.

      • nanoflower
      • 6 years ago

      I wonder how much of the APU cost/performance issue is due to the engineering and how much is due to the fact that they are stuck on a 28NM process while Intel keeps moving ahead. I’m sure they assumed they would at least be on a 22NM process by this point when designing their APUs so that must have some negative effect given the difference between their initial design assumptions and what the reality was when they finally went to production.

      • ronch
      • 6 years ago

      AMD’s current situation is a result of past mismanagement, specifically during Hector’s watch. That was the time Barcelona was in development (Jerry stepped down in 2001). He was also CEO when the Bulldozer specs were frozen, which was in 2005/6. Barcelona and Bulldozer have been, as we all know, less than blockbusters. AMD managed to keep fighting in 2009-10 by pricing competitively but Bulldozer, while technically sound, embodied AMD’s decision to ‘take a step back’ and go for the ‘optimal design point’ (which is another way of saying ‘spend less on R&D even if performance suffers’). The thing is, apart from the expense of buying ATI, the 2003-2006 years were AMD’s best years ever (ATI was only bought in Q4 2006), as they were able to charge almost a grand for some of their chips just like Intel does and enjoyed their best days in the server side. In other words, they had the money to burn into R&D and keep up their winning streak, but they didn’t because they consciously took a step back.

      Building a design that shoots for the stars doesn’t guarantee success, so what more if you decide to just shoot for the moon? They said going toe to toe with Intel almost killed them, but I reckon their best days were when they did go toe to toe with Intel. When you compete with a big company, you need to keep up. You can’t say you can’t match them and decide to step back. In today’s world, either you keep up or you keep out. In AMD’s case, they thought they could go against the Mercedes of processors by offering a Toyota and hoping people would be OK with it. In some ways, people are, but when that Mercedes sells their cars for $20,000, you need to sell your Toyotas for a lot less to compete. And that can end you right there while they can just weather out the storm.

      • blastdoor
      • 6 years ago

      A couple of hundred million dollars can be a lot of money if it’s spent wisely. Many successful start-ups that become multibillion dollar success stories do so with a lot less early R&D spending than what AMD is doing. (also, I suspect Apple spent a lot less money than this to develop the iPod, and that seemed to work out well for them)

      Those industry legends may be exactly the people to make wise decisions in the spending of that R&D money. (edit — engineers are great, but they often get lost in the weeds. Without appropriate guidance/leadership, they can burn through billions and have nothing to show for it)

      But there’s no doubt this is do or die time for AMD. The clock is ticking loudly.

        • ronch
        • 6 years ago

        R&D spending can be done more wisely but AMD is not known for spending their money wisely.

          • blastdoor
          • 6 years ago

          Sadly, that is true, at least for the last decade.

          • the
          • 6 years ago

          Indeed.

          The obvious ones were the cancellation of Steamroller and Excavator chips for socket AM3, C32 and G34. Not only were the chips cancelled, but AMD ended up paying Global Foundries for a reduction in orders.

          AMD did a lot of research into memory extenders/buffers for socket G3 that was never released. This may have tied into a partner ship with IBM where they were rumored to release a common POWER/Opteron socket. While AMD killed off Socket G3, IBM did eventually use memory extender/buffers for the POWER7 chip. How much of that stemmed from an AMD collaboration is an open question.

          A common socket for Opterons and Radeons for HPC workloads was rumored to be in development before being scrapped. Not sure how solid this rumor was to reality but the Radeon chip for this was not initially HSA compliant with that feature coming later.

          There was also a rumor of AMD having Steamroller and Excavator based Opertons using a new socket that provided on-die PCIe connectivity and DDR3 memory. This was cancelled relatively early as by the time it was released, DDR4 would have been on the horizon. The choice was made to continue with socket C32/G34. As it turned out, DDR4 took longer to appear than initially projected but by the DDR4 roadmap slipped, AMD had cancelled Steamroller and Excavator based Opterons anyway. So the entire point was moot. Also this socket likely tied into the above rumor of GPU sharing an Opteron socket.

          Then there is the whole ZRAM fiasco where AMD sunk millions into a SRAM replacement that never materialized. AMD also managed to do some research into eDRAM for caches but never used it as well.

          Mantle can be put on this list as they spent the money developing it and then essentially giving it away to be rebranded as OpenGL Vulkan. Granted a few things were changed to make it more open for non-AMD GPUs to use but they did the bulk of the grunt work. Spreading out development of Vulkan with other vendors would have been significantly more cost effective (though it also could have lead to Vulkan arriving far into the future). Essentially the concepts of Mantle and subsequently Vulkan are important for the industry as a whole but AMD paid the most of the check to get it done when they don’t have such free money.

          AMD also did a bit of software development that went no where. They have OS X drivers for their VLIW4 based GPUs. This is noteworthy as there was never a Mac version of the Radeon 6970. Conceptually Apple could have used it in the Mac Pro but the more likely scenario is that Apple was considering Llano in a Mac Book/Mac Book Air. Regardless, AMD built a driver for product that never saw the light of day.

          AMD bought SeaMicro to build a next generation interconnect between SoCs. This has yet to be release though it is also something that we could see surface with Zen/K12. To date, this acquisition has been a bit of a waste.

        • sweatshopking
        • 6 years ago

        I’d say that until very recently apple wasn’t really in the research biz like the others. certainly development, but they weren’t actually inventing anything. Just putting others inventions together.

    • Generic
    • 6 years ago

    I thought DX12 had been in the works for a while. How did Mantle lay the way for Microsoft?

      • HisDivineOrder
      • 6 years ago

      AMD keeps repeating it and if history is any indication, repeating something enough times and people will start to just go along with your take.

      The original reason people thought DirectX 12 wasn’t coming was an article by a guy at AMD who went on the record (and then recanted) that DX 12 wasn’t coming any time soon. Months later, AMD announced Mantle as a “response” to that.

      But all people remember is that “DirectX 12 wasn’t coming,” then Mantle came, and DirectX 12 followed. They don’t remember that the only person saying DX 12 wasn’t coming was AMD in the first place.

      • Ninjitsu
      • 6 years ago

      Yeah, I don’t know why Jeff decided to write that line.

        • nanoflower
        • 6 years ago

        Because everyone keeps saying it like it was the truth. I noticed that with the PCPer articles they don’t even suggest it could be that Msft was working on DX12 before Mantle. It’s just taken as gospel that without AMD we wouldn’t have Vulkan or DX12. Never mind that everyone in the field (including from AMD and Microsoft) say that developers have been pushing for low level access to the GPU for years.

    • UnfriendlyFire
    • 6 years ago

    “FYI, you dont need resources when you have great talent =D
    remember the Athlon 64… Radeon 9700, Radeon HD 3870 etc
    hell even the FX-8350 holds pretty well still and thats pretty old by now”

    Except if you don’t pay enough, you won’t have enough engineers/researchers to go around. Intel has a wide variety of projects ranging from networking to fabbing to CPUs, and thus need a huge R&D staff to run it.

    Also, having good equipments and other resources help. Have you ever tried creating a 3D design using 2D methods on a 1998 software?

    There’s a community college that still uses those vintage engineering software because they won’t pay the huge price tag of having the latest 3D CAD software.

    EDIT: I have been on an understaffed project that had an extremely ambitious and short deadline, yet the people who set the requirements expected high quality work. They were in for a disappointment.

      • K-L-Waster
      • 6 years ago

      [quote<]I have been on an understaffed project that had an extremely ambitious and short deadline, yet the people who set the requirements expected high quality work. They were in for a disappointment.[/quote<] Fast, Good, and Cheap: pick 2. AMD certainly hasn't been doing fast. In GPUs at least they seem to be doing Good + Cheap....

        • UnfriendlyFire
        • 6 years ago

        Well it appears they delayed their new GPUs due to large inventory of older ones.

        AMD got quite punked by the crypto-coin mining craze. First the demand went through the roof and their GPUs cost a lot more than Nvidia’s for the same gaming performance.

        Then after they get the supply to match the high demand, a bunch of the mining GPUs get dumped onto eBay or Craigslist when the mining craze ends.

      • Sam125
      • 6 years ago

      I hear AMD is big on automating circuit paths for their CPUs while Intel doesn’t. So no, you don’t know what you’re talking about because AMD isn’t using outdated tech. Their current CPU just sucks. 😉

        • UnfriendlyFire
        • 6 years ago

        It cost much more to hire a team of electrical engineers to hand-craft critical circuits rather than letting an automated software do the work (at the expense of unoptimized designs).

        • Klimax
        • 6 years ago

        Cost is lower performance, lesser efficiency and more transistors required. (like longer traces because most optimal layout and tracing is not solved problem)

    • wiak
    • 6 years ago

    FYI, you dont need resources when you have great talent =D
    remember the Athlon 64… Radeon 9700, Radeon HD 3870 etc
    hell even the FX-8350 holds pretty well still and thats pretty old by now

      • K-L-Waster
      • 6 years ago

      Technology is more of a “what have you done for me lately?” field….

    • wiak
    • 6 years ago

    to make it fair, it should have a combined AMD + GlobalFundries too
    that makes more sense as they was the same company half a decade ago

      • UnfriendlyFire
      • 6 years ago

      I disagree. GF is more interested in making silicon processes specifically for smartphone/tablet.

      To say that Bulldozer had some problems on silicon dies not meant for high clock rates would be an understatement.

        • nanoflower
        • 6 years ago

        Isn’t everyone in the chip production business interested in smartphone/tablet market. Samsung certainly is and it seems that TSMC is too. The market is just too big to be ignored. That doesn’t mean that companies other than Intel won’t push forward with processes that support high speed chips but it doesn’t seem like that will be in the forefront again.

      • ronch
      • 6 years ago

      Except the fact that GF also spends money to build chips for other companies would muddy the numbers.

    • MetricT
    • 6 years ago

    AMD has great engineers, even one of Intel’s engineers admitted that much in a Reddit AMA.

    [url<]http://www.reddit.com/r/IAmA/comments/15iaet/iama_cpu_architect_and_designer_at_intel_ama/[/url<] AMD has two major technical problems: The fabs they used couldn't keep up with Intel's lithographic progress and their CPU architecture assumed gains from frequency scaling and increased parallelism that didn't materialize. There's not much they could have done about the former, and the latter was compounded by the amount of debt they had. They couldn't afford to rearchitect on the fly. AMD may be down and out, or it may be on the upswing again. Between the Zen architecture and gaining access to Samsung's 14 nm process, they may actually become competitive again in 2016. Time will tell, and I wish them the best.

      • HisDivineOrder
      • 6 years ago

      AMD’s always got the big performance gain just around the corner. And AMD’s always got an excuse all picked out for why their latest “big performance gain” didn’t show up the last time. This is true of AMD with regards to the Bulldozer, Phenom II, and Phenom. Every iteration on Bulldozer has been about trying to bridge the gap with Intel CPU’s from yesteryear.

      I want AMD to compete. A lack of R&D investment is a poor indication of a real investment in doing that. It IS a sign of their current strategy of selling everything not nailed down and rebranding anything they’ve sold less than across two generations to be sold for at least three total generations under different names and numbers.

      Oh, and APU’s. Because those are just selling gangbusters.

      • BobbinThreadbare
      • 6 years ago

      [quote<]their CPU architecture assumed gains from frequency scaling and increased parallelism that didn't materialize.[/quote<] If only they could have seen this coming. Like their biggest competitor doing a very similar thing a few years before. I guess absent that kind of warning though, it's a natural assumption to make.

      • jihadjoe
      • 6 years ago

      Nice mention of TR as one of the sites he likes to read after RWT and Anandtech. The coverage here isn’t quite as technical as AT, let alone RWT so I can imagine why an engineer would prefer them, but the writing is lighthearted and makes technical details digestible even for the lay person.

      I also thought AT’s quality would go down after Anand went away, but Ryan Smith has been doing a good job holding things down. Kudos to him.

    • jjj
    • 6 years ago

    The way to look at it is % of revenue.
    20% is normal for a chipmaker. Intel , Qualcomm, Mediatek are at about 20% too.
    Nvidia last year was at 29% but that’s not so much because ‘they are investing in the future” but because their Tegra business is in terrible shape. They would be a lot happier if R&D was 25% of revenue but , for now, their revenue is trailing behind.
    For AMD the real problem is that they are just not selling anything. Their CPU and APU business is tanking hard and in GPU they lost a lot of share.

    PS; Apple last year had R&D at 3.3% of revenue vs Sales General and Administrative at 6.65%. Chipmakers invest in R&D (and don’t really get the credit for it) , device makers in marketing.
    Google is someone worth noticing though, they were at 14.9% of revenue even if they wouldn’t need much R&D for their core business. They really are investing in the future, remains to be seen if they actually manage to translate that R&D into sellable things or they mess it up every single time.

    Edit: Figured i better explain why one must look at it as % of revenue. Everyone has a business model and they target certain gross margins, certain spending and so on. R&D is just part of that math. They don’t always manage to reach their targets but they try to.Nvidia or Marvell for example have their revenue too low right now.So the amount chipmakers spend on R&D is less relevant, deviations from their target model are what is worth noticing.

      • K-L-Waster
      • 6 years ago

      [quote<]Figured i better explain why one must look at it as % of revenue. Everyone has a business model and they target certain gross margins, certain spending and so on. R&D is just part of that math. They don't always manage to reach their targets but they try to.Nvidia or Marvell for example have their revenue too low right now.So the amount chipmakers spend on R&D is less relevant, deviations from their target model are what is worth noticing.[/quote<] From an accounting perspective, that's true. OTOH, in terms of how strong their tech will be in the future, absolute spend is more important because it determines how many engineers are working, what resources those engineers have to work with, how many parallel lines of inquiry you can follow, etc. Ultimately, the scale of those things is determined by absolute dollars, not % of revenue.

        • jjj
        • 6 years ago

        Except it’s not news when the R&D is proportional to the size and the sector of the company.
        Like here, AMD is in trouble we already knew that and R&D is not low, it just tracks with revenue, so this so called report is pointless.
        One could look at AMD properly and show how troubled they are and that would be a relevant report. Or look at Google or Nvidia and point out the oversized R&D and try to understand it.
        This is just for clueless people by clueless people. It’s similar quality to … The Verge talking about CPU architecture.

          • UnfriendlyFire
          • 6 years ago

          Or WCCF Tech trying to predict the future.

          • Srsly_Bro
          • 6 years ago

          Who are you, who are so wise in the ways of accounting?

      • Sam125
      • 6 years ago

      As someone who would be considered “Smart Money” (although I’m not wealthy yet) Intel is very wasteful with their R&D. They research a lot of topics that they deadend or simply scrap because someone in management isn’t in love with the technology. Internal politics decides which research path is taken, instead of which research path will lead to the most fruitful results. Which happens at every large innovation oriented company, but I’ve heard it’s especially bad at Intel. Their stock price reflects this.

      Plus a behemoth like Intel needs to throw money at R&D or else they will definitely lose scientists and engineers to smaller companies like AMD or be lured away by secondary competitors like Apple, Microsoft, etc. Heck, I bet even Korean companies like Samsung or LG would be poaching employees left and right if Intel left that kind of an opening with their R&D staff. That’s just the nature of a competitive market for labor. So reports like these are “meh”. They don’t really prove anything, other than how much money they can afford to toss at “R&D”.

      • Deanjo
      • 6 years ago

      [quote<]PS; Apple last year had R&D at 3.3% of revenue vs Sales General and Administrative at 6.65%. [/quote<] Apple's R&D at 3.3% still is 6 billion invested into R&D, double that of intel. Percentages are a horrible way of evaluating a companies investment into R&D. All it shows is that a company with low R&D % but makes massive revenue is that they are getting better bang for their buck with their R&D dollar. In Apples case, focusing their R&D $ in a narrow field of focus has provided them with a product that they can reap in huge margins with but make no mistake, the total dollar investment into that narrow field is huge.

        • takeship
        • 6 years ago

        Just to point out – Apple’s R&D listing for full year 2013 was ~6B while Intel’s was 10.6B. The numbers above are quarterly numbers, not full year. Kudos to Apple though, who has more than doubled their R&D numbers since 2012.

      • melgross
      • 6 years ago

      When companies get to the size of an Apple, which is about $200 billion in sales, percentages invested in R&D don’t matter as much. Last year, Apple invested $6 billion in R&D. This year, that is increasing by a good 20%. Apple also invests in plants its suppliers build. They buy machinery for them, and even help pay for employee training. They just paid over $1.2 billion for an LCD manufacturing plant, as an example. Apple will pay over $13 billion this year on investments such as that. That counts as massive capital investment.

      How much is AMD spending on that!

    • crabjokeman
    • 6 years ago

    It’s interesting to see Intel graphed in comparison, but wouldn’t the graph me more informative without it?

      • K-L-Waster
      • 6 years ago

      If you were only interested in GPU and ARM development, yes – if you are interested in the overall desktop space, though, ignoring Intel is going to leave a huge gap in the picture.

    • LocalCitizen
    • 6 years ago

    AMD’s R&D spending cut is consistent with its plan to produce high performance CPU microarchitecture and GPUs.

    you know:
    1 – R&D cut
    2 – magic
    3 – great products
    4 – future growth

    and how has resources wasted on Mantle give them any competitive advantage?

    • brothergc
    • 6 years ago

    sorry but I just have lost faith in AMD , years of waiting for something that would beat intel has left me running to intel .They screwed the pooch with their bulldozer BS
    I used to be a AMD only user but that was years ago in a land far far away . Now adays I avoid AMD like the plauge !

      • albundy
      • 6 years ago

      same here. but hey, whatever has a beginning must have and end…to some means. disappointments come and go. lets hope another company jumps in and starts competing.

      • TopHatKiller
      • 6 years ago

      Not trying to be rude, but; I avoid the “plauge” desperately myself. I much rather spend my
      [not particularly] hard-earned on companies that stiffle competition, innovate nothing their not dragged kicking-and-screaming into and screw every last penny they can out of their trapped customers… OR spend my readies on companies that bluster, lie, extort, bribe and try to [i<]trap[/i<] their customers in propriety upgrade paths. There is a 50/50 chance that 2016 belongings to that other company. PS. The financial analysis of r&d spend in the quote article is nonsense; I'm under the impression its figures are motivated by analysts attempting stock price manipulation. So there.

        • Klimax
        • 6 years ago

        Supposedly stifling innovation. Nice joke there…

      • ronch
      • 6 years ago

      Someone here posted a link to a Reedit AMA by an Intel engineer and reading it was very enlightening. It just shows the quality of Intel’s people and the work that they do. AMD also does have really remarkable people and the Intel guy did acknowledge that they’re pretty awesome considering how they manage to compete in the same ballpark as Intel given their limited resources at their disposal. I echo the same sentiments.

      There was also an AMD guy who acknowledged that AMD engineers have great respect for Intel engineers (the Intel engineer returned the compliment) and many engineers in both companies know and respect each other. It’s the marketing/upper management guys who often have some disdain for the competitor.

      While I bash AMD [s<]from time to time[/s<] often, it's mainly because I think they need to ditch the monkeys running the show or the stupid marketing people like that clown Adam Kozak who looks funny and had to twist the truth when Bulldozer came out. If you haven't read the blog he wrote back then, consider yourself lucky. Products that generally don't hold up very well to the competition are bad enough, but twisting the truth and being utterly un-classy and insisting that your products are awesome are, well... marketing crap from crappy marketing people.

    • Sam125
    • 6 years ago

    The disparity in R&D costs is because Intel owns a dozen or so fabs. Lithography shrinks are only becoming more expensive which is why most of the world has halted at 28nm for a long time.

    Also, I have nothing to substantiate this, but I think 2016 and 17 are going to be banner years for AMD. I believe the term is, it’ll be a breakout year. With AMD, I’ve noticed that there’s always a ton of bad news for the company before they do something semi-amazing. 😉

      • NeelyCam
      • 6 years ago

      [quote<][b<]semi[/b<]-amazing[/quote<] I see what you did there

    • Deanjo
    • 6 years ago

    [quote<]AMD's higher spending in years past continues to pay dividends today. The company's GCN GPU architecture came to market in 2011, and its low-overhead Mantle API essentially laid the course for the next generation of graphics APIs like Vulkan and DirectX 12.[/quote<] Not sure how you can say that it pays dividends, literally or figuratively by looking at their financials.

      • sweatshopking
      • 6 years ago

      Their GPU section has been profitable?

        • Deanjo
        • 6 years ago

        That’s hard to tell considering they lump it in with their CPU/Chipset numbers (which last quarter was in the red again).

        All signs show them losing market share in graphics as well over the last year especially since the mining craze is done.

        [url<]http://www.guru3d.com/news-story/nvidia-76-amd-24-gpu-marketshare-q4-2014.html[/url<]

          • sweatshopking
          • 6 years ago

          Sure. I’m not sure, but it is assumed.

            • Deanjo
            • 6 years ago

            You know what they say of “assume”…

            One would assume a 290 would gobble up a GTX-960 by the specs but…..

            [url<]http://www.phoronix.com/scan.php?page=article&item=nvidia-titan-x&num=1[/url<]

      • Klimax
      • 6 years ago

      Their HW outpaced their SW…

Pin It on Pinterest

Share This