Microsoft catapults datacenter performance with FPGAs

Gamers and content creators are not alone in their frustration with the last decade's slowing increases in CPU performance. Massive datacenter operators like Microsoft are coming up with new ways to increase server performance using unconventional hardware. Microsoft in particular is upgrading 34 datacenters around the world with field programmable gate arrays (FPGAs) in an operation the software giant calls "Project Catapult."

Microsoft says its researchers have been planning these upgrades since 2010 and have done pilot rollouts since 2012. The company says it evaluated GPUs and application-specific integration circuits (ASICs) as well as FPGAs in its acceleration research. Microsoft claims its FPGA deployment strategy results in a performance increase of an order of magnitude compared to CPUs, with a comparatively modest 30% increase in cost and 10% increase in total power consumption. Wired reports a 40-fold performance increase compared to a CPU when running Bing algorithms.

Microsoft's "acceleration fabric" deployment strategy calls for the distributed integration of FPGAs into nearly every new box in their datacenters rather than a concentration of many FPGAs into a smaller pool of servers. For that purpose, Microsoft is using Stratix V D5 FPGA daughtercards from Altera inside Intel Xeon-based machines. Intel completed its purchase of Altera for $16.7 billion in cash and shares at the end of December 2015. Microsoft researchers are surely drooling at the prospect of using rumored Xeons with on-package FPGAs in the future.

Comments closed
    • moog
    • 4 years ago

    I really like the daily Bing-powered desktop photos.

    • odizzido
    • 4 years ago

    Neat. I bet intel execs cry when people look for alternatives or don’t upgrade because they don’t/can’t provide significant performance increases anymore.

      • MathMan
      • 4 years ago

      Microsoft is using Altera FPGA. Altera is owned by Intel.

        • emredjan
        • 4 years ago

        Even more, Microsoft’s interest in FPGA was the main reason Intel decided to buy Altera.

        So, no, Intel execs don’t cry, but they snatch the other providers so the alternative is also Intel.

    • Chrispy_
    • 4 years ago

    All this effort into research and compute, but why do Bing search results still suck the ass of a thousand camels?

      • just brew it!
      • 4 years ago

      But now you can get your sucky results 40x faster!

        • morphine
        • 4 years ago

        Philosophical question: if you have a million computers producing a million different sets of terrible Bing search results, will they eventually combine to a single good one?

          • Ninjitsu
          • 4 years ago

          Two wrongs don’t make a right, but a million wrongs make Bing.

            • alrey
            • 4 years ago

            product of 2 negatives is positive so they should put FPGAs in even numbers 🙂

        • CuttinHobo
        • 4 years ago

        40x faster [u<]or[/u<] the asses of 40,000 camels instead of a mere 1,000. Choose wisely.

          • Chrispy_
          • 4 years ago

          That’s a lot of camel ass.

            • Redocbew
            • 4 years ago

            If I was working on Bing and for that reason responsible for the sucking of 1,000 camel asses I think I’d want to make it go 40x faster also.

      • sweatshopking
      • 4 years ago

      Given your views of ms generally, I can’t imagine Bing could ever make you happy.

        • Srsly_Bro
        • 4 years ago

        Sup. You done traveling the world and hosting weddings?

          • sweatshopking
          • 4 years ago

          Yeah. Back in NS for now. Should be here for the winter, then idk where.

        • Chrispy_
        • 4 years ago

        I’m generally disappointed with Microsoft, it is true.

        If they stopped being disappointing, I would be more enthusiastic in my comments and perhaps use Auxy-like Emojis.

      • Krogoth
      • 4 years ago

      Bing is MSN search re-branded.

      Secondly, Bing is almost as good as Google as far as search engines are concerned. Google has a far superior web suite though.

      • EzioAs
      • 4 years ago

      I don’t know what problems you’ve had but I’ve been using Bing for quite some time (a bit over a year, probably) and it’s fine by me. Can’t really say I miss Google Search since I like Bing’s Image and Video search better.

        • sweatshopking
        • 4 years ago

        Yeah, I ditched google years ago too. Bing is perfectly capable.

        • RAGEPRO
        • 4 years ago

        I mean, the thing with Google is that you can put in wildly tangential queries and have the very first result be exactly what you were looking for. It’s uncanny sometimes, and I rely on it.

        When Bing can provide that, I might start using it. Bing frequently fails to give me the correct results with a very specific query, and it either has a lot more obnoxious paid results or more of them get through my filters. Either way, the experience of using Bing is typically tedious at best and an active waste of my time at worst.

          • sweatshopking
          • 4 years ago

          It doesn’t search the same way as google. You have been trained to search a specific way over the last decade by google, and you’re right, Bing doesn’t provide results in the same way. As for more paid ads, not sure what that means. And don’t you use an ad blocker and white list anyway?

            • Redocbew
            • 4 years ago

            You’re searching it wrong.

        • Rza79
        • 4 years ago

        I feel Google and Bing really complement each other nicely.
        I like the general Google search more but I prefer the picture and video search of Bing, especially since you can’t turn off SafeSearch anymore in Google.
        The Image search of Google is very amazing though.
        So I use both.

      • divide_by_zero
      • 4 years ago

      Psshh. Like they even need a single one of these for all of Bing’s traffic combined.

        • BurntMyBacon
        • 4 years ago

        Where do you think all the Cortana feeds are being processed? Note: the user doesn’t necessarily need to be actively using Cortana for Cortana to make an inquiry … preemptively … based on something Cortana thinks you may want given everything else Cortana has been monitoring you doing.

    • TwistedKestrel
    • 4 years ago

    I don’t suppose we’re ever going to get a more specific description of the roles of these FPGAs, and why they are much more suited to whatever that workload is?

      • TheRazorsEdge
      • 4 years ago

      FPGAs are programmable. They are inherently more suited to the customer’s purpose because the customer configures them.

      The trade-off is less efficiency than dedicated ASICs. However, if your software and electrical engineers can use that flexibility then FPGAs can be better in the long run.

    • WhatMeWorry
    • 4 years ago

    So I can now get unsatisfactory search results from Bing 40x faster?

      • Wirko
      • 4 years ago

      On the contrary. You get 40x as many in the same amount of time.

    • xeridea
    • 4 years ago

    They don’t have comparison to GPUs. For crypto anyway, FGPAs weren’t any faster than GPUs, though they did use less power. FPGAs cost a lot more than GPUs though. They say they are 10x faster than CPUs, but GPUs will be at least that much faster, and in some cases 100x+ faster than CPUs. Of course it is highly dependent on workload. So without GPU comparison, who knows if it is really better?

      • sweatshopking
      • 4 years ago

      [quote<] So without GPU comparison, who knows if it is really better? [/quote<] Probably Microsofts engineers who did the tests and the rollout?

        • xeridea
        • 4 years ago

        Well sure, but they don’t compare it in report, so what good is them knowing if they don’t publish it? It is not that hard to be faster than a CPU.

          • sweatshopking
          • 4 years ago

          I’m sure that information is kept internally

      • chuckula
      • 4 years ago

      GPUs are great at doing embarrassingly parallel number crunching, although as coin-of-the-week miners can attest, once an ASIC for the coin algorithm is developed, the ASIC wins every time.

      Search engine algorithms — while certainly exhibiting parallelism — are not necessarily in the category of embarrassingly parallel number crunching. On top of that, as mentioned in the article, building an ASIC for a complex algorithm that changes multiple times a year isn’t a winning proposition. In this case, the FPGA gives the right level of flexibility to attack the problem while retaining the reprogrammability needed to update for the next iteration of the algorithm.

      • Pitabred
      • 4 years ago

      Even if GPUs did it 1000x faster, the FPGA is only a 10% power increase for 10x performance (assuming linear scaling, 100x performance for a doubling of power budget). GPUs at 100x are still a 200%+ power increase over a CPU alone, and power is a BIG concern in a datacenter.

      Point being, even if it’s not the absolute fastest compared to a GPU, it is a by far more efficient use of power, which is the name of the datacenter game.

        • xeridea
        • 4 years ago

        So if FPGA is 10x faster, at 10% more power, it is ~9x better than CPU

        If GPU is 100x faster at 100% more power, it is 50x better than CPU, or ~5x better than FPGA.

        You could also use a smaller GPU, so it would be 50x faster at same power consumption, while FGPA is 10x faster at similar power consumption.

        It is all highly dependent on workload. There is tradeoff of developer time though, it is much quicker to write OpenCL or CUDA than it is to program an FPGA.

      • TheRazorsEdge
      • 4 years ago

      The benefit of FPGAs is that they are reprogrammable. If you suddenly need 2X as many dedicated SSL-acceleration blocks, you simply push the config. Or if you no longer need it, you can repurpose those circuits on the fly.

      GPUs are really good at a very specific type of work. If you need anything beyond that, well, too bad for you.

      ASICs and FPGAs are both customizable, with the notable difference that FPGAs can be recustomized after they’re put into production. That’s a huge strategic advantage in exchange for some efficiency (a well-designed ASIC will always outperform a comparable FPGA).

        • xeridea
        • 4 years ago

        GPUs already give massive acceleration to a variety of workloads. They aren’t quite as flexible as CPU, but they are far from limited. Surely there will be many cases where FPGA is better, enough of an efficiency gain that it is worth additional time programming, I am just saying without GPU comparison the article is kind of useless. We already know CPUs aren’t the fastest, but GPUs already have a giant leap in performance.

        It’s like saying LED is hands down better than incandescent without mentioning CFL. Yes, it is a lot better than edisons bulbs, but it’s not that much more efficient than CFL, so the added cost isn’t worth it for many (they are a lot cheaper now, but for many years CFL was still most cost efficient).

          • Srsly_Bro
          • 4 years ago

          You made the assumption the task can be run on GPUs.

          It’s like saying it can without knowing if it is able.

      • flip-mode
      • 4 years ago

      [quote<]The company says it evaluated GPUs and application-specific integration circuits (ASICs) as well as FPGAs in its acceleration research. [/quote<] Microsoft did the comparison you speak of. Looks like GPUs lost.

      • superjawes
      • 4 years ago

      Given a specific task, the right FPGA will ALWAYS be better than a GPU or CPU. You program an FPGA to perform a single task at the greatest efficiency. However, since you have the ability to reprogram the FPGA for a different task later, the [i<]cost[/i<] of the FPGA will be greater. (An ASIC would also have the greatest efficiency at a lower cost than the FPGA, but you lose ability to reprogram later.) IOW, adding GPUs to these datacenters *might* have offered a better performance increase than adding CPUs, but MS determined the FPGAs offered the best value in terms of efficiency and utility (utility to change the circuits later). The benefit of CPUs and GPUs is that you can perform many different tasks without ever having change the circuits. Again, they might not be as efficient as an FPGA, but they are more flexible. EDIT: also, what flip-mode said about MS looking into GPUs as an option.

        • xeridea
        • 4 years ago

        Flexibility depends on definition. Yes, you can program them to do whatever you want, but at a greater labor cost. I am not disagreeing that FPGA would have a big benefit in many situations, I am just saying without comparison to GPU, the article is of little use.

        FPGA isn’t going to better than GPU at everything. You don’t see FGPAs used for deep learning for instance, and some tasks are too complex, or change too often to reasonably adapt with FPGA.

          • Rza79
          • 4 years ago

          I don’t understand you. This is not a review. It’s just MS saying that they started to use FPGA’s for their datacenters after 6 years of research. Not to inform the casual consumer of what they researched or how GPUs compare to FPGA’s.
          You have no clue what the accelerated ‘Bing algorithms’ is or if it even can be accelerated on a GPU. You don’t know how parallel it is.

            • xeridea
            • 4 years ago

            If a major contender is ignored, it isn’t a good assessment. Like when Intel said their tech was super awesome in deep learning, but didn’t mention they was comparing to technology from 2 years ago, and Nvidia had to correct them. Surely in 6 years they did some GPU testing, and have stats on it, but they are choosing to ignore it. MS has been known to have skewed results from their “studies”. I am not saying one or the other is better, just that it is fishy, or they are trying to make it seem better than it is.

            • superjawes
            • 4 years ago

            They gave a comparison to CPUs. That is literally all you need to understand why they are deploying a non-CPU solution in datacenters. Mentioning GPU results wouldn’t do anything but add noise to the announcement.

            • xeridea
            • 4 years ago

            I don’t see how it would add noise. Typically when comparing tech, details and methods are given. Giving the results would mean they were more thorough. Obviously FPGAs can give a speedup, that is kind of the whole point of them, and we knew that ages ago, so their announcement is old news.

    • DPete27
    • 4 years ago

    (Looking for RGB LEDs) Is it that green strip on the right?

      • willmore
      • 4 years ago

      That looks like a 40 character x 2 line character cell LCD display. This card is clearly a developer board and not a production one. Unless Microsoft suggests that servers normally have their cases off so that people can push buttons on them and read the little LCD display.

        • just brew it!
        • 4 years ago

        Yup, I imagine that’s just a stock Altera photo of the developer kit hardware they used to prove out the original concept.

        Edit: Confirmed. Exact same photo appears here: [url<]https://www.altera.com/products/boards_and_kits/dev-kits/altera/kit-sv-gx-host.html[/url<] (scroll down about halfway)

    • techguy
    • 4 years ago

    [quote<]Intel completed its purchase of Altera for $16.7 in cash and shares at the end of December 2015.[/quote<] $16.7! Man, I should've gotten in on that action...

      • morphine
      • 4 years ago

      Apologies for that error. I was thinking of Samsung Note 7 prices on eBay.

        • Srsly_Bro
        • 4 years ago

        Joke is on him paying the taxes on the gain in a bargain purchase!

        • superjawes
        • 4 years ago

        Burn!

Pin It on Pinterest

Share This