Intel moves to achieve deep-learning Nervana with acquisition

Deep learning technology is one of the buzziest, fastest-growing fields in tech right now. We've long heard about the potential of the technology from Nvidia, who has positioned its GPUs as ideal platforms to power applications like self-driving cars and neural-network-powered image identification, to name just two potential applications for the tech. Today, Intel is trying to get a leg up in the game by purchasing Nervana Systems, a deep-learning hardware and software provider, for an undisclosed sum.

Intel says it'll be using Nervana's software to refine its Math Kernel Library, and the company's hardware expertise will also be put to work in the development of Xeon and Xeon Phi chips.

Nervana's hardware page says it's developed a deep-learning application-specific integrated circuit (ASIC)  in an interposer package with 32GB of HBM on board. That ASIC also includes six bi-directional proprietary interconnect links that allow clusters of chips to be connected intra- or inter-server for extra speed. The company also makes an SDK called Neon for implementing deep-learning applications. In tandem, these products are meant to provide big speedups for the field. We'll probably hear much more about this acquisition at IDF next week.

Comments closed
    • ronch
    • 4 years ago

    Look at that nicely done image of that die and HBM and all. So DEEP LEARNING stuff! /s

    • raddude9
    • 4 years ago

    How about they apply this “Deep Learning” chip to the question of whether these big acquisitions are a good idea of not. I wonder what it would find?

    • Voldenuit
    • 4 years ago

    Care Bear Stare!

    Oh wait, that was Nelvana.

    • Tirk
    • 4 years ago

    Besides the deep learning, another interesting aspect is Nervana’s work with HBM memory.

    I thought Intel largely shunned HBM in favor the the competing HMC memory. Does anyone have more information on Intel’s current stance between HMC and HBM? Does this mean a shift for Intel away from HMC and towards implementing HBM in Xeon and Xeon Phi chips?

      • ImSpartacus
      • 4 years ago

      Yes, I believe Intel shunned hbm for the most part.

      It could be as simple as Intel not finding a “perfect” candidate to acquire.

        • Tirk
        • 4 years ago

        True, it does not mean they are necessarily contemplating a move to HBM but the fact that Intel as mentioned in the article as using, “The company’s hardware expertise will also be put to work in the development of Xeon and Xeon Phi chips.” has me intrigued on whether Nervana’s hardware expertise in implementing HBM, “Nervana’s hardware page says it’s developed a deep-learning application-specific integrated circuit (ASIC) in an interposer package with 32GB of HBM on board.” would influence Xeon and Xeon Phi chips moving forward, hence my questions into the idea.

      • chuckula
      • 4 years ago

      [quote<]I thought Intel largely shunned HBM [/quote<] More like AMD shunned HMC, which was around first.

        • Tirk
        • 4 years ago

        Nervana mentioned in article, Intel mentioned in article, HBM used by Nervana mentioned in article, Intel using Nervana’s hardware expertise being put to work in the development of Xeon and Xeon Phi chips mentioned in the article. Please inform me why it is incorrect to ask questions about what is pertained in the article? HMC implementation has seemed to be expensive thus far and moving to a potentially more cost effective HBM solution might be the the right move for Intel, hence my question. If you have evidence to the contrary then please let us continue with a civil discussion. Everything I mentioned in my post pertained to what was written in the article but you……..

        No where in my post or the article did it mention or pertain to AMD and yet somehow in your infinite bias you found a way to include an attack against AMD, good job. Let’s forget that SK Hynix also helped to develop HBM and only blame AMD. But lets use your logic from now on:
        -Fairchild was around before Intel, anyone who used Intel must clearly be using the inferior tech
        -Intel was around before AMD, anyone who uses AMD must clearly be using the inferior tech
        -AMD was around before Nvidia, anyone who uses Nvidia must clearly be using the inferior tech
        -Clearly anyone who owns an Nvidia card shuns the superior technology created at Fairchild, Intel, and AMD.

        I’ll await your announcement denouncing Nvidia as the inferior choice any minute now…….

    • Redundant
    • 4 years ago

    Think the busiest bee could replace the buzz in the opening sentence? Although, I do love me some honey.

    • NTMBK
    • 4 years ago

    I am sure this will go as well as their Infineon, Altera and McAfee acquisitions…

      • WhatMeWorry
      • 4 years ago

      Totally agree. I wonder if they will use Deep Learning to figure out if buying this company was a good idea.

      • tipoo
      • 4 years ago

      Project Offset, RIP 🙁

    • chuckula
    • 4 years ago

    Now we’re spending and acquiring.
    And we don’t know, what we’re buying.

    Crank the buzzwords, and the powerpoints.
    But oh wait yeah, we’re still firing.

    We’re so loud and incoherent
    Boy, this oughta bug your shareholders.

    Yeah!

      • sweatshopking
      • 4 years ago

      doesn’t bug shareholders. they love pointless acquisitions.

      • pranav0091
      • 4 years ago

      How come, of all people, you criticize the big blue, chuck?

    • sweatshopking
    • 4 years ago

    These kinds of markets (cloud, big data, deep learning, etc) seem like buzzword bubble industries which remind me of the last tech bubble. companies just buying stuff for the sake of being in the industry so they can tell shareholders their part of the new hotness buzzword industry.

      • blastdoor
      • 4 years ago

      [url=http://vignette3.wikia.nocookie.net/mst3k/images/2/23/TV%27s_Frank_%26_Dr_Forrester%3B_DEEP_HURTING%21.jpg/revision/latest?cb=20140312214612<]Deep Hurting[/url<]

      • anotherengineer
      • 4 years ago

      Buzzword people

      [url<]https://www.youtube.com/watch?v=YgSPaXgAdzE[/url<]

      • tipoo
      • 4 years ago

      That was a joke we had when putting thought bubbles on a whiteboard. “Look, now we’re a cloud company!”

      • Anonymous Coward
      • 4 years ago

      I can assure you that “cloud” computing is not only real, but it is excellent.

      As for machine learning, there is a real problem with information overload, and anyone that can extract something of value from it has an edge, maybe even a fat profit. Machine learning is also so far as I know the only feasible approach to certain types of problems.

        • Waco
        • 4 years ago

        Unless you care about data integrity, and then you’ll never trust the “cloud”. 🙂

          • Anonymous Coward
          • 4 years ago

          Proper design goes a long ways towards data integrity, perhaps it goes the whole way; I don’t work for a bank. The rate of advancement in the cloud, the remarkable flexibility, the ease of achieving something that closely approximates 100% uptime, these are very very compelling. I have seen that low cost is not necessarily something the cloud offers to all customers, though in its great flexibility, there are ways. The scale of the cloud computing platforms is warping the economics of server design and network design, it will be interesting to see how far it goes. Right now the trend is going very much against in-house server rooms.

          Also its not simply a matter of switching from VMWare and SAN to some drop-in replacements. Teams who get into it can run major applications without managing a single server directly… and probably show greater uptime and security than what an average team would running in-house.

            • Waco
            • 4 years ago

            I know of no system available that guarantees data integrity at scale for the entire dataset. No currently available system is secure enough for long-term storage when your single dataset is in the petabyte range, let alone larger.

            I’d love to be proven wrong, it’d make my job a lot easier. 🙂

            • Anonymous Coward
            • 4 years ago

            OK well also if you have PB-scale data, the economics might favor storing it yourself. 🙂

            • Waco
            • 4 years ago

            Considering nobody (Cloud/storage vendors) seems to know how to do it, that’s exactly what my job is. :P. We’re even to the point of designing and deploying our own filesystem to do it…

            • Anonymous Coward
            • 4 years ago

            Hmm, when you need to implement a filesystem to ensure data integrity, you’re a corner case. One also has to be quite confident in themselves to trust their own filesystem to a greater extent than the existing implementations. (Or at least be prepared to do data recovery themselves.)

            • Waco
            • 4 years ago

            MarFS, if you want to read about it. Simpler compared to almost any full FS, soon to have its own erasure embedded within. Trusts the lower tier, currently, but that’s not enough.

        • PBCrunch
        • 4 years ago

        I took a machine learning (ML) class in school. It is tough to wrap your head around, and tougher to explain, but apparently it has many uses. A manager from a local company that performs these tasks for other businesses explained three current uses:

        1. When people obtain health services, the services are often billed to multiple insurers. The company applies ML algorithms to incoming bills to flag bills with certain combinations of features for further investigation. The ML algorithm reduces the number of bills for investigation.

        2. Automated phone systems for cable companies and the like are very annoying, but they save companies a lot of money. The company uses ML algorithms to reorganize the order of options and the flowcharts of the phone prompt systems.

        3. Outbound telemarketers like to call people when they are at home (or work). ML algorithms can assist companies in timing their outbound calls to actually reach the intended person.

        Yes, two of these three uses for ML are actively making the world a worse place. It was interesting to hear three very different current, real-world uses for this technology.

        For our class project, we used ML algorithms and the leaked Patreon campaign data to identify Patreon campaigns likely to produce NSFW content. The algorithm couldn’t do a very good job of picking the accounts that ~were~ likely to produce NSFW content, but it did an EXCELLENT job of picking out accounts not worth investigating. Like 99.99% accuracy in picking out accounts not worthy of further scrutiny. That was with a pretty small data set in the grand scheme of data mining and ML.

        For a project in another class I tried to use clustering to attempt to identify proteins encoded in the nuclear DNA that would be localized to mitochondria based on codon usage patterns. I am sad to say that the technique I chose did not work at all. There are many examples of ML algorithms used within the life sciences to come up with some interesting results.

        The small-time work I was doing for class projects didn’t really need any special hardware, but I can definitely see how someone working with the browsing habits of the entire American public might want some special chips to chomp all that data. </tinfoil>

          • Anonymous Coward
          • 4 years ago

          Sad to hear that clever people are tasked with applying machine learning to such garbage as phone menus and telemarketing. Spammers stay at the cutting edge I guess.

          I imagine there are both government and non-government groups that posses a huge browsing habit dataset they’d like to crawl. “Big Data” seems to usually involve parsing web logs, often without connection to machine learning.

          Machine learning has also shown up in fraud detection, image recognition & visual search, and AI in general.

          Apparently GPU’s feature prominently in machine learning these days, as does… the cloud.

      • ronch
      • 4 years ago

      Totally agree. Everyone just seems to be going crazy looking for the next big thing. They scramble to find every possible acquisition just so they can feel good about themselves that they’re keeping up with the latest trends, then a few years from now they shut down the operation because it was all hype and there’s not enough demand for it. In other words, it wasn’t the next big thing after all.

Pin It on Pinterest

Share This