Intel unveils XMM 8160 5G modem for a 2H 2019 arrival

Intel is shaking up its plans for its rollout of 5G modems today. Just under a year ago, the company took the wraps off its XMM 8060 modem. The XMM 8060 promised support for the 5G New Radio (5G NR) standard in both Standalone (SA) and Non-Standalone (NSA) forms, as well as backward compatibility with 2G, 3G (including CDMA), and 4G networks. Today, the company announced that it plans to pull in the launch of its new XMM 8160 modem to the second half of 2019, or by more than six months, according to its press release. The XMM 8160 claims support for 5G speeds of up to 6 Gbps.

Past that, though, the XMM 8160 doesn't seem to unveil many new capabilities versus those announced for the XMM 8060. It has the same multi-mode chops promised by the 8060 across 2G, 3G, and 4G legacy networks, in addition to support for 5G NR SA and NSA across sub-6-GHz and mmWave spectra. Intel highlights the fact that the XMM 8160's backward and forward capabilities are all wrapped up in a single chip, though, and that could be an important distingushing point for the blue team.

In contrast, Qualcomm's Snapdragon X50 modem has to piggyback on Snapdragon SoCs that have modems for LTE and other legacy standards baked in to let a Snapdragon smartphone cover all its bases. Intel's purported single-chip approach could be important for companies who want to integrate a complete backward- and forward-compatible modem into products that don't necessarily have LTE support to begin with. That covers most every non-Qualcomm notebook PC sold today, to name just one attractive market that Intel might want to help itself to.

Intel also anticipates that implementing the XMM 8160 and its supporting transceivers in products might be less demanding of board area than other early 5G implementations, an important consideration in smartphones and tablets. The company produced a not-to-scale graphic to demonstrate that a complete XMM 8160 implementation will need just the modem itself, a 5G mmWave transceiver, and a seven-mode RF transceiver to cover sub-6-GHz operation. Whether the company's supporting graphic is meant to show the advantages of the XMM 8160 versus what would have been required to implement the XMM 8060 isn't made clear, however. It's also not whether Intel might be illustrating what it thinks would be required for Qualcomm's partners to implement the Snapdragon X50 modem alongside another Snapdragon SoC.

It's hard not to see this graphic as a dig at Qualcomm, however, as that company's recently-introduced QTM052 5G RF transceivers only claim support for the mmWave spectrum, not sub-6-GHz bands. Early Qualcomm 5G phones might need separate RF transceivers for sub-6-GHz 5G operation and another legacy networking standards as a result, but until real devices hit the market, we won't know for sure. It is clear that Intel is fudging a bit by leaving the primary SoC of any mobile device that implements the XMM 8160 out of the picture, though, while the legacy modem for Snapdragon devices is already part of the board area occupied by those SoCs. That omission might make Intel's graphic more dramatic than actual implementations will be.

Whatever its implementation details may be, the timeline for the introduction of the XMM 8160 is fascinating on its own. Apple is Intel's largest client for modems, and it regularly releases new iPhones in the fall of most years. Intel's second-half-of-2019 release window for the XMM 8160 could suggest that the first 5G-capable iPhones are coming in that time frame. As with so much else about 5G, we'll just have to wait and see.

Comments closed
    • Unknown-Error
    • 1 year ago

    Intel the SSD company (plus Xpoint)
    Intel the Modem company
    Soon Intel the GPU company.

    What next for Intel?

      • NTMBK
      • 1 year ago

      Intel the 10nm company, hopefully…

        • tipoo
        • 1 year ago

        *By some definitions of 10
        **which is really just bringing them more in line with existing crappy fab naming by everyone else

    • NTMBK
    • 1 year ago

    Sooo… The 8060 is cancelled?

      • DancinJack
      • 1 year ago

      That’s what it feels like to me. 10nm 8160 instead of 14nm or 10nm 8060 because Apple paid a ton of money to get them to bust their butts for 2H 2019.

        • blastdoor
        • 1 year ago

        Where’s the 8086?

    • Srsly_Bro
    • 1 year ago

    Still not going to beat AMD’s Rome. Intel releasing more trash.

      • Shobai
      • 1 year ago

      Well, yeah, except that this has the full rate AVX-512 unit and Rome won’t.

        • chuckula
        • 1 year ago

        What are you talking about! Rome is clearly a superior 5G modem!

      • krazyredboy
      • 1 year ago

      Guys, guys, guys. We all know that Rome finally fell around 476, and it was Romulus’ fault… not Intel!

    • Chrispy_
    • 1 year ago

    Is anyone else underwhelmed by 5G?

    The only place in the world that I’ve been with 4G speeds that are even remotely close to 4G promises, was halfway up an Alp in France with clear line of sight to the mast and very low RF interference up in the mountain wilderness.

    In the real world, anyone living in a city and getting more than 5% of the promised mobile data speeds is uncommon. The problem is massive oversubscription, and that’s been my experience in every European city I’ve ever visited, as well as everywhere I’ve been in Japan and my road trips across the eastern quarter of the US.

    When I first joined the 4G network as an early adopter, it was almost empty – I managed to get 18mbit/s of the promised 42mbit/s. Massively innacurate promises, but 18mbit/s on a mobile device is still decent by today’s standards. Today, however, with almost everyone on the 4G network and 3G relegated to a backup network, my 4G download speeds in the same location on the same network with 4/4 signal bars are just under 1Mbit/s

    Yeah, I’m getting 2% of what 4G promises and the latency jitter from the overcrowded masts means that it’s unusable for gaming, VoIP, and anything else that requires consistent, low ping times.

    5G will be great, but the contention ratios mean that even 6G or 7G will not manage to reach the promised speeds that even 4G is capable of. (because the networks are greedy bastards who don’t buy enough bandwidth and who don’t have enough masts to cover population centers adequately).

      • Srsly_Bro
      • 1 year ago

      Mbit*

      Scrub

      • DancinJack
      • 1 year ago

      Though obviously I don’t get this ALL the time, I have hit ~70-90 Mbp/s on Verizon LTE pretty often. It’s generally closer to 25-40 Mbp/s (down) most of the time though.

      I’ve lived in major cities on the east coast (Boston/NYC) and Texas (Austin) and spent a lot of time in Kansas City, San Francisco, and LA. There are occasional slowdowns where I’ll drop down into the signal digits, but for the most part Verizon’s LTE has been pretty amazing for me. Even if I have crap signal it’s still entirely usable.

      Seriously though, UNDER 1Mbp/s with full bars? There’s something seriously wrong there. Like, it’s hard to believe network operators would let that happen at all.

      • Goty
      • 1 year ago

      It’s also definitely provider/network dependent. I’m sitting in the middle the Atlanta suburbs buried in an office building and getting 35 Mbps on T-Mobile LTE band 2 (1900 MHz). If I switch over to Sprint on band 25 (still 1900 MHz), I get less than 5 Mbps. Both signals are around -110 dBm.

    • tay
    • 1 year ago

    Hi – I am getting ear wax ads with graphic images of ear wax. Is there any chance TR will reconsider their ad providers? Thank you.

      • cegras
      • 1 year ago

      Same. I’ve turned on adblocker.

      • BiffStroganoffsky
      • 1 year ago

      Ear wax ads would be an upgrade from the political swill I get. Wanna trade?

    • chuckula
    • 1 year ago

    This sucks, it’s not 7nm and Intel will be bankrupt before Epyc 2 even launches in 2 weeks.

    With that out of the way, it’s a sad day: RIP Stan Lee.

    [url<]https://www.npr.org/2018/11/12/106322838/a-marvel-of-a-man-stan-lee-dead-at-95?t=1542055347142[/url<]

      • Gadoran
      • 1 year ago

      Reporderly it is on Intel 10nm. XMM 7670 will be on 14nm.

      By the way this means 10nm is yieding better now, so Intel can acceletate.

        • chuckula
        • 1 year ago

        Oh Contraire Mon Frair! Haven’t you learned anything?!!?

        [quote<] Intel highlights the fact that the XMM 8160's backward and forward capabilities are all wrapped up in a single chip, though, and that could be an important distingushing point for the blue team.[/quote<] A primitive [b<]SINGLE CHIP[/b<] failure?!!? What kind of crap is this? Hasn't AMD taught Intel anything?!?! Until they can get their crap together and produce a 6-chip MCM (that's one chip per-G plus the Holy-of-Holy Miraculous I/O hub that better darn well be on 22nm because we all know I/O can't scale) then I declare these products to be a failure and this just confirms that the iPhone is going AMD.

          • highlandr
          • 1 year ago

          You’re leaking into cellular now?

          Good think AMD leaves the cellular “droppings” for Qualcomm and Intel, if they so chose they would obliterate the blue team in every area!

            • chuckula
            • 1 year ago

            Step 1: AMD single-handedly invents the GPU in-house without any outside influences unlike Ngreedia who had to buy out a graphics company for every chip it makes.

            Step 2: AMD single-handedly invents mobile graphics.

            Step 3: AMD sells mobile graphics to Qualcomm because it realizes that having that much POWAR concentrated in just one company would destabilize the Earth’s orbit.

            Step 4: Qualcomm makes 5G modems.

            Conclusion: AMD INVENTED 5G…. CONFIRMED!

        • Goty
        • 1 year ago

        I wonder if the process has improved or Intel has decided to get more revenue out of the process by bringing forward the production of significantly smaller chips?

          • chuckula
          • 1 year ago

          Yes, those idiots at Intel were going to make GIGANTIC 10nm modems but then decided to make the same modem on 14nm… just small.

            • Goty
            • 1 year ago

            Maybe if you’d set aside your little vendetta for a second, you wouldn’t jump to ridiculous interpretations of a completely sensible statement. Given the known yield problems Intel has on 10nm, pulling a small chip like this in on the schedule (as opposed to whatever nonsense you interpreted my comment to mean) would increase the amount of chips produced on said problematic process, creating revenue that would otherwise be lost. If everything was peachy, what would the benefit be to take up valuable wafer starts with something as low-margin as this chip probably is when they could be making high-margin products for the datacenter? I think the fact that Intel is already doing something similar with their dual core i3s just bolsters this argument. They’d probably have been making these already if the design work was ready.

            • DancinJack
            • 1 year ago

            lol @ this nuanced, thoughtful response to Chuck’s trolling

            • Goty
            • 1 year ago

            Is it still trolling if he’s incapable of responding in a mature manner to a serious post? Honestly, he’s no better than the people he’s mocking anymore.

            • Beahmont
            • 1 year ago

            I’ll say it again.

            The problem for 10nm isn’t yields directly, but clock speed and power at the high end. They can get good yields, and have for years, but the damn things won’t clock anything close to acceptable for all but the most low end low power parts. It’s not a size issue with larger chips having more defects, beyond the expected, that’s keeping 10nm down, but that the chips can’t hit their frequency targets with any reliability and in some cases at all.

            If it was a ‘simple’ issue with larger chips not being produced correctly in enough numbers, Intel could and would have solved the issue years ago or just brute forced the issue by burning waffers for limited run products like Broadwell.

            • Goty
            • 1 year ago

            That would make sense (and would obviously derail my hypothesis above), but would that also cause Intel to only ship parts with disabled GPUs? I feel like that points more to defect density issues than clock issues (not that they can’t be having more than one type of problem.)

            • Beahmont
            • 1 year ago

            I don’t know why Intel is disabling the GPU’s, heat issues perhaps? However, I do know that the full chips with a GPU on the chip are around the same size as some of the modems they are planning on making.

            It’s not a perfect comparison as the modems have a small bit of extremely complex logic circuits and then a lot of densely packed low complexity bits and bobs, but it’s around the same total size if not slightly bigger than a CPU because some of the transistors arrangements actually have to be a certain physical size because the physics demand it. However, none of it is very clock sensitive or really creates that much heat.

            That to me points towards clocking yield issues and not towards size yield issues. Though there could also be some kind of unsolved issue with certain transistor setups, like in the GPU, not forming correctly but other large transistor setups form good enough to perfectly.

            What I’ve consistently heard since Intel ran into the first problems with 10nm, is that it’s clock speeds and power curve issues, but they are getting the expected defect rate. Who knows what the ‘expected defect rate’ is for 10nm though.

    • DancinJack
    • 1 year ago

    herrrrrrrrrrrrrrrrrrrro iPhone 2019.

      • DancinJack
      • 1 year ago

      lol at you haters. i’m gonna be rocking my iPhone XX with 5G and 27x the performance of your Snapdragon 855, leaving you in the dust. It’s going to be freakin’ magical.

      • tipoo
      • 1 year ago

      If they’re still using Intel modems by 2020, I’m curious when their home grown ones will be out. Modems are hard though, could take some time.

        • derFunkenstein
        • 1 year ago

        I doubt they want to wade into the CDMA water too much. They’ll do their own modems when Verizon and Sprint retire their legacy network.

          • DancinJack
          • 1 year ago

          Honestly they could release a VZW phone that’s all LTE right now. There is enough coverage of LTE to not REALLY make a difference. Sprint, on the other hand, is another story.

      • chuckula
      • 1 year ago

      Why all the Intel shilling?
      Why can’t Apple’s 2019 Miracle Arm chip have the 5G modem built right in??

Pin It on Pinterest

Share This