Rumor: Intel 300-series chipsets pack USB 3.1 and Wi-Fi support

Benchlife is showing off another purported Intel slide with some pretty juicy details about that company's next desktop chipset series. The image appears to describe the upcoming 300-series chipsets intended for use with the chip giant's "Cannon Lake" processor family. Most notably, the chips are rumored to include support for USB 3.1 (10 Gbps) and 802.11ac Wi-Fi.

Source: Benchlife

Aside from the two new features, the rumored 300-series chipsets look to be pretty similar to existing 200-series offerings. That means they include connectivity for up to 24 lanes of PCIe 3.0, up to 10 USB 3.0 ports, and six 6Gbps SATA ports. Benchlife goes on to say that much like previous generations, there will be a range of chipsets in the 300-series including the whole alphabet soup of Z, Q, H, and B-series models with progressively-reduced capabilities. If this info checks out, you can bet that Wi-Fi will be one of the first things to go as you move down the product stack.

The site also says that Intel's next Core chips will be again built on 14-nm technology, something we heard earlier this year. According to that earlier report, it's possible that Cannon Lake desktop processors could come out on a 14-nm process, while mobile processors use a 10-nm process. The new rumors today pertain to the desktop chipsets ("-S" platform), but the benefits of integrating Wi-Fi into the mobile chipsets should be obvious. Benchlife says the desktop Cannon Lake processors are scheduled for release in Q4 of this year.

Comments closed
    • DavidC1
    • 3 years ago

    One of the biggest reasons Intel platforms can’t compete with ARM ones in mobility is due to lack of integration. This is a very welcome change. They are one of the top quality makers of ethernet and WiFi modules anyway. Their software support is also great. They don’t get obsolete after 3 updates. I use an Intel WiFi module in my desktop. This is a non-marginal BoM and power reduction.

    If you are complaining of the lack of upgrades in laptops, think of a Tablet. I am pretty sure the needs of those that want upgradeable laptops will be better served by making a purpose built one rather than making everything discrete. Integration is inevitable for a device that follows Moore’s Law. You need to put the extra space to use.

    In the big picture though, we are just arguing only our own opinions are correct. Actually, that’s not the case. Your usage scenarios just happen to be unique that’s all.

    • willmore
    • 3 years ago

    This may make sense for laptop chipsets. Do people use wifi for desktop machines?

    If Intel wants to do something nice for desktop chipsets, how about a 10GigE port?

      • Krogoth
      • 3 years ago

      10 Gigabit Ethernet and beyond has several problems going against it. It is a PITA to transmit data reliably over metal with significant distances. Cat5e cable doesn’t cut it and CAT6 cabling is only good for short runs. That’s part of the reason why it remain enterprise/datacenter tier. The masses perfer wireless ethernet over wired ethernet so mainstream demand is more focused on wireless solutions.

      2.5Mbps/5Mbps a.k.a 802.11bz was designed as a compromise (still works with existing CAT5/CAT6) but it was ratified back late 2016 so you will not see any significant SMB adoption until later part of 2017 into 2018. Mainstream adoption will follow after that.

        • willmore
        • 3 years ago

        How is 100m not long enough? Who’s going to have runs longer than that in their house? I think my longest run is barely 20m. Sure, in office environments you may need longer runs, but then again, in those environements you can pull cable and put switches wherever you need them.

        For the few longer links you need, pull fiber.

          • Krogoth
          • 3 years ago

          CAT6 only can do 10Gbps up to ~55M assuming the cable is done right in the real-world expect ~40M.

            • hansmuff
            • 3 years ago

            Having my servers and router in the middle of the house, I can do that. Bring it, Intel!

            Seriously though, I could use that bandwidth for Remote-effin-Desktop. I use it when I work from home; the laptop screen is tiny and shitty and on RDP I can use my two 24″ screens. But even at 24bit color it’s a little laggy. I would not mind a 6-9x improvement.

            But for that, my laptop also would need a 10Gbps connection. I guess I’ll check again in 5 years.

            • curtisb
            • 3 years ago

            Apparently N-Base-T for 2.5Gbps and 5Gbps have the same length limitation at 10Gbps on CAT6.

            • Andrew Lauritzen
            • 3 years ago

            But again, even 40M is *plenty* for almost all households. And I’m not sure I agree with the notion that “in real world, expect 40M” with CAT6… in the “real world” even the Cat5e in my house walls works fine @ 10Gbit.

            Granted you never know, but I’ve found in practice that even decent quality Cat5e gets you further than you’d expect, and if you’re running new cables Cat6a isn’t really much more expensive.

            The reason you’re unlikely to see this on consumer boards is that 10Gbase-T is still $100/port or more… it’s not clear that’s going to come down much either since it is both fairly complex and there’s no real mass market pressure.

            I do agree that the consumer trend is waaaaay more towards Wifi though and people would rather spend $500+ on a Wifi access point to maybe get a tiny bit more performance/range than do anything involving wires. Sorta sad (especially for gamers for whom wifi reliability and latency can be noticeable) but just have to accept it.

        • Lemonsquare
        • 3 years ago

        What? Use Cat 6A cabling which is still relatively affordable and up to 100 metre runs for 10G. It remains enterprise because whose home-consumer level Internet package from their ISP includes an option for 10 Gigabit?

        I use an ASUS PCI-E WiFi card in my desktop at home out of necessity, and it works flawlessly compared to wired. I live in a poorly maintained, 30s era building, where electrical outlets are extremely scarce and the electric system is poor (for EoP), as well as room/furniture layout as far as position of desktop relative to ISP’s box and router for cabling, and being a rental, I’m not running cable through the walls.

      • MOSFET
      • 3 years ago

      [quote<]Do people use wifi for desktop machines[/quote<] I hope not, for their sake. I know I don't. This would be a great place to insert that wired 10G, or even the 5G and 2.5Gb/s vaporware.

        • EzioAs
        • 3 years ago

        I still use it and I don’t see any problem with it based on my usage. 🙂

        • Krogoth
        • 3 years ago

        5Gb/2.5Gb isn’t vaporware. The bloody standard was just finalized a few months ago for goodness sake. It takes time for network guys to engineer and develop new NICs, switches and routers that can support it.

      • Ninjitsu
      • 3 years ago

      yeah, they do, sometimes it’s not possible to get a wired connection.

      • NTMBK
      • 3 years ago

      Plenty of people live in rented flats/houses where they can’t just go drilling holes in walls and installing ethernet. And plenty more people simply don’t [i<]want[/i<] to go drilling holes in their walls. If you don't play low-latency online games, I honestly don't see the need for a wired connection these days. Wifi works just fine.

      • Andrew Lauritzen
      • 3 years ago

      I don’t use it for anything serious but it’s still useful to have as an option if I move machines somewhere, etc. And bluetooth is useful, even on a desktop. I generally wouldn’t pay extra for it but I’ve found that having wireless stuff “just there and working” is one of the nice things about NUCs.

        • MOSFET
        • 3 years ago

        I definitely wasn’t saying I don’t use wireless, just not on a [b<]stationary desktop machine[/b<]. If you do and it works for you, fantastic. I move my NUC around a lot, so I don't consider it stationary, probably just me. Laptop, tablet, phone, sometimes NUC -- all great candidates for wireless.

    • hasseb64
    • 3 years ago

    No buy!

    • not@home
    • 3 years ago

    I am surprised it took Intel this long to integrate Wifi into their chipsets. I would have thought that would be the first thing when they were on their “make everything smaller so it fits in a phone or tablet” kick.

    • End User
    • 3 years ago

    Consumers don’t need USB 3.1 Gen 2. USB 2.0 is all consumers need. Leave USB 3.1 Gen 2 for the professionals.

      • RAGEPRO
      • 3 years ago

      As a consumer, I absolutely do need USB 3.1. Thanks for telling me what I need, though.

        • End User
        • 3 years ago

        I totally agree with you. 😛

          • RAGEPRO
          • 3 years ago

          Ahh, my bad. Stressed out and I missed the sarcasm.

          Ironically if you had said 3.0 instead of 2.0 I could get behind that idea, except that “leaving things to the professionals” is crap and should never happen, haha.

        • derFunkenstein
        • 3 years ago

        What’s funny is I’m about 99% positive that he’s joking. Since he’s (apparently) telling other people to throw away entire PCs in some [url=https://techreport.com/forums/viewtopic.php?p=1348014#p1348001<]forum threads[/url<], however, I'm not totally at 100%. It's possible he's serious.

          • End User
          • 3 years ago

          Well, ronch is using 125 W TDP CPU to play YouTube videos when all you need is a ARM CPU these days.

          As far as my consumer/pro post above it is me having fun with the absurd suggestion made by another that only “pros” need high bandwidth connections. As the down votes indicate, that is not a popular viewpoint.

            • derFunkenstein
            • 3 years ago

            It’s not as if a high-power CPU is drawing its full load at all times or something.

            • End User
            • 3 years ago

            It’s amazing to me that something half the size of a hockey puck can replace a PC. Doubly so when you consider the Chromecast delivers 4K content.

            There is a reason I have a i7-920 build sitting around doing nothing.

      • LostCat
      • 3 years ago

      Ahh USB 2.0, back when it was Useless Serial Bus (except for the mouse and keyboard.)

      • smilingcrow
      • 3 years ago

      Are you implying that USB 1.0 was inadequate in some way?

        • ImSpartacus
        • 3 years ago

        Are you implying my usb version?

      • ImSpartacus
      • 3 years ago

      I just want to commend your balls. Sarcasm tags are for suckers. Stay strong!

      • Vhalidictes
      • 3 years ago

      Professionals don’t need USB 3.1 Gen 2 either, because they use Thunderbolt. It has better throughput to the Unicorn adapter, and “time is money, friend”!

        • End User
        • 3 years ago

        Hmmmm. There is no mention of Thunderbolt 3 support. That is depressing (I’m being totally serious).

        Unicorn adapter? Thunderbolt 3 uses the Type-C connector.

      • Krogoth
      • 3 years ago

      Pretty much true for mainstream-tier peripherals. The only thing that made USB 3.0 mainstream was external HDDs enclosures. There’s still no killer mainstream app that makes USB 3.1 Gen 2 useful for the masses.

      Screw Apple and their shitty strong-arming the USB-IF group into making USB 3.1 spec into a mess of confusion that marketing-types love. It is parallel-SCSI(tons of specs with different speeds and limations) back in its heyday again. They set back USB 3.1 adoption by at least five years.

        • End User
        • 3 years ago

        There it is!! The downvotes were worth it.

        Can you give me more info on your Apple and the USB-IF comment? And by info I mean facts. And by facts I mean published work.

          • Krogoth
          • 3 years ago

          It is no secret that Apple used its massive clout to “persuade” USB-IF to publish different USB 3.1 standards for its Macbook series back in the early 2010s. Apple wanted to mask the fact that their ports couldn’t handle 20/10Gbps despite the USB 3.1 name stamp on them.

            • End User
            • 3 years ago

            Proof?

            • the
            • 3 years ago

            USB 3.0 speeds in the USB-C connector. The Type C connector was introduced in the USB 3.1 spec. Hence USB 3.1 Gen 1 became a thing.

            This arcane configuration was because Apple needed a small connector to fit the MacBook chassis. USB Type A was too big.

            • End User
            • 3 years ago

            How do USB 2.0 Type-C connectors fit into your conspiracy theory?

            Are you suggesting that Apple forced the awesome reversible Type-C connector on the world? Where is the proof?

            Do you think that Type-C is arcane?

            Are you backing up Krogoth’s statement that Apple somehow delayed next gen USB by 5 years? I need proof.

            • Krogoth
            • 3 years ago

            No, it is that Apple’s strong-arming in earlier 2010s that created the confusion which makes customers, manufacturers and vendors reluctant to pick-up on USB 3.1 en mass when there’s so many conflicting standards and form factors.

            • End User
            • 3 years ago

            Provide proof of Apple’s strong-arming or cut it out.

            USB.org. is dominated by the classic PC industry:

            [url<]http://www.usb.org/about[/url<]

            • the
            • 3 years ago

            Due to the various alt modes combined with being able to do additional conversion with those Alt-modes, USB-C is nothing but arcane.

            For example, the DP alt-mode for HDMI has only existed for half a year but USB-C to HDMI convertor have existed since it started shipping in products. This is due to the ability to leverage DP alt mode combined with a DP to HDMI active convertor. OR it could be using MHL which also has an alt mode.

            The USB 2.0 Type-C connection was actually defined in the initial USB-C spec to provide guaranteed USB connectivity while using various alt modes. This is important as two pairs of wire used for USB 3.0/3.1 speeds can be reassigned for alt-modes. So if you have a DP 1.3 alt-mode host and DP 1.3 display, you can connect a 4K display and use 10 Gbit/s USB simultaneously over a single cable. Not all the alt-mode lanes are necessary to drive 4K using DP 1.3. However, if either the host or the display operates using DP 1.2 alt mode, the maximum USB speed is 480 Mbit/s. USB-C ports that only operate at USB 2.0 speeds was never intended but the smart phone industry had other ideas.

            What Apple has done is make the whole USB-C confusion even worse by branding 5 Gbit/s speeds as USB 3.1 Gen 1 instead of something more logical or distinguishing.

            • End User
            • 3 years ago

            USB.org. is dominated by the PC industry (not Apple):

            [url<]http://www.usb.org/about[/url<] If anyone is to blame for the USB 3.1 Gen 1/2 confusion it is USB.org as a whole.

            • the
            • 3 years ago

            Apple is such a member and was in fact USB’s first big promoter by making the first iMac exclusively USB for general data IO.

            You’d have to look at raw time frames but Apple was using USB 3.1 Gen 1 terminology [i<]before[/i<] the USB org adopted it. Apple's influence in this matter is rather clear.

            • End User
            • 3 years ago

            Of course Apple is a member. One of many. I said nothing to the contrary. Ultimately the final decisions of the USB.org are a group effort.

            You’ve provided us with nothing to back up your claims. Show me proof that Apple pushed for USB 3.1 Gen 1/2 terminology or did anything to “delay” USB development by “five years”.

            • Krogoth
            • 3 years ago

            Apple’s shenanigans didn’t delay USB development. It delayed market adoption of USB 3.1 both in ports and peripherals because of confusing standards and labeling. Intel also had a hand in it by being reluctant because they wanted to push Thunderbolt and quietly retire USB standard on their platforms.

            The damage is painful apparent because USB 3.1 ports should be commonplace in today’s laptops and desktops not an uncommon sight.

            • End User
            • 3 years ago

            Proof. Proof. Proof. Where is the proof.

            I’m not disagreeing with you. All I need is proof that what you say is true.

            • the
            • 3 years ago

            Order of operations. Apple started referring to USB 3.1 Gen 1 with the release of the single port USB 3.1 Gen1 on March 9th 2015.

            [url=http://arstechnica.com/apple/2015/03/explaining-the-usb-3-1-gen-1-port-in-the-retina-macbook/<]Articles indicated the USB org has retroactively renamed USB 3.0 to USB 3.1 Gen 1 appeared a week later.[/url<]

            • End User
            • 3 years ago

            Apple was the first to adopt the USB 3.1 Gen1 naming convention as outlined by the governing body.

            If you want to convince me there was something more to this then you had better dig deeper.

        • the
        • 3 years ago

        10 Gbit Ethernet adapters and external SSD storage will help drive USB 3.1 Gen 2 speeds. Wide spread adoption won’t happen until this becomes part of Intel’s chipsets which as noted in the article due with the 300 series chipsets.

        I don’t think that the stupid naming convention (and I will agree it is pretty stupid) has really set it back since it hasn’t reached support integrated into chipsets.

    • NTMBK
    • 3 years ago

    Nice, integrated wifi should help simplify and shrink ultrabook motherboards. Apple will be happy.

      • derFunkenstein
      • 3 years ago

      Apple has been using Broadcom Wi-Fi chips even when Intel’s have been around for a long time. I’d be kind of surprised to see them use it now.

      • DoomGuy64
      • 3 years ago

      Yeah, but you won’t be able to update your wifi modules anymore or use mPCI-e adapters to hack external graphics cards into a laptop. Integrated wifi seems more like a ploy to make laptops obsolete quicker than something that actually improves anything. Wifi modules are already so small that you don’t actually need to shrink them further, and costs are negligible, so the real goal is to increase planned obsolescence.

        • chuckula
        • 3 years ago

        Integrating WiFi in the chipset makes having another WiFi card literally impossible to exactly the same degree that onboard Ethernet makes adding in an extra NIC literally impossible.

          • DancinJack
          • 3 years ago

          If AMD had integrated Wi-Fi into any insertion point of Ryzen he’d be praising it. He has the oddest POV on the tech industry I have ever seen.

            • chuckula
            • 3 years ago

            Especially considering that RyZen literally integrates some components from the southbridge directly onto the CPU die itself, making the supposedly “anti-consumer” practice of putting something onto the chipset look like child’s play in comparison.

            • DoomGuy64
            • 3 years ago

            ASRock has Ryzen boards with wifi, and that’s not even the point. Nice straw man.

            I’m saying you lose the mPCI-e slot. Period. It’s not about how you include wifi.

            • derFunkenstein
            • 3 years ago

            What? Integrating it into the chipset means you can’t have an mPCI-e slot? Nice straw, man.

            • ImSpartacus
            • 3 years ago

            Not can’t, [i<]won't[/i<]. You're delusional if you think laptops aren't becoming more and more integrated. If a laptop maker doesn't need to include mpcie for WiFi, they won't (again, not "can't") include it. You can say that this guy is a baby for not coming to terms with the increasing integration of modern mobile devices, but it's equally silly to remind us that it matters that device makers technically aren't forced to make ever more integrated devices.

            • DoomGuy64
            • 3 years ago

            This. I would be perfectly happy to have everything integrated, so long as manufacturers left us some spare upgrade slots.

            The problem is that they won’t as soon as they don’t have to. Otherwise, it’s up to consumers to vote with their dollar, and most of them don’t know or care. The laptop market is going to eventually destroy itself, and get replaced by tablets like the surface.

            • derFunkenstein
            • 3 years ago

            From a laptop perspective, most people want lower prices. A smaller number of people are willing to pay more for an mPCI-e card slot so they can upgrade it. But notebook makers have been taking potshots at those upgraders for years in the form of BIOS whitelists. To say this is a game-changer is short-sighted.

          • DoomGuy64
          • 3 years ago

          Yeah, no. Most cheap laptops are integrated so thoroughly that the only upgradable slot in the machine is the wifi. Once wifi is integrated, there won’t be anything left to upgrade, other than your ram or hdd.

          I’m not saying you can’t add a usb wifi stick, I’m saying you lose the damn mPCI-e slot.

            • chuckula
            • 3 years ago

            [quote<]Most cheap laptops are integrated [/quote<] Yeah and you could have stopped there without the added conspiracy theory. A lot of "cheap" notebooks don't even have upgradeable wifi since it's soldered onto the motherboard anyway.

            • DoomGuy64
            • 3 years ago

            True, but nobody will bother to include it at all once it’s integrated.

            • BillyBuerger
            • 3 years ago

            Not sure about other laptops but the Lenovo thinkpad laptops we have at work won’t let you just swap whatever wifi card you want anyways. Supposedly for FCC requirements the laptop will balk if you put in a wifi card without their specific information on it. You can only buy approved wifi cards making swapping them basically a non-starter to begin with. So I could care less. I’ve had very little reason to swap out a wifi card in many years anyways.

            • DPete27
            • 3 years ago

            and of the wifi module, ram, or hdd. which one do you suppose is least frequently upgraded/swapped?

            • willmore
            • 3 years ago

            HDD. I always upgrade the RAM and Wifi on laptops.

          • ImSpartacus
          • 3 years ago

          I think he’s suggesting that the reality is that a space-constrained laptop is less likely to implement its own wifi if there’s something steady on the chipset.

        • DancinJack
        • 3 years ago

        No.

        • w76
        • 3 years ago

        The only product cycle that moves slower than Intel these days is Wifi. 802.11n is still a serviceable, widely compatible standard, and appears likely to remain so for many years, and it’s approaching it’s 8th birthday if memory serves me right. Integrating 802.11ac (not that integrating it in to the CPU chipset or a separate chip makes any difference at all to the consumer) doesn’t impact obsolescence at all. In fact, it’ll probably be one of the last things to feel antiquated.

          • DoomGuy64
          • 3 years ago

          Very few new laptops use 802.11n. Mostly Atom tablets, which is where 802.11ac should be integrated first if anything.

          I’m not complaining about the wifi dude, just losing the expansion slot for it. Reading comprehension. It helps if you read the whole post and don’t make up your own imaginary discussion inside your head.

          Also, if you’re going integrate wifi, standard ac is getting dated, and you need actual hardware/software support for MU-MIMO. Maybe if intel was including that, I would be happy about it. Otherwise, it’s just a minor cost cutting measure.

        • Ninjitsu
        • 3 years ago

        OTOH if this was in Haswell’s complementary PCHs, I wouldn’t have had to spend 25 EUR to change Dell’s inadequate wireless module with stronk Intel module.

    • chuckula
    • 3 years ago

    That could be a good upgrade, but the problem with USB 3.1 is the BSey “Gen 1” and “Gen 2” nomenclature where a port that’s running at USB 3.0 speeds can be called USB 3.1 “Gen 1”.

    Even if only one or two of those ports are “real” USB 3.1 that’s still a very nice feature, so we’ll see how it shakes out.

      • The Egg
      • 3 years ago

      It’s nice, but I was expecting to see USB 3.1 Gen2 back on the 200 series chipset, so it would’ve been pretty awful to not have it for Cannon Lake.

      Also, the wording of them being “capable” makes me raise an eyebrow. Sounds like there may be some caveats.

        • chuckula
        • 3 years ago

        “Capable” might also refer to other parts of the USB specification like power delivery for the high-wattage USB 3.1 ports.

        Obviously the chipset isn’t pushing the power through the port, so to meet every element of the full USB 3.1 spec (high-power delivery is optional IIRC) you have to look at how the device is configured as a whole.

          • The Egg
          • 3 years ago

          Sounds likely. It’s also possible that 10gbps speeds are only when a single port is in use, or with some other stipulation

            • willmore
            • 3 years ago

            You mean like the ports in the Macintosh laptops where they don’t have enough BW for all the ports running to the chip that controlls them?

        • the
        • 3 years ago

        Capable likely means that only certain lanes on Intel’s flex IO schema can support the higher speed USB 3.1. There are plenty of caveats with this (can’t have everything enabled) but nothing that we haven’t seen before.

      • curtisb
      • 3 years ago

      Blame Apple for the USB 3.1 Gen 1 nonsense. No one else, including Intel, has started calling USB 3.0 anything but USB 3.0. That’s not to say they won’t start that here, but they haven’t thus far.

        • ImSpartacus
        • 3 years ago

        Apple doesn’t make the standards.

        At least Apple plainly and clearly communicates that its “Gen1” while other sources (including this very article) often neglect that distinction.

        It’s shitty that there’s such a distinction to be made, but you can’t blame Apple for it.

          • the
          • 3 years ago

          Umm… Apple does make standards. MiniDP came from them, it wasn’t part of the DP spec until after they were first shipping systems with it. OpenCL is also an Apple creation which they gave to the Khornos group.

          • curtisb
          • 3 years ago

          They may not make the standards, but you better believe they can certainly influence them the same as Intel, AMD, Microsoft, Google, etc. Apple was the first to use the USB 3.1 Gen 1 designation, and it’s quite honestly the only place I’ve ever seen it. Everyone else, including Intel, stuck with USB 3.0. It was a mess that was created when Apple put a USB Type-C port on the Retina MacBook but it only had 3.0’s 5Gbps throughput as opposed to the full 10Gbps throughput of 3.1. [i<]That[/i<] is why I said "blame Apple."

        • Krogoth
        • 3 years ago

        They strong-arm USB-IF group into making USB 3.1 spec into a mess of confusing standards.

      • RAGEPRO
      • 3 years ago

      It does explicitly say “USB 3.1 (10 Gbps)” up there in the chart, boss.

      • meerkt
      • 3 years ago

      I heard they’re going to fix the naming with USB 4.

      There will be USB 4.0 Lite (=USB1), USB 4.0 Legacy- (=USB2), USB 4.0 Legacy (=USB3), USB 4.0 Legacy+ (=USB3.1) and USB 4.0 SuperHiSpeedSATAIII (=Bluetooth 4/LE compatibility mode).

      • TwistedKestrel
      • 3 years ago

      The slide directly addresses that, though. What I actually love about it is that it manages to address it without invoking the idiotic Gen 1/2 nomenclature at all

Pin It on Pinterest

Share This