Atheros, Wilocity collaborate on 60GHz wireless PCIe

Exciting things are afoot in wireless networking. Based on the work of the WiGig Alliance (which we’ve already told you all about), Atheros and Wilocity plan to make devices that use 60GHz wireless technology not just for ultra-high-speed networking, but also for wireless PCI Express connectivity.

These devices should start sampling next year, and based what Wilocity told us, they’ll allow notebooks to connect to displays, storage devices, and even auxiliary graphics processors, all wirelessly. The same 60GHz wireless adapters will support 2.5GHz and 5GHz bands, as well, so they’ll let you connect to existing Wi-Fi networks.

Here, the key to wireless peripheral connectivity is wireless PCI Express (wPCIe), a technology Wilocity developed (and subsequently trademarked). The company’s whitepaper has all the nitty-gritty details, but the diagram below does a pretty good job of summing it up:

Essentially, wPCIe specifies a PCI Express switch with local and remote components that talk over a 60GHz connection. The remote components go in the DockingZone (another Wilocity trademark), which can include any number of PCIe-compatible devices or controllers. A hypothetical DockingZone might have, say, USB 3.0, eSATA, and FireWire controllers sitting alongside a graphics processor. The neat part is that, in Wilocity’s words, “the switch appears as if it is co-located in a single location. Therefore, the software used to configure and manage the switch is identical to that of legacy switches/bridges.” Translation: the operating system doesn’t need to know there’s any wireless tomfoolery going on.

Wilocity told us that wPCIe can push bits at up to 5Gbps (625MB/s), and that the spec should move “quickly” to 7Gbps (875MB/s). Assuming that doesn’t account for the 8b/10b encoding used by PCI Express, you can expect PCIe transfer rates of up to 500MB/s at first, which is equivalent to two lanes of first-gen PCI Express connectivity. That’s probably not enough for a proper graphics card, but as we found while reviewing Zotac’s Zbox HD-ID11 nettop, an Nvidia Ion GPU hooked up over a single gen-one lane can still crank out decent frame rates in casual games at 1080p.

Of course, cheap netbooks won’t be the first systems to start toting 60GHz adapters—Wilocity expects the technology to debut in high-end systems. We’re told there’s nothing inherently more costly about 60GHz technology, though, so support may not take long to trickle down to cheaper machines.

Comments closed
    • Bensam123
    • 10 years ago

    What is… latency?

    This is generally why telling young geekians using a supercomputer for gaming was a bad idea back in the day and also why wireless mice are still a bad idea for hardcore gamers.

      • Krogoth
      • 10 years ago

      Wireless mice are bad for gaming because of the batteries not because of the latency. Your nervous system operates at a much lower speed than wireless connections.

        • Waco
        • 10 years ago

        That may be theoretically true but I’ve yet to use a wireless mouse that didn’t have an additional noticeable amount of lag inherent in it. Adding extra lag that doesn’t need to be there (especially given the already almost noticeable lag from many monitors) seems pretty useless to me.

        • Bensam123
        • 10 years ago

        As the post above me said, you ‘feel lag’ even if it’s not noticeable on a 1 on 1 basis. Every wireless mouse I’ve used, it feels like the mouse is following around where the cursor really should be. This is extremely noticeable if you’re used to a high DPI / high pole rate mouse. It feels like mouse trails are on.

        I’ve had similar experiences using LCDs and a long time ago when they first introduced culling I believe, where video cards do not draw objects around corners to save processing power (Radeon 9700). I had a scenario where I knew when they started implementing it because when people appear around corners there is a extremely small delay between when they appear and when they were drawn in FPS’s. That was ironed out effectively in newer video cards to the point where I can’t notice it anymore (or I got old).

        That said, attach a processor/video card wirelessly and see what happens to your system responsiveness. Be my guest and game over a wireless network too (right now). I’m sure you’ll have a fairly decent ping, but you’ll play like utter ass.

    • Krogoth
    • 10 years ago

    I can’t really see it being useful to the average desktop. The distance probably isn’t that good if you want to get any decent bandwidth. IMO, for short runs (less than 3m), a wired connection makes more sense. The key is making it useful for 5-15m, because it would make a nice alternative to docking stations and computing cluster connection.

      • designerfx
      • 10 years ago

      uh, no.

      think of what this means. It’s the same idea as the wireless rechargers for cell phones. You know, that “charge 3 devices at once” thing. However, now you have a wireless connector for *all devices* irrespective of what format they use. Basically a glorified version of USB3 that works faster and you don’t have to plug it in at all. Just plop it on top of the “Dock mat”. poof.

        • Krogoth
        • 10 years ago

        Basically, my alternative “dock station” idea. 😉

    • SomeOtherGeek
    • 10 years ago

    Cool. What is next wireless electricity? Tesla called and wants his technology back…

    • ltcommander.data
    • 10 years ago

    Maybe this will be more successful than Wireless USB. Whatever happened to that anyways?

      • Saber Cherry
      • 10 years ago

      Superseded by Bluetooth. Devices that need more bandwidth than Bluetooth can handle are likely to need wires anyway, for power.

    • jackaroon
    • 10 years ago

    I can’t imagine using that, on security grounds.

      • ew
      • 10 years ago

      From what I understand 60Ghz attenuates pretty quickly in air. I’d be more worried about wirelesses general unreliability then anything else.

      • Majiir Paktu
      • 10 years ago

      I expect wireless traffic would be encrypted. Then again, it wouldn’t surprise me if this company found the hardware to symmetrically encrypt at 500MB/s to be too expensive.

        • Saber Cherry
        • 10 years ago

        HDMI, with similar bandwidth, can be encrypted with fairly cheap components. Encryption can be accomplished with a trivial amount of processing if you don’t use a very secure algorithm. I agree that it is very unlikely that they will use a strong cipher (such as AES) on 500MB/s streams, but personally, I would not even consider a product like that if it didn’t include some form of data obfuscation.

          • Majiir Paktu
          • 10 years ago

          A strong cipher doesn’t necessarily demand tremendous computing power. AES is pretty fast, despite its strength.

          You’re right about HDMI, I hadn’t really considered that. The HDCP implementation itself is insecure, but only because its key exchange is flawed, not because it uses an insecure cipher.

      • designerfx
      • 10 years ago

      you do realize this will work at an excessively short distance, right? This isn’t exactly the kind of thing you do to send a signal across a mountain.

      • Wajo
      • 10 years ago

      AFAIK, one of the reasons for using the 60GHz band is that is the resonance frequency for water vapor suspended in the air, and therefore signal attenuation is dramatic, ensuring the signal does not interfere with other devices nearby.

      That’s what I remember from my RF classes, but I might as well be wrong 😛

    • kamikaziechameleon
    • 10 years ago

    very interesting. Tech like this will revitalize the desktop space in a very unique way while also promoting and growing tablets, netbooks, and notebooks.

    • Kurotetsu
    • 10 years ago

    Wow, that diagram is outdated. Intel and AMD have both ditched the FSB architecture, desktops are using DDR2 and DDR3, a single PCI-E x1 link for the Southbridge is weaksauce, and the Northbridge has been moved onto the CPU (Intel has already done it, AMD will be doing it with Bulldozer I think)! <– Can you tell that I’ve completely missed the point of the article?

      • khands
      • 10 years ago

      AMD did it before Intel actually.

        • Atradeimos
        • 10 years ago

        I think he’s talking about the entire northbridge, not just the memory controller (which is what AMD did with the Athlons).

    • Duck
    • 10 years ago

    We are getting pretty close to microwave radiation here. 100GHz and you can reheat your dinner wirelessly… or heat up your eyeballs lol!

    If 60 Ghz = 6×10^9 Hz then that puts microwaves in the 100GHz to 1000GHz range.

      • Atradeimos
      • 10 years ago

      Just to correct a misconception… Most household microwave ovens actually work at 2.4 GHz, so your router is technically a microwave oven about 10,000 times less powerful than the real thing.

      That’s also why keeping your router near a microwave is a bad idea.

        • excession
        • 10 years ago

        Ooh, beat me to it! 🙂

          • Atradeimos
          • 10 years ago

          Lol. I figured I’d look at wikipedia too, just to make sure…

      • excession
      • 10 years ago

      Er, microwave ovens radiate at ~2.4GHz. 🙂

      §[<http://en.wikipedia.org/wiki/Microwave_oven#Principle<]§

        • ShadowTiger
        • 10 years ago

        Phones also use a 2.4 GHZ… which might be why some studies suggest that using wireless phones is bad for you!

          • Atradeimos
          • 10 years ago

          Cordless phones do, not mobile phones. But who even uses those anymore?

    • tay
    • 10 years ago

    So 5Gbps wireless = 1 Gbps physical? And 60 Ghz means 5′ range?

Pin It on Pinterest

Share This