USB Type-C is getting safer with cryptographic authentication

It’s been nearly three years in the making, but the USB Implementers Forum (USB-IF) has launched the USB Type-C Authentication Program. It’s a way for OEMs to implement the USB Type-C Authentication spec that defines the cryptographic-based authentication methods used to protect systems from firmware, hardware, or software threats, an even power surges from non-compliant chargers.

OEMs get control over the actual security policies that they put in place, but the specification allows them to use a standard protocol to do so. “Using this protocol, host systems can confirm the authenticity of a USB device, USB cable or USB charger, including such product aspects as the capabilities and certification status,” said the USB-IF in a press release. The authentication occurs at the moment the wired connection happens, the group said, “before inappropriate power or data can be transferred,” and it can be via the USB data bus or USB PD communications channels.

The spec uses 128-bit security and “existing internationally-accepted cryptographic methods for certificate format, digital signing, hash and random number generation.” DigiCert is managing the Public Key Infrastructure (PKI) and certificate authority services for the program, a fact the group touted in its own press release. Participation in the program is voluntary for OEMs.

There are obvious practical applications for this technology. One is avoiding wonky USB Type-C chargers, which is a well-documented and immensely frustrating hazard of owning a USB Type-C device. Instead of playing Russian roulette with a third party charger, whether it’s one you buy from Amazon or a charging port at a public terminal, the authentication should ameliorate any dangerous connections. Sure, you may end up without the ability to juice up your phone at the airport, but it’s better than the alternative.

Another scenario that the USB-IF noted is when an organization needs to secure its IP (or state secrets, or whatever) on laptops that travel with their personnel. They can implement a policy that will allow only verified storage devices to connect with the systems.

That is all marvelous, but because every tool can be a weapon when used improperly and we can’t have anything nice, there are fears that this program is essentially a DRM mechanism. Indeed, it gives OEMs a lot of control over those ports, and that control could be abused to, for example, lock out competitors’ devices. The practicality of such restrictions seems meager, though; that would be an awfully aggressive vertical integration strategy.

It does, though, follow the USB-IF’s strategy of making USB Type-C an eminently flexible connector, made more (or less) powerful by the protocols a port supports. The whole USB Type-C plan was about using one connector to support all manner of protocols, from Thunderbolt 3 to Power Delivery, and letting OEMs figure out which ones they wanted to implement.

This has backfired a bit, though, because you can’t tell by looking a given USB Type-C port what it actually offers. Does it have USB 3.1 Gen 2? Thunderbolt 3? And so on. (You’re supposed to be able to tell at a glance, but OEMs don’t always use the labeling code that the USB-IF created. Sigh.)

It’s ultimately probably for your own good, but now you can add “Are these two devices that share a universal physical connector actually compatible, or is the authentication failing?” to your list of troubleshooting questions to ask when things go awry.

Comments closed
    • mikewinddale
    • 9 months ago

    So what will it take to enable this on the host side? BIOS update? Chipset driver update? Windows update?

    • Ummagumma
    • 9 months ago

    This idea is probably just another “feel good” attempt at security, like TSA for “airport security”.

    I don’t expect this approach to remain “secure” for longer than 6 months, 12 months at the outside, before it is hacked in some way, either by “anonymous hax0rs” or “state actors”, because we have seen SSL certificates hacked before. Stuxnet anyone?

    What about the recent the recent discoveries of “signed malware”? Yep, malware with real (forged or stolen) authentication certificates.

    • DoomGuy64
    • 9 months ago

    Oh boy, just had a realization. You don’t need to buy a hardware keylogger anymore, as it’s built into the cable. Great. China will love this, as it makes stealing IP so much easier. Also, I’m sure companies like Sony would love to install more rootkits as well.

    People already mentioned expired certificates and vendor lock-in, so we have that as well. Not to mention chips that break or burn out will render the cable useless.

    You know what the real solution is? Auto sensing usb ports. Copy the technology used in vape mods that auto detect coils, and you fix all the problems without needing to put chips into the cable itself. Chipping the cable is just stupid on a mass level, unless the true goal is drm. Meanwhile, auto sensing usb ports will not only fix the problem, but keep cost down while exposing manufacturers that cheap out.

    • helix
    • 9 months ago

    Will this enable hardware manufacturers to make their USB devices only work if the bundled crappy app provides the right crypto key?

    (I’m talking about that stuff on the bundled CD you get with all hardware that I throw away much faster than I throw away than the box it came in. Windows only, never to see an update, riddled with security holes, mostly there to make sure you saw the first page of 100 pages of legalese no one in their right mind would ever agree to.)

    • Wirko
    • 9 months ago

    One of the funny things a digital certificate can do is to expire.

    • psuedonymous
    • 9 months ago

    Before anyone declares that the sky is falling: how many devices have implemented restrictive whitelists for the existing electronically marked Type C cables?

    • BorgOvermind
    • 9 months ago

    Oh, the retards with ‘new ideas’ again.

    A charger must have the + and – terminals and that’s it. No danger, no security breach, no problem. But most stupidphone manufacturers wanted to not allow charging if data pins are not connected, which in turn causes the problems mentioned above and a few more. Now they want to super-over-complicate things, which will in turn will cause even more issues.

    No, thanks. USB ? TB ? Pass. I’ll take a red and a black wire. My phones and chargers work with power, not data, and I can charge any of my phones from any DC source that has above 4.25 V. So good luck fighting the exponentially growing problems.

    • jensend
    • 9 months ago

    “Manufacturers can now make cell phones that will rely on DRM to ensure you can only quick-charge with their own pricey OEM accessory cable. This is good for you because, um, safety! Terrorists! You’ll soon get to carry around a different USB cable for every device—hurrah for standards!”

    It has seemed to me the USB-C standard was prematurely rushed out the half-bakery in the first place.

    • Flying Fox
    • 9 months ago

    Based on the latest shenanigans about “fast” wireless charging in the Pixels, I am betting that Google will be the first to do this and shut out a bunch of 3rd party (even well built ones) chargers and stuff.

    • Usacomp2k3
    • 9 months ago

    I’d love an improved ability to go into “charger only” mode that would prohibit data transfer by default unless unlocked. Maybe the iPhone does that and I didn’t realize it?

      • tu2thepoo
      • 9 months ago

      iPhone does that by default after one of the iOS11 updates, unless you specifically disabled it: [url<]https://support.apple.com/en-us/HT208857[/url<] "Starting with iOS 11.4.1, if you use USB accessories with your iPhone, iPad, or iPod touch, or if you connect your device to a Mac or PC, you might need to unlock your device for it to recognize and use the accessory. Your accessory then remains connected, even if your device is subsequently locked. If you don’t first unlock your password-protected iOS device—or you haven’t unlocked and connected it to a USB accessory within the past hour—your iOS device won’t communicate with the accessory or computer, and in some cases, it might not charge. You might also see an alert asking you to unlock your device to use accessories."

        • Usacomp2k3
        • 9 months ago

        I guess I was wanting something more granular. If I’m sitting there using my phone and want to charge it at the sametime, there is no way to force it into “charge only” mode.

        • psuedonymous
        • 9 months ago

        It’s also the stock behaviour as of Android 6 (though of course some OEMs screwed it up).

    • cygnus1
    • 9 months ago

    I find this highly unlikely to stop shady factories from producing junk USB things (chargers or cables or whatever). They will simply source a a chip from a another shady factory that has somehow gotten hold of trusted certificates. They will pump out a million crappy USB things that might marginally work for some period of time until the cert is revoked/blacklisted and then it’ll just stop working with some devices. Users won’t have any clue what’s happening and will just buy another cheap USB thing. Rinse and repeat.

    • DragonDaddyBear
    • 9 months ago

    “Another scenario that the USB-IF noted is when an organization needs to secure its IP (or state secrets, or whatever) on laptops that travel with their personnel. They can implement a policy that will allow only verified storage devices to connect with the systems.”

    The corporate security professional in me loves this, the consumer in me is very anxious. I mean it would be nice to have device-authentication for authorized devices, but I can see someone like Apple locking out manufactures who do not pay a license from not being allowed to use it (I’ve seen errors on iPhones with multi-plug charging cables because it wasn’t theirs).

      • liquidsquid
      • 9 months ago

      This is exactly the first thought that comes to mind. It is a way to lock out generic chargers, devices, and competition to keep prices high. In addition I can foresee this causing a lot of confusion with end users when peripherals work on one machine, and don’t/wont on another.

      We will go right back to only a Samsung charger will give you full high-speed power and features. Everything else will result in a measly 100mA trickle charge. On a business trip and forget your charger? Tough crap.

      Where this may be quite useful is a simple method of node-locking software to a cheaper hardware dongle without requiring special crappy drivers that are nightmare to support in the engineering world, and the obvious you mentioned with encrypted storage.

        • frenchy2k1
        • 9 months ago

        Can’t say it won’t happen, but today’s problem is devices that did not follow specs and were reporting their capabilities erroneously.

        I mean, a google engineer had to test and report on early USB-C cables as those would fry connected devices (plugs were wired incorrectly) or not work at all.
        [url<]https://bensonapprovedcom.wordpress.com/[/url<] Now with identification, the device itself can report on unrecognized cables/chargers. This can both prevents accidents and/or lock up devices to official accessories. It will all depends on the implementation on each device.

    • Krogoth
    • 9 months ago

    This is a pointless gesture. If your attacker has physical access. The game is already over.

      • homerdog
      • 9 months ago

      It’s probably intended to protect users from themselves more than anything else.

      • derFunkenstein
      • 9 months ago

      You clearly don’t understand this.

      [quote<]before inappropriate power or data can be transferred[/quote<] The "power" part of that should signal that this is at least in part about preventing sending 60 or 80 watts down a cheap, under-specced cable and preventing physical damage.

        • Krogoth
        • 9 months ago

        That has nothing do with putting in encryption/authentication though.

          • derFunkenstein
          • 9 months ago

          Sure it does, at least in part. For example, the cable would have to pass authentication to carry power delivery.

          • frenchy2k1
          • 9 months ago

          they basically had to add cryptographic signature to identify and verify what the device is saying *BECAUSE* devices were lying.

          USB group created a very flexible spec letting both the source, sink and cable code what their capabilities are, but cheap manufacturers have been flooding the market with cheap devices (either power supplies or cables, particularly), that do not respect that spec.

          So, the solution they chose to solve that problem is to certify the devices and add a key so that you can identify the certification.

          It’s a ham-fisted solution, to a human problem. It will work, but could also be abused.

          It’s still a working solution to a problem that was causing devices to go up in smoke.

    • chuckula
    • 9 months ago

    Least important: the number of bits in a hash/signature/etc.

    Most important: Whether or not the trust authority is worth a darn.

    Here’s a hint: If the NSA wants to Snoop on your encyrpted SSL sessions do you think they spend billions on exotic quantum computers that may not even be possible or do they just create or subvert one of the ridiculously large set of trusted key authorities that are blindly accepted by your web browser? The same concept applies here.

      • DoomGuy64
      • 9 months ago

      Absolutely, but the more direct problem is drm. What is the true ramification here? How are we going to be affected in general? Higher prices for sure, and maybe even locking out competitors. Dell only ecosystem, for example.

      • cygnus1
      • 9 months ago

      The NSA breaking this wouldn’t be about snooping network traffic. Not directly anyway. NSA (or other bad guys) breaking this would really only let them spoof hardware signatures.

        • Beahmont
        • 9 months ago

        Think Stuxnet. Slipped into the Iranian facilities by people sticking random USB flash drives into ports. If there was authentication for USB drives, that wouldn’t happen…

        Unless the NSA spoofs the hardware signature of a valid USB device.

          • cygnus1
          • 9 months ago

          Yeah, the NSA won’t have any issue creating devices with trusted certs, of that I have no doubt.

Pin It on Pinterest

Share This