AMD, Nvidia, VIA could make separate USB 3.0 spec

When it shows up next year, the USB 3.0 specification should bring both vastly improved transfer speeds and backward compatibility with USB 2.0. That all sounds good in theory, but the way Intel is handling things reportedly has AMD and others riled up.

News.com quotes one source close to AMD as saying, "The challenge is that Intel is not…giving the specification to anybody that competes with CPUs and chipsets." To be more specific, Intel does plan to release the spec—but only at the end of the year or the beginning of 2009. The same source claimed that would give Intel free reign over the USB 3.0 market for "six to nine months," during which competitors would have to sit around and wait. The source believes Intel has no good reason to withhold the spec, since much of the intellectual property behind USB 3.0 comes from the PCI Special Interest Group.

What does Intel have to say about all this? News.com also spoke with a source close to Intel, who chimed in, "We do the work–at this point it’s not an industry effort anymore–and then (we) hand over the work for free without any licenses." That particular source added, "Intel only gives it out once it’s finished. And it’s not finished. . . . If it was mature enough to release, it would be released."

News.com says AMD, Nvidia, and VIA could end up creating their own USB 3.0 specification in the meantime. Such a spec would be compatible with the Intel standard in theory, but users could run into snags. The source close to AMD said on the subject, "This is not good for users. But we have no choice."

Comments closed
    • pogsnet
    • 12 years ago
    • jstern
    • 12 years ago

    I hate the USB plug, I think it’s called “A” plug. Just rather have something like the connectors on a GameCube controller, where you can tell which way is up by just feeling it, and not having to look at it. Something like the GameCube controller but smaller. I know you can feel current USB cables, but it doesn’t compare.

      • Saber Cherry
      • 12 years ago

      Yep, it’s an example of really poor design. I usually insert my thumbdrive the wrong way first.

      Also, the plug’s retention is inferior, and things get knocked/pulled out easily.

        • indeego
        • 12 years ago

        Did you know you can fit a USB cable into an RJ-45 slot? Seems kinda whack. In the dark I’ve mistakingly plugged in a USB cable in the wrong slot, thankfully no short happenedg{<.<}g

          • Palek
          • 12 years ago

          Indeed. Experienced this first-hand on my Thinkpad X40 which has the RJ45 socket right next to a USB socket. Plugged in mouse in the dark and wondered for a few seconds why it wouldn’t respond!

            • d2brothe
            • 12 years ago

            You can complain all you want about the USB connector, I would still note its so much better than serial or parallel port, and even PS/2 ports, in terms of plugging stuff in: no screws to do up, no pins to bend, etc.

      • Usacomp2k3
      • 12 years ago

      The flat side (not the cracked side) goes up on 90% of computers. It’s incredibly easy to manage. (up = away from the bottom of the case for computers standing vertical).

    • Hattig
    • 12 years ago

    I think a mixture of eSATA and Firewire would do for most people. They are much closer to these companies viewpoints on the market, whereas USB3 is just going to be “more work for the CPU, so we can sell more” from Intel. Firewire controllers have their own CPU which takes the work off the main CPU, in essence a “bus accellerator” if you think of it that way. VIA, nVidia, AMD should adopt and integrate Firewire into their core chipsets for general connectivity, add eSATA for really fast external storage (although FW800 is fast enough to be honest, and FW3200 would beat eSATA).

    • Crayon Shin Chan
    • 12 years ago

    Now THAT’s the Intel I know!

    • xii
    • 12 years ago

    Oh no, not another CD+R/CD-R Bluray/HDDVD Gnome/KDE fiasco… I hope manufacturers jump over to Firewire then.

      • d0g_p00p
      • 12 years ago

      Firewire is dead.

        • derFunkenstein
        • 12 years ago

        Firewire is on point.

        It’s not dead, but it’s a very niche product – IEEE1394b is still kicking it at 800mbit on professional video setups.

          • flip-mode
          • 12 years ago

          firewire is soooo on point.

            • eitje
            • 12 years ago

            *how on point is it?*

            • flip-mode
            • 12 years ago

            Not soo, not sooooo, but soooo.

            • BobbinThreadbare
            • 12 years ago

            It’s over 9000!

      • boing
      • 12 years ago

      What’s CD+R?

        • Meadows
        • 12 years ago

        I’m sure he meant DVD+R versus DVD-R. Quite frankly I don’t care which one I’m using, but it’s worth noting that DVD+R has higher fault- and speed tolerance. That’s about it.

    • crazybus
    • 12 years ago

    I like to keep data storage in an external enclosure so additional speed over USB2 is very nice. Getting the 75MB/s+ hard drives are capable of vs. th 32MB/s of USB is quite a big deal.

    In its current incarnation, eSATA isn’t exactly ideal for laptops due to the lack of support for bus powered operation.

      • moritzgedig
      • 12 years ago

      then get yourself a ilink/firewire/1394 enclosure.
      USB isn’t ment to power 3,5″ drives either.
      FW can serve more power than USB.

    • PetMiceRnice
    • 12 years ago

    I’ll just wait for the 3.1 spec to come out that everyone agrees on and avoid a Beta vs. VHS or HD DVD vs. Blu-Ray sort of war. I’ll continue to stick with Firewire for my external hard drives as I have for several years now. I don’t have any other devices that would remotely need anything faster than USB 2.0.

    • Mystic-G
    • 12 years ago

    Nvidia still holds SLI which is a mixed blessing as they will recieve more enthusiast sales at the moment but if they were to share they could easily work up a deal with Intel to recieve both USB 3.0 and Nehalem compatibility.

    As for AMD, not much to say lol.

    But yea, Intel needs to stop being a d***head and be a more friendly company. Enthusiasts are the ones informed about these things so if you want to appeal to them try not to fragment the market.

      • greeny
      • 12 years ago

      of course then there would be no reason to buy a buggy nvidia board and their core logic business would be finished

      • pogsnet
      • 12 years ago
        • mad dog
        • 12 years ago

        yet we all endorse it .. this must mean we’re not friendly either

    • PeterD
    • 12 years ago

    Euh… different specs… There was something with different specs recently…. euh… what was it? … Oh, yes: HD-DVD and Blu-Ray. Both failed.

      • firerules16
      • 12 years ago

      Oh really? Blu-ray failed? Hmm, I seem to recall buying a brand new blu-ray just the other day. I even think they’re releasing new ones on a weekly basis, if I remember correctly!

        • indeego
        • 12 years ago

        The adoption rate isn’t anywhere where they expected it to be, especially in the states. This is more than a format succeeding over another, I’m quite sure the next big format is being worked on to eclipse blu-ray, so time is not on its sideg{<.<}g

          • Mystic-G
          • 12 years ago

          On the contrary, the longer Blu-ray is around (by itself) the more time it has to weave itself into the public along with companies. Must I remind you why HD-DVD even died? Not because it was inferior, because Warner Bros went exclusive to Blu-ray. If there was a product being worked on to be even better it’d only be in vein at this point in time.

          Intel is a huge name and whatever standard they come up with, the others usually have to follow suit depending on the topic at hand. As for their tactics along with it, that’s a whole nother story.

      • d0g_p00p
      • 12 years ago

      Blu-Ray failed? It’s the standard next gen high def now.

    • Pax-UX
    • 12 years ago

    More stupidity by big business. I’m more interested in wireless usb standard we don’t need a usb3. Things are wireless now!!! remember wireless we don’t want wires!!!

      • Mourmain
      • 12 years ago

      I want wires, thank you very much. I can do with taking 5 seconds to plug something in, instead of having to deal with issues of security, frequency conflicts and not least the health hazards.

      Plus, the exercise will do you good.

      • Resomegnis
      • 12 years ago

      Wireless is not the way to go. When to many things are wireless it creates problems. Not to mention wireless will NEVER be ask fast as a hardwire connection.

      Something things need wireless, mice, internet, etc. but I don’t want EVERYTHING to be.

      I’m more interested in e-SATA w/power.

        • Pax-UX
        • 12 years ago

        Well I was actually thing keyboard and mouse without usb dongle mostly + other things if they wanted. Just a Wireless USB standard where everybody will make wireless stuff to one standard. Bluetooth is great but just not enough things support it. As for security that’s always a problem but it depends on the device. Not overly worried about though, most wireless KB+MS can be remotely read any.

    • tygrus
    • 12 years ago

    Can they please keep in common a low level interface and driver low level compatibility so the OS can get some functionality and interogation of details&functionality without the ned for separate drivers per USB device (host bus or device). I hate Windows needing you to workout what the device is and load te driver/inf before Windows can actually work out what hte device is. Why does the OS need so many custom drivers for so many devices that are so similar (incl. same ASIC but different model) ? Plug&Play should mean basic config and basic functions should work with generic driver that can get full details of brand, model and features.

    • Shining Arcanine
    • 12 years ago

    We do not need USB 3.0 right now. I have no idea why AMD, Nvidia and VIA would waste their time making a separate USB 3.0 specification if Intel refuses to release the specification until it is done. It is a complete waste of time and resources.

      • davidedney123
      • 12 years ago

      We very much DO have a need for it – faster transfer from HD camcorders, faster transfer rates for other external storage devices (eSATA is shite – the last thing we need is another port standard), external HD tuners. On top of those (and a pile others) it opens the door for all sorts of innovative products – a more usable version of the external USB video card for example.

      Just because you don’t need it doesn’t mean nobody else does. Anyone who ever argues that higher performance isn’t necessary obviously has no idea what they are talking about.

      That said I do agree that it would be a waste of time for these companies to develop their own standard for the sake of a 6 month wait, and I doubt AMD could afford to do so anyway – they’re just sabre rattling.

      Dave

        • Shining Arcanine
        • 12 years ago

        Most storage solutions do not exceed 60MBps transfer rates. There is no present need for USB 3.0 for HD camcorders.

        Internal TV Tuners are superior to External TV Tuners.

        Ceil(1920 * 1080 * 30 * 4 / 1024^2) = 238MBps

        There is a need for more bandwidth for External TV Tuners, but present PCI Express x1 slots allow for 250MBps transfer rates and existing external tuners will not benefit from the technology. The only thing that the creation of USB 3.0 hardware will accomplish is the creation of the need for USB 3.0. There is no present need for it now.

        As for a more usable USB video card, the bandwidth will only be 600MBps whereas PCI Express 16x provides 4GBps of bandwidth. I have a cousin who has a Dell computer that uses the BTX specification, integrated graphics and has no PCI Express or AGP slots. He cannot upgrade his video card to anything decent. In the future, more computers will be made like this. Not only that, but our games will be more demanding. I sincerely doubt that the 600MBps of USB 3.0 will provide any improvement in tomorrow’s games over integrated graphics that the 133MBps of PCI presently provides in today’s games.

    • albundy
    • 12 years ago

    hope the ftc frys them good. they abuse their power whenever they can. eSATA forever. do they really think manufacturers will choose their expensive crappy spec? is it even wireless for that matter? or are they just gonna peddle utter cr@p.

    • cerenity
    • 12 years ago

    It might be a clueless question, but in the spirit of interoperability, how does this differ from Nvidia keeping their SLI specs to themselves? If it is a clueless question, please do clarify and correct me.

      • Smurfer2
      • 12 years ago

      Interesting comparison. What I thought when I read this, is that this is bad. I can’t think of an input standard that is more important that USB. We use it for mice, keyboards, printers, flash drives, external hard drives (sometimes), scanners, digital cameras, etc… So messing or fooling around with a update of the standard is a lose-lose. The other companies seem to have no real choice other than to wait…

      Okay, with that said, your comparison is insightful. However, the difference between SLI and USB is the fact that most computer users will not be using SLI, either because they don’t know what it is or have no use for it. (like me, I game, but prefer one good card over 2 mediocre ones) While USB is used by almost every major input device. So, I wouldn’t say that is the greatest analogy.

      • alex666
      • 12 years ago

      What smurfer said. USB is so widely used by so many, easily well over 50% of all PC users if I had to venture a guess. In contrast, SLI likely is used by less than 1%. So the USB 3.0 conundrum has the potential to affect many many more people.

    • bdwilcox
    • 12 years ago

    I can’t wait for the BluBus vs. HDBus war.

    • Nitrodist
    • 12 years ago

    I don’t care that Intel is “hogging” all of the work that they did. For heaven’s sake, don’t split any standards ever… ever… evverrrrr….. ever……………

    ever………

    ever….

    USB 3.0 will fail……..

    Damn it!

    And this is just pure speculation from some random person!

      • Smurfer2
      • 12 years ago

      Wow, that quote is simply amazing! ๐Ÿ˜›

    • pogsnet
    • 12 years ago
    • ludi
    • 12 years ago

    If it were just AMD and/or VIA making noise, this would suffer crib death. With Nvidia signed up, Intel will have to cought up or get subrogated into whatever the other three come up with, especially with the other three having substantial presence in both portable devices and industrial systems.

      • eitje
      • 12 years ago

      psch, why does nvidia matter? are they going to leverage their ultra-influential ESA spec against Intel? ๐Ÿ˜›

        • flip-mode
        • 12 years ago

        Prolly cuz Nvidia usually doesn’t fail to deliver.

          • ludi
          • 12 years ago

          Heh…that, and the fact that an Nvidia/AMD/VIA triple play pretty much covers any x86 chipset that isn’t made by Intel, or in other words, the other half of the platform industry. AMD and/or VIA don’t have enough clout to push it alone or as a duo, so if Nvidia merely said “we’ll wait and see”, Intel wins.

    • Krogoth
    • 12 years ago

    Great, it is SATA-I all over again.

    >_<

    Besides, what makes USB 3.0 so darn special? There are quite a few high-bandwidth interfaces already available on the market.

      • ish718
      • 12 years ago

      I think its the reputation lol

    • indeego
    • 12 years ago

    Internet meme that died before it startedg{<:<}g /[<"____ is on point."<]/

      • eitje
      • 12 years ago

      you know where you are, indeego.

        • Meadows
        • 12 years ago

        Indeed, he knows that he’s /[

      • ludi
      • 12 years ago

      Uh-oh…indeego is /[

    • Kurotetsu
    • 12 years ago

    Intel tried this before with USB 1.0. It resolved when a bunch of competitors got together and eventually released USB 1.1, making Intel’s implementation worthless. Inquirer talks a bit about it (I know, I know, its The Inquirer, but I don’t think they can mess history up THAT much).

    ยง[<http://www.theinquirer.net/gb/inquirer/news/2008/05/09/intel-plays-games-usb3<]ยง Though now its even worse, since eSATA is already out there being implemented on a ton of boards, plus the 'Power over eSATA' revision should be coming soon and will be backwards compatible. I could see board makers just giving USB 3.0 the finger and going all the way with eSATA.

      • doobtribe
      • 12 years ago

      I believe i write for many that, i would love to see that happen!

        • Deli
        • 12 years ago

        seems like only mid-high to high end board have eSATA, which is unfortunate.

      • BobbinThreadbare
      • 12 years ago

      Yeah, but I don’t think SATA is a good interface for printers, cameras, mice, keyboards, etc.

        • Kurotetsu
        • 12 years ago

        Firewire gets another chance at life?

          • Peffse
          • 12 years ago

          I would think most would keep to USB 2.0 for printers. scanners, etc… versus paying high royalties to Apple.

        • Corrado
        • 12 years ago

        Do you really think your printer needs more than 480mbit/sec? Or your mouse and kb? Lets be honest here… USB 3.0 spec is for storage.

        • Forge
        • 12 years ago

        Yeah, cause the printers of the world are just STARVING on 60MB/s.

        I had this conversation earlier tonight already.

        The only devices that are really suffering under USB 2.0 that USB 3.0 might help would be external HDDs.

        What are external HDDs doing about it? eSATA.

        Why do we need USB3 again?

          • data8504
          • 12 years ago

          You seen many laptops with eSATA connectors recently?

            • Farting Bob
            • 12 years ago

            Laptops may not have eSATA ports right now, but apart from the very high end laptops do you see USB3 being adopted any faster? Its not coming out for a long time yet, and there is likely to be only a few products that support it initally. eSATA in the meantime will gain a bigger foothold.

            • bthylafh
            • 12 years ago

            Who honestly needs external HDs to be that fast? Is anybody really using them for running big applications or data sets?

            I’d expect by the time you’re saturating USB2, you’re also hogging the processor pretty badly. More’s the pity that Apple couldn’t have made Firewire a non-royalty tech to begin with.

          • davidedney123
          • 12 years ago

          So that we can stick with just one port for everything? I seem to be alone in thinking esata is a fudge and can’t die soon enough.

          Dave

      • random_task
      • 12 years ago

      According to Wikipedia (http://en.wikipedia.org/wiki/Serial_Attached_SCSI#SAS_vs_SATA) both ATA and SATA (they both use the same instruction set) only support disk drives and hard drives. Apparently SAS can support printers and stuff though… odd.

        • evermore
        • 12 years ago

        Serial Attached SCSI. SCSI has always been able to support a wider array of devices (and both internal and external to boot). The same commands/protocol is used for SAS so it makes sense the same hardware would be supportable (I doubt anybody’s making SAS scanners though). SATA doesn’t implement enough of the SCSI technology to let it handle all those devices.

    • Mithent
    • 12 years ago

    This doesn’t sound like a terribly good development, unless both are entirely compatible.. even then, it’s likely to confuse people if they go by different names. USB’s been so successful because it’s so easy, you just plug in almost any device to the same ports and it works. If this leads to only certain devices working with certain computers it’s not going to be beneficial.

    Intel having too much influence is bad, but fragmenting the USB standard could be worse.

      • TheTechReporter
      • 12 years ago

      Yes, fragmenting USB would be _much_ worse.
      AMD says “we have no choice”, but they _could_ just wait and avoid totally screwing over all their loyal customers. Yes, waiting would be the correct choice.

        • d2brothe
        • 12 years ago

        They could wait, and have nobody buy their chipsets for lack of USB 3.0 for six to nine months because of it, yes, that sounds like a very sound business decision. A very small percentage of population would know about let alone even understand that there are two different standards and what that means.

    • ish718
    • 12 years ago

    Intel is $#*^
    They like to abuse their power too much even if it hurts users in the end

    • Sargent Duck
    • 12 years ago

    /[

      • AMDisDEC
      • 12 years ago

      The answer is a BIG, NO!
      AMD doesn’t assign any engineers to industry standards bodies and one of the primary reasons they fail to succeed in advancing industry standards.
      It’s not that AMD doesn’t create some useful standards on their own, but rather, they are very piss poor at building the ecosystems required to make them succeed in the global market place.

      Intel, on the other hand, has engineers assigned to each major industry standards group.

        • ludi
        • 12 years ago

        Well, if that’s the case, it’s yet another reason why Ruiz needs to go. AMD might not be able to lend out very much support but they made enough money from K7/K8 on Ruiz’ watch, they should have at least been able to assign one engineer to all of the major standards groups that directly affect AMD’s interests in the marketplace. That person would at least be in the position to give direct input and report back to AMD what was going on.

        Running the place like a boistrous second-source shop was JSIII’s job, and he knew how to do that well and when to get out of the way. Ruiz had a chance to move beyond that, and so far his most noteable move was the necessary, but sloppily handled, ATi acquisition.

          • ssidbroadcast
          • 12 years ago

          ludi is on point. Yeah, I already read Indeego’s comment.

            • AMDisDEC
            • 12 years ago

            ๐Ÿ™‚ ,,,,,….

          • AMDisDEC
          • 12 years ago

          AMD’s opinion on this is to let other companies define and fund specs, while they follow IF the spec looks like it will be successful.
          This is AMDs version of risk mitigation.
          The down side of this is, you don’t get to define or influence the spec, and you are never first to market.

          The favorite weak line deployed by AMD managers is, we get the low hanging fruit. They don’t seem to grasp that low hanging fruit comes with low average selling prices and low profit margins.

          In the meantime, AMD executive overhead is one of the largest in the industry. I can’t see how they will ever recover.
          Instead of asset Light, they should be management lite by outsourcing their upper management to China.

        • AMDisDEC
        • 12 years ago

        i ta ki yo.(1a) i po Ki se. (1b) i ta ki yo.

        • AMDisDEC
        • 12 years ago

        Pure King Yo, is pure moral gradeur (and) a pure Propriety.

    • henfactor
    • 12 years ago

    Makes me feel mad/ guilty about recommending that E8400 for my friend.

      • Dposcorp
      • 12 years ago

      and this is just one more of the reasons I just bought a second Phenom setup.
      (I’ll post about it soon as I finish the install)

      Consumers can only speak and be heard with the cash they spend.

      My spending says I refuse to let Intel go back to the Intel of old.

      For how many years did we put up with Intels overpriced, same old stuff.

      It took AMD to bring out cheap 64bit, dual core CPUs to really push Intel.

    • Hdfisise
    • 12 years ago

    The question is, are Intel purposely excluding AMD, Nvidia and VIA in order to keep control of the specification a bit longer, or have AMD etc just not bothered to help Intel. I expect the former.

      • Master Kenobi
      • 12 years ago

      I would find it hgihly unlikely that AMD, nVidia. or VIA contributed any funds/engineers to developing USB 3.0. Intel developed the spec, and the others will get it before it even takes off. This just gives Intel a slight advantage out of the gates. By the time USB 3.0 compliant devices start showing up at stores, nViaia, VIA, and AMD will have it by then.

        • d2brothe
        • 12 years ago

        Yes, but in terms of chipsets with USB 3.0 support, people will want those…even before devices are available. Intel will have an advantage here.

    • BoBzeBuilder
    • 12 years ago

    AMD, Nvidia, and VIA should unite and crush Intel.

    BoBzeBuilder is on point.

    Thanks.

    Bob

      • eitje
      • 12 years ago

      can you put yourself on point?

      *starts flipping through the TR rulebook*

        • Price0331
        • 12 years ago

        Yes, you can, internet makes those kind of things possible, even if they shouldn’t be.

        But nonetheless, I do hope they do come together and release a 3.1 standard, and not let Intel have the specs for it either.

          • eitje
          • 12 years ago

          ah, yeah, you’re right – there it is, on page 45.

      • marvelous
      • 12 years ago

      Crush Intel? Hahaha… You made my day.

      • albundy
      • 12 years ago

      yeah, but then you’ll be left with shittty processors.

Pin It on Pinterest

Share This