Nvidia sees itself making x86 SoCs eventually

With both AMD and Intel playing up CPU-GPU integration and Intel prepping its own discrete graphics processor, Nvidia is in a precarious position—it has graphics processors but no x86 microprocessors to go with them. Nvidia CEO Jen-Hsun Huang didn’t sound too worried back in April 2008, saying GPUs were the future yet Nvidia hardware “simply isn’t for everyone.” He added, “I would build CPUs if I could change the world [in doing so].”

Nvidia Senior VP of Investor Relations Michael Hara spoke in a somewhat different tone at the Morgan Stanley Technology Conference in San Francisco this week. According to Bit-Tech, the executive was asked whether Nvidia would ever launch a microprocessor. He replied, “The question is not so much I think if; I think the question is when.” Bit-tech jotted down the rest of his response:

“I think some time down the road it makes sense to take the same level of integration that we’ve done with Tegra,” said Hara. “Tegra is by any definition a complete computer on a chip, and the requirements of that market are such that you have to be very low power, very small, but highly efficient. So in that particular state it made a lot of sense to take that approach, and someday it’s going to make sense to take the same approach in the x86 market as well.”

Hara made it clear that Nvidia is aiming for a “highly integrated system-on-chip” rather than a high-end x86 processor that can compete with Intel’s latest and greatest. (Tegra already fills the system-on-a-chip description, but it includes an ARM processor that can’t run x86 software.)  Hara expects such product may come out in two or three years.

Comments closed
    • pogsnet
    • 11 years ago
    • Pax-UX
    • 11 years ago

    Not really a big deal for the desktop but since computers are moving away from Desktops to Laptops and smaller nVidia could be in for some hurt as once you leave the desktop it’s all about low power. The other two can create complete end-to-end solutions while nVidia can’t which makes it a harder sell.

    • kitsunegari
    • 11 years ago

    It’s ironically interesting that iNTEL’s performance crown in the CPU game derives itself not from the monolithic stature and resources so often uniquely attributed to the company, but rather the very same attribute that facilitate it’s chief competitors’- nVIDEA and ATi (or for that matter any business’)- market relevance: FOCUS.

    By heavily out-investing the x86-core IPC market iNTEL itself created with cutting edge process node transitions, they can effectively maintain control of the PC platform from the “top down”.

    This resource allocation comes at the expense of their chipset and graphics which have no doubt long since played second fiddle in both performance and R&D spending.

    This fiscal paradigm all changes with “Larabee” which seeks to reconcile the R&D returns of their CPU investments with their graphics/HPC endeavors by effectively marrying/consolidating the x86 IPC with those products.

    Thereby- and this is in iNTEL’s wildest wet dreams mind you- ushering in a veritable vertically integrated IPC evolutionary monoculture for most of the computing industry.

    This “top to bottom” approach to market dominance that iNTEL has taken is, I believe, in contrast to that of AMD. By starting with HPC(look no further to their success in opteron based enterprise servers prior to i7) and Graphics (ATi) we can see the rationale behind not only the “ball-busting” purchase of ATi but also their purported emphasis on this idea (i.e. marketing) of “platform”.

    Put simply, iNTEL is the karate chop strategy; AMD is the uppercut.

    The opportunity nVIDEA has as a fabless “odd man out” company stems from marrying HPC graphics with non-x86 RISC- read ARM- based architectures (i.e. future iterations of TEGRA) and enjoying efficiency scaling that way. Sure ATi- i mean AMD 😉 -could do that too but they’ve got their bets in the same x86 hedge as intel so I see more plausible market possibility of “team green” rallying the necessary power players to form a triumvirate that challenges the inevitability of x86 everywhere: graphics-nVIDEA, IPC- ARM, and OEM- APPLE. (of course recent news indicates they might egregiously forfeit that potentiality and decide on fielding their on x86 chips- seriously Chen, either buy VIA or spend on ARM soc)

    As the computing power needs of the vast majority of consumers in the 21st century (read anyone with electricity) level out against the various IPC’s, the opportunity for core-logic change is ripe for the picking of any company with the foresight capitalize on the intersect between performance, price, and subsequent inter-polarity.

      • elpresidente
      • 11 years ago

      Nvidia is the name

    • fpsduck
    • 11 years ago

    If in case Nvidia can make x86 things
    so dual socket CPU from them shall be called…er, SLI CPU ?

      • Meadows
      • 11 years ago

      I can’t wait for their proprietary SLI+SLI system optimisations.

    • ssidbroadcast
    • 11 years ago

    Cyril is that render provided by nVidia or something you whipped up in 3DSMax?

      • ludi
      • 11 years ago

      Since it has an Nvidia logo on it, I’d reckon it most certainly came from the manufacturer. It wouldn’t be wise for a site like TR to go generating and posting in-house images with a brand logo on them unless the brand owner had solicited and approved them.

      • ecalmosthuman
      • 11 years ago

      I thought the same thing for 2 seconds.

    • ThomasDM
    • 11 years ago

    My guess is NVIDIA will use the Intel Atom, remember the announcement that TSMC will be making Atom SoCs for Intel?

      • eitje
      • 11 years ago

      I’m backing you as my candidate.

      • UberGerbil
      • 11 years ago

      I doubt it. Intel doesn’t want nVidia to be offering a separate platform that challenges their own Moorestown. What they want is to enable their customers to take Intel’s platform and modify it to meet their needs. Now, if one of the big handset mfrs — Samsung/Sony/LG etc — came to them and said “Hey, we want to use your Lincroft assets with some of nVidia’s Ion assets to put together something that makes our product uniquely great” Intel might say yes, but that’s because the result is a black box not an open platform. If nVidia came to them with the same proposal with the idea of turning around and selling the result to Samsung/Sony/LG etc, Intel would say no.

      Actually, the customer most likely to get Intel to say yes to this kind of shotgun marriage is probably Apple. (And that /[

        • Rurouni
        • 11 years ago

        Basically the concept of SoC is integrating everything into a single chip… If Nvidia wants to license Atom and integrate it with their GPU I believe nothing will stop them. The problem is that they probably can’t brand it like its their chip, meaning it won’t be Nvidia x86, but Intel X86+Nvidia GPU SoC.

          • UberGerbil
          • 11 years ago

          Intel will stop them, if they choose to. Intel doesn’t have to license it to anybody they don’t want to. Of course nVidia might offer them some of their GPU patents to get it, but I don’t know they want to do that either.

    • DrDillyBar
    • 11 years ago

    But will it play Crysis
    *ducks*

      • ludi
      • 11 years ago

      I’m sure it will play Crysis just fine, given sufficient AI programming. Now, I rather doubt it will /[

      • ReAp3r-G
      • 11 years ago

      i lol’d

      if it does i’ll take 2 😀

    • HighTech4US
    • 11 years ago

    nVidia does not have to develop a X86 core. They can either buyout VIA/Cyrix or buy the IP from the now defunct Transmeta or Via/Cyrix.

    oh, and Charlie is still a 1st class Idiot with a grudge.

      • HurgyMcGurgyGurg
      • 11 years ago

      And I thought maybe Charlie had a change of heart when he wrote the AMD X3 un-locking article…

      Somethings don’t change.

      • kilkennycat
      • 11 years ago

      Yep, Charlie is one of the world’s biggest idiots. Appropriate journalistic talent for the (National) Inquirer…. Did you see his article dissing the 9400M in every paragraph as “yawn, boring, we’ve seen all this before”?Maybe Steve Jobs should have read Charlie’s article before embedding the 9400M in all of the new Macbooks and killing 20% of Intel’s lap-top core-logic business in one swell foop. Intel’s current vendetta against the nVidia Ion and their lawsuit attempt to prevent nVidia building peripheral silicon for the Core i7/i5 signals their fear of open exposure to their Achilles’ heel — their long-proven inability to design a decent IGP. It’s a wonder that Charlie has survived so long without being run over by a bus in a one-way street with him looking the wrong way.

        • MadManOriginal
        • 11 years ago

        I think that most tech-savvy readers of TR and the like know to take his ‘articles,’ especially ones pertaining to NV or graphics, with a big old salt lick. Having said that I’m not surprised he hasn’t been run over…I haven’t seen any buses with NV advertising on them :p

        • ludi
        • 11 years ago

        He likes the attention, and readers like you can’t stop giving it. Sounds like a successful marriage to me.

    • PRIME1
    • 11 years ago

    At this point NVIDIA could just buy AMD for half the price that AMD bought ATI for.

      • alex666
      • 11 years ago

      I doubt anyone will be buying out anybody else in the near future given today’s economy and the credit crunch.

      • ThomasDM
      • 11 years ago

      Why would they. The x86 license can’t be transferred and AMD no longer has any fabs either.

        • PRIME1
        • 11 years ago

        I don’t think the license would be an issue. If it was Intel would have pulled it already because AMD sold off its fabs.

          • poulpy
          • 11 years ago

          The foundry will be producing any chips not designing x86 cpus which stays the privilege of AMD AFAIK.
          Pretty sure Intel watched that space /[

            • Hattig
            • 11 years ago

            I wonder what the legal situation regarding a dual-die MCM package of nvidia chipset and VIA nano CPU would be… a package is just a tiny complex motherboard with lots of connectors ultimately!

          • UberGerbil
          • 11 years ago

          Selling off fabs does not constitute a material “Change of Control” in the entity (AMD) with which Intel holds the cross-licensing agreement. In the event of a merger in which AMD was not the surviving partner, there would be such a change and the licenses would be revoked.

          As it is, Intel did some saber-rattling when it was announced
          §[<http://www.engadget.com/2008/10/09/intel-patent-attorneys-kick-some-dirt-at-amd/<]§ But I imagine that was just for PR purposes. Splitting off the fabs into a JV clearly doesn't violate section 6.2(b)(7), whereas a purchase of AMD by nVidia clearly would §[<http://contracts.corporate.findlaw.com/agreements/amd/intel.license.2001.01.01.html<]§ And that's setting aside the FTC (and possibly DOJ) review that such a merger would trigger, since the resulting company would have a lock on virtually all the discrete GPU market. They would probably be required to spin off the former ATI assets to a 3rd party. It would be a bit of a mess and distract nVidia for quite a while. Even assuming the result would be worth it, which is dubious at best.

      • asdsa
      • 11 years ago

      Soon AMD will be the one doing buying stuff.

    • BoBzeBuilder
    • 11 years ago

    Rebrand the 8800GT to Tegra.

      • SecretMaster
      • 11 years ago

      Congratulations sir! You’ve just been hired as lead designer for the Tegra project due to your bold thinking.

      • kvndoom
      • 11 years ago

      I was leaning more towards the nTium.

        • Rakhmaninov3
        • 11 years ago

        I like this one. nTium.

    • Big-Mac
    • 11 years ago

    Intel has became the biggest roadblock of this industry.

      • pogsnet
      • 11 years ago
    • SoulSlave
    • 11 years ago

    Lawyer Wars!!!!!!!!!!

    • maxxcool
    • 11 years ago

    Shocker 🙁 … didn’t a whole slew of people already call this ?

Pin It on Pinterest

Share This