Nvidia CEO talks down CPU-GPU hybrids, Larrabee

Yesterday at its Financial Analyst Day conference, officials from Nvidia talked for quite a while (about six hours) about the firm’s position in the market. They addressed, at some length, how Nvidia plans to counter the AMD and Intel CPU-GPU hybrids and Intel’s upcoming Larrabee graphics processor. Nvidia CEO Jen-Hsun Huang was quite vocal on those fronts, arguing hybrid chips that mix microprocessor and graphics processor cores will be no different from systems that include Intel or AMD integrated graphics today.

According to Huang, development cycles for processors are longer, and customers will continue to find value in pairing integrated or discrete Nvidia graphics with a processor that already includes a graphics core. Besides, Huang believes Intel’s promise of ten-times-greater integrated graphics performance by 2010 will yield hardware barely at the level of current mainstream Nvidia GPUs. Nvidia’s vision is that consumers will need relatively powerful GPUs as raw processing devices in the future, too. Later in the conference, software firm Elemental Technologies gave a demonstration of its H.264 video transcoding software—the kind you need to convert videos for an iPod—running about 19 times faster on an Nvidia GPU than on a quad-core processor. Huang thinks video transcoding in such instances ought to be “instantaneous” in the future.

Huang showed a slide suggesting consumers spent 7% less on processors and 14% more on GPUs last year compared to 2005. Nvidia believes users will get a better experience from pairing a less powerful CPU with a more powerful GPU, and Huang cited the commercial success of Gateway’s P-6831 FX—a $1,349 laptop that pairs a 1.66GHz Core 2 Duo with a GeForce 8800M GTS—as evidence. The laptop has been hailed as the “best midrange gaming notebook ever” by Anandtech, and it reportedly sold out well ahead of schedule at Best Buy. The success of GPUs will only increase in the future, Huang projected:

Of course, all this talk about powerful graphics processors capturing more of the market wouldn’t be complete without a mention of Larrabee, Intel’s upcoming discrete GPU.

Huang summed up Nvidia’s position on Larrabee in one sentence: “We’re gonna open a can of whoop-ass [on Intel].” The Nvidia CEO’s other arguments weren’t quite as strongly worded, but he explained that Nvidia is continuously reinventing itself and that it will be two architectural refreshes beyond the current generation of chips before Larrabee launches. Huang also raised the prospect of application and API-level compatibility problems with Larrabee. Intel has said Larrabee will support the DirectX 10 and OpenGL application programming interfaces just like current AMD and Nvidia GPUs, but Huang seemed dubious Intel could deliver on that front.

Huang even argued that Larrabee’s x86-derived architecture is a rehash of Itanium—the ill-fated “successor” to x86 processors—in the way it throws away “billions of dollars of investment” on current graphics architectures. Huang said CUDA, Nvidia’s C-like application programming interface for general-purpose GPU computing, is already very successful, and suggested Intel might have a hard time displacing it with a completely different product based on a radically different architecture. Nvidia VP Tony Tamasi chipped in: “Larrabee is a PowerPoint slide, and every PowerPoint slide is perfect.”

Tamasi went on to shoot down Intel’s emphasis on ray tracing, which the chipmaker has called “the future for games.” He prefaced his criticism by mentioning that the popular mental ray ray tracing renderer is owned by Nvidia, and that he has nothing against ray tracing in particular. However, his view is that ray tracing is just another tool in the toolbox and that the future of real-time graphics will involve a mix of rasterization and ray tracing—the very same approach used in movie rendering by the likes of Pixar. Additionally, Tamasi believes rasterization is inherently more scalable than ray tracing. He said running a ray tracer on a cell phone is “hard to conceive.”

So, with Intel about to jump in the discrete GPU market, will Nvidia counter by moving into the microprocessor market? Huang hinted such a move isn’t part of Nvidia’s strategy for the time being. “We’re gonna be highly focused on bringing a great experience to people who care about it,” he explained, adding that Nvidia hardware simply isn’t for everyone. He illustrated that point with two examples. One was Google, which he said doesn’t need to make an operating system to compete with Microsoft. Another was the iPhone, which Huang thinks isn’t as good as a Blackberry for e-mail, even though it does have e-mail functionality. Nonetheless, Huang added, “I would build CPUs if I could change the world [in doing so].”

Even if Nvidia doesn’t care to make CPUs, it doesn’t mind taking microprocessor market share away from Intel—either directly or indirectly. One crucial battlefield in the coming months and years is the low-cost and mobile stage, and Huang argued Nvidia is armed and ready on that front. First, Nvidia is readying a platform to accompany VIA’s next-generation Isaiah processor, which should fight it out with Intel’s Atom in the low-cost notebook and desktop arena:

Another of Nvidia’s weapons is APX 2500, the applications processor for mobile devices scheduled to hit production later this quarter. The APX 2500 squeezes an ARM11 microprocessor core, a high-definition audio/video processor, a GeForce graphics processor, and a DDR memory interface into a package smaller than a dime, with power consumption low enough to allow for up to ten hours of continuous 720p high-definition video playback. The APX 2500 (or a future iteration of it) may compete with the system-on-a-chip successor to Intel’s Atom, which is code-named Moorestown and is scheduled to appear in 2009 or 2010.

Comments closed
    • pani_alex
    • 12 years ago

    is these via cn+geforce abailable or is a plan.
    that is what i was wainting a low consuption pc with nvidia video

    • Fighterpilot
    • 12 years ago

    Intel might be reaching to make one as fast as Nvidia’s best but as for having the know how to do it…couldn’t they just take G92 to their engineers and say “make a better one”?

      • Chrispy_
      • 12 years ago

      No. GPU design is the easy part.
      The hard part is getting your design to do what it’s supposed to do, getting it to a stage where it’s suitable for fabrication, making sure it’s stable at the die size and with yields that will be economically viable and then even when that much is done, it needs optimisation, otherwise it’ll likely be slower than the gpu it replaces.

      Then you’ve got drivers to write 😐

        • Fighterpilot
        • 12 years ago

        Actually I think GPU design is the /[

          • gtoulouzas
          • 12 years ago

          Intel’s first venture into “3d acceleration”, the i740 chipset (roughly equivalent to a Riva128, iirc), was positively riddled with driver problems. I’m not saying it has to be this way with larrabee, just that the intel brand is no guarantee of a lean-mean graphics driver team.

          Edit: Oops! Now I see that you were specifically suggesting that the change of architecture would allow for such an improvement. My error.

      • kilkennycat
      • 12 years ago

      There might be just the minor issue of patents. Both nVidia and ATi have
      been developing substantial portfolios of GPU related patents for many years. And nVidia is no doubt building a futher set of patents with regard to architectures and implementations in the massively-parallel processing domain of their Tesla (GPGPU) initiative.

    • kilkennycat
    • 12 years ago

    Money and market dominance does not a winner make…. in a new realm. And for Intel, simultaneous state-of-the-art graphics together with serious number-crunching (AI, physics etc) in SINGLE desktop or mobile-domain CPU is surely new territory. Larrabee will not be customer-available until 2010 at the earliest, so Intel’s current graphics-dominant rant is only an Intel Marketing “paper-tiger” . Yes, they are throwing pots of money at Larrabee, rapidly building the Larrabee engineering team etc, but regardless of hardware-architecture their struggle is uphill on two fronts -at least. Power-management and efficient high-performance graphics/parallel-processing software.

    On the subject of power-management, take for example current state-of-the-art gaming technology such as Crysis. This game loads BOTH
    multicore CPUs and graphics-engines to the full. Even with 45nm CPU technology and 55nm GPU technology, we are looking at over 300 watts of combined power ( with multiple GPUs, PLUS discounting ALL the memory and MB chip-set power) to run this game at 1600×1200 with the graphics maxed. And regardless of nay-sayers to the contrary, Crysis is very efficiently coded. So Larrabee is going to have this capability built into a SINGLE silicon die, even at 32nm? Mandatory water-cooling for all desktop CPUs? Pull the other one, it might have bells on it. Both nVidia and ATi pull all sorts of on-chip power-management tricks to keep the GPU dissipation down. Think that Intel has anything better in that department up their sleeves — then think again. Both nV and ATi are way ahead of Intel in the silicon power management department. They have had to be since (a) they do not have access to the most power-efficient silicon-processes and (b) their GPUs are massively parallel-processing engines with vastly higher numbers of computations per clock cycle than ANY Intel processor including Itanium.

    And on the subject of parallel-processing software/hardware combinations for both graphics and intense number-crunching look no further than nVidia’s CUDA and its employment with nV’s GPU’s in their Tesla initiative. Industrial number-crunching at the desktop is an exponential growth business for nVidia, with huge profit margins. Or look at nVidia’s near-total ownership of real-time broadcast special-effects hardware and software with their Quadro series. Check around the National Association of Broadcasters show now opening in Las Vegas for confirmation of nV’s penetration.

    By the time, Larrabee arrives, nV will be two more hardware generations ahead (and on TSMCs 45nm process) in both GPU-graphics and the application of GPUs as GPGPUs to computationally intensive tasks. Whether in industry or games, nVidia will have both hardware and MATURE software toolsets to handle those tasks. No doubt these tools will also take advantage of whatever Nehalem or Larrabee will offer as general-purpose compute engines, but the notion of a CPU such as Larrabee SOLELY addressing the demands of high-peformance graphics in combination with the advanced AI and advanced physics demanded games in year 2010 is purely Intel marketing hot-air.

    Maybe Larrabee will make some inroads as the combo CPU/graphics core of the successor to the Nintendo Wii, but as for addressing the gaming needs for the successors to the PS3 and Xbox360 in 2011-2012, expect a CPU core still in combo with a GPU core… probably from nVidia, if AMD/ATi does not survive the current financial turmoil.

    • chinese_farmer
    • 12 years ago

    ah, pride comes before a fall….

    When will the fall come? When Intel decides to make it painful.

    And if we don’t have AMD around… ohh there will be a very monopolistic Intel on the CPU side…

    • tejas84
    • 12 years ago

    Jen-Hsun Huang is a pretty inspiring guy. I am quite shocked that I am saying that since I have been until recently an ATI guy! He is spot on though and I look forward to Nvidia eventually acquiring VIA and taking on Intel and crushing AMD

    • pogsnet
    • 12 years ago
    • Protip
    • 12 years ago

    I think it’s AMD’s graphics+cpu hybrid that looks promising. Just judging from Intel’s integrated graphics, I can see why nVidia doesn’t feel too threatened.

    That said I think Intel’s push to be a more serious graphics contender can only improve the graphics market. If some of their ray tracing tech pans out you can bet nVidia and AMD will want to adopt it. I just hope a patent war doesn’t crop up.

      • Entroper
      • 12 years ago

      I think AMD’s strategy is to complement, not replace, GPUs, and that seems to be the smarter thing to do.

    • AMDisDEC
    • 12 years ago

    As usual, Huang’s analysis and strategy are dead on. His company will continue to prosper and grow for the next decade.

      • Grigory
      • 12 years ago

      Haha, you had me for a second! 😀

    • Krogoth
    • 12 years ago

    Nvidia is worried that average joe and kid gamer would loss complete interest in discrete solutions, because Intel’s integrated GPUs start to perform decently instead of being made of fail at 3D graphics.

    The hardcore market is simply not enough to keep discrete GPUs alives. Look at good, old 3dfx when they had tried so to keep afloat with only hardcore users, while ignoring the mainstream market.

    • DrDillyBar
    • 12 years ago

    I’d love intel to suprise them in this.

    • tfp
    • 12 years ago

    Wow I would have expected him to say that Intel is right and Nvidia is doing everything wrong…

    • no51
    • 12 years ago

    It’d be awesome if AMD/ATI came out of nowhere in this argument.

      • srg86
      • 12 years ago

      I have to admit it would be good to see the smile wiped off his face.

    • UberGerbil
    • 12 years ago

    When you start talking pre-emptively about your competitor’s vapor, you’re officially worried.

      • Jigar
      • 12 years ago

      Exactly my thoughts, they seriously want to know, what’s going under Intel’s hood.

    • eitje
    • 12 years ago

    q[

      • UberGerbil
      • 12 years ago

      That’s typical Huang, though. Ego wins over “professional decorum” every time. Not that that’s unusual among tech CEOs (see Larry Ellison, Scott McNealy, etc).

        • gtoulouzas
        • 12 years ago

        …and Steve Ballmer! Even after all these years, his orangutan antics still bring a smile to my face.

    • provoko
    • 12 years ago

    That pie-chart of the GPU is ridiculous. Nice try nvidia, maybe if your chipsets didn’t run so hot and gpus weren’t so expensive, people would actually believe you.

      • lolento
      • 12 years ago

      1:1 ratio of Cpu to Gpu in any given pc may not be far fetched. My last 5 computers going back to 1998 is in this ratio. Adding to the fact that SLi and Crossfire configurations seems to be maturing, I think this unity ratio should be achievable (may be even before 2012).

    • A_Pickle
    • 12 years ago

    I agree with Huang on that a lower-end CPU paired with a higher-end GPU will be a superior experience to users, and may bring PC gaming back to competition with consoles (especially if laptops like that get sold more), but…

    …I have a hard time respecting the man when Nvidia’s drivers suck hardcore, and when, curiously, at the same time that ATI is having trouble competing, the newly-released Geforce 9-series is breathtakingly… underwhelming.

    Hmm.

      • srg86
      • 12 years ago

      As I mostly run compilers and other CPU based stuff rather than play games, I’m quite happy to have a lesser GPU and a more powerful CPU.

      I agree about the nVidia driver front, My dad’s nForce 3 based machine will probably never run vista becuase of lack of drivers.

        • A_Pickle
        • 12 years ago

        And I can see eye to eye with that, too. The only people who really want crazy fast graphics cards are gamers — other people have /[

          • bfellow
          • 12 years ago

          Those with super-fast CPUs also consumer more power than super “slow” CPUs. I mean what difference do you see with 9100LE vs. 9850BE in regular day-to-day use or E8200 vs. X6800.

            • srg86
            • 12 years ago

            I agree with this too, for day to day stuff, on XP and a decent amount of RAM, you won’t really see a difference between a Core 2 and a Pentium II.

          • srg86
          • 12 years ago

          The only other people I think would be those that want media PCs, then again, the nice video encode/decode stuff (although I don’t need fast 3D, that would be a usefull feature to me) can still be implemented on low end, relatively low power cards (only used when needed).

          • eitje
          • 12 years ago

          i think as we see more graphics-rich desktops (think compiz and aero), low-to-midrange GPUs will become almost necessary. i think that’s why they’re talking about the elimination of chipsets, though – the era of the headless chipset is coming to a close.

        • l33t-g4m3r
        • 12 years ago

        The nForce4 needs some driver updates too.

    • ReAp3r-G
    • 12 years ago

    i love the image! the NV gray eye…but better! where did you guys get that? i want it! :p

Pin It on Pinterest

Share This