Nvidia’s Project Denver and the fate of x86 explored

Ars Technica’s Jon Stokes has penned a thought-provoking piece that looks at Nvidia’s Project Denver and what it means for ARM, Intel, and the fate of x86. Many have wondered why Nvidia elected to go the ARM route instead of developing its own x86 processor. In the article, CEO Jen-Hsun Huang characterizes the technical challenges associated with going the x86 route as "solvable" and says that other factors motivated Nvidia’s decision go to in another direction. Among them: the fact that it simply didn’t make business sense to challenge Intel in the x86 market given that AMD has struggled mightily to make a profit doing so. Fair point.

Stokes isn’t convinced that any ARM-based CPU vendor can match the performance of Intel’s x86 designs for the next 5-10 years. He also points out that traditionally low-power ARM designs will have to sacrifice some power efficiency as their performance climbs. This snippet sums it up nicely:

The overall point is that if you absolutely must run some non-enormous number of compute-intensive instruction threads on the CPU (not on the GPU), then you’ll want an x86 CPU for the foreseeable future. If you can offload some of that work to the GPU, then you’ll want an x86 CPU and a good GPU. In either case, the x86 CPU and GPU combo will give you better performance and performance per watt in such a scenario than a comparable ARM chip—again, this is because of Intel’s process leadership, and because Intel has been working on exactly this problem for much longer than any of the upstart ARM guys, with ARM being a total newcomer to the world of performance and out-of-order execution.

Of course, Stokes also notes the rising number of consumers in the market for devices with long battery life and a little CPU horsepower coupled with a powerful GPU. In the next 5-10 years, the popularity of smartphones, tablets, ultraportable notebooks, and even home-theater PCs seems destined to outstrip that of traditional desktops.

With ARM well-suited to those potentially massive markets, Stokes sees two options for Intel. The chip giant could get into the ARM business itself or become more of a "mobile device and consumer electronics company" that sells reference designs that in effect compete with the ARM ecosystem as a whole.

Comments closed
    • Anonymous Coward
    • 12 years ago

    I think the best example of a RISC-ish processor taking on x86 in the desktop market is the IBM chip Apple was calling G5. It was a mass-market appropriate derivative of the Power4. Maybe it was competitive with Opteron (at the time) in float, if I recall correctly it lost in integer tasks, and it had no power efficiency edge. Clock speeds and core counts where comparable to Opteron. Overall an interesting 3rd high performance desktop CPU choice, but not a winning product.

    • swaaye
    • 12 years ago

    x86 CPUs are mostly RISC designs too and have been since the AMD K5 and Intel Pentium Pro.

    • WaltC
    • 12 years ago

    One of my rare, possibly scathing rants:

    It’s just so amusing to see every few years the utter misunderstandings people develop about various technologies–in this case, ARM…;) It never fails that some pedunk somewhere rolls out the incredibly uneducated opinion that “this new technology X” is going to replace [i<]everything[/i<]! And you know what?--it pretty much *never* happens that way. I could recite litanies about how "consoles have killed the PC" or about how "iPhones have killed off every other smartphone made" or about how "Apple is challenging Microsoft's market share (when the two companies rarely if ever compete in the same markets)", and litanies about how "Larrabee means real-time ray tracing and the death of of the rasterizing GPU," ad infinitum. And none of these smug, illiterate prognostications turned out to be true in the slightest. None of them. OK, let's take the last little tidbit above and expound on it for a second. It is particularly relevant in this case because Jon Stokes at Ars (and I like the Ars site and have frequented it for years) was a media champion for "Larrabee the Real-Time Ray Tracer and the Killer of Rasterizing GPUs." Stokes wrote countless (it seemed) articles and "article-ettes" on "This is What Larrabee Is." Stokes was either at the forefront of the insane and wildly insensate rumor mill concerning Larrabee, or else he was merely one of many running with the same basic idiotic rumors about Larrabee. One thing was certain about Larrabee, and that was that Intel, either in public or private, never, ever, ratified, verified, or otherwise certified the veracity of any of these rumors! Indeed, the few and sparse on-the-record comments Intel ever made about Larrabee were almost always directly contradictory to the rumors the rumor-mill was putting out. It was almost comical--why lie? It was hilarious!--to watch the spasmodic reactions of all of these Larrabee pundits when their houses of cards came crashing down with the utter and final cancellation of the Larrabee project by Intel. It was really funny to watch them engage their various modes of CYA damage control to try and "save" their shredded reputations as "pundits"...;) Let's see--first it was, "Larrabee's cancelled, but not really!" and then it was "Watch for Larrabee 2!" and then when that was not forthcoming from Intel, either, they finally just let Larrabee go to the Unrequited Vaporware Technology Garden in the Sky! Geesh, talk about mania--I almost felt sorry for Intel for all of the garbage speculation written about Larrabee that had no chance of being even partially correct--Larrabee never had a chance of meeting the insane expectations the "Larrabee pundits" created for it. Rather than see Larrabee miserably fail to meet all of the crazy speculation about the product, Intel wisely just cancelled the whole thing before it was born. There really is a lesson here for folks who see conflict around every corner and every technology threatening to usurp every other technology. It just doesn't happen that way in real life. ARM, for instance, was around for years and years before Apple used one in an iPad. But of course it is par for the course that Apple people and pundits have extreme tunnel vision and the only light they ever see at the end of any tunnel is Apple, of course...;) For these folks, ARM cpus simply did not exist until Apple deigned to use them in one of Apple's products line--ditto, USB, PCI/PCIe, x86, IDE, etc. ad infinitum. That is why we keep hearing that ARM is "new." It's only "new" to some people who have never before heard of it. It really is appalling to see such ignorance flouted in such a public fashion. The business of the ARM cpu company, long before Apple's involvement, was to create low-power, low-capability cpus for use in embedded hardware applications. That is is *precisely why* Apple became involved with ARM in the first place! ARM greatly predates iPads and the rest of Apple's hardware that employs ARM cpus. So what Apple is doing is simply leveraging an older technology inside some newer Apple devices--there's just not going to be any direct competition with what Intel and AMD are doing--ever. But that's OK, because those are two disparate markets and markets which have traditionally coexisted without so much as a qualm. So why do these pundits feel compelled to wildly speculate to a nonsensical degree? Simple--page hits--the wilder and more controversial the topic is, the more conflict is journalistically created, which means even more page hits with heavier forum activity, and so on. Caveat Emptor. It's sometimes fun reading and responding to the mindless drivel of pundits determined to be sensational--I've been doing it for years, after all--so there must be some socially redeeming quality in there somewhere...;) One last word...if only I had a nickel for every time I have read the phrase, "People just don't need X or Y technologies!" I would, well, be rich...;) The amazing fact withstanding that sentiment is that regardless, people keep right on buying X or Y technology, and they keep on doing it in ever increasing numbers! That is fact, not speculation or rumor. Fact is, if either x86 or ARM cpu technology was going to usurp the other's place in the market--it would have happened long before now. It simply is never going to happen, because the uses for both roads of technology are legion. One size does not now, and will not ever, fit all.

    • ET3D
    • 12 years ago

    I can’t really imagine any new software that’d be very CPU hungry, certainly not what you suggest. The reason most people moved to notebooks and netbooks and PC prices have fallen a lot is because there are very few applications which require more than low end PC’s currently supply. For those applications dedicated hardware would be less costly and less power hungry.

    • Madman
    • 12 years ago

    I don’t quite understand why everyone says ARM will be as power hungry at higher performance levels as x86/x64? ARM is a RISC like architecture, isn’t it? So basically there is way less hardware logic in the chip, therefore large scale CPU will still be way less complex?

    • Madman
    • 12 years ago

    A little bit of .NET, Silverlight, Java, and the saying “premature optimization is the root of all evil” interpreted in all the wrong ways, and yea, notepad will weight 5Gb and crawl like a turtle.

    • dpaus
    • 12 years ago

    Yes, run by an advertising conglomerate so they could target you with laser precision….

    • mutarasector
    • 12 years ago

    I’m still waiting for the new ‘Power ARM’ or ‘pARM’ processors. I understand there are a few different variants in the “pARM” architecture:

    “Chicken pARM” – lower power, single core – less expensive

    “Meatball pARM” – No Mozzarella, but similarly, a tad saltier in price due to multi-meatbal…er, ‘cores’

    “Veal pARM” – A premium priced scaled up performance multicore ARM processor.

    • bhtooefr
    • 12 years ago

    It took 10 years after the release of the 80386 for a native 32-bit OS to go into the mainstream consumer market, 7 years for a native 2-bit OS to go into the mainstream business market, and about 5 years for DOS extenders to allow native 32-bit programs to run.

    Meanwhile, it only took 2 years for a native 64-bit OS to go into the mainstream business market, and 3 years for the same OS to go into the mainstream consumer market. It’ll take a while for the software to come, especially because most software doesn’t need 64-bit, and there’s a tradeoff between the memory usage (and higher required memory bandwidth) of 64-bit, but the added registers of 64-bit on x86 CPUs. (However, now that there’s work on using the 64-bit CPUs in a “hybrid mode”, where it’s running in full 64-bit, but only 32-bit values are used. Then, you get none of the disadvantages of 64-bit (well, other than needing a 64-bit OS and therefore losing 32-bit drivers), with the biggest advantages, for applications that don’t need the added address space, of x86-64.)

    • dashbarron
    • 12 years ago

    Yes you were Neely! Everyday I watch the comments and take notes then comb through the news ever-so-carefully to see if you’re prophetic predictions come true and you never fail to deliver! Don’t let us down, and please keep reminding us when we fail to catch your successful visions.

    • blastdoor
    • 12 years ago

    That’s a great example of why people *don’t* need massive performance in a home theatre pc. The functionality you describe could be more efficiently provided by a server farm somewhere.

    • NeelyCam
    • 12 years ago

    I meant the comments about ARM having trouble scaling up the performance without losing power efficiency… it was a while ago.

    ARM is doing things better and better, but the big question is “can they do it fast enough”? Note that “fast enough” is a moving target.. Everyone and “The Market” will determine what’s fast enough for them given the sacrifices, and we’ll see where the whole thing will land.

    • OneArmedScissor
    • 12 years ago

    About what lol? The article is saying that Intel can’t keep up their, “We can put our CPUs in anything and it works the best!” charade.

    It doesn’t matter one bit if ARM CPUs never become as powerful as Intel’s. They continue to become even better at the job they already do better, and that’s all most people are going to need, even out of “PCs,” which will be basically anything EXCEPT desktops from here on out.

    • NeelyCam
    • 12 years ago

    You’re scaring me… because you’re probably right.

    • NeelyCam
    • 12 years ago

    Was NeelyCam right… again?

    Answer: yes

    (Edit: it’s about to rain minuses)

    • NeelyCam
    • 12 years ago

    Sorry, had to vote you down for voting him down

    • Incubus
    • 12 years ago

    who knows what kind of wonders AMD will bring to the ultra portable markets in the upcoming years.
    I believe AMD showcased a 5 watts brazos based chip somewhere in Asia couple of months ago.
    I have a 1GHZ hummingbird smart phone and it lags on big pdf files and some casual flash games like no one’s bussiness.
    Arm is not the future,unless Nvidia could come up with x96 design.

    • steddy
    • 12 years ago

    Platform power consumption.

    • Game_boy
    • 12 years ago

    If you have a large number of cores on a chip then you need a high-speed, low-latency interconnect (like Intel’s ringbus) to keep them all supplied with data. Then you need a high performance memory controller (because of the tens of times more data coming in), and because that’s still a bottleneck for tens of cores, lots of L2 and L3 cache.

    You’d end up with something like Larrabee or Intel MIC or Beckton. And it would eat power and die area because of all of the infrastructure; look at a Sandy Bridge die and see how little space the cores take up. I’m not sure it would be more efficient than an x86 cluster of many 16-core BD or 10-core Westmeres.

    BD will be 4.8W/core including infrastructure at probably 2GHz speeds.
    Could a 100-core ARM chip deliver a BD core worth of performance for each 4.8W?

    • fredsnotdead
    • 12 years ago

    He seems to state (in the full article) that as you go to higher performance, power efficiency drops, but techreport’s data seems to contradict that; look at the third graph on the fifth page of the recent E-350 review:
    [url<]https://techreport.com/articles.x/20401/5[/url<]

    • derFunkenstein
    • 12 years ago

    Any time you make nerds think of the possibility of getting laid, they always go “thumbs” up.

    • thesmileman
    • 12 years ago

    It is going to be more cores I think not higher speeds

    • thesmileman
    • 12 years ago

    While I am big on irony and puns you are right they don’t usually go over well. Maybe you need a new angle because my post about being a chubby chaser on the TR XXL launch post went over pretty well.

    • thesmileman
    • 12 years ago

    The way I see arm excelling is lower watt cores but having a large number of them. This will really help them in the server and workstation market. We constantly used the Power architecture from IBM because of the vector processing and I believe well written code running on an arm processor with 100 low voltage cores would be far better (better in this case means for our purposes and servers) than a two hexa-core x86 chips (Which is something we currently use.

    • kamikaziechameleon
    • 12 years ago

    Irony and puns are one of the highest forms of humor and for some reason tech report seems to not take so well to it 😛

    I get thumbed down every time I’d make a pun joke.

    • Hattig
    • 12 years ago

    Firstly, cache doesn’t use a lot of power.

    And secondly what 2W ARM core are you looking at! There’s a 40nm 2GHz dual-core Cortex A9 hard macro for performance fabrication processes (that also includes the full NEON vector unit – that’s 1.9W TDP. Is that what you are talking about?

    • tejas84
    • 12 years ago

    Thaddeus(Anglicised version of his Polish name) is right

    It is indeed an ARM race!

    Thumbs up for Taddeusz

    • sschaem
    • 12 years ago

    Can anyone point to SpecInt and FP benchmark done on the latest and greatest ARM chip ?

    ARM website seem to show that their design efficiency drop by dramatically, over 60%, when running at higher mhz.
    And even a chip with only 32K of data cache use 2watt of power, and thats with no modules (no floating point support of any kind), just integer processing!!

    Thats 32 times less cache then the 9watt zacate, and that chip also include a powerfull Dx11 GPU with 80 core.

    My guess is by the time ARM add floating point support, a decentGPU, SIMD array, and all the other SoC feature needed. You get a slower chip then a x86 that might consume only 50% less.

    Saving 4watt in a tablet or netbook is not that great if you have to sacrifice performance.

    • dpaus
    • 12 years ago

    Sorry, had to vote you down for sayin’ it.

    • Taddeusz
    • 12 years ago

    It’s an ARM race!

    Sorry, had to say it. 😛

    • dpaus
    • 12 years ago

    Never underestimate the ability of new software to choke existing hardware. Just wait until your home theatre PC is also running an AI client that’s hunting for content tailored to your fickle interests…

    • anotherengineer
    • 12 years ago

    “Many have wondered why Nvidia elected to go the ARM route instead of developing its own x86 processor…. CEO Jen-Hsun Huang characterizes the technical challenges associated with going the x86 route as “solvable” and says that other factors motivated Nvidia’s decision go to in another direction…… Among them: the fact that it simply didn’t make business sense to challenge Intel in the x86 market given that AMD has struggled mightily to make a profit doing so. Fair point..

    True enough, what about obtaining an x86 license from Intel also??

    Who know’s where it will be in 5-10 years, by then ARM could quite well be in a lot of netbooks or even desktops….especially 10 yrs from now, or maybe some new architechture not even out yet.

    Everyone knows that 10yrs in tech is like a millenia elsewhere (hardware anyways, software seems to be very slow relatively speaking….when was the A64 released?? 2004? 7 yrs later where are my native 64 bit programs???

Pin It on Pinterest

Share This

Share this post with your friends!