Personal computing discussed

Moderators: renee, morphine, SecretSquirrel

 
codedivine
Gerbil Elite
Topic Author
Posts: 714
Joined: Sat Jan 24, 2009 8:13 am

Its official: Larrabee GPU is dead

Tue May 25, 2010 3:23 pm

http://www.anandtech.com/show/3738/inte ... -to-market
"Intel kills Larrabee GPU, will not bring a discrete GPU to market"
 
just brew it!
Administrator
Posts: 54500
Joined: Tue Aug 20, 2002 10:51 pm
Location: Somewhere, having a beer

Re: Its official: Larrabee GPU is dead

Tue May 25, 2010 3:39 pm

This isn't really all that much different from what they were saying a few months ago.
Nostalgia isn't what it used to be.
 
morphine
TR Staff
Posts: 11600
Joined: Fri Dec 27, 2002 8:51 pm
Location: Portugal (that's next to Spain)

Re: Its official: Larrabee GPU is dead

Tue May 25, 2010 7:25 pm

What a surprise!

Oh, wait.... :)
There is a fixed amount of intelligence on the planet, and the population keeps growing :(
 
kvndoom
Minister of Gerbil Affairs
Posts: 2758
Joined: Sat Feb 28, 2004 11:47 pm
Location: Virginia, thank goodness

Re: Its official: Larrabee GPU is dead

Tue May 25, 2010 7:55 pm

I should be able to count all the people who are surprised by this on my middle finger. If even that many. Intel never will offer anything besides their integrated garbage.
A most unfortunate, Freudian, double entendre is that hotel named "Budget Inn."
 
UberGerbil
Grand Admiral Gerbil
Posts: 10368
Joined: Thu Jun 19, 2003 3:11 pm

Re: Its official: Larrabee GPU is dead

Tue May 25, 2010 9:16 pm

"Never" is a long time (Intel was "never" going to do 64bit x86 either). "Garbage" is open to debate also, if your timeframe is "forever." You never know: they could fix their drivers. They could get ATI tech in a patent swap with AMD. They could buy nVidia. Integrated graphics may never be a gamer's first choice; but the time may come when everybody is gaming on iWhatevers and consoles, and all the people who are still gaming on PCs with discrete graphics could fit into one room at PAX (hey everybody, it's the diehard LAN party! Just down the hall from the Amiga fanatics!)

Larabee was an interesting experiment; it's too bad it could not have remained in stealth mode within the company, hidden away from the corporate hype machine. Like Itanium, it was a gamble on a hardware design that was ultimately and irretrievably dependent on breakthroughs in software -- breakthroughs that didn't arrive. But experiments often end in failure; that's no reason to stop attempting them -- to the contrary, you often learn the most from your failures. The real failure is that Intel was already trying to build and flog products before the experiment was complete and its outcome known.
 
just brew it!
Administrator
Posts: 54500
Joined: Tue Aug 20, 2002 10:51 pm
Location: Somewhere, having a beer

Re: Its official: Larrabee GPU is dead

Tue May 25, 2010 10:07 pm

Drivers for Intel's integrated GPUs are actually quite reasonable on Linux, as long as you stay away from the GMA 500 chipset (which is used in some netbooks). At work we've been using Intel integrated IGPs for a couple of years to run embedded OpenGL applications, and it has worked out well. The one video driver bug we hit which could've been a potential show-stopper for us was fixed in-house, and our patch subsequently kicked upstream to the Xorg driver maintainers to be incorporated into the official driver. (IMO this is an excellent example of the Open Source development paradigm working the way it is supposed to.)
Nostalgia isn't what it used to be.
 
MadManOriginal
Gerbil Jedi
Posts: 1533
Joined: Wed Jan 30, 2002 7:00 pm
Location: In my head...

Re: Its official: Larrabee GPU is dead

Tue May 25, 2010 10:53 pm

kvndoom wrote:
I should be able to count all the people who are surprised by this on my middle finger. If even that many. Intel never will offer anything besides their integrated garbage.


I can count at least one person who doesn't understand the phrase '..in the short-term' on mine too - you! :) If you actually bothered to read more than the headline you'd understand the nuances and the fact that the project is not being killed. Perhaps that's too much to ask in these dumbed down soundbite days though.
 
reactorfuel
Gerbil
Posts: 72
Joined: Thu Feb 28, 2008 5:58 am

Re: Its official: Larrabee GPU is dead

Tue May 25, 2010 11:15 pm

UberGerbil wrote:
But experiments often end in failure; that's no reason to stop attempting them -- to the contrary, you often learn the most from your failures. The real failure is that Intel was already trying to build and flog products before the experiment was complete and its outcome known.

Of course, building something like Larrabee is a real "in for a penny, in for a pound" situation. Simply "running the experiment" - designing and implementing a radically new graphics architecture - is tremendously difficult and expensive. Throwing together a small hype machine is, by comparison, ridiculously cheap and easy. If the experiment pans out, people are going to be excited to buy your hot new stuff, and you don't have to work up demand at the last second before it hits the shelves. If it doesn't work so well, nobody's going to stop presses if it turns out that a corporate PR flack was a bit optimistic.

Also, you could even say that a bit of hype is necessary for the experiment itself. If you develop a radically new and revolutionary graphics architecture, capable of doing something (in this case, real-time raytracing) that nothing else can do, you're still going to need people to develop for it. No developer support, and you're kinda dead in the water for anything but your own tech demos. This is a big part of why Intel did most of their Larrabee talking at developer conferences. The hype we, the public, saw was mostly incidental to that.
 
kvndoom
Minister of Gerbil Affairs
Posts: 2758
Joined: Sat Feb 28, 2004 11:47 pm
Location: Virginia, thank goodness

Re: Its official: Larrabee GPU is dead

Wed May 26, 2010 4:56 am

MadManOriginal wrote:
kvndoom wrote:
I should be able to count all the people who are surprised by this on my middle finger. If even that many. Intel never will offer anything besides their integrated garbage.


I can count at least one person who doesn't understand the phrase '..in the short-term' on mine too - you! :) If you actually bothered to read more than the headline you'd understand the nuances and the fact that the project is not being killed. Perhaps that's too much to ask in these dumbed down soundbite days though.

Oh please. The longer they wait, the more powerful nvidia and ati GPU's get and the farther behind their garbage (yes, garbage) falls. I tell you what. "WHEN" it comes out, reference this post and I'll paypal you 15 dollars. Until then, enjoy your Kool-aid.
A most unfortunate, Freudian, double entendre is that hotel named "Budget Inn."
 
Airmantharp
Emperor Gerbilius I
Posts: 6192
Joined: Fri Oct 15, 2004 10:41 pm

Re: Its official: Larrabee GPU is dead

Wed May 26, 2010 5:18 am

Larrabee's development always seemed backwards to me-

For AMD and Nvidia to arrive at the GPGPU monsters they have today, they went through a multitude of fixed function products that did nothing good except render. I believe in Intel; they contribute greatly to the overall science of turning sand into computational power, but the idea of creating a massive general purpose x86 (of all God-awful things) processor and then trying to turn it into a renderer with software just seemed a little silly. I'd hoped that they could do it, I'd hoped that they could do something, but they failed.

Heck, the GPGPU scene hasn't really taken off like it's expected to have. I hope it does, as the prospect of any large-scale FPU intensive calculations bypassing SSE units and on to massively parallel GPUs sounds awesome to me.
 
MadManOriginal
Gerbil Jedi
Posts: 1533
Joined: Wed Jan 30, 2002 7:00 pm
Location: In my head...

Re: Its official: Larrabee GPU is dead

Wed May 26, 2010 8:21 am

kvndoom wrote:
MadManOriginal wrote:
kvndoom wrote:
I should be able to count all the people who are surprised by this on my middle finger. If even that many. Intel never will offer anything besides their integrated garbage.


I can count at least one person who doesn't understand the phrase '..in the short-term' on mine too - you! :) If you actually bothered to read more than the headline you'd understand the nuances and the fact that the project is not being killed. Perhaps that's too much to ask in these dumbed down soundbite days though.

Oh please. The longer they wait, the more powerful nvidia and ati GPU's get and the farther behind their garbage (yes, garbage) falls. I tell you what. "WHEN" it comes out, reference this post and I'll paypal you 15 dollars. Until then, enjoy your Kool-aid.


I certainly will :D and I *will* count Larabee-based HPC parts or IGPs, hopefully you don't try to back out since you haven't specified what 'it' is that counts as coming out.

I just don't get the overboard Intel hate is all. Did Intel rape your mom or something?
 
morphine
TR Staff
Posts: 11600
Joined: Fri Dec 27, 2002 8:51 pm
Location: Portugal (that's next to Spain)

Re: Its official: Larrabee GPU is dead

Wed May 26, 2010 8:28 am

MadManOriginal wrote:
I just don't get the overboard Intel hate is all. Did Intel rape your mom or something?

It's quite simple: Intel makes the majority of integrated chipsets. Thus they have the necessary leverage to also have the majority of integrated graphics. Which are notoriously crap. They could do a lot better, but they don't bother, because of their market share.
There is a fixed amount of intelligence on the planet, and the population keeps growing :(
 
Airmantharp
Emperor Gerbilius I
Posts: 6192
Joined: Fri Oct 15, 2004 10:41 pm

Re: Its official: Larrabee GPU is dead

Wed May 26, 2010 9:27 am

Are you sure that it is market share, or demand?

If better than current (i3/i5) integrated graphics were needed today, wouldn't consumers be flocking to AMD based systems, given that the performance of the CPU is largely irrelevent for daily tasks? What's Intel's motivation to improve, aside from being completely outclassed (which I believe they're not)?

WoW and LotRO both play funtionally on Core 2 era Intel IGP's, and current iterations are faster as well as having full HD acceleration. I don't think I could ask much more from the bottom of the barrel, and yet they're improving it, based on their own desire and expectation of market needs, not necessarily current needs.
 
Shining Arcanine
Gerbil Jedi
Posts: 1718
Joined: Wed Jun 11, 2003 11:30 am

Re: Its official: Larrabee GPU is dead

Wed May 26, 2010 5:10 pm

kvndoom wrote:
I should be able to count all the people who are surprised by this on my middle finger. If even that many. Intel never will offer anything besides their integrated garbage.


You can include me in that count. I was hoping that something would come of Larrabee. I was confident that Intel would realize their mistake in using x86 and move to something else for Larrabee, but it seems that they refuse to do that.
Disclaimer: I over-analyze everything, so try not to be offended if I over-analyze something you wrote.
 
just brew it!
Administrator
Posts: 54500
Joined: Tue Aug 20, 2002 10:51 pm
Location: Somewhere, having a beer

Re: Its official: Larrabee GPU is dead

Wed May 26, 2010 5:31 pm

Shining Arcanine wrote:
You can include me in that count. I was hoping that something would come of Larrabee. I was confident that Intel would realize their mistake in using x86 and move to something else for Larrabee, but it seems that they refuse to do that.

But the whole point of Larrabee was to build a GPU on top of x86.
Nostalgia isn't what it used to be.
 
morphine
TR Staff
Posts: 11600
Joined: Fri Dec 27, 2002 8:51 pm
Location: Portugal (that's next to Spain)

Re: Its official: Larrabee GPU is dead

Wed May 26, 2010 5:54 pm

just brew it! wrote:
But the whole point of Larrabee was to build a GPU on top of x86.

Square peg, round hole? :)
There is a fixed amount of intelligence on the planet, and the population keeps growing :(
 
Shining Arcanine
Gerbil Jedi
Posts: 1718
Joined: Wed Jun 11, 2003 11:30 am

Re: Its official: Larrabee GPU is dead

Wed May 26, 2010 6:38 pm

just brew it! wrote:
Shining Arcanine wrote:
You can include me in that count. I was hoping that something would come of Larrabee. I was confident that Intel would realize their mistake in using x86 and move to something else for Larrabee, but it seems that they refuse to do that.

But the whole point of Larrabee was to build a GPU on top of x86.


But that is a stupid idea.

Really, x86 is pure bloat and if Intel modified their current processors to remove the x86 decoder and instead expose the micro-ops as the ISA, their processors would become much more energy efficient. x86 only requires a few million transistors, but those few million transistors need to be running nearly 100% of the time, so when almost everything could be powered off such that only 1 million other transistors are running, those few million will account for the bulk of the power consumption. If they were not there, power consumption would drop precipitously.
Disclaimer: I over-analyze everything, so try not to be offended if I over-analyze something you wrote.
 
SNM
Emperor Gerbilius I
Posts: 6209
Joined: Fri Dec 30, 2005 10:37 am

Re: Its official: Larrabee GPU is dead

Wed May 26, 2010 6:44 pm

Shining Arcanine wrote:
Really, x86 is pure bloat and if Intel modified their current processors to remove the x86 decoder and instead expose the micro-ops as the ISA, their processors would become much more energy efficient. x86 only requires a few million transistors, but those few million transistors need to be running nearly 100% of the time, so when almost everything could be powered off such that only 1 million other transistors are running, those few million will account for the bulk of the power consumption. If they were not there, power consumption would drop precipitously.

I'm glad you know that x86 instructions are just translated to micro-ops these days, but I think you need a refresh on chip design.

Edit: Let me elaborate. Since the x86 instructions are just translated, and there are no latency guarantees in x86, the use of the x86 instruction set doesn't constrain which transistors are allowed to be powered down. Not to mention that the last chip layout design I saw, the power control takes up a few million transistors, so even if x86 mysteriously did require a few million transistors to remain powered up, you couldn't cut idle power by more than 50% by removing them.
Core i7 920, 3x2GB Corsair DDR3 1600, 80GB X25-M, 1TB WD Caviar Black, MSI X58 Pro-E, Radeon 4890, Cooler Master iGreen 600, Antec P183, opticals
 
just brew it!
Administrator
Posts: 54500
Joined: Tue Aug 20, 2002 10:51 pm
Location: Somewhere, having a beer

Re: Its official: Larrabee GPU is dead

Wed May 26, 2010 6:54 pm

Shining Arcanine wrote:
just brew it! wrote:
But the whole point of Larrabee was to build a GPU on top of x86.

But that is a stupid idea.

Bingo.
Nostalgia isn't what it used to be.
 
Shining Arcanine
Gerbil Jedi
Posts: 1718
Joined: Wed Jun 11, 2003 11:30 am

Re: Its official: Larrabee GPU is dead

Wed May 26, 2010 7:56 pm

SNM wrote:
Shining Arcanine wrote:
Really, x86 is pure bloat and if Intel modified their current processors to remove the x86 decoder and instead expose the micro-ops as the ISA, their processors would become much more energy efficient. x86 only requires a few million transistors, but those few million transistors need to be running nearly 100% of the time, so when almost everything could be powered off such that only 1 million other transistors are running, those few million will account for the bulk of the power consumption. If they were not there, power consumption would drop precipitously.

I'm glad you know that x86 instructions are just translated to micro-ops these days, but I think you need a refresh on chip design.

Edit: Let me elaborate. Since the x86 instructions are just translated, and there are no latency guarantees in x86, the use of the x86 instruction set doesn't constrain which transistors are allowed to be powered down. Not to mention that the last chip layout design I saw, the power control takes up a few million transistors, so even if x86 mysteriously did require a few million transistors to remain powered up, you couldn't cut idle power by more than 50% by removing them.


Well, there are two different chip designs to discuss here. One is Larrabee. The other is Nehalem. Nehalem has the power monitoring circuitry you mention, and I believe that the x86 decoder in Nehalem uses a similar number of transistors. I used a few million transistors as an an educated guess for the size of one of Larrabee's x86 decoders because I expect Intel to minimize the extensions to x86 that Larrabee supports to minimize the size of its x86 decoder. I also believe that Larrabee lacks that kind of circuitry, so it would not have that constant draw, but even if it did, the next point I make renders that irrelevant.

In Nehalem, the x86 decoder can only be powered down when the Loop Stream Detector is in use. The majority of code I have seen does not use the LSD and I doubt Larrabee would have a LSD because of the tight per core transistor budget, so the x86 decoder is running all the time in Larrabee and almost all the time in Nehalem. Multiply the cost of the x86 decoder by your number of cores and you have a substantial amount of wasted power. In Nehalem, there are only 4 cores, so this issue is not the end of the world, but in Larrabee, there are 80 cores, so this issue becomes particularly profound.

The reason I said that the x86 decoders' power draw renders the power draw of any uncore Larrabee might have irrevelent is because assuming that Larrabee does include a dedicated uncore for power management that uses 20 million transistors and that each x86 decoder it has uses 2 to 3 million transistors, when you multiple the number of transistors per core for the x86 decoder by 80 cores per chip, you get 160 million to 240 million transistors per chip that must always be in use, not including the uncore. This renders the contribution of the uncore insignificant to power consumption, assuming that Larrabee has an uncore.

Perhaps I should have been more clear that my first sentence applied to Nehalem and my subsequent sentences applied to Larrabee. I know that I did a very poor job of that and I apologize for any confusion that might have caused.
Disclaimer: I over-analyze everything, so try not to be offended if I over-analyze something you wrote.
 
MadManOriginal
Gerbil Jedi
Posts: 1533
Joined: Wed Jan 30, 2002 7:00 pm
Location: In my head...

Re: Its official: Larrabee GPU is dead

Wed May 26, 2010 10:22 pm

Airmantharp wrote:
Are you sure that it is market share, or demand?

If better than current (i3/i5) integrated graphics were needed today, wouldn't consumers be flocking to AMD based systems, given that the performance of the CPU is largely irrelevent for daily tasks? What's Intel's motivation to improve, aside from being completely outclassed (which I believe they're not)?

WoW and LotRO both play funtionally on Core 2 era Intel IGP's, and current iterations are faster as well as having full HD acceleration. I don't think I could ask much more from the bottom of the barrel, and yet they're improving it, based on their own desire and expectation of market needs, not necessarily current needs.


Exactly. The only people who nerd-rage over 'crap Intel integrated graphics holding back games' are, you guessed it, gamers. Meanwhile people who use their PCs for actual productivity like in Corp IT or 'office task' home users don't really care as long as it runs Windows and whatever desktop programs they use. They are the largest market whether gamers like it or not.
 
MadManOriginal
Gerbil Jedi
Posts: 1533
Joined: Wed Jan 30, 2002 7:00 pm
Location: In my head...

Re: Its official: Larrabee GPU is dead

Wed May 26, 2010 10:28 pm

Shining Arcanine wrote:
just brew it! wrote:
Shining Arcanine wrote:
You can include me in that count. I was hoping that something would come of Larrabee. I was confident that Intel would realize their mistake in using x86 and move to something else for Larrabee, but it seems that they refuse to do that.

But the whole point of Larrabee was to build a GPU on top of x86.


But that is a stupid idea.

Really, x86 is pure bloat and if Intel modified their current processors to remove the x86 decoder and instead expose the micro-ops as the ISA, their processors would become much more energy efficient. x86 only requires a few million transistors, but those few million transistors need to be running nearly 100% of the time, so when almost everything could be powered off such that only 1 million other transistors are running, those few million will account for the bulk of the power consumption. If they were not there, power consumption would drop precipitously.


How does the compiler fit in to the equation if following the idea of 'remove the x86 decoder and instead expose the micro-ops as the ISA'?

You'd need a compiler that takes high-level language and just translates to micro-ops instead of to x86 right? What would be the other advantages/disadvantages of doing this? Is there any trade secret reason that Intel might not want to expose their micro-ops?
 
morphine
TR Staff
Posts: 11600
Joined: Fri Dec 27, 2002 8:51 pm
Location: Portugal (that's next to Spain)

Re: Its official: Larrabee GPU is dead

Wed May 26, 2010 11:30 pm

MadManOriginal wrote:
Exactly. The only people who nerd-rage over 'crap Intel integrated graphics holding back games' are, you guessed it, gamers. Meanwhile people who use their PCs for actual productivity like in Corp IT or 'office task' home users don't really care as long as it runs Windows and whatever desktop programs they use. They are the largest market whether gamers like it or not.

It's not quite as clear-cut as that. What pisses people off is that Intel integrated graphics could be a lot better, if only they bothered even a bit. Nvidia's and AMD's chipsets are the proof of that. It's mediocrity born out of marketshare.

Going even further, it's simply a chicken-and-egg problem. Corporate customers aside, and among a number of other reasons, people are bothering less and less about gaming on personal computers because of upgrades / software annoyances / etc. If a reasonably-priced computer (in other words, anything except the bottom of the barrel) came with a minimally decent graphics card/chipset by default, you'd see a lot more of those people gaming.

Hell, I still remember "back in the day", you had two kinds of mass-market computers to buy: those that were good enough to game on (affordable) and those that were good for reasonably serious work (expensive). As the years went on, the situation just about inverted. Nowadays you need a minimally decent PC to game with but not surf the web.

The Xbox360 and the PS3 GPUs are roughly the power of a 7900GT. It's 2010 now and we're down to 40nm processes. How can anyone honestly say that such a GPU can't be an integrated, run-of-the-mill one?
There is a fixed amount of intelligence on the planet, and the population keeps growing :(
 
Shining Arcanine
Gerbil Jedi
Posts: 1718
Joined: Wed Jun 11, 2003 11:30 am

Re: Its official: Larrabee GPU is dead

Mon May 31, 2010 1:57 pm

MadManOriginal wrote:
How does the compiler fit in to the equation if following the idea of 'remove the x86 decoder and instead expose the micro-ops as the ISA'?

You'd need a compiler that takes high-level language and just translates to micro-ops instead of to x86 right? What would be the other advantages/disadvantages of doing this? Is there any trade secret reason that Intel might not want to expose their micro-ops?


Intel's micro-ops are a RISC-like language as far as I am aware, although they are probably somewhat hack-ish because the outside world never sees it. RISC ISAs have the property that they make automated code generation by comilers (or automated programmers if you use 1960s terminology) easier and more efficient, so if Intel produced an ISA based off of their current micro-ops, the task of writing compilers (and efficient optimizers) would be made much simpler. This would not only result in a decrease in power consumption, but also a possible increase in performance, particularly in the cases where current compilers have difficulty producing efficient assembly code for regularly executed routines.

The only technical disadvantage of Intel exposing ts micro-ops as an ISA is the fact that all software would need to be recompiled for the new ISA to run on it. At the same time, doing this would require other companies like AMD and VIA to obtain new licenses to produce compatible processors, because their x86 licenses would not apply. These are always issues when you change ISAs.
Disclaimer: I over-analyze everything, so try not to be offended if I over-analyze something you wrote.
 
just brew it!
Administrator
Posts: 54500
Joined: Tue Aug 20, 2002 10:51 pm
Location: Somewhere, having a beer

Re: Its official: Larrabee GPU is dead

Mon May 31, 2010 2:12 pm

The biggest hurdle would be convincing Microsoft to support it. Intel was essentially forced to adopt AMD's 64-bit x86 extensions because users were starting to demand 64-bit, and Microsoft didn't want to support yet another 64-bit ISA (they were already supporting AMD-64 and Itanium).

Also, having micro-ops that are designed to allow efficient translation from x86 is not necessarily the best approach for an architecture you want to target with high-level language compilers. They'd probably be better off designing a new RISC core from scratch.
Nostalgia isn't what it used to be.
 
Shining Arcanine
Gerbil Jedi
Posts: 1718
Joined: Wed Jun 11, 2003 11:30 am

Re: Its official: Larrabee GPU is dead

Mon May 31, 2010 2:19 pm

just brew it! wrote:
The biggest hurdle would be convincing Microsoft to support it. Intel was essentially forced to adopt AMD's 64-bit x86 extensions because users were starting to demand 64-bit, and Microsoft didn't want to support yet another 64-bit ISA (they were already supporting AMD-64 and Itanium).

Also, having micro-ops that are designed to allow efficient translation from x86 is not necessarily the best approach for an architecture you want to target with high-level language compilers. They'd probably be better off designing a new RISC core from scratch.


That depends on how the existing RISC core is designed. It could be possible to make some tweaks to it to fix it so that it can be used as a stand-alone ISA that has all of the merits of other RISC ISAs on the market.

By the way, while the Ottoman Empire was the dying man of Europe during the beginning of the 20th century, Microsoft is the dying man of the software industry in the 21st century. Intel does not need Microsoft's support to produce a new ISA. For instance, supercomputers use whatever ISA has been implemented in a chip that produces the highest performance from scientists' perspectives and with their support, a new ISA will be able to last in the market indefinitely, while Microsoft will not. Intel could afford to wait for Microsoft to die while it produces chips based on the new ISA and those chips would filter down to consumer devices eventually, perhaps sooner rather than later considering all of the ARM devices that are coming onto the market this year without support from Microsoft. They would also have the benefit of the core sharing logical components with their x86 chips on the market, lowering development costs.
Disclaimer: I over-analyze everything, so try not to be offended if I over-analyze something you wrote.
 
jhtrico1850
Gerbil In Training
Posts: 6
Joined: Tue Mar 27, 2007 2:20 pm

Re: Its official: Larrabee GPU is dead

Mon May 31, 2010 4:30 pm

http://www.intel.com/pressroom/archive/ ... 31comp.htm
HPC part from Larrabee experiment is here.
 
SNM
Emperor Gerbilius I
Posts: 6209
Joined: Fri Dec 30, 2005 10:37 am

Re: Its official: Larrabee GPU is dead

Mon May 31, 2010 4:54 pm

Shining Arcanine wrote:
By the way, while the Ottoman Empire was the dying man of Europe during the beginning of the 20th century, Microsoft is the dying man of the software industry in the 21st century. Intel does not need Microsoft's support to produce a new ISA. For instance, supercomputers use whatever ISA has been implemented in a chip that produces the highest performance from scientists' perspectives and with their support, a new ISA will be able to last in the market indefinitely, while Microsoft will not.

Yeah, that worked real well with Itanium.
Microsoft isn't the only one with an investment in x86.
Core i7 920, 3x2GB Corsair DDR3 1600, 80GB X25-M, 1TB WD Caviar Black, MSI X58 Pro-E, Radeon 4890, Cooler Master iGreen 600, Antec P183, opticals
 
reactorfuel
Gerbil
Posts: 72
Joined: Thu Feb 28, 2008 5:58 am

Re: Its official: Larrabee GPU is dead

Mon May 31, 2010 9:38 pm

SNM wrote:
Yeah, that worked real well with Itanium.
Microsoft isn't the only one with an investment in x86.

No kidding. 450 of the 500 fastest supercomputing sites in the world use x86 CPUs.
 
tfp
Grand Gerbil Poohbah
Posts: 3413
Joined: Wed Sep 24, 2003 11:09 am

Re: Its official: Larrabee GPU is dead

Mon May 31, 2010 10:03 pm

Shining Arcanine wrote:
The only technical disadvantage of Intel exposing ts micro-ops as an ISA is the fact that all software would need to be recompiled for the new ISA to run on it. At the same time, doing this would require other companies like AMD and VIA to obtain new licenses to produce compatible processors, because their x86 licenses would not apply. These are always issues when you change ISAs.


And what you’re saying sound like what Intel tried to do with Itanium, which everyone hated/complained about.

Now what would be funny is if their chips started translating the x86 into IA64 "micro-ops" and then run those. Only with the "x86" chips they would have OoO processing and the rest. They could then expose the IA64 "micro-ops" down the road and then everyone would be happy. For the old programs that aren't going to be recompiled they could just run them through an x86 emulator kind of like what alpha had back in the day, oh wait...

Who is online

Users browsing this forum: No registered users and 1 guest
GZIP: On