Personal computing discussed
Moderators: renee, morphine, SecretSquirrel
kvndoom wrote:I should be able to count all the people who are surprised by this on my middle finger. If even that many. Intel never will offer anything besides their integrated garbage.
UberGerbil wrote:But experiments often end in failure; that's no reason to stop attempting them -- to the contrary, you often learn the most from your failures. The real failure is that Intel was already trying to build and flog products before the experiment was complete and its outcome known.
MadManOriginal wrote:kvndoom wrote:I should be able to count all the people who are surprised by this on my middle finger. If even that many. Intel never will offer anything besides their integrated garbage.
I can count at least one person who doesn't understand the phrase '..in the short-term' on mine too - you! If you actually bothered to read more than the headline you'd understand the nuances and the fact that the project is not being killed. Perhaps that's too much to ask in these dumbed down soundbite days though.
kvndoom wrote:MadManOriginal wrote:kvndoom wrote:I should be able to count all the people who are surprised by this on my middle finger. If even that many. Intel never will offer anything besides their integrated garbage.
I can count at least one person who doesn't understand the phrase '..in the short-term' on mine too - you! If you actually bothered to read more than the headline you'd understand the nuances and the fact that the project is not being killed. Perhaps that's too much to ask in these dumbed down soundbite days though.
Oh please. The longer they wait, the more powerful nvidia and ati GPU's get and the farther behind their garbage (yes, garbage) falls. I tell you what. "WHEN" it comes out, reference this post and I'll paypal you 15 dollars. Until then, enjoy your Kool-aid.
MadManOriginal wrote:I just don't get the overboard Intel hate is all. Did Intel rape your mom or something?
kvndoom wrote:I should be able to count all the people who are surprised by this on my middle finger. If even that many. Intel never will offer anything besides their integrated garbage.
Shining Arcanine wrote:You can include me in that count. I was hoping that something would come of Larrabee. I was confident that Intel would realize their mistake in using x86 and move to something else for Larrabee, but it seems that they refuse to do that.
just brew it! wrote:But the whole point of Larrabee was to build a GPU on top of x86.
just brew it! wrote:Shining Arcanine wrote:You can include me in that count. I was hoping that something would come of Larrabee. I was confident that Intel would realize their mistake in using x86 and move to something else for Larrabee, but it seems that they refuse to do that.
But the whole point of Larrabee was to build a GPU on top of x86.
Shining Arcanine wrote:Really, x86 is pure bloat and if Intel modified their current processors to remove the x86 decoder and instead expose the micro-ops as the ISA, their processors would become much more energy efficient. x86 only requires a few million transistors, but those few million transistors need to be running nearly 100% of the time, so when almost everything could be powered off such that only 1 million other transistors are running, those few million will account for the bulk of the power consumption. If they were not there, power consumption would drop precipitously.
Shining Arcanine wrote:just brew it! wrote:But the whole point of Larrabee was to build a GPU on top of x86.
But that is a stupid idea.
SNM wrote:Shining Arcanine wrote:Really, x86 is pure bloat and if Intel modified their current processors to remove the x86 decoder and instead expose the micro-ops as the ISA, their processors would become much more energy efficient. x86 only requires a few million transistors, but those few million transistors need to be running nearly 100% of the time, so when almost everything could be powered off such that only 1 million other transistors are running, those few million will account for the bulk of the power consumption. If they were not there, power consumption would drop precipitously.
I'm glad you know that x86 instructions are just translated to micro-ops these days, but I think you need a refresh on chip design.
Edit: Let me elaborate. Since the x86 instructions are just translated, and there are no latency guarantees in x86, the use of the x86 instruction set doesn't constrain which transistors are allowed to be powered down. Not to mention that the last chip layout design I saw, the power control takes up a few million transistors, so even if x86 mysteriously did require a few million transistors to remain powered up, you couldn't cut idle power by more than 50% by removing them.
Airmantharp wrote:Are you sure that it is market share, or demand?
If better than current (i3/i5) integrated graphics were needed today, wouldn't consumers be flocking to AMD based systems, given that the performance of the CPU is largely irrelevent for daily tasks? What's Intel's motivation to improve, aside from being completely outclassed (which I believe they're not)?
WoW and LotRO both play funtionally on Core 2 era Intel IGP's, and current iterations are faster as well as having full HD acceleration. I don't think I could ask much more from the bottom of the barrel, and yet they're improving it, based on their own desire and expectation of market needs, not necessarily current needs.
Shining Arcanine wrote:just brew it! wrote:Shining Arcanine wrote:You can include me in that count. I was hoping that something would come of Larrabee. I was confident that Intel would realize their mistake in using x86 and move to something else for Larrabee, but it seems that they refuse to do that.
But the whole point of Larrabee was to build a GPU on top of x86.
But that is a stupid idea.
Really, x86 is pure bloat and if Intel modified their current processors to remove the x86 decoder and instead expose the micro-ops as the ISA, their processors would become much more energy efficient. x86 only requires a few million transistors, but those few million transistors need to be running nearly 100% of the time, so when almost everything could be powered off such that only 1 million other transistors are running, those few million will account for the bulk of the power consumption. If they were not there, power consumption would drop precipitously.
MadManOriginal wrote:Exactly. The only people who nerd-rage over 'crap Intel integrated graphics holding back games' are, you guessed it, gamers. Meanwhile people who use their PCs for actual productivity like in Corp IT or 'office task' home users don't really care as long as it runs Windows and whatever desktop programs they use. They are the largest market whether gamers like it or not.
MadManOriginal wrote:How does the compiler fit in to the equation if following the idea of 'remove the x86 decoder and instead expose the micro-ops as the ISA'?
You'd need a compiler that takes high-level language and just translates to micro-ops instead of to x86 right? What would be the other advantages/disadvantages of doing this? Is there any trade secret reason that Intel might not want to expose their micro-ops?
just brew it! wrote:The biggest hurdle would be convincing Microsoft to support it. Intel was essentially forced to adopt AMD's 64-bit x86 extensions because users were starting to demand 64-bit, and Microsoft didn't want to support yet another 64-bit ISA (they were already supporting AMD-64 and Itanium).
Also, having micro-ops that are designed to allow efficient translation from x86 is not necessarily the best approach for an architecture you want to target with high-level language compilers. They'd probably be better off designing a new RISC core from scratch.
Shining Arcanine wrote:By the way, while the Ottoman Empire was the dying man of Europe during the beginning of the 20th century, Microsoft is the dying man of the software industry in the 21st century. Intel does not need Microsoft's support to produce a new ISA. For instance, supercomputers use whatever ISA has been implemented in a chip that produces the highest performance from scientists' perspectives and with their support, a new ISA will be able to last in the market indefinitely, while Microsoft will not.
SNM wrote:Yeah, that worked real well with Itanium.
Microsoft isn't the only one with an investment in x86.
Shining Arcanine wrote:The only technical disadvantage of Intel exposing ts micro-ops as an ISA is the fact that all software would need to be recompiled for the new ISA to run on it. At the same time, doing this would require other companies like AMD and VIA to obtain new licenses to produce compatible processors, because their x86 licenses would not apply. These are always issues when you change ISAs.