For many months now, Intel has told the public its upcoming discrete graphics processor would come out either late this year or in early 2010. Well, PC Magazine says the company has now “sharpened” that release time frame, burying the prospect of a 2009 launch.
The news comes from Paul Otellini himself. PC Magazine says the Intel CEO stated in a recent conference call, “I would expect volume introduction of this product to be early next year.” An Intel spokesman later elaborated:
“We always said it would launch in the 2009/2010 timeframe,” Intel spokesman Nick Knupffer said in an email Friday. “We are narrowing that timeframe. Larrabee is healthy and in our labs right now. There will be multiple versions of Larrabee over time. We are not releasing additional details at this time.”
Intel has stayed mum on many aspects of Larrabee, but a recently revealed shot of the GPU’s silicon allowed us to speculate about the GPU’s guts yesterday. That said, the chipmaker has hinted that different versions of the chip are in the works.
they already got the silicon… now they have to make the drivers to make it work! (and most of us know how intel graphics drivers work…)
I’m going to weigh in and go on record saying “Larrabee will be a dud”.
That is, unless they can pay all the card mfgs to drop ATI/Nvidia gpus.
Hey, when something works for you, you stick with it, right?
It took you 2 days to come up with that? You’ve been slacking
i wonder how intel expects than a general purpose cpu will beat in graphics the hardware designed and refined for that purpose, (asics).
very pretentious or very stupid.
sadly many people don’t think by themselves.
i will make an example for you to understand. intel has a duck (walks,swims and flies, general purpose) and pretends to beat an eagle (wich mostly just flies, specific purpose), not very smart.
It’s not QUITE that simple. It’s still a specialized, highly parallel processor with its own memory and all that. But we haven’t actually seen it in action in its final form, so it’s hard to really say.
It doesn’t appear that it’s going to blow everything out of the water, or really have ANY particular advantage, just because it uses a different type of design, though.
The only advantage is just that the design makes it easier for them to get into the market without having an established product they’ve been building up for years, as ATI and Nvidia has kind of done.
Even if it will have roughly the same performance at launch, Intel has also HUUUGE advantage of their manufacturing tech.
I dunno, those big fabs-for-rent are doing pretty good these days.
Have you actually read and understood the Larrabee project?
READ:
=> §[<http://www.ddj.com/hpc-high-performance-computing/216402188<]§ => §[<http://software.intel.com/en-us/articles/prototype-primitives-guide/<]§ => §[<http://isdlibrary.intel-dispatch.com/isd/2499/GamePhysicsOnLarrabee_paper.pdf<]§ Each core is a modified Pentium with L2 cache and a vector processor unit...The key to Larrabee is the *[
Ignorance?
The pentium & vector unit is doing allot of work that GPU have dedicated transistor for. So what is done in parallel on a GPU need to be serialized on larrabee.
Intel even realized that common GPU functionality cant be ’emulated’ without a huge speed drop.
So you cant just look at raw power… It would be a huge piece of code to emulate a texture unit doing trilinear texture fetch in an sRGB volume texture.
So at least Intel got a texture units in there… but how about the rasterizer setup? for small triangles this can be a huge overhead.
But Intel is betting that future design will have this made irrelevant.
And I think they are right… and on top of it all larrabee can be the ‘SSE’ killer.
Yet, larrabbe is only a bit more flexible then next gen GPU, so dominance for ‘stream computing’ and gaming is not in intels cards just yet.
For example, I’m not sure larrabe will be much better, if at all, then nvidia upcoming GPU at running OpenCL code.
And for sure nvidia tweak their HW for todays API & game engines,
not raytracers…
Stephan
Dx and opengl can be ran on larrabee using a interpreter. Although you can code directly to larrabee using larrabee C for best performance yield.
AMD and nVidia have been creating less specific hardware with more and more ability to do general work, and intel has been adding more specific units to their general processors, so I think the two extremes are going to meet at some point, and there will be thunder.
too bad you don’t need the hole damn x86 instruction set to make some float calculations.
Which is exactly why intel added specific-purpose units.
CPUs and GPUs have been converging for several years already. GPUs now have the ability to run the sort of code which would’ve previously run on CPUs, and CPUs are getting ever-more sophisticated SIMD processing capabilities (which are actually quite similar to what happens in a GPU rendering pipeline). So the concept of a GPU with x86 cores on it isn’t as crazy as it seems at first glance.
I guess Intel is still trying to perfect its process that uses ten foot wafers.
wow hope that intel fixes all the bugs given the time left for launch
I suspect software is a bigger risk to their schedule than the hardware.
Yea, if Intel was smart, they would spend some of their billions and hire some season programmers… They will get all the money back if they kick ass in both departments! Hello, Intel? Get the hint?
Intel hires more software engineers than hardware engineers according to comments made by (I think) Otellini a few months ago.
NWA rappin’g{<.<}g
Reply Failin’g{<.<}g
Me lollin’.
LOL havin’
Pissed my pants, did!
Mass production in 2010… So, that means you and TR will get a prototype to WOW! us with some beachmarks in 2009?
NDA allowing
PDA loving.
BBQ loving.
beachmarks would be awesome. White sand vs black sand vs tidewater pull.
Review when?
Cooling could be a problem, though. Fans don’t like sand.
But from the looks of things, the initial Larrabee will be a lot of silicon. And what else is made of silica?
Hippies!
…yeah ok I couldn’t think of anything except that giant crystal space lifeform on Star Trek TNG that the angsty lady doctor shattered.