Larrabee won’t come out until early 2010, Intel says

For many months now, Intel has told the public its upcoming discrete graphics processor would come out either late this year or in early 2010. Well, PC Magazine says the company has now “sharpened” that release time frame, burying the prospect of a 2009 launch.

The news comes from Paul Otellini himself. PC Magazine says the Intel CEO stated in a recent conference call, “I would expect volume introduction of this product to be early next year.” An Intel spokesman later elaborated:

“We always said it would launch in the 2009/2010 timeframe,” Intel spokesman Nick Knupffer said in an email Friday. “We are narrowing that timeframe. Larrabee is healthy and in our labs right now. There will be multiple versions of Larrabee over time. We are not releasing additional details at this time.”

Intel has stayed mum on many aspects of Larrabee, but a recently revealed shot of the GPU’s silicon allowed us to speculate about the GPU’s guts yesterday. That said, the chipmaker has hinted that different versions of the chip are in the works.

Comments closed
    • juampa_valve_rde
    • 12 years ago

    they already got the silicon… now they have to make the drivers to make it work! (and most of us know how intel graphics drivers work…)

    • pluscard
    • 12 years ago

    I’m going to weigh in and go on record saying “Larrabee will be a dud”.

    That is, unless they can pay all the card mfgs to drop ATI/Nvidia gpus.

    Hey, when something works for you, you stick with it, right?

      • tfp
      • 12 years ago

      It took you 2 days to come up with that? You’ve been slacking

    • snakeoil
    • 12 years ago

    i wonder how intel expects than a general purpose cpu will beat in graphics the hardware designed and refined for that purpose, (asics).
    very pretentious or very stupid.
    sadly many people don’t think by themselves.
    i will make an example for you to understand. intel has a duck (walks,swims and flies, general purpose) and pretends to beat an eagle (wich mostly just flies, specific purpose), not very smart.

      • OneArmedScissor
      • 12 years ago

      It’s not QUITE that simple. It’s still a specialized, highly parallel processor with its own memory and all that. But we haven’t actually seen it in action in its final form, so it’s hard to really say.

      It doesn’t appear that it’s going to blow everything out of the water, or really have ANY particular advantage, just because it uses a different type of design, though.

      The only advantage is just that the design makes it easier for them to get into the market without having an established product they’ve been building up for years, as ATI and Nvidia has kind of done.

        • zima
        • 12 years ago

        Even if it will have roughly the same performance at launch, Intel has also HUUUGE advantage of their manufacturing tech.

          • Anonymous Coward
          • 12 years ago

          I dunno, those big fabs-for-rent are doing pretty good these days.

      • stmok
      • 12 years ago

      Have you actually read and understood the Larrabee project?

      READ:
      => §[<http://www.ddj.com/hpc-high-performance-computing/216402188<]§ => §[<http://software.intel.com/en-us/articles/prototype-primitives-guide/<]§ => §[<http://isdlibrary.intel-dispatch.com/isd/2499/GamePhysicsOnLarrabee_paper.pdf<]§ Each core is a modified Pentium with L2 cache and a vector processor unit...The key to Larrabee is the *[

        • sschaem
        • 12 years ago

        Ignorance?

        The pentium & vector unit is doing allot of work that GPU have dedicated transistor for. So what is done in parallel on a GPU need to be serialized on larrabee.

        Intel even realized that common GPU functionality cant be ’emulated’ without a huge speed drop.
        So you cant just look at raw power… It would be a huge piece of code to emulate a texture unit doing trilinear texture fetch in an sRGB volume texture.
        So at least Intel got a texture units in there… but how about the rasterizer setup? for small triangles this can be a huge overhead.

        But Intel is betting that future design will have this made irrelevant.
        And I think they are right… and on top of it all larrabee can be the ‘SSE’ killer.
        Yet, larrabbe is only a bit more flexible then next gen GPU, so dominance for ‘stream computing’ and gaming is not in intels cards just yet.

        For example, I’m not sure larrabe will be much better, if at all, then nvidia upcoming GPU at running OpenCL code.
        And for sure nvidia tweak their HW for todays API & game engines,
        not raytracers…

        Stephan

          • ish718
          • 12 years ago

          Dx and opengl can be ran on larrabee using a interpreter. Although you can code directly to larrabee using larrabee C for best performance yield.

      • Meadows
      • 12 years ago

      AMD and nVidia have been creating less specific hardware with more and more ability to do general work, and intel has been adding more specific units to their general processors, so I think the two extremes are going to meet at some point, and there will be thunder.

        • armistitiu
        • 12 years ago

        too bad you don’t need the hole damn x86 instruction set to make some float calculations.

          • Meadows
          • 12 years ago

          Which is exactly why intel added specific-purpose units.

      • just brew it!
      • 12 years ago

      CPUs and GPUs have been converging for several years already. GPUs now have the ability to run the sort of code which would’ve previously run on CPUs, and CPUs are getting ever-more sophisticated SIMD processing capabilities (which are actually quite similar to what happens in a GPU rendering pipeline). So the concept of a GPU with x86 cores on it isn’t as crazy as it seems at first glance.

    • bdwilcox
    • 12 years ago

    I guess Intel is still trying to perfect its process that uses ten foot wafers.

    • casingh
    • 12 years ago

    wow hope that intel fixes all the bugs given the time left for launch

      • UberGerbil
      • 12 years ago

      I suspect software is a bigger risk to their schedule than the hardware.

        • SomeOtherGeek
        • 12 years ago

        Yea, if Intel was smart, they would spend some of their billions and hire some season programmers… They will get all the money back if they kick ass in both departments! Hello, Intel? Get the hint?

          • MadManOriginal
          • 12 years ago

          Intel hires more software engineers than hardware engineers according to comments made by (I think) Otellini a few months ago.

    • indeego
    • 12 years ago

    NWA rappin’g{<.<}g

      • indeego
      • 12 years ago

      Reply Failin’g{<.<}g

        • Buzzard44
        • 12 years ago

        Me lollin’.

          • ssidbroadcast
          • 12 years ago

          LOL havin’

            • SomeOtherGeek
            • 12 years ago

            Pissed my pants, did!

    • SomeOtherGeek
    • 12 years ago

    Mass production in 2010… So, that means you and TR will get a prototype to WOW! us with some beachmarks in 2009?

      • DrDillyBar
      • 12 years ago

      NDA allowing

        • ssidbroadcast
        • 12 years ago

        PDA loving.

          • Nitrodist
          • 12 years ago

          BBQ loving.

      • d0g_p00p
      • 12 years ago

      beachmarks would be awesome. White sand vs black sand vs tidewater pull.

      Review when?

        • UberGerbil
        • 12 years ago

        Cooling could be a problem, though. Fans don’t like sand.

        But from the looks of things, the initial Larrabee will be a lot of silicon. And what else is made of silica?

          • Scrotos
          • 12 years ago

          Hippies!

          …yeah ok I couldn’t think of anything except that giant crystal space lifeform on Star Trek TNG that the angsty lady doctor shattered.

Pin It on Pinterest

Share This