Nvidia’s Kirk: Larrabee is the GPU a CPU designer would build

We’ve heard Nvidia discuss its vision for the future extensively these past few weeks, the latest example being the company’s six-hour-long Financial Analyst Day conference earlier this month. Taking advantage of this talkativeness, the guys at bit-tech.net have spent some time with Nvidia Chief Scientist David Kirk, asking him about everything from CUDA‘s future to Intel’s Larrabee.

Interestingly, Kirk isn’t opposed to the idea of licensing CUDA to competitors like AMD. He told bit-tech.net, “We do take every opportunity to discuss the ability to run CUDA with anyone who’s interested. It’s not exactly an open standard, but there’s really not very much that is proprietary about it. . . . The pieces of the tools we build are made available to whoever is interested in using them.” Kirk also envisions a future where CUDA applications can run on both CPUs and GPUs, and “we can start adding GPUs or CPU cores to increase performance.”

On Larrabee, Intel’s discrete graphics processor project, Kirk said Nvidia sees it as “the GPU that a CPU designer would build, not the GPU you’d build if you were a GPU designer.” Kirk repeated Nvidia CEO Jen-Hsun Huang’s line that Larrabee is just a slide in a presentation, and slides tend to look perfect. “I’m not going to get into all of the details especially for Larrabee, but they’re missing some pretty important pieces about how a GPU works,” he added.

Kirk also talked a little bit about AMD, revealing a pessimistic view of the company’s future. “AMD has been declining because it hasn’t built a competitive graphics architecture for almost two years now—ever since the AMD/ATI merger. They’ve been pulling engineers [from the GPU teams] to Fusion, which integrates GPU technology onto the CPU. They have to do four things to survive, but I don’t think they have enough money to do one thing.”

Check out Kirk’s full interview at bit-tech.net for more details about CUDA, ray tracing, and the like.

Comments closed
    • ThorAxe
    • 11 years ago

    This is somewhat bemusing since there were *[<90 MILLION<]* sales of discrete graphics cards in 2007 with an *[

    • thermistor
    • 11 years ago

    I think the biggest short term and long term problem for Nvidia is that AMD is using the former ATI for not just the integrated graphics portion of their PC offering, but moving all their core logic to ATI. That includes SERVERS.

    Weren’t Nvidia chipsets a mainstay in Opteron servers, not more than 18 months ago? Yeah, AMD’s server share is down, but they weren’t using (junky) ATI chipsets, that I recall.

    Part of AMD’s fabled ‘open’ approach was a strong reliance on Nvidia…is that not right? And that revenue stream is disappearing due to AMD’s desire to have a ‘platform’ in house.

    • bogbox
    • 11 years ago

    Intel stated that the future is similar with fusion( or the Intel counterpart )
    *[

      • SPOOFE
      • 11 years ago

      My suspicion: For all Intel’s talk about putting out discrete boards, I think they realize that the real money to be made with GPU’s is not gaming; I think they’re anticipating a greater prevalence of 3D acceleration for pretty much every mundane task a computer can do and are planning accordingly.

      The discrete boards they’re going for are to be the gaming versions, faster with more bandwidth and memory and shader/texture/etc. power… but their integrated stuff, spun off from the discrete chips, is probably meant to just be “faster integrated graphics”, nothing more.

    • 0g1
    • 11 years ago

    Fusion and Larrabee+CPU are quite similar, except Larrabee is an external solution.

    This reminds me of Sony’s PS3. The Cell CPU was supposed to replace the graphics card at first, but once they realized it would never perform 3D graphics like a GPU can, they went to nVidia for support. But in the end, they have a very powerful CPU for physics and I guess AI to some extent. The problem with the Cell CPU though is that its not based on the x86 architecture and its more complicated to program for.

    I think Fusion and Larrabee+CPU will be nice, however, you’ll always need a graphics card if you want to do some large amounts of pixel or vertex shaders (even voxel based raytracing could take advantage of this by detailing the voxels). I dont think we’re ever going back to software rendering (ie all on the CPU) — at least not completely.

    Kirk says that AMD hasn’t made a competitve GPU for 2 years now … well just recently the 3870 X2 was the fastest GPU available until the 9800 GX2 was released. Not to mention the 3DMark06 records are held by quad crossfire.

    Kirk also seems to think AMD wont be competitve in the future of GPU’s because theyve pulled engineers into Fusion? I might be guessing here, but I think its a good guess that the 4870 will be much more competitve in terms of bang for buck than the 9900GTX 512bit bus card nV has planned.

      • ish718
      • 11 years ago

      I completely agree with you, you make some valid points…

      • Flying Fox
      • 11 years ago

      Actually the Nehalem variants with the integrated graphics core are closer to Fusion.

    • Krogoth
    • 11 years ago

    Nvidia is officially scared. They are trying to throw senseless FUD around.

    They know quite well what will happen if Intel manages to develop GPU solutions that do not suck at gaming.

      • PRIME1
      • 11 years ago

      It’s AMD who should be scared. It’s very doubtful that Intel will launch a GPU on the level of the G90 or GT200. However they could launch a decent mid-range card and further dilute the only refuge that ATI has right now.

      Their marketshare is already way down, I doubt they could survive on any less. With their CPU division struggling as well. It may just be a matter of time before someone (even NVIDIA) buys them out.

        • Krogoth
        • 11 years ago

        That is the point, Intel does not give a hoot about the high-end segment.

        They are after mid-range and value ranges. The real butter and bread of discrete GPU guys even Nvidia.

        Guess what happens if that Intel discrete GPU solution that comes with average joe and kid gamer’s Dell does not suck for gaming?

        They have no real reason to get a discrete solution. These folks are what make the bulk of discrete GPU sales.

        At the same time, Intel can easily secure its position to be a desirable hardware platform for future gaming consoles.

        Both AMD and Nvidia are very afraid of those prospects.

          • Silus
          • 11 years ago

          I have my doubts that Intel can pull off a mid-range card as good as the 9600 GT or HD 3870. I even have doubts they can match the performance of the now lowly 8600 GT. Maybe in a few years, after the first incarnation of Larrabee they can manage that, but at first ? I highly doubt that.
          Intel’s been around for years and it’s not just because of Fusion that they’re getting into the discrete graphics card market. They certainly had the funds to do it before, but why didn’t they do it ? Simply because they knew they couldn’t compete with NVIDIA and ATI (at the time). It’s no different now.

          They just want to extend their “tentacles” to another market, now that they dominate the CPU one. Maybe they can indeed pull off a 8600 GT performance level chip, or maybe a bit better, but that’s going to be NVIDIA’s or AMD’s low level performance cards, by the time Larrabee is out. I agree with you that they want to go for the mid-range market, but I have my doubts they’ll be able to with Larrabee’s first incarnations.

          • PetMiceRnice
          • 11 years ago

          Indeed, quite right about Intel. They don’t /[

        • ludi
        • 11 years ago

        Unlikely that it would be Nvidia. First, the SEC scrutiny would be pretty brutal. Second, buying AMD outright would mean acquiring fabs and a whole lot of debt, and Nvidia is sitting quite pretty without either one.

        If AMD went completely into point-of-no-return bankruptcy and their assets were fire-saled, then sure, Nvidia would happily grab as much graphics and chipset IP as possible, and cherry-pick the best engineering talent from same divisions.

        • BabelHuber
        • 11 years ago

        I think that AMD’s biggest problem by far is the lousy results they got with Barcelona/ Phenom:
        First, AMD was half a year too late at the market. Then, when working silicon arrived, it turned out that it was slower than the glued-together C2Q. Afterwards, it turned out that the silicon was not working 100%.

        AMD has to build a winner with the 45nm K10 now.

        On the graphic side of things, it’s actually not so bad at all: The 3800s aren’t as fast as Nvidia’s offerings, but they aren’t bad either. And AMD’s drivers are also good meanwhile – probably even better than Nvidia’s.

        Why are the 3850/3870 based on the RV670 chip? Where is the R670? Wasn’t this supposed to be the high-end chip, the successor to the R600?

        My guess is that AMD cancelled the R670. I don’t know what ATi would have done on its own – perhaps they would have released some Phantom Edition high-end cards again, which AMD didn’t want to do. Just a tought.

          • Silus
          • 11 years ago

          Make no mistake. R600 was intended to take the performance crown. G80 was just too powerful for it to do that, so they had to sell the HD 2900 XT at a price similar to that of its main competitor: the 8800 GTS 640.

          There was never anything else from ATI to compete with G80. RV670 was just a way for them to reduce production costs, from the R600 failure. Performance wise, it’s pretty much the same. It’s just at an appealing price.

      • derFunkenstein
      • 11 years ago

      It’d be nice if nVidia couldn’t just rebadge an 8800GT into the 9800GT and maybe bump the RAM and core speeds a tiny bit. So yeah, I could see nVidia being kinda scared.

        • SPOOFE
        • 11 years ago

        The fact that nVidia CAN do that – successfully! – shows quite decidedly that they’re NOT scared, in my opinion.

          • derFunkenstein
          • 11 years ago

          No, I don’t think so – they’re doing it now because AMD is twiddling their thumbs. Larrabee is a ways off yet, and they’ll have their next gen GPU ready to go. I think it’s ALREADY set to go and they’re setting on it because they have no competition NOW.

            • SPOOFE
            • 11 years ago

            Yes, AMD is “twiddling their thumbs”. They’re certainly not “working as hard as possible to get a competitive product out the door”, no… they’re “twiddling their thumbs.”

            There’s always a certain fear in business; when the 800 pound gorilla says it’s moving, everyone tenses up for it. That’s hardly the same thing as “scared”… Intel’s history shows quite clearly that their interest is in getting the masses, not the niche gamer markets. Maybe this time will be different, but to imply any sort of panic on nVidia’s part is foolish… there’s no justification for that assertion.

      • gtoulouzas
      • 11 years ago

      I’ve heard that line of reasoning before. It was in 1997, when intel announced its first entrance to the 3D graphics market. I even remember the PC Format caricature with 3dfx and nVidia “shaking in their boots” at the news of mighty intel entering their market.

      My point being, yours is a big IF. 🙂

        • Krogoth
        • 11 years ago

        At the time, Intel’s strategy was not getting a foothold into the discrete market. It was to help to speed up AGP adoption. IIRC, it had work quite well. i740 was an affordable, decent performing first-generation AGP card.

    • MadManOriginal
    • 11 years ago

    I guess having smooth and non-abrasive public statements is not a job requirement for being chief scientist.

      • Flying Fox
      • 11 years ago

      When the CEO actually uses the term “can of whoop ass” does anyone else in the company need to be discrete?

    • mentaldrano
    • 11 years ago

    Intel could say that a GPU is what you get when you discard accuracy for speed. Hence the whole “our GPUs are great for scientific calculations!” but don’t support the IEEE standard for floating point ops.

    Yes, Nvidia makes great GPUs, which work great for games. No, Intel is probably not going to blast them out of the water with Larrabee, at least when it comes to game performance. I do think that Larrabee will eat Nvidia for lunch when it comes to scientific computing and other supercomputer workloads, CUDA or not.

    As for AMD, I sure hope they shine like a diamond, and SOON. Intel vs Nvidia with no AMD makes me shiver.

    • gtoulouzas
    • 11 years ago

    “AMD has been declining because it hasn’t built a competitive graphics architecture for almost two years now—ever since the AMD/ATI merger. They’ve been pulling engineers [from the GPU teams] to Fusion, which integrates GPU technology onto the CPU”.

    If pulling off this strategy has been the reason behind ATI’s lackluster releases, it is insane. Has it ever occurred to AMD why intel is not taking the CPU-GPU integration route? Nothing competitive will come out of this compromise.

    Cyrix MediaGX, part deuce, here we come.

      • indeego
      • 11 years ago

      I don’t know. If AMD can bring a new class of business applications (3D enabled) to the masses cheaply, and before Intel, they have a potential mass market there while Intel sits on its driver team.

      Potential. AMD isn’t known for its executions lately and I have little confidence in them capitalizing on thisg{<.<}g

      • Krogoth
      • 11 years ago

      You are smoking some fine stuff there buddy.

      Intel intends on beating AMD to the whole CPU+GPU integrated solution. Fusion is the reason why Intel is giving a flying hoot about GPU development.

      • dragmor
      • 11 years ago

      PC’s are moving towards SOC designs. This leads to cheaper and simpiler designs which lead to more OEM wins.

      For the average user any CPU sold today is enough. There will still be high end but if 90% can be covered by low power integrated solution why not build it?

    • Shinare
    • 11 years ago

    I dont care if its a GPU a CPU designer would come up with. What he expects us to believe is that this is a relevant idea. All I care about is performance, I dont care about the design on the inside. If Looloobee performs with the best of them, or better, then who cares?

    • Saber Cherry
    • 11 years ago

    g{<"I'm not going to get into all of the details especially for Larrabee, but they’re missing some pretty important pieces about how a GPU works," he added.<}g David Kirk is being so sneaky... but even if he stays quiet, eventually Intel will realize they forgot *[

    • Majiir Paktu
    • 11 years ago

    “Larrabee is the GPU a CPU designer would build”

    Really? Gee, I hadn’t thought of this. I mean, now that you point it out, Larrabee /[

      • Meadows
      • 11 years ago

      Because it has fewer, longer pipelines instead of short, (stout,) rapid ones, but a bahzillion at that.

        • Majiir Paktu
        • 11 years ago

        Indeed. Almost reminds me of the teraflops prototype Intel built a while back– what was it, 80 non-x86 FPUs on a single die?

        I really wish multithreading would develop to the point where many simple cores could thrive in a computing environment. Who needs out-of-order speculative execution when you can have the raw power of four times the cores?

    • ludi
    • 11 years ago

    mis post, whoops

    • pogsnet
    • 11 years ago
      • ludi
      • 11 years ago

      Yeah, well, VIA hasn’t /[

Pin It on Pinterest

Share This