AMD demos Llano, confirms mid-2011 release

At today’s AMD Technical Forum and Exhibition in Taipei, Taiwan, corporate VP and general manager of AMD’s client division Chris Cloran revealed that the much-anticipated Llano APU will arrive in "mid 2011." Llano started sampling way back in April, and it was initially expected to arrive later this year or early next. Cloran went on to show off one of the first Llano wafers built using GlobalFoundries’ 32-nano fabrication process.

Shiny wafers are neat to look at and all, but the demo that followed was far more interesting. AMD showed a quad-core Llano system playing a high-def video clip while running a DirectCompute-enabled celestial body simulation and a four-way instance of HyperPi. Video playback was smooth despite the fact that all four CPU cores were pegged at full utilization during the demo.

TR regulars will know Llano is a desktop-bound APU that consolidates a graphics processor and four Phenom II-derived CPU cores on a single die. You won’t have to wait until the middle of next year to get your hands on AMD’s first Fusion offering, though. According to Cloran, AMD’s Bobcat-based Brazos APU is still on track to infiltrate the mobile world in the "first part of 2011." Cloran was particularly enthusiastic about Brazos, whose first silicon was solid enough to boot three operating systems within 48 hours of coming back from the fab. Brazos won’t just be for notebooks, either. AMD’s first APU will also make its way into nettops and desktop motherboards, presumably of the small-form-factor variety.

Comments closed
    • Bensam123
    • 9 years ago

    Does he look like ‘I hate my job and I hate all you people’ to anyone else?

      • Palek
      • 9 years ago

      Looks to me like an actor straight off a “Mad Men” set.

    • UberGerbil
    • 9 years ago

    Dear AMD: this “APU” term isn’t going to fly, and the sooner you accept that the sooner you can direct resources to fights you can win. I realize you want your combined CPU+GPU to stand out from what Intel is doing, so perhaps we can compromise: you can call it a “composite processing unit” and then we’ll shorten that to “CPU”

      • ClickClick5
      • 9 years ago

      More ridiculous terms have caught on before. “Jailbreak” is one of them. The term “modding” or “unlocking” works just as well, and they sound less demented too.

      So this shows that silly sounding terms catches on. I believe AMD realizes this….

      Don’t be shocked if “APU” catches on.

      • ssidbroadcast
      • 9 years ago

      Dear UberGerbil:

      Thanks so much for your invaluable feedback. Did you know that sharks don’t have any bones in their body? Pretty neat, huh?

      Sincerely,

      AMD Marketing Dept.

      • ludi
      • 9 years ago

      Thank you, come again.

      • amirol
      • 9 years ago

      Dear AMD : you are not Intel to do every thing with your resourses,to milk the water. You are not G-force to force the money making policies .In your situation the aim is not justifiable with MEDIUM .Please do what you did before..making cpus for The Average. They need work station not moneydrawstations. Be as we are do as we need. Thank you.

      • ronch
      • 9 years ago

      It should be called ‘Complete Processing Unit.’ Maybe AMD can throw in audio functionality as well to make it really complete.

    • sschaem
    • 9 years ago

    The nbody result was 30gflops, what does that match? a 5650 ?

      • Game_boy
      • 9 years ago

      It has ~400 shaders looking at the die shot. So, if it isn’t memory bandwidth limited, that performance region is expected.

      • Voldenuit
      • 9 years ago

      I’d say that the system was choked by running a HD video stream and multiple HyperPi instances at the same time.

      Because otherwise, a N-Body result of 30 GFLOPs is pretty pathetic for a GPU (Zacate scored 23 GFLOPs, and it only has an 80-shader GPU compared to Llano’s rumored 400).

        • sschaem
        • 9 years ago

        My understanding :
        – nbody DirectCompute is 99.9% GPU bound
        – Video Decoding doesn’t use shader HW, but dedicated video decoding logic (very low memory overhead)
        – HyperPi is 100% CPU bound (L1 cache)?

        This way they run all part of the chip in discreet fashion giving most of the system bandwidth to the nbody app.

        So what is then making the nbody run possibly 3x to 4x slower then expected?

        Could it be that llano GPU got 100 shader units, not 400 ?

          • Game_boy
          • 9 years ago

          Memory bandwidth would be similar to an IGP unless AMD are very clever. No 128-bit GDDR5 setup like 400 shaders needs. Not even Sideport looking at the information on the southbridge.

          • Voldenuit
          • 9 years ago

          It could be that the system is memory bandwidth bound, or that the 1% that is CPU dependent is stuck waiting on the CPU because of the other running processes.

          Also, this is engineering hardware that is probably running at much lower speeds than the final product, and probably running slower RAM as well.

          Still, this is a lot slower than expected.

            • JumpingJack
            • 9 years ago

            This is also a mobile part, you can tell from the reference system they are using a mobile cooler.

            Higher performing GPU/CPU portions in desktop should be expected because they will clock higher, a large portion of this 3x to 4x lower than expected is likely due to simply thermal constraints of being a mobile processor.

            My guess anyway.

    • Ozenmacher
    • 9 years ago

    Homeboy holding the wafer there looks like he does daily jaw workouts.

      • not@home
      • 9 years ago

      lol, The crazy thing is, he looks exactly like my brother – only taller- who works out every day.

    • Hattig
    • 9 years ago

    Personally I couldn’t care less about ten percent difference in CPU performance, but having 5x the GPU performance available for gaming, even casual gaming (at 1680×1050 in a laptop), would be a major benefit.

    It depends on the pricing of course. Llano is what? 190 – 220 mm^2? That’s not a lot of money per die. It’s 20W as well in this demo – that’s really good for a quad-core processor with integrated graphics, even if it turns out the cores are running at low clock speeds to get this TDP.

    Also that motherboard has no visible memory. Is it on package?

      • Voldenuit
      • 9 years ago

      That 20W is for Zacate, which is the dual core Bobcat part.

      • NeelyCam
      • 9 years ago

      How did you come up with 20W?

      • srg86
      • 9 years ago

      I’d rather have the extra CPU performance personally, I don’t play games and my 3D needs are modest, but I like as much CPU power as possible for my uses.

    • Meadows
    • 9 years ago

    Someone said Dirk Meyer looked scary, but I rather think this guy is the greater evil.

      • Voldenuit
      • 9 years ago

      With that plaid suit and loud tie, he could be Stan the Used Boat salesman from Monkey Island!

        • ludi
        • 9 years ago

        Clearly, he climbed his way up from Marketing. Where he found a marketing division in AMD is perhaps a greater mystery, but the clothes do not lie.

    • ronch
    • 9 years ago

    Mid-2011? They’re really taking their time on releasing products, aren’t they? Or perhaps they want Intel to beat them to market first? Move, AMD.

      • EsotericLord
      • 9 years ago

      Intel? Their GPU technology is pathetic at best, and sickening at worse. AMD is in a unique position thanks to their ATI buyout in that they make high end processors and GPUs. Plus new tech isnt something you want to rush.

      If this works, it will be huge.

        • ronch
        • 9 years ago

        I really don’t care what Intel comes up with, as I stick with AMD anyway, but AMD’s been hyping this product for years, even scrapping Swift. It’s about time they showed us some real product. Don’t you agree?

      • amirol
      • 9 years ago

      If Intel could beat them it would bite with their bee the lara (not the craft)!!
      Ati buyout brought was the best policy for those days when Green Sneak was bitting the pockets bucks for the milky cow the 8xxx series. If they let the green snake to head itself up, the gloriuse days come to an end. So they need to release. but we need real moivements beside ne releases.

    • d0g_p00p
    • 9 years ago

    APU = accelerated processing unit?

      • dpaus
      • 9 years ago

      or Armoured Personnel Unit, or manager of the Kwik-E-Mart… I think there’s already been a thread on this.

    • djgandy
    • 9 years ago

    APU is the new cloud in this thread. It’s just a CPU and a GPU.

    Also have Intel camped out in a hotel across the road?

      • Voldenuit
      • 9 years ago

      Well, there is a difference, since intel’s GPUs are not APUs (can’t do GPGPU/OpenCL/CUDA/DirectCompute).

      I’m betting intel deliberately left compute capabilities off Sandy Bridge to spike AMD’s technological lead in this area. Since intel has the compiler and developer advantage, they can close off the market by simply not competing. Meanwhile, by delaying market adoption, they can chip away at AMD’s lead in their development labs.

      Sucks for consumers and the industry, but hey, welcome to the marketplace.

        • sweatshopking
        • 9 years ago

        sounds about right.

      • indeego
      • 9 years ago

      q[<"Also have Intel camped out in a hotel across the road?"<]q A company like Intel I imagine owns the hotel, the road, and the microphones recording everything long since they got there. Not that I'm implying industrial spying or anythingg{.}g

    • tejas84
    • 9 years ago

    Voldenuit is GOD!

    You are absolutely right. Software is what matters. Also I highly doubt that the majority of folks who buy laptops do so to play DX11 Aliens Vs Predator.

    Basic GPU performance and High CPU performance per watt with excellent battery life like Sandy Bridge is what customers want in the laptop space.

    AMD believe that they have won the PC market when in actual fact their leadership will only ever remain in the GPU market.

    Call me the day AMD’s commitment to software is the same as NVIDIA’s or Intel’s. For this reason alone I hope Larry Ellison buys AMD and kicks some ass.

    Their CPU’s are and always will be a pile of steaming turd…

      • bwcbiz
      • 9 years ago

      If AMD is too stupid to provide a video driver that makes Llano look like a normal GPU to Windows they deserve to lose the market over this. Don’t get your shorts in a knot about software apps. The key will be the driver and the framerates Llano can provide under the driver.

        • Voldenuit
        • 9 years ago

        I don’t think anyone is even remotely suggesting that AMD will have any difficulty making a working GPU.

        It’s whether they can leverage the potential parallel computing capabilities of the platform in any meaningful way to make up for the vastly underpowered K10 CPU core that’s the big question.

          • ronch
          • 9 years ago

          And that’s exactly the point.

          ATI Stream technology has been around for a while now but when I tried it with AMD’s AVIVO Video Converter, I was disappointed to find out my HD5670 1GB isn’t much faster than my Phenom II X4 with transcoding videos. The output quality isn’t better either (it was a bit worse, actually). After some investigation I found out that the GPU only ran at 400MHz while transcoding instead of the full 775MHz. Disappointing to see AMD only has video transcoding as a real-world non-graphical application for its GPU. And not too good at it, too.

          But the kicker is, what makes anyone think combining the CPU and GPU in one piece of silicon will vastly improve performance? If AMD can pack 800 stream processors in its APU and pit it against a Phenom II + HD4870, then yes, it should be better. But if Llano only has 40 stream processors, no amount of integration will make it win over 800 ponies.

          So can we expect this new APU to be better at doing the job compared to a discrete graphics card with more ALUs? Can we expect more applications to benefit from GPGPU computing? Maybe not. To me, all this APU hoopla just sounds like moving the IGP from the north bridge onto the CPU die, nothing more.

          And yes, I am pro-AMD. Just my constructive criticism. Kudos to AMD marketing people for coining the term ‘APU’.

            • Anonymous Coward
            • 9 years ago

            I don’t know what you or the other posts is going on about. Llano is just a GPU on die with some CPUs, and it should be cost effective. That’s it.

            • ronch
            • 9 years ago

            From the AMD website:

            “APUS… enabling breakthroughs in visual computing, security, performance-per-watt and device form factor. Software developers, utilizing AMD drivers, libraries and either the ATI Stream SDK2 or the Microsoft DirectCompute API, can enhance the user experience and speed application performance by developing applications that fully utilize the unique compute power of the AMD Fusion™ Family of APUs.”

            They may just be right about the performance-per-watt and device form factor claims, as you would no longer need a discrete GPU to enable the same level of graphics performance, but, with all the hype since 2006 about Fusion, AMD is creating the impression that Llano is gonna be more than just a CPU and GPU glued together. Also, from what they’re saying above, it’s easy to realize that those things, in theory, can already be done today with current CPUs and discrete GPUs but is just being hampered because AMD’s infrastructure for its GPGPU technology isn’t quite there yet, with even something coming from them as their AVIVO converter showing signs of rough edges.

            This brings to mind the time when AMD placed the integrated memory controller on-die and saying it’s such a breakthrough. From a performance standpoint, it is, but it isn’t something that’s particularly difficult to do. Same thing with putting together a CPU and GPU on a single piece of silicon. In fact, many Intel products have beaten AMD to market which combine CPU and GPU either on-package (Core i3 and i5) or on-die (Atom N450). Intel’s GPU may suck, but they did it first. Tooting the Fusion horn since 2006 and being beaten by Intel to market isn’t such a nice thing to see.

            I agree though, that AMD’s Fusion will be a more compelling choice for Notebooks and other portable form factors, and perhaps even desktops which put more importance on well-roundedness rather than sheer CPU performance. I hope laptops based on it will come to market soon enough, as my Acer Aspire is showing signs of aging.

            • sweatshopking
            • 9 years ago

            of course it’s not going to be faster than a 4870. that would be madness to suppose such a thing. the idea is to provide SOME level of performance, and reduce energy usage. I don’t know what all you guys are complaining about. I’ve used phenom 2’s and i7’s and for 99% of tasks, and 100% of layman uses, the phenom 2’s fast enough to provide seamless usage. This part will be a step up for 99% of people using computers, not down. Yes I agree that the nehalem is faster. Duh. but stars is fast enough, and with a good gpu, it will provide a more compelling experience. That’s not to say i believe this chip will take market share, cause I don’t think it will. But i think for most users, it will be a better product, just one that will be largely ignored.

            • Anonymous Coward
            • 9 years ago

            I imagine it’ll see good sales in OEM machines, beating any current AMD processor.

            • sweatshopking
            • 9 years ago

            lol. that’s not hard to do. I don’t imagine it will effect market share too much. possible sell well enough to help them be slightly profitable, but nothing more.

            • ronch
            • 9 years ago

            Of course the Stars architecture is fast enough. I’m using a Phenom II X3 720 with one core unlocked, effectively making it an X4 925. No plan to upgrade for say, 2 more years. By that time Bulldozer should be affordable and widely available.

        • amirol
        • 9 years ago

        really they did and the Athlon days will come soon

    • amirol
    • 9 years ago

    Love you cool men.. Do better ….Loved best.

      • Palek
      • 9 years ago

      Somebody set up us the bomb.

        • Meadows
        • 9 years ago

        What you say

          • designerfx
          • 9 years ago

          you have no chance to survive make your time

            • amirol
            • 9 years ago

            Consider the competing Market without AMD.You guys say Drek the evil ..what a devilish selfish saying.If this as you think”evil’ werent, these comments were about the Core2 Quade 6600 ,not the Sandy or Fusion.Suport them both to suport the customers.

    • Voldenuit
    • 9 years ago

    Software. Software. Software.

    Software support will make or break Fusion for AMD. There’s still distressingly little hoopla from developers to support APU-style computing, and without that, all those fancy transistors sitting in the GPU part of the die are just so much dead weight in mainstream apps.

    I really want APU/Fusion/GPGPU to work out, but AMD(and ATI)’s abysmal history at garnering/fostering developer support says otherwise. Truform, 3DNow! gained pretty much zero support. This is not solely a function of market share or perceived dominance, as intel was able to champion SSE even during the Athlon64’s ascendancy (although AMD did win on x86-64 by default). Rather, I think that AMD is too reliant on the ‘build it and they will come’ philosophy, instead of actively engaging and attracting developers (as nvidia does with CUDA).

    Llano and Bobcat are not simply another CPU product. They are a stab at a new paradigm in computing, and unless AMD is as aggressive at promoting and developing software tools for their new babies as they are about architecture and fab process, they will fail. Even with all cylinders firing, though, the fact that Fusion products will only account for (at most) 20% of new sales and an infinitesemal percentage of the installed hardware base means that this will be an uphill battle for them. Godspeed and good luck.

      • OneArmedScissor
      • 9 years ago

      Lots of people play computer games, just not very demanding ones. Lots of people also use laptops and no desktop. There’s your “mainstream application.” The end.

        • Lans
        • 9 years ago

        I do agree software ecosystem will largely determine how (un)successful Fusion will be. But this gives me alot of hope:

        /[

        • Voldenuit
        • 9 years ago

        On the laptop front, Llano will be competing with Sandy Bridge. Which will kill it in performance/watt and thermals. Sure Llano will have a better IGP, but gaming will necessitate plugging it in to the wall, at which point you’re better off with a desktop in the first place.

        On the desktop, $95 (the going rate for a Radeon 4850) will demolish Llano in gaming benchmarks. A 4770 or 5750 will do the same at much lower thermals.

        Llano is essentially a K10 CPU with a decent IGP. I don’t think that by itself will be anywhere near enough to reverse the market share stranglehold that intel has.

        If they came out guns blazing with a dozen free APU-enabled apps that let the average user encode their HDcam videos at 5x the speed of Sandy Bridge, simultaneously record (and encode) and watch multiple TV tuner streams, run monte carlo simulations on particles in real-time (ok, joe average probably isn’t going to use this), then they’d have something that intel cannot match or offer.

        It’s frustrating that AMD has the technical ability to do this but not the software or developer support.

          • Hattig
          • 9 years ago

          Voldemort wrote “On the laptop front, Llano will be competing with Sandy Bridge. Which IN MY OPINION will kill it in performance/watt and thermals. Sure Llano will have a better IGP, but IN MY OPINION gaming will necessitate plugging it in to the wall, at which point IN MY OPINION you’re better off with a desktop in the first place.”

          There, fixed what you wrote.

          • OneArmedScissor
          • 9 years ago

          For making such a stink about AMD’s new CPUs not having “mainstream application,” you sure don’t seem to understand what that is.

          Sandy Bridge…seriously? Every mobile chip so far boosts to over 3 GHz. Remember those GPUs in the AMD chips? Well, Intel has them, too – running at a blistering 1.3 GHz.

          Welcome to the wonderful world of physics. High clock speeds kill batteries. That’s not Intel’s concern. This is their high end. Their concern is throwing big numbers at you, including the $300 price tag for the CPU by itself.

          Protip: people don’t like spending money they don’t need to.

          A 1 GHz-ish, low end CPU is not only enough for most people, it’s exactly the type of thing that makes the most sense for a laptop that’s only going to be used for the internets, word documents, and maybe WoW.

          The mental image of someone walking into Best Buy’s computer department, approaching an employee, and asking, “Where is your laptop with the best performance per watt?” is killing me.

            • Voldenuit
            • 9 years ago

            Eh? That’s a lot of vitriol in response to a post that just said (and I think many ppl agree with) that AMD needs to buck up their developer relations bigtime to take advantage of their strengths.

            I pretty much spelled out that if AMD can leverage APU support (in any space, not just mobile) on the software side, they will offer something intel can’t hope to match for another generation or two, if ever.

            I just don’t see that happening, is all.

            • ludi
            • 9 years ago

            Actually, the average Best Buy purchaser either wants the fastest thing their money will buy in a particular form-factor, or they want the one that looks lightweight and has lots of battery time — both of which are, in fact, related to performance/watt.

      • blastdoor
      • 9 years ago

      Well, there is a company out there that appears to value GPU performance so highly that it is willing to continue using C2D in many of its products just so that it can avoid crappy Intel integrated graphics and continue to use NV graphics. It’s not clear yet why Apple is doing that, but it might be more clear tomorrow. Of course, Apple doesn’t use AMD chips but that sort of thing can change.

        • sweatshopking
        • 9 years ago

        it can change. Just like misty’s heart changed, and she started to love ash. But what about brock? where does that leave him?

        • Voldenuit
        • 9 years ago

        Yeah. Every new Mac right now has OpenCL capabilities. Surely Apple must have /[

          • sweatshopking
          • 9 years ago

          but why bother? I never got the whole point of redoing all that video. When I want a movie, say for my ipod, i just go and download it. Why bother transcoding?

            • Voldenuit
            • 9 years ago

            I’m referring to home made video captured off my DSLR. My camera records in either MJPEG (which is horribly inefficient) or H.264 (but in a low compression format suitable for realtime capture). Either of which would benefit greatly from recompressing so they are more palatable for distribution/storage.

            And then (or before) there’s the editing to string together scenes and to cut out the fluff.

            • sweatshopking
            • 9 years ago

            gotcha. makes sense in that environment.

      • Anonymous Coward
      • 9 years ago

      The relative lack of software that uses GPUs for something other than graphics probably says a lot more about what GPUs are actually good for than it says about how motived developers are, or how motived AMD is. Meanwhile, AMD is able to put a GPU on die with CPUs and improve their price/performance situation relative to Intel, which keeps AMD in business. Nothing to worry about.

    • ssidbroadcast
    • 9 years ago

    In more important news: AMD has made /[

      • Metalianman
      • 9 years ago

      If you really believe this then I don’t know what to say. AMD was first in starting building that technology, that was the reason for buying ATi!!! Sure, they were left behind by Intel, since the financial troubles got them scratching their heads to find ways to stay afloat, but, be serious!!!! Not to mention that you will see almost nothing that will resemble, even remotely, Intel’s Sandy Bridge.

      Now, back to the matter of this new chip. AMD will need to keep the power consumption of really low, especially on the mobile platform parts. If they’re close, really close, to Intel on that factor then most of the battle is already won.

      Yes, I don’t deny that software will play a huge part of Llano’s fate and that AMD has a bad history in that department, but you never know…

      I think it’s about time AMD stepped up, Intel had the reigns for far too long. They were close with Intel before the ATi buyout and hopefully they’ll get that close soon enough (in the next 3-5 years or so).

      This kind of technology will either be the making or fall of AMD.

        • Palek
        • 9 years ago

        Whooosh!!!

      • dpaus
      • 9 years ago

      Yawn. Wake me when the Deborah Harry model is available.

        • ssidbroadcast
        • 9 years ago

        Sorry, will the Daryl Hannah model suffice?

          • dpaus
          • 9 years ago

          Well, duh!, the whole point of replicants is that I can have more than one! I’m thinking a nice 4-way SLIppery configuration….

    • bdwilcox
    • 9 years ago

    At first, I thought Chris Cloran was all gangsta’ and that wafer he’s holding was a huge clock he hung around his neck to show white man’s time is running out.

      • Meadows
      • 9 years ago

      You have some strange thinking.

        • sweatshopking
        • 9 years ago

        psssshh. Do you see that mans chin? his time is NEVER running out. any problems, he says “Yo! you see this Mofo’in chin? that’s what I thought. Now give me that cheeseburger at no charge!”

Pin It on Pinterest

Share This