Apple hires Trinity platform architect John Bruno

One of the brains behind AMD’s Trinity APU has a new job—and no, he wasn’t promoted within the chipmaker. According to SemiAccurate, Bruno was actually fired last November as part of AMD’s most recent workforce decimation. Since then, Bruno has taken up a new position… at Apple.

SemiAccurate says Bruno now holds the title of System Architect at the Mac maker. The folks at SlashGear tracked down his LinkedIn profile, which you can find here, and the information checks out. It looks like Bruno was an ASIC Design Manager at ATI before joining AMD as Senior ASIC Design Manager. His last title at AMD was, “System Architect, Office Of The Chief Engineer.” Bruno also has a degree in engineering from the University of Toronto. SemiAccurate calls Bruno “the guy who brought you Trinity,” but his profile suggests he worked on the platform spec for the APU, rather than architecting the chip itself.

This isn’t the first time an AMD executive has defected to Apple. Three years ago, we learned that Bob Drebin and Raja Koduri both joined Apple after holding the exact same job at AMD: Chief Technology Officer of the graphics product group. Koduri actually replaced Drebin as AMD’s graphics CTO when Drebin left in early 2008. Koduri then went on to follow in Drebin’s footsteps.

Apple, of course, has been hiring folks to work on its own A-series chips. Apple snatched up PA Semiconductor in 2008 and, two years later, introduced its A4 chip, which powered the iPad, iPhone 4, fourth-gen iPod Touch, and second-gen Apple TV. Since then, Apple has released the A5, which can be found inside the iPad 2 and iPhone 4S, and the A5X, which powers the latest iPad. The chips are all designed in-house using third-party IP (namely ARM CPU cores and PowerVR graphics).

Comments closed
    • ronch
    • 7 years ago

    It seems we’re all interested in AMD employees ending up elsewhere than any other company’s employees who bail/get fired and end up elsewhere. AMD really gets a lot of attention, understandably, given how important their industry position and function are and how fragile their situation is.

    • Sam125
    • 7 years ago

    If I were on AMD’s board of directors, I’d want to fire all the system architects responsible for the disaster that is bulldozer as well.

    Seems AMD fired the wrong architect though. Fire your deadwood AMD people, not your ATI people, you idiots. lol

    Anyways, picking up talent that was fired is in no way unloyal. If AMD didn’t want this guy to defect to Apple, they shouldn’t have fired him.

      • shank15217
      • 7 years ago

      Oh please just stop, you have no idea what bulldozer is all about, all you give a crap about is how fast it encodes your pr0n. AMD wouldn’t stand a chance if they didn’t move off their 9 year old design. Even if they didn’t meet performance expectations they were priced to sell and one year later piledriver will bring performance up significantly on the same socket across the board.

        • Sam125
        • 7 years ago

        Well, the BD design team made some unfortunate design choices, such as sharing a FP unit between two cores (how was that ever considered a good idea?), sharing the L2 cache and a few others that are probably too subtle for a layman like me to notice. I know nothing about ASIC design and I can tell you that the first two were terrible decisions. The reason why the Athlon architecture was successful was due to its extremely robust FPU. Why the BD design team decided to abandon that lesson learned with the Athlon and essentially neuter the BD’s FP resources and tie much of the CPU’s performance on the accuracy of the L2 cache’s branch prediction is beyond me.

        Now that AMD has committed to BD those unfortunate design choices are pretty much set in stone which is why the company has pretty much been banking on their APUs being successful. However, fumbling on the desktop release of Trinity makes it almost seem as if no one at AMD even wants to try anymore. It’s pretty unfortunate and although AMD will more than likely pull through as it always does, some better choices at the beginning would’ve made their offerings much more compelling today.

          • Jason181
          • 7 years ago

          While I agree those decisions were disastrous for desktop applications (gaming!), they did it because most server applications are heavily integer-based.

          If they were going to make those kind of tradeoffs though, it would have been smart to keep the Phenom line for desktops.

          The problem I see is that BD’s performance ranges from stellar to abysmal, depending on the application. They’ve sort of conceded the high-end desktop space to Intel, but Trinity looks pretty good for lower power applications.

            • Sam125
            • 7 years ago

            [quote<]...they did it because most server applications are heavily integer-based.[/quote<] Yeah, you're absolutely correct but IIRC BD doesn't improve on integer based applications on a clock-for-clock basis compared to the A64. Bulldozer is designed to scale performance with clockspeed which really only works if you have a process advantage which AMD doesn't.

            • Jason181
            • 7 years ago

            They just threw more “cores” at the problem though; BD has twice as many integer units as A64 (since you didn’t specifically mention Thuban). I’m not under the impression that they increased the clock-for-clock performance. 🙂

            • Anonymous Coward
            • 7 years ago

            [quote<]While I agree those decisions were disastrous for desktop applications (gaming!), they did it because most server applications are heavily integer-based.[/quote<] We have hardly any data which can be used to make such a conclusion. Once Trinity is available at desktop wattages, then we can see how it compares in CPU tasks to the 32nm K10 cores in Llano. I don't think much can be concluded from BD. AMD routinely messes things up the first time around, and 32nm has been difficult. It would be hard to demonstrate than 32nm K10 can beat 45nm K10...

            • Jason181
            • 7 years ago

            Well, except BD is what we were talking about.

            • Anonymous Coward
            • 7 years ago

            Are you saying that my comments do not apply to BD?

          • Anonymous Coward
          • 7 years ago

          [quote<]Well, the BD design team made some unfortunate design choices, such as sharing a FP unit between two cores (how was that ever considered a good idea?), sharing the L2 cache and a few others that are probably too subtle for a layman like me to notice.[/quote<] Who the hell are you to criticize these choices? Come back in 10 years and we'll see what choices where found to have merit, and which where not. [quote<]However, fumbling on the desktop release of Trinity makes it almost seem as if no one at AMD even wants to try anymore.[/quote<] You should just stop sharing your thoughts.

            • NeelyCam
            • 7 years ago

            More AMD engineers getting insulted…?

            • Anonymous Coward
            • 7 years ago

            All we can say is that 32nm BD, the combination of design and manufacture, was not a competitive product.

            Its ridiculous to see someone claiming a shared FPU is [i<]obviously[/i<] a poor choice, and its just a failure of logic when someone says a two-way shared L2 is to blame. Its not as if there hasn't been a highly successful, highly performing, mass produced processor with a two-way shared L2. The shared FPU is a feature to watch over the coming years.

        • heinsj24
        • 7 years ago

        My 8120 may not take away any awards for being the fastest at running benchmarks, but it can certainly multi-task like a mother…

          • Jason181
          • 7 years ago

          If you can keep 8 integer cores busy, performance can be very good. If it’s single-threaded performance or heavily multi-threaded fpu performance, it’s really not good. It just depends on the applications you use. As a small business server, it would probably be fantastic; as a gaming computer it’s uneven at best.

        • sschaem
        • 7 years ago

        A 32nm Phenom III X8 would deliver better performance per watt.

        An X8 with AVX, FMA and all the other tweaks would have saved AMD and it wouldn’t be any bigger then llano.
        Most likely 3.8ghz turbo on 2 core (great for gaming), and 3.1ghz with all 8 core.
        Running AVX code this x8 would crush the 4 core of the FX-8150

        BTW, what is bulldozer good at?

        • NeelyCam
        • 7 years ago

        AMD promises all sorts of things but hasn’t delivered for a long time.. why should it be different now?

        You sort of sound like an AMD engineer who is utterly insulted by muggles criticizing AMD’s perfect products

        • ronch
        • 7 years ago

        I plan to get a Piiledriver-based Vishera CPU when it’s available for a good price. The current FX is not a bad CPU if you take Intel out of the equation, but the thing is, Intel will always be a part of the equation (actually much more so than AMD), and that’s when you realize you can do much better than an FX when it’s time to buy your next rig.

        AMD simply can’t pretend that they have no competition because Intel is always present wherever AMD products are sold (but not vice versa). And the reality is that no matter how much effort AMD exerts to convince people that their chips are ‘good enough’, those partial to neither company and those who aren’t willing to make sacrifices for AMD’s sake will always see the advantages Intel is offering in terms of price, performance and power efficiency.

    • AGerbilWithAFootInTheGrav
    • 7 years ago

    The writing is on the wall for AMD for a while now, the only shame in all of this is that they bought ATI at the time, and really are going to destroy themselves but the GPU competition in one go… the GPU guys may survive a bit longer on their own, but all those re-orgs cannot help them run business as it could have been with a solid organization around.

      • forumics
      • 7 years ago

      amd buying ati was supposed to be a good thing, bulldozer was supposed to feature 1 large FPU instead of many small and incapable FPU.
      if only the practical side of things matched the theoretical ideology

    • anotherengineer
    • 7 years ago

    “This isn’t the first time an AMD executive has defected to Apple.”

    “DEFECTED”

    Cyril, if he was fired/let go/layoff/ and then found a new job, how is that “defection”??

    Wouldn’t it be AMD who is the defector by terminating his employment?

    • WaltC
    • 7 years ago

    [quote<]This isn't the first time an AMD executive has defected to Apple.[/quote<] Well, according to your own article here--Bruno did not "defect." He was canned at AMD and only later hired by Apple. No "defection" apparent. I see nothing unusual about Apple picking up the crumbs that fall from AMD's table...;)

      • NeelyCam
      • 7 years ago

      The point is that you’re supposed to be loyal enough that even though you get laid off (so execs would make more money with their stock options), you should neverever go work for anyone that could even remotely be considered a ‘competitor’.

      Bruno the Traitor broke the unwritten rule.

        • tay
        • 7 years ago

        Hogwash. See Marissa Mayer and Yahoo.

          • NeelyCam
          • 7 years ago

          Mayer was intentional – she still works for Google, but draws her salary from Yahoo. And stock options aren’t affected by her maternity leave, so her compensation will continue nicely even though Yahoo crashes/burns.

          It’s sort of like what Elop is doing with Nokia.

            • derFunkenstein
            • 7 years ago

            at first I thought maybe you were kidding about the “being loyal” thing after getting shit canned, but now I’m not so sure..

            • NeelyCam
            • 7 years ago

            I’m tricky

        • Cuhulin
        • 7 years ago

        There is no such unwritten rule.

        Every former employee goes where his or her experience is most valued. Usually, that is to a competitor.

        If AMD doesn’t want its employees going to competitors, it shouldn’t fire them.

        The problem is with AMD senior management (fi one can call them that), not with Bruno.

          • BobbinThreadbare
          • 7 years ago

          Your sarcasm meter needs some adjusting.

            • NeelyCam
            • 7 years ago

            Yeah.. sounds like chuckula broke everyone’s sarcdets… now everything passes through

          • sschaem
          • 7 years ago

          The answer that matter : Why did AMD felt compelled to fire J. Bruno and Apple felt compelled to hire him as a “System Architect”

          Not good enough for AMD but good enough for Apple ?

          Most likely Bruno was red pissed every day he went to work, and AMD management let him go, not because he wasn’t usefull, but because he wasn’t a ‘team’ player.

          This is one of the big issue AMD is facing. Talented people at AMD (like their OEM partners) are fed up.

          AMD been on life support for over 3 years.. isn’t it time to pull the plug ? Because whatever R.Read is doing, its not working.

          • stmok
          • 7 years ago

          [quote<]There is no such unwritten rule.[/quote<] That depends on where you are in the world. In America, Australia, etc one would consider it the norm to work for someone that values their skills. No one sticks around for very long as management have adopted the notion that firing people (under the banner of "restructuring to be competitive"); is easier than actually being a pro-active leader to solve your company's problems. You do that by looking at the processes of your production lines or development, as well as going down to see people at the shop floor, etc regularly. (Not stay in your ivory tower as you remain oblivious to what's really going on!) In Japan, Germany, Korea, etc; one is loyal as long term employment with a company is the norm. You do your best for your boss; the boss looks after you. As an employee, you are part of the boss's extended eyes and ears; as you see if the quality processes are functioning correctly and what can be done better. Firing people is a last resort. There is the expectation for the boss to re-invest in their people via training of new technologies, methods, etc. The whole notion they work under is constant improvement... Where did they get this notion from? From the USA. Not the current USA. The USA of WWII. They learned, studied, and applied from "The Greatest Generation" of Americans. The principles that made America a global leader. Ironically, (or unfortunately...Depending on your view), current America has completely forgotten all those lessons of the past. It has to re-learn them. (I know some American companies are re-learning them. Most are not.)

        • willyolio
        • 7 years ago

        in your little fanboy world, maybe. is that why you’re eternally loyal to intel no matter what or something?

        • internetsandman
        • 7 years ago

        I thought you’d be supporting a “traitor” to AMD

        In all seriousness, I’m fairly sure in any corporate world, the only loyalty that matters is to the company that pays your salary. If they’re not paying your salary anymore, then why do you care what they think or do? Go to someone else who WILL pay your salary.

    • derFunkenstein
    • 7 years ago

    Johnny Bruno sounds like a 1930s-style, booze-running gangster.

      • anotherengineer
      • 7 years ago

      Sounds more American than Canadian 😉

        • derFunkenstein
        • 7 years ago

        Indeed, and with my joke about prohibition-era America, that was kind of the point. #derp

          • NeelyCam
          • 7 years ago

          Don’t be mean. We’re all friends here

    • forumics
    • 7 years ago

    well i hope that 1 day apple won’t = amd because although i love amd and have always supported their products, i hate apple to the core.
    i would be torn between choosing apple or intel which is another company which i try my best not to buy products from because i support amd

      • Deanjo
      • 7 years ago

      [quote<]well i hope that 1 day apple won't = amd[/quote<] I don't think Apple could bring themselves down to that level.

      • ludi
      • 7 years ago

      I used to feel that way about AMD and Apple, but then the crushing burden of adulthood set in, and suddenly it didn’t matter as much.

      • Jason181
      • 7 years ago

      I would buy an AMD computer if only the keyboard included a shift key.

    • jdaven
    • 7 years ago

    Apple also purchased Intrinsity supposedly for their high clock ARM tech.

    • Anarchist
    • 7 years ago

    looks like Apple is buying ATI from right under the nose of AMD …

      • ludi
      • 7 years ago

      I’ll bet they get it cheaper than AMD did.

        • sschaem
        • 7 years ago

        Almost for free. VS almost 6 billion.

        Even so Apple could acquire AMD without noticing it on their book, but they prefer to hand pick the best part of AMD and leave the rest to slowly rot away.

        Apple amazingly seem to make acquisition at almost below book value.
        AMD in contrast overpay 10x for seamicro… What a sign of desperation.

        AMD is managed by amateur, wasting money and making decision that is slowly, but surely, killing them. Dirk still got the crown for the worse AMD CEO ever had (maybe less of a crook, but more destructive), and R.Read seem to be in way over his head.

        AMD reputation is also going down the gutter in the process, so when the company fold its will sell for peanuts.

          • anotherengineer
          • 7 years ago

          Is your real name Debbie Downer??

            • sschaem
            • 7 years ago

            Want more ? 🙂

            AMD just confirmed that their revenue dropped 11%, 10% lower then the same time in 2011.

            But get this, they will drop by another 4% (Its AMD CEO stating this) next quarter.
            AMD have been overly optimistic so we might be looking at a 15% drop in sales.

            AMD also said that they see even more trouble ahead for them in the next 3 month.
            “Strong headwinds” .. usually, its a great time for PC makers as its the back to school laptop shopping time.

            edit: also that the time OEM build inventory if windows8 laptops.

            But I feel you… AMD news are depressing.

            • NeelyCam
            • 7 years ago

            Yes, the results are in.

            I guess I should’ve noticed it 3mo ago, but the most shocking thing to me was the amount of money they paid GloFo to get that “Limited Exclusivity Waiver” in Q1… [i<][b<]700mil![/b<][/i<] I had no idea... That's worth, what, 5 years of AMD profits? Jesus, who the hell authorized that?!?

            • sschaem
            • 7 years ago

            Also next month one of their debt payment is due (AMD borrow money like their is no tomorrow)
            in the tune of 450 millions.

            And they now spend 184 million a month on operation.

            BTW, 3 month ago AMD was giddy saying they will raise revenue 3%.

          • NeelyCam
          • 7 years ago

          Just out of curiosity… what would YOU do if you were the CEO of AMD…?

          Or… are you..?

          • blastdoor
          • 7 years ago

          YUP!

          Are there still people who think that buying ATI was a good idea? If so, I invite them to tak a long hard look at apple. Apple managed to design a CPU+GPU SOC without having to waste billions on a takeover.AMD could have done the same thing, and invested those billions in fab capacity, thereby better capitalizing on the CPU lead they once enjoyed.

          Oh well. Too late now.

            • BobbinThreadbare
            • 7 years ago

            The problem was timing. If they bought ATI one year later or one sooner, it’s probably half what they paid. They bought them literally at their high water mark and right before the disastrous HD 2000 series came out.

        • HisDivineOrder
        • 7 years ago

        I’ll bet they do more with it, too.

        • ronch
        • 7 years ago

        AMD is unbeatable when it comes to paying too much for other companies.

    • chuckula
    • 7 years ago

    AMD fires the project lead for Trinity… Excuse me while I go write a palm-to-face routine. I can even multi-thread it for dual-palm systems….

      • dpaus
      • 7 years ago

      As any corporate legal department will tell you, you don’t fire someone like that without a damn good reason.

        • chuckula
        • 7 years ago

        How about firing John Fruehe instead? From his years-long Bulldozer disinformation campaign I can think of several “good reasons” why he could go. The Llano/Trinity APU line is the second-best product line that AMD produces (the best being regular GPUs that ATI was good at long before the takeover). If I was going to fire the project manager of any product, it would be in the Bulldozer side of things and not Trinity.

          • NeelyCam
          • 7 years ago

          JF is a nice guy who actually made an attempt to communicate with the “fans”. It’s not his fault that the engineers lied to him

        • NeelyCam
        • 7 years ago

        Is “We’re bleeding money!!!!” a good enough reason..?

          • dpaus
          • 7 years ago

          Only if you’re responsible for making the money in the first place. That certainly doesn’t seem to be the case here.

            • NeelyCam
            • 7 years ago

            Well, AMD is bleeding money they “made” by 1) selling fabs and 2) whining about Intel abusing them

      • Damage
      • 7 years ago

      John wasn’t the project lead for Trinity. His LinkedIn profile tells the story. He was responsible for the reference platform spec, defining and coordinating the various bits that went into it. He is not a CPU architect.

      • sschaem
      • 7 years ago

      They lost WAY more people then it appear. Samsung and Apple decimated AMD ranks in the past 3 years, and AMD from its own making ripped its heart out when it ‘gave’ its entire mobile division to Qualcomm (lost so much talent and business that day)
      I even wonder if Dirk didn’t make the entire ex ATI workforce despise AMD from this move alone as ATI started to build its mobile division in 2002 and it took them allot of effort to reach profitability.

      Whats left is a bunch of people tweaking whats been done 3 years ago at a snail pace, and missing deadline.

      When AMD outsourced llano to India, it was all tweaking an old (but solid design) from *2003*
      They couldn’t even add any instructions, not even SSSE3 (the core for modern image/AV processing) that even Intel ATOM have. Result wasted 32nm potential for a chip that went EOL on release.

      AMD still spend over 200 million a month in ‘R&D’ (more like greasing VP/management ) for little to no innovation or breakthrough design.

      I wouldn’t be surprise if AMD announce another round of layoff after loosing 11% in revenue even so the PC market grew 3%… this paint a disastrous future for AMD.

Pin It on Pinterest

Share This