Raja Koduri joins Intel as head of new Core and Visual Computing Group

In a shocking announcement, Intel has confirmed that former Radeon Technologies Group head Raja Koduri will join the company as a chief architect and senior vice president of its newly-formed Core and Visual Computing Group. In that role, Intel says Koduri will bolster Intel's leadership in integrated graphics processors and—incredibly—complement those products with "high-end discrete graphics solutions for a broad range of computing segments."

With what is sure to be a wide-ranging mandate, Intel says Koduri will be responsible for delivering "differentiated IP across computing, graphics, media, imaging, and machine intelligence capabilities for the client and data center segments, artificial intelligence, and emerging opportunities like edge computing."

My jaw now sports a third bruise from hitting various flat, hard surfaces this week. Although we're likely a long way from seeing shipping products shaped by Koduri's team, Intel seems to be taking an incredibly pointed and aggressive step forward in challenging Nvidia's white-hot energy in machine intelligence and massively parallel computing applications.

In the near term, Koduri's influence could help to bring much-needed order to Intel's AI and visual-computing efforts, which have until now been scattered across a range of internal projects like the Xeon Phi accelerator and external acquisitions like Movidius and Nervana. Although those technologies are all exciting and potentially important on their own, the proceedings that have led to their home under Intel's big blue umbrella have all felt a bit disordered.

Koduri undoubtedly has a clear vision for the future of GPUs and massively parallel computing that's independent of any one company or technology, and Intel will be fortunate to have that vision on its bench of talent going forward.

Although machine learning and visual computing are perhaps the most important challenges he'll face, Koduri's hire also suggests that Intel is ready to fully embrace the excitement around PC gaming that's continued to strengthen even as the broader PC market has contracted quarter after quarter for multiple years in a row.

Many a budding gamer has likely used an Intel IGP to dip their toe into Dota 2 or League of Legends, but the company has never been able to hold onto that excitement as that same gamer sets their eyes on higher-resolution displays, higher refresh rates, higher-quality graphics, and the accompanying bottomless thirst for pixel-pushing power that those gaming experiences require. Assuming I'm reading Intel's press release correctly, that's all set to change.

Koduri's expertise leading teams that have produced those high-end graphics products, combined with Intel's massive resources, world-class process technologies, and an apparent newfound commitment to producing high-end discrete graphics processors, has the potential to be an epochal shift in the balance of power among Intel, Nvidia, and AMD.

Presuming Intel puts its Carolina Reapers where its mouth is, we could see a remarkable reshuffling of the names on the chips that power high-performance PCs. The only question is how soon that new era will dawn, but one thing is for sure: the road to that point will be long, contentious, and above all, exciting. I can't wait to see what happens next.

Comments closed
    • utmode
    • 2 years ago

    Raja does not care for AMD. His vision is to be the CEO of Radeon.

    • DeadOfKnight
    • 2 years ago

    Intel: We’ll put you in charge of designing a Skull Canyon successor utilizing Radeon Graphics.
    Raja: I must go. My people need me.

    • Mat3
    • 2 years ago

    So Kyle at Hardocp was right all along.

      • Convert
      • 2 years ago

      People keep saying this but the only thing I could find that Kyle said was that he 100% didn’t know this was going to happen.

      Source?

        • cynan
        • 2 years ago

        See this [url=https://hardforum.com/threads/from-ati-to-amd-back-to-ati-a-journey-in-futility-h.1900681/page-72#post-1042797289<]forum post[/url<] of Kyle's from last Feb. The specific road map (will it be a Kaby Lake chip and not something more recent by the time the hybrid GPU/CPU launches?) and perhaps target market (low to midrange?) and any details pertaining to the specifics of the deal might be up in the air, but it was more or less on the money.

          • Convert
          • 2 years ago

          So people saying Kyle was on the money are accidentally commenting on this article and not this one: [url<]https://techreport.com/news/32792/intel-brings-a-core-cpu-and-radeon-gpu-together-on-one-package[/url<] I was really confused as Kyle himself said he had no idea that Raja was lined up to join Intel, other than his theory that Raja was trying to join Intel through an acquisition of the RTG group as a whole. However I know a few people speculated that a Intel/AMD collaboration was going on so he was definitely right about that.

    • WaltC
    • 2 years ago

    If you think the timing of this with the announcement of the discrete mobile GPUs that Intel designed with AMD and is now buying from AMD is coincidence–I think not…;) It most certainly wasn’t something that took either company by surprise, in any event–this was planned much in advance by both companies. Recall that both companies tried to keep a lid on the fact that Intel was turning to AMD to design and build a discrete mobile GPU for them, and that AMD was actually going to sell same to Intel. Intel *cannot build* discrete mobile GPUs utilizing HBM/2. Can’t imagine why this would shock anyone–if they could build it they wouldn’t have to buy it from AMD, would they?

    Why would they wish to keep it a secret? Well, just look at all of the truly stupid commentary about this–“Raja got fired, ” etc. Neither company wanted to affect stock prices with a premature announcement. None of this was announced until both AMD and Intel were ready to announce it. Raja could not have made this move without AMD’s permission, I guarantee it. Maybe in five years, he could have, when everything he learned from his work with AMD was obsolete (Such are work-related NDA’s.) …;) He has AMD’s blessing here 100%.

    My personal opinion is that Raja is just about tapped out–at his limit. He’s looking for simpler pastures in which to graze. But we shall see–this particular Intel chip is of a very rudimentary design–except for the HBM/2 interface. Be interesting to see what’s ahead…!

      • nanoflower
      • 2 years ago

      If was looking for simpler pastures then he wouldn’t have taken on so many different areas in his new job. Having to bring all of these different groups together and working under a combined vision is not going to be ‘simple.’ Let alone the effort to build a GPU that’s good enough to replace the AMD component in this new Kaby Lake product.

    • AMDisDEC
    • 2 years ago

    Raja’s next stop; Google.

      • NTMBK
      • 2 years ago

      AMDisDEC’s next stop; the bushes outside Lisa Su’s house.

        • AMDisDEC
        • 2 years ago

        LOL, Impotent Euronut.

    • psuedonymous
    • 2 years ago

    I’ll be interesting to see what route Intel go:

    – A Totally-Not-GCN-Derived GPU, picking Raja’s brains on the implementation and relying on cross-licensing to avoid lawsuits.
    – [url=http://tomforsyth1000.github.io/blog.wiki.html#%5B%5BWhy%2520didn%2527t%2520Larrabee%2520fail%253F%5D%5D<]Resurrecting Larrabee[/url<] (or rather, a graphics-dedicated Xeon Phi derivative) now close-to-the-metal GPU programming is the in vogue thing to do - Scaling up Gen. It managed to survive being pimped out into Iris Pro, but it'd need to scale much larger to be a viable discrete card. - A totally fresh ground-up design. And if they do go the new-arch route (or neo-Larrabee), whether they go the throughput-optimised route that is the current best practice for graphics and GPGPU of today, or aim for latency-optimisation (or rather, just-in-time job completion) that would be preferable for future VR - and in a decade or so AR - workloads, and for real-time inference at the edge (rather than NN training).

      • tipoo
      • 2 years ago

      A fresh ground up GPU would be the most interesting, an optimized Gen is probably the most likely. Gen iirc had a front end that served for what the GPU was, but was limited such that low level APIs weren’t much help to it since it was saturated already. Addressing that would be one thing. Then scaling up the execution units, bandwidth, wattage, and it would already be an interesting modern GPU.

      • AnotherReader
      • 2 years ago

      I would love to see what a scaled up Iris Pro could do. A discrete GPU with 512 Gen9 EUs and 128 ROPs should be pretty good.

    • dodozoid
    • 2 years ago

    I bet that right now in some shady alley, a smallish figure with her face hidden under dark shroud ordes an assasination and hands over a suitcase filled with unmarked GPUs with a promis of few threadrippers after the deed is done.

    • USAFTW
    • 2 years ago

    IS THIS REAL LIFE?!?!

      • chuckula
      • 2 years ago

      IS THIS JUST FANTASY?

        • Anovoca
        • 2 years ago

        CAUGHT IN A LAAAAANDSLIIIIIIDE

          • chuckula
          • 2 years ago

          NO ESCAPE FROM REALITY.

            • Lord.Blue
            • 2 years ago

            OPEN YOUR EYES, LOOK UP TO THE SKIES AND SEEEEEEEEE

            • NTMBK
            • 2 years ago

            I’M JUST A POOOOOR BOYYYYY, I NEED NO SYMPATHY

            • derFunkenstein
            • 2 years ago

            BUT IT’S EASY COME, EASY GO

            • ludi
            • 2 years ago

            LITTLE HIGH, LITTLE LOW

            • morphine
            • 2 years ago

            ANY WAY THE WIND BLOWS, DOESN’T REALLY MAAAATEEER TO MEEEEEEEEE…

            TOOOO MEEEEEE….

            • Jeff Kampman
            • 2 years ago

            combo breaker

            • AnotherReader
            • 2 years ago

            MAMA, JUST KILLED A MAN

            • chuckula
            • 2 years ago

            You mean: KAMPMAN, JUST KILLED THE COMBO

            • AnotherReader
            • 2 years ago

            That he did, oh PunKing. Only you can save it now.

            • Redocbew
            • 2 years ago

            BUT HE DIDN’T SHOOT NO DEPUTY

    • Anovoca
    • 2 years ago

    So instead of just team Red and team Green in the discrete market we might now have Team Blue?

    …Teams Red, Green and Blue
    ……Teams R,G,B
    ………..RGB
    …………….

    OH GOD NO!!!!!!!!!!!!!!

      • jihadjoe
      • 2 years ago

      So much for discreet GPUs. All these young cards these days and their flashy, blinky clothes…

      • ronch
      • 2 years ago

      Yeah. RGB.. that’s been noted before already.

      • derFunkenstein
      • 2 years ago

      Say it three times and Michael Keaton will appear.

      • spiketheaardvark
      • 2 years ago

      I would feel a little sad if the Intel + AMD chip wasn’t code named “Magenta”

        • derFunkenstein
        • 2 years ago

        Purple Rain

          • Mr Bill
          • 2 years ago

          And then out comes the Octarian Rimbow.

    • mtruchado
    • 2 years ago

    Intel’s leadership in integrated graphics processors ??? Intel IGPs are crap!, with screen flickering, bugs here and there when using multi screens configuration, not to talk about their ridiculous performance. It looks like every company that makes something is a leadership in this something, regardless of the reality.

    • fyo
    • 2 years ago

    To all those sceptical of Intel being able to produce discrete graphics cards… The #1 barrier to entry is drivers and it isn’t even close. In some ways, making a fully-featured and stable DirectX driver has become a simpler over the years, since Microsoft defines pretty much everything very tightly. And everything pretty much does work on Intel graphics today, which was far from the case last time they tried their hand at discrete graphics.

    In other words, there’s little reason to believe that Intel wouldn’t be able to ship a fully featured (and functioning) discrete graphics card, but to be competitive constant driver optimizations are critical. This requires a large driver team and good contacts with the large game developers. This has been a weak point for AMD over the years and Intel is even further behind. Assembling such a driver team isn’t trivial, but Intel certainly has the money to do so.

    • ermo
    • 2 years ago

    Wasn’t it Raja who was responsible for recruiting our man ScottW?

      • chuckula
      • 2 years ago

      You are correct.

        • Mr Bill
        • 2 years ago

        Confirmed! ScottW is going to Intel!

          • CScottG
          • 2 years ago

          [url<]https://www.youtube.com/watch?v=6MHct4wtalg[/url<]

            • derFunkenstein
            • 2 years ago

            In light of that day’s revelations, Scott did have a big announcement on Twitter:

            [url<]https://twitter.com/scottwasson/status/928458261744975872[/url<]

    • NTMBK
    • 2 years ago

    Good on him! Here’s hoping he kicks Intel’s driver team into shape like he did at AMD. More competition is good for everyone!

    • RtFusion
    • 2 years ago

    EDIT: Sorry from Fortune, not Forbes and forgot link

    Interesting quote from an AMD spokesperson in a Fortune article covering this:

    [url<]http://fortune.com/2017/11/08/intel-gpu-raja-koduri-amd/[/url<] / AMD sounded like it might sue. โ€œWe have a very strong graphics team and will continue our focus on building great products,โ€ a spokesman said. โ€œWe also have industry-leading graphics IP and, if necessary, will vigorously defend it.โ€ / Lawsuits incoming soon? The past 3 days in tech is the craziest I have seen in a while. Probably not since AMD sued Intel a few years back for making backroom deals with OEMs to keep AMD products out of their product lines for years with shady payments.

      • cynan
      • 2 years ago

      Nah. They were just attempting to spin it to make it sound like AMD has its own GPU vision going forward and aren’t jumping recklessly into Intel’s pocket out of short term desperation. Who knows what the truth is..

    • brucethemoose
    • 2 years ago

    Isn’t this the guy who recently got alot of internet hate for AMD’s recent “meh* releases?

    • Unknown-Error
    • 2 years ago

    Lets see what Mr. Raja Koduri (or is he Dr. ?) can do with a much, much larger R&D budget! Looks like Intel really has its cross-hairs on nVidia. I read at AT-forums that making their own dGPU will allow Intel to fill some of its under-used fabs? Does that make sense?

    • Kougar
    • 2 years ago

    And now Raja Koduri will have all the resources and fancy toys at his fingertips he could want. Will be interesting to see what he cook ups with them.

      • Beahmont
      • 2 years ago

      And that is the big question. Mr. Koduri now has the keys to the kingdom with one of the largest R&D budgets on Earth at his disposal as well as metric tons of IP. So what does he do with all of it? Does he actually achieve glory and fame? Does he get scatter brained in a target rich environment? Or does he get tunnel vision and fail to properly capitalize on all of the resources at his disposal?

      Find out next year on:

      As the Tech Report Turns!

      • ptsant
      • 2 years ago

      He won’t have the mountain of patents that AMD/nVidia have.

      I am aware of the fact that Intel has LICENSED AMD GPU patents, but this will of course be limited in duration and cannot be a very stable basis for longterm development.

      I have no idea how this is going to function, but I expect AMD to protect IP and obtain something meaningful in exchange (% of sales?).

      • tipoo
      • 2 years ago

      Yeah…Nvidia as far I as know was outspending AMDs GPU R&D by more than triple. With Intel hopefully that is not the case and they at least match Nvidia R&D on this effort. What he can do with that should be interesting…

    • DoomGuy64
    • 2 years ago

    Bottom line: Intel is teaming up with AMD to take down Nvidia’s monopoly of compute and graphics. It seems just about everybody has had enough of dealing with Nvidia’s business practices, aside from cash flush toxic fanboys. Good. The PC market can now start equalizing out with better performance at lower cost. Nvidia won’t be able to get away with locking consumers into $800 TN panels for much longer, as well as all their other price gouging schemes.

      • f0d
      • 2 years ago

      you really have a deep down hatred for nvidia dont you?

        • DoomGuy64
        • 2 years ago

        No, just their business practices and price gouging, and so do other companies like Microsoft, Sony, Apple, and now Intel. I have no respect for their business model that sells mid range hardware as high end, and locks adaptive sync into $800 TN panels. I have nothing against Nvidia’s graphic hardware, but everything against their walled garden and price gouging. Only die hard cash flush stark raving mad nvidia fanboys can tolerate Nvidia’s price gouging, while the rest of the PC industry is moving away because they’ve had enough.

        Major OEMs don’t want to integrate Nvidia with their mid-range products simply because Nvidia overcharges way too much, and tries to lock in the market. There effectively is no good mid-range products at reasonable prices, say sub $1000 for a laptop, and Nvidia is directly responsible for that. OEMs can’t build mid-range prebuilts under $1000 because of the Nvidia tax, and so they’re moving away, just like the consoles and Apple. Intel is getting in on this too because they’re trying to cater to the market and break Nvidia’s monopoly, including compute.

        Here’s some commentary on why Intel and AMD are likely teaming up to bypass Nvidia: [url<]https://youtu.be/mNY5e5CFlbc?t=610[/url<] It's just math. You can't build affordable midrange systems with Nvidia because of the pricing. This isn't going to eliminate your fanboy deluxe 9000 Titan SLI gaming rigs that cost $5000+, just allow OEMs to finally sell decent products at reasonable prices. Also, once this happens Nvidia will be forced to stop price gouging. Everybody wins, aside from Nvidia's massive profit margins.

          • NTMBK
          • 2 years ago

          It’s a basic fact of economics that an object is worth precisely as much as you are able to charge for it. If NVidia is able to command higher prices for their products- for whatever reason, be it power efficiency, driver reputation, features like G-Sync, game bundles, brand identity, whatever- then that is how much their product is worth.

            • DoomGuy64
            • 2 years ago

            Wrong. TN panels are not worth $800 dollars. The reason why they are $800 is because Nvidia uses Gsync to lock out competition as a MONOPOLY, and we’ve got these toxic PCMR elitist fanboys who because they have the money, don’t think this is detrimental to the rest of the community. BULL. Both the prices and apologists are toxic and ruining PC gaming.

            The only way your argument has merit, is if Nvidia was actually competing fairly, and they’re NOT.

            • NTMBK
            • 2 years ago

            If they can sell TN G-Sync monitors at $800, then they are worth $800. There’s no subjective definition of value.

            • ludi
            • 2 years ago

            And if the price is high enough to attract competitors into the market, the price will fall. Which is basically what OP was indicating.

            • DoomGuy64
            • 2 years ago

            NO, IT’S CALLED A MONOPOLY, AND PRICE FIXING. TN Gsync panels cost $800 because there is no alternative, and you are locked into those prices. IF, and ONLY IF, you had the OPTION of buying a cheaper panel that wasn’t locked in, THEN can you say those $800 TN panels are worth $800.

            Reality is, Nvidia has a MONOPOLY on the market, and the only people buying these products are toxic fanboys who have BURNABLE MONEY.

            ——————————
            CAR ANOLOGY: Nvidia = GM/Corvette. They have so much money that they lobby the government to regulate out of existence all other cars on the road, and deliberately stop selling normal Chevys. The only car manufacturer left is Toyota who can’t compete with the corvette, and therefore only sell Priuses.

            Corvette owners LOVE their Corvettes. They constantly berate and harass Prius owners. They run Priuses off the road, and play games where they box them in traffic, burn out their tires at every stop light, and generally are an extreme nuisance to the general public. Meanwhile, they don’t see how they’re being toxic, and tell Prius owners that this is fair game and buy a Corvette if they don’t like it. Meanwhile, ALL the remnants of the decimated car industry are sick of GM, and decide to pull together to sell better Toyota’s that aren’t Priuses. Corvette owners cry foul. Basically what’s happened with Nvidia.

            • chuckula
            • 2 years ago

            What monopoly are we talking about again?

            Because Nvidia is nowhere near a monopoly in practically anything, and whining about G-Sync being Nvidia’s solution for adaptive sync doesn’t make it a “monopoly”.

            • DoomGuy64
            • 2 years ago

            Gsync is a walled garden. How you justify walled gardens to not be a monopolistic business practices is beyond me, because you can’t slap a freesync monitor onto a Nvidia card and get working adaptive sync.

            • chuckula
            • 2 years ago

            Oh so you just changed your tune from “monopoly” to “walled garden”.

            Not the same thing by a long shot.

            As for “walled gardens” I was under the impression that DX12 and Vulkan were supposed to be exactly those things since weren’t they specifically designed to use features that only exist on AMD products (even to this day?) You know, the miracle of async that Nvidia could literally never duplicate?

            I guess “walled gardens” are only bad when they benefit the company you hate.

            Then again, you aren’t actually an AMD fanboy, you’re an Nvidia hateboy. There’s a difference.

            • DoomGuy64
            • 2 years ago

            DX12/Vulkan LOL. The only reason why Nvidia doesn’t run as well is because they designed Maxwell for dx11, and slapped on some basic updates for pascal. Pascal DOES “support” dx12/vulcan, but since it isn’t designed from the ground up to do so, it kludges the features into working. That’s not a walled garden, because the APIs are open for everyone to implement. Nvidia can implement the features, and they did so with Pascal. Not an argument, unless you consider fallacies to be a counter point. Protip: it’s not, but you apparently think fallacies are legitimate counter points, because that’s what you use 99.9% of the time to make a case.

            • travbrad
            • 2 years ago

            I don’t think monopoly means what you think it does… I generally don’t like walled gardens either but it takes a lot more than that to be a monopoly. By that logic most auto makers are monopolies too since you can’t just mix and match random parts.

            • chuckula
            • 2 years ago

            Obvious troll is obvious but I’ll bite.

            Here are all of Newegg’s G-sync monitors: [url<]https://www.newegg.com/Product/ProductList.aspx?Submit=ENE&N=100160979%20600559797&IsNodeId=1&bop=And&Order=PRICED&PageSize=36[/url<] Are some of them overpriced? Sure. However, are they overpriced because of G-Sync or because of OMG curved monitor with weird resolution? I'd guess more of the latter than the former. I'm not seeing any plain TN-panel $800 monitors there.

    • madmanmarz
    • 2 years ago

    I never liked him anyway

      • mcarson09
      • 2 years ago

      I don’t care for Su either. I don’t like the way AMD handles their cpus
      [url<]http://www.agner.org/optimize/blog/read.php?i=838[/url<] and I don't like how their hardware in the gpu side has been behind in performance. I loved my 7970, but the 290x was just a dated refresh. Fury X was not up to par. Vega just can't compete. I really hated having to wait a year for the drivers to take full advantage of their cards. I owned other ATI/AMD cards in the past, but right now Nvidia is just doing a overall job.

        • ludi
        • 2 years ago

        Say what you want about Su — I certainly have, particularly when she was spitting out hot marketing blather at a rate exceeding Professor Farnsworth — but she seems to be herding the cats in the right direction, in a way that Roary Read apparently could not.

        • AnotherReader
        • 2 years ago

        How is the 290X dated? It is faring much better than the 780 Ti.

      • ronch
      • 2 years ago

      We don’t have to like the guys who work for big companies. There are all sorts of folks who work for them. And as long as they put out good products and they don’t screw their customers, things should be ok; the big market they serve will decide their fate. If it’s a small business though, it could be a different story, like if it’s a mom&pop store or shop and they’re a grumpy bunch that screws their clients… well..

    • chuckula
    • 2 years ago

    So 3 of the top 4 commented stories are directly or indirectly related to Raj, starting with Radeon GPUs in Intel chips at #1.

    Congratulations Raj, you are officially this week’s winner of the Internet.

      • Ninjitsu
      • 2 years ago

      Since i have nothing better to add – Raj is usually a short form of “Rajesh” and not “Raja”, and sometimes used by itself too (is also a verb, see British Raj).

      Of course, it really doesn’t matter here, but you might potentially confuse someone ๐Ÿ˜›

        • simbant
        • 2 years ago

        Not true, In Tamilnadu Raj itself is a name

    • derFunkenstein
    • 2 years ago

    Lots of “smiling Raja” pictures on TR this week. It’s kind of unsettling to go to the site and have smiling Raja waiting for me. It’s like he’s waiting at my front door. Hi Raja. Care for some coffee?

      • Meadows
      • 2 years ago

      At least he has a beard on this one, some other sites kept using his “Indian face.”

    • DragonDaddyBear
    • 2 years ago

    How did AMD manage to not put a one year no compete clause in such a significant person’s employment contact? I thought that was standards stuff for high level people.

    Also, that was fast.

      • chuckula
      • 2 years ago

      Go ahead and slap in a non-compete.

      Then try enforcing it in California.

      That’s why.

        • DragonDaddyBear
        • 2 years ago

        I strongly feel they are very unethical and possibly illegal. However, this guy has trade secrets that could tank the company he once worked for. This, i think, is the one time it should be used.

          • jihadjoe
          • 2 years ago

          I don’t think it really matters much in this case, as Intel is essentially starting from scratch.

          Let’s say Raj gives them the secret sauce for Vega, or even Navi. If Intel starts work tomorrow, they’ll still be 3-5 years away from a working product, by which time AMD|RTG should be on their ‘Next Gen’ chip according to their roadmap.

          AMD probably knows this as well, which is why they let him go so easily.

      • Liron
      • 2 years ago

      He was on sabbatical for several months. That may have been the extent of the clause.

      I thought he said that he quit to focus on his family. I guess Intel must have much better working hours than AMD, then…

      • tipoo
      • 2 years ago

      He was on leave for months, I think all the timing isn’t coincidental and the leave was just enough for his non compete clause.

    • Tristan
    • 2 years ago

    No matter what GPU Raja will design, it will be watercooled

    • DeadOfKnight
    • 2 years ago

    Playstation 5: Powered by Intel

      • tipoo
      • 2 years ago

      The Rajachip with Intel cores and a Radeon would be super interesting for a console, since it has shared memory.

        • derFunkenstein
        • 2 years ago

        As long as it’s Intel Cores and not Atom cores.

          • tipoo
          • 2 years ago

          That’s my big hope for the 9th gen. The .5 upgrades already used a die shrink on more GPU, and one of them on more RAM, but where the 9th gen could really fly away is CPU performance. A modest dual core Baby Cake i3 can do at 60fps a lot of what the consoles are locked to 30fps for, with six ad a half available Jaguar cores.

            • derFunkenstein
            • 2 years ago

            Related: GOB Bluth designed the semi-custom chips used in the 8th gen consoles. When reached for comment, he gave this statement.

            [url<]https://78.media.tumblr.com/9fe02a1b30bd131cde2cbab52cec077d/tumblr_nahh7dl6mX1twe13wo1_500.gif[/url<]

    • AMDisDEC
    • 2 years ago

    Every month there is more and more great news for AMD.
    The deal with China, then Elon Musk, and now Intel with Dr. Su taking direct control of the graphics division will ensure not only will it be a nice Christmas, but a great New Years too!
    Once the APU is ramped up and shipping, 2018 is a going to be a fantastic year for AMD and AMD shraeholders.

      • chuckula
      • 2 years ago

      Why did you need to edit that comment twice?

        • Redocbew
        • 2 years ago

        No amount of editing is going to help here, but doesn’t preclude a person from trying.

        • derFunkenstein
        • 2 years ago

        It was too coherent.

        • AMDisDEC
        • 2 years ago

        I can’t resist edit buttons.
        They’re irresistible.
        I like then, a lot!

      • derFunkenstein
      • 2 years ago

      No, No, No, No, No. Ted, you cannot do this to me. No. No, No, No.

      • AMDisDEC
      • 2 years ago

      Goosestep TR klan mentality proves Sean Parker’s evaluation conclusively.

        • AnotherReader
        • 2 years ago

        Goosestep and Klan: those are some pretty ballsy allegations. Care to provide any evidence?

          • AMDisDEC
          • 2 years ago

          I can’t help but recall the Dark Matter episode of the social media upvote fanatic female except here they are supposed to be males. Eunuchs?

            • AnotherReader
            • 2 years ago

            I loved that episode of Black Mirror, but calling the male readers of TechReport Eunuchs and implying that all the readers are male and that females are the only ones concerned with social media is a stellar sample of why you are so loved here.

            • AMDisDEC
            • 2 years ago

            Statistics!
            Call me Zatoichi, merciless slicing up the albino eunuchs and exposing their evil ways. Of course they don’t love it.

            • alloyD
            • 2 years ago

            I think we’re being trolled by a chatbot.

            • derFunkenstein
            • 2 years ago

            How is it still allowed to post here? Normally I don’t like to backseat mod, but this thing is calling all of us Nazis and KKK members and it’s apparently not a joke.

            • AMDisDEC
            • 2 years ago

            LOL, I’m not making this stuff up.

            Since J. Edgar Hoover to Reagan to Bush (x2) and now Trump, America ‘s majority has always felt safe being ruled and represented by these super competent Nazi Eunuchs. You can censor the messenger for fear of hearing the truth, but the truth still remains constant.

    • mcarson09
    • 2 years ago

    The is a textbook example of failing upwards….

    Vega uses way too much power. This guy was a letdown with AMD and I don’t see why Intel would want him. Well if Intel is going to be a REAL GPU player they better blow the 1080 ti out of the water by 2x with their first GPU. What next? Nvidia going to sue Intel to be able to make X86 CPUs.

    This is Intel right now

    [url<]https://www.youtube.com/watch?v=MPoit_51ac4[/url<]

      • ronch
      • 2 years ago

      Intel can still pick his brains for anything that might help Intel’s graphics division.

        • nanoflower
        • 2 years ago

        I think it’s too early to say he was a letdown. AMD has been struggling for a long time and what he did with Polaris was a good step in the right direction at a time when AMD was clearly putting most of their efforts into the CPU development (coming up with Zen). They’ve also done a lot of work on their drivers. If RTG had the sort of money Nvidia has and they still came out with a product like Vega I would consider it a failure, but with their limited funds it has to be considered a good effort. It will be interesting to see what resources Intel is going to put behind this new venture and what Raja and crew can accomplish over the next few years.

          • mcarson09
          • 2 years ago

          Four years is not too early. He’s been there since late 2013, and he was also there before he left to go to apple in 2009.

          [url<]https://www.anandtech.com/show/6907/the-king-is-back-raja-koduri-leaves-apple-returns-to-amd[/url<]

            • mudcore
            • 2 years ago

            You haven’t made a convincing argument against Koduri. He lead AMD’s graphics divisions through both strong and weak products, which given the length of his stays would make sense. Likewise we can all acknowledge that his teams have dealt significantly smaller budgets than their competition. And his time at Apple certainly didn’t show any shortcomings. And now he’s been quickly brought into Intel.

            I think this has very little to do with performance in the traditional sense. I’m not sure someone could have done better than Koduri given the circumstances. Instead I suspect his vision for the RTG was simply at odds with AMD leadership and it was better for all parties for him to move on.

            • mcarson09
            • 2 years ago

            He didn’t last at Apple. Four years just like his last stint at AMD. I do agree AMD has been giving R&D the shaft in the budget, but I’ve been saying that for years. Koduri’s history has given me nothing to be impressed by. He’s just a flash in the pan and I don’t expect him to last longer than four years at Intel. You haven’t convinced me otherwise yourself.

          • ronch
          • 2 years ago

          I don’t think he’s a letdown although he may/ could’ve done a bit more. However, given how AMD cut their r&d budget in 2012 onwards Koduri’s team probably didn’t really have much ammo to pursue what they really wanted. Maybe his team was severely downsized while more juice was pumped into the CPU division.

        • mcarson09
        • 2 years ago

        He’s a flash in the pan at best.

      • Hsldn
      • 2 years ago

      Maybe they know something?

      Vega is terrible at game benchmarks, but Open AI and GPU applications are totally different , Vega triumphs any Nvidia gpu. Maybe that what intel needs?

        • JustAnEngineer
        • 2 years ago

        Now that Radeon RX Vega 56 and 64 prices have dropped back to MSRP, AMD and NVidia are neck-and-neck in most gaming benchmarks at a given price point. Radeon RX Vega 56 looks pretty good as a $390 gaming graphics card:
        [url<]https://techreport.com/review/32766/nvidia-geforce-gtx-1070-ti-graphics-card-reviewed[/url<] If you valued performance of a 2017 game using the id Tech 6 game engine more than that of a 2015 game using the Rockstar Advanced Game Engine, you would have an even more favorable view of AMD's game benchmark performance.

          • bhtooefr
          • 2 years ago

          Worth noting that MSRP isn’t a good measure of cost, though – die area would be a better measure.

          And, Vega 10’s 484 mm[super<]2[/super<] die area is comparable to [i<]GP102's[/i<] 471 mm[super<]2[/super<], and is much larger than the GP104-based designs that it's competing against. (However, if it's better at compute...)

            • NTMBK
            • 2 years ago

            Remember that Vega and GP102 are manufactured at different fabs- we don’t know how wafer costs compare between GloFo and TSMC.

      • Rza79
      • 2 years ago

      From [H] when they clocked Vega at the same clock as the Fury X:
      [quote<]When the AMD Radeon RX Vega 64 is clocked normally it can consume up to 476w of system power in our testing. When we downclocked Vega 64 to 1050MHz the power draw dropped all the way down to a peak of 308w maximum. In fact, at-wall power was mostly between 260-290w while gaming. Thatโ€™s a huge power saver right there, just downclocking the video card. The AMD Radeon R9 Fury X, for comparison, was around 350w-400w while gaming, and peaked at 406w. This means that since performance was the same on both video cards AMD Radeon RX Vega 64 really is much more power efficient than AMD Radeon R9 Fury X.[/quote<] Vega was pushed too hard by AMD. It's not that the chip can't be power efficient. It represents a 100W drop compared to Fury X at the same performance. Nothing to scoff at. RX Vega 64 is also more often than not faster than the GTX 1080 in newly released games: [url<]https://goo.gl/5dPVxk[/url<] [url<]https://goo.gl/LF98JY[/url<]

        • bhtooefr
        • 2 years ago

        That kind of thing is why I see AMD’s mobile parts as being the really interesting ones – the desktop parts push deep into the upper right corner of the frequency/voltage curve, but results like that are actually pretty promising for downclocked mobile parts.

        • Rza79
        • 2 years ago

        One more new game to add:
        [url<]https://goo.gl/6eDrz7[/url<]

          • AnotherReader
          • 2 years ago

          Interestingly, DirectX 12 is slower for all cards until 4K.

        • Rza79
        • 2 years ago

        Wolfenstein 2 is now up to 22% faster with patch #2 on Vega:
        [url<]https://goo.gl/nPrptd[/url<]

      • chuckula
      • 2 years ago

      Once upon a time AMD poached this guy name Jim Keller who clearly was a failure since his entire company — DEC — plain went out of business.

      That clearly didn’t work out either.

    • Pancake
    • 2 years ago

    They’re not hiring just for his own skills but for the team he can build around him.

    Expect significant resignations from AMD in 3… 2… 1…

      • tipoo
      • 2 years ago

      [wrong reply]

    • Mr Bill
    • 2 years ago

    I’m not sure at the moment who bought or who sold a [url=https://en.wikipedia.org/wiki/Pig_in_a_poke<]pig in a poke[/url<]

    • ronch
    • 2 years ago

    AMD to Intel : “We’ll let you use our graphics technology AND give you our main graphics guy to make sure everything goes well for you. All for $0.10 a chip. Deal?”

      • Pancake
      • 2 years ago

      Intel to AMD: You forgot to say please!

        • ronch
        • 2 years ago

        AMD : “Puh-leeeezzzeee!!!”

        Intel : “I think we’ll go with Nvidia.”

        AMD : “Ok, we’ll PAY you $20 for EVERY chip you sell with our graphics in it!”

        Intel : “Um, ok, but you’ll have to pay us $30 from here on for every chip you sell with MMX, ok?”

        AMD : “DEAL!!”

          • Beahmont
          • 2 years ago

          Well they have got to do something after they went long on a NAPIM (National Association of Printing Ink Manufacturers) index buy to cover all those years of Red Ink!

    • xpentor
    • 2 years ago

    Is it only me that has this feeling that Raja Koduri did not meet his KPI as the leader in RTG?
    Compared to Ryzen:
    1. He missed highend cards back in the rx480 launch
    2. He introduced rx580 to feed Vega delays
    3. Vega launched peformance not as hyped, not even close to 1080Ti
    4. Vega is a powerhog , always out of stock, price jacks anywhere

    Bottom line:
    AMD needs a better leader at RTG.

      • davidbowser
      • 2 years ago

      So by your reckoning, is Intel making a mistake hiring him? Or does Intel have deep enough pockets to make his bold plans a reality?

        • xpentor
        • 2 years ago

        This is simply a case of a better job offer and knowledge needed at intel. He gets to continue with his Vega toy for highend intel SoC. At the same time re-endure the pain of improving/revamp IntelHD graphics architecture for next gen intel cpus.

      • Beahmont
      • 2 years ago

      Yeah, but the smart money says that all those things you’re dinging Raja for were caused by AMD making choices that left them in debt to Red Ink producers as well as everyone else.

      And while Intel has many problems right now, lack of money to bulldoze projects isn’t one of them.

      • Mr Bill
      • 2 years ago

      Its a pig in a poke question. We’ll just have to see what Intel lets out of the bag.

      • AMDisDEC
      • 2 years ago

      I agree, he proved himself expendable. He won’t like it at Intel.

    • NoOne ButMe
    • 2 years ago

    Past few days go from:
    “bad for Nvidia, okay for AMD, good for Intel”
    to:
    “Terrible for Nvidia, bad for AMD, really good for Intel”

    • Krogoth
    • 2 years ago

    This probably marks the beginning of the end for low-end and mid-range discrete GPU solutions.

      • derFunkenstein
      • 2 years ago

      To late for that prediction. ludi already said PC gaming was dying in the forums.

        • DeadOfKnight
        • 2 years ago

        What a curious location for it to meet its demise.

          • derFunkenstein
          • 2 years ago

          At TR of all places, a site dedicated to high-performance PC hardware. It was ludi the forums with the candlestick.

      • tipoo
      • 2 years ago

      I don’t see why. This is to make a dedicated card, and their current IGPs were already killing low end GPUs.

        • Redocbew
        • 2 years ago

        Krogoth is just being Krogoth. I’m not sure he’s ever been impressed by low to mid-range GPUs.

          • derFunkenstein
          • 2 years ago

          I dunno about him but the GTX 1050 Ti really just impresses the heck out of me. Plenty of 1080p grunt for most games in a power budget so small you can skip the PCIe 6-pin connectors.

            • Demetri
            • 2 years ago

            1050Ti looks pretty good now, thanks to the mining boom. Back when you could get an RX 470 for $130 AR… not so much.

            • derFunkenstein
            • 2 years ago

            Well, yeah, but AMD was taking a bath on those.

        • Krogoth
        • 2 years ago

        Nope, Intel doesn’t care about the discrete GPU market. It served its purpose back with i740 in adoption of AGP bus. Intel is all about making iGPU and semi-integrated GPU solutions.

        This move is all about sizing up to AMD RTG’s upcoming APU solutions. They know that their current iGPU on their CPUs doesn’t cut anymore. That’s the reason why they headhunted Raja, the brainchild of AMD’s current APU roadmap.

        AMD RTG start a fire under the 800-pound gorilla. It is going to be an arms race in the iGPU world that eventually create units that yield “good enough” performance for the masses and casual gamers (FYI, they make up the bulk of discrete GPU revenue). They no longer have a reason or need to get a discrete GPU.

        Nvidia has seen the writing on this wall a decade ago and have trying to move away from discrete GPUs as their bread and butter.

          • tsk
          • 2 years ago

          What are you talking about? Intel has already confirmed they will be making discrete GPUs.

            • chuckula
            • 2 years ago

            In Krogoth’s defense (and I rarely do this) Intel might be building “discrete” GPUs that are still not the traditional Radeon/GeForce add-in graphics card that you expect from a “discrete” product.

            They might be referring to steroided versions of the new Kaby Lake-G where there technically is a discrete piece of silicon that is the GPU and that even has its own high-speed RAM, but the big difference is that instead of being in an add-in card it’s now tightly integrated with the CPU.

            • derFunkenstein
            • 2 years ago

            And it would either be really coincidental or maybe even genuinely ironic if people have been begging AMD to do just this for years since Zen was announced, and it’ll be an ex-AMD employee that spearheads the effort for Intel.

            • chuckula
            • 2 years ago

            People love to rag on Intel for having fabs and spending lots of money on researching how to produce these complex chips (I don’t know why but they do).

            However, if you want to build some exotic GPU/CPU/HBM/etc. compound product it sure doesn’t hurt to have control of a sophisticated manufacturing infrastructure instead of having to go begging to multiple silicon vendors and hoping that all of them pull through to give you a working product.

            Using AMD for the GPU in Kaby Lake-G is clearly a transitional idea (although I expect the “transition” to last a while, there will likely be successors that use future AMD GPUs) since Intel doesn’t have its own in-house GPU. Raj is there to fix that deficiency, although it’s not going to be an overnight fix.

            • derFunkenstein
            • 2 years ago

            Oh for sure. Intel is the only one that can produce the transistor density and low power leakage to maximize the performance of such a tightly-integrated part. Raja can poach whoever may still be loyal to him at AMD.

            • AnotherReader
            • 2 years ago

            Given its vast resources, Intel could have been serious about discrete GPUs at any time since 3Dfx revolutionized GPUs. I suspect that fears of anti-trust legislation kept them away from this market.

            • tsk
            • 2 years ago

            Possibly, but more likely we’ll see a product similar to nvidias Tesla cards.

      • anubis44
      • 2 years ago

      As I predicted a couple of years ago, but almost everybody thought I was wrong.

    • 223 Fan
    • 2 years ago

    Intel lost out on the phone gravy train and is on the verge of missing out on the AI / neural processing gravy train, which co-incidentally involves making a GPU in order to ride. Raja has a chance to make Intel a player in this space, something he never had with continual plucky underdog AMD.

    • DrDominodog51
    • 2 years ago

    Intel will destroy Novideo with the mustache power!

      • chuckula
      • 2 years ago

      If Jen-Hsun grows his own porn stache then you know it’s war!

    • blastdoor
    • 2 years ago

    Iโ€™m confused – I thought the only way to enter the GPU market was to spend $5 billion to buy an existing player. No doubt thatโ€™s what Imagination management imagined too.

      • derFunkenstein
      • 2 years ago

      Merhaps they lacked… Imagination

        • Mr Bill
        • 2 years ago

        I find their lack of imagination… disturbing.

      • nanoflower
      • 2 years ago

      Except Intel already is in the GPU marketplace. They are in practically every area except for discrete GPUs for consumers (the gaming marketplace.) It’s a lot easier to build up an existing unit than starting from scratch.

        • Klimax
        • 2 years ago

        And they used to be there too… for few years.

        • blastdoor
        • 2 years ago

        At some point, everybody was starting from scratch.

        Apple also managed to start from (more or less) scratch, on both CPU and GPU.

        I know this horse died long ago and that Iโ€™m just kicking bones around, but AMDโ€™s purchase of ATI was a monumental mistake that nearly destroyed the company. Itโ€™s the kind of thing that should be told as a scary story over a campfire to every junior executive in every industry from now until the end of time.

          • nanoflower
          • 2 years ago

          Is it the purchase that was really the problem or overpaying? As I recall they paid a premium price for ATI at a time when the market was at a peak. The worst time to buy another company unless you can get a great bargain.

          • chuckula
          • 2 years ago

          I equate AMD buying out ATi to an early 20th century car company that didn’t succeed because its owner was so pleased with himself for being a “visionary” about cars being the future that he didn’t bother to realize that simply knowing cars were the future wasn’t a guarantee that the company was actually going about making & selling the cars the right way.

            • MOSFET
            • 2 years ago

            So, Ford or Packard?

    • derFunkenstein
    • 2 years ago

    [quote<]"high-end discrete graphics solutions for a broad range of computing segments."[/quote<] This is what makes me check the date for April 1. It's gotta be more than just the recently-announced AMD thing, right?

      • MOSFET
      • 2 years ago

      Yes, it does. We just get to speculate, or wait for future announcements.

        • derFunkenstein
        • 2 years ago

        AMD spins RTG off and sells it to Intel confirmed!

          • nanoflower
          • 2 years ago

          Or Intel and AMD continue to work together on more projects. Certainly this seems likely over the short run (next couple of years). If AMD can’t stay profitable then I could see them spinning off RTG to someone like Intel. Otherwise it’s best for AMD to keep RTG around so they have the GPU tech to compete in the mobile marketplace.

            • derFunkenstein
            • 2 years ago

            That’s way too dull and realistic

          • mcarson09
          • 2 years ago

          For a loss no less…

            • derFunkenstein
            • 2 years ago

            Intel can’t over-pay for a division with delayed products, now can it?

    • tsk
    • 2 years ago

    Hot damn Raja, you sly old fox!

    • christos_thski
    • 2 years ago

    Could this “high-end discrete graphics solutions for a broad range of computing segments” be simply referring to the joint AMD-Intel project?

    Because , if not, it’s GEEK PARTY TIME.

    How long has it been since we last had THREE high end gpu manufacturers?

    Let’s party like it’s 2000!!!!!!!

      • tipoo
      • 2 years ago

      Discreet = probably not that. I mean it’s kind of a dedicated GPU since it has HBM2, but its also kinda integrated-ey with the MCM. I think this means an actual card.

        • jihadjoe
        • 2 years ago

        [quote<]Discreet = probably not that.[/quote<] Bummer. I was really hoping for a GPU that silently does its job and doesn't call attention to itself. Most cards nowadays get so whiny and noisy when they start to feel they're working too hard.

    • chuckula
    • 2 years ago

    Here’s the press release in all its PR glory: [url<]https://newsroom.intel.com/news-releases/raja-koduri-joins-intel/[/url<] Note: Raj kept the beard for the photo.

    • Demetri
    • 2 years ago

    So the million dollar question is whether an Intel line of discrete gaming GPUs is part of that “broad range of computing segments” he’s going to be developing for.

      • tsk
      • 2 years ago

      I sure as hell hope so, that is gonna be a big win for consumers, and a big hurt for Nvidia.

        • Redocbew
        • 2 years ago

        Same here. Another discrete GPU maker who hopefully has drivers that don’t suck can only be a good thing.

    • tipoo
    • 2 years ago

    The current top comment is applicable here too.

    * checks headline *
    * checks date *
    * Not April 1 *

    * Checks headline again *

    What a silicon whirlwind the last few days.

      • chuckula
      • 2 years ago

      April 1 meets Groundhog Day

    • fyo
    • 2 years ago

    Gotta say I’m shocked he didn’t have a non-compete clause in his contract with AMD…

      • tipoo
      • 2 years ago

      Might have had to do with the timing of the leave of absence.

      • DrDominodog51
      • 2 years ago

      It’s a pain to enforce a non-compete clause in California even if there was one in his contract.

      • smilingcrow
      • 2 years ago

      He really needed a non-compute clause or even a sanity clause or failing that a Santa Clause.

        • Redocbew
        • 2 years ago

        I’m pretty sure this is Raja being his own Santa Clause.

      • CScottG
      • 2 years ago

      This ties-in with the last Raja news.

      Non-compete clauses are void in California.. you can’t stop someone from working elsewhere.. BUT, you can sue the sh!t out of them (and anyone else involved) if they are providing intellectual property from their prior employer to their new employer (..and I can’t see how that wouldn’t be the case). And there was probably an employment agreement specifying what would constitute that.

    • JAKra85
    • 2 years ago

    “Some months ago, I got a phone call from Raja Koduri, who heads up the newly formed Radeon Technologies Group at AMD. Raja asked me if I’d be interested in coming to work at AMD…” – Scott Wasson

    I would really love to hear his opinion now, but probably he is not allowed…

      • nanoflower
      • 2 years ago

      That happens. It’s why it’s a good idea to not just see if you are a good fit with your boss but with the other people in charge in that company. Hopefully AMD is going to continue moving forward with the various graphics initiatives they have underway like driver improvements and a focus on frame times. If not… there may be a position open with Intel in the next few months. ๐Ÿ™‚

    • chuckula
    • 2 years ago

    “high-end discrete graphics solutions for a broad range of computing segments.”

    That could mean a lot of things but Nvidia and AMD can finally agree on something: they don’t like it.

      • tipoo
      • 2 years ago

      With everything else going crazy in the silicon world in the last few days: Next step, Geforce Radeon multichip module

    • tipoo
    • 2 years ago

    Intel dedicated GPUs huh. I’ve actually wanted to see this for years, the Gen graphics fellas aren’t bad. Scale up, optimize, add bandwidth and wattage, and they may have something.

      • fullbodydenim
      • 2 years ago

      Clearly you must not be old enough to remember the Intel Starfighter i740 8MB AGP video graphics card.

        • tipoo
        • 2 years ago

        Post-awfulness Intel dedicated GPU ๐Ÿ˜›

        • Klimax
        • 2 years ago

        ๐Ÿ˜€

        Got two: Intel Xpress 3D and Biostar. IIRC mostly driver problems and reliance on AGP Texturing (i740 was supposed to be example for others) killed it.

          • AnotherReader
          • 2 years ago

          Also, the RIVA 128 was faster than it and the Voodoo 2 was much faster.

        • ColeLT1
        • 2 years ago

        Diamond stealth g460 (i740) in my PII-400 I built back in 1998. I remember playing Interstate ’76/’82 on it like a boss ๐Ÿ˜€

    • oldog
    • 2 years ago

    I guess this makes sense based on the recent AMD/Intel hookup. Or not.

      • Wirko
      • 2 years ago

      Hookup? What hookup? Ah, you mean the Rajachip?

        • HERETIC
        • 2 years ago

        That’s got a nice ring to it-we could go with that.
        You heard it here first-“RAJACHIP”

    • kuraegomon
    • 2 years ago

    Kyle (HardOCP): “Muuuuaaahahahahahaha! … Victory lap.”

    You’ve got to admit, he called it like a G.

      • Convert
      • 2 years ago

      The comment I saw from Kyle said he 100% didn’t know this was going to happen.

      When did he say it would?

        • kuraegomon
        • 2 years ago

        See comment 29 above. And if you follow that link, prepare for a bunch of “You’ve _got_ to be kidding me” moments…

          • Convert
          • 2 years ago

          I read the article twice and see no mention of it.

          The closest thing is the comment that Raja threatened to move to Intel if he couldn’t head up RTG. But he did head up the RTG group.

          I must be reading the wrong article? Here is the one I thought you linked to: [url<]https://www.hardocp.com/article/2016/05/27/from_ati_to_amd_back_journey_in_futility[/url<]

    • morphine
    • 2 years ago

    Is that a Radeon in your pocket or are you just happy to see us?

    [sub<](that's a joke, btw)[/sub<]

      • alexberti02
      • 2 years ago

      Do you know a better joke? (It’s in Spanish, BTW)

      Se ha Raja-do

        • morphine
        • 2 years ago

        lol that was good

      • ronch
      • 2 years ago

      I dunno, but it’s long, hard, hot, and it blows (air).

    • lilbuddhaman
    • 2 years ago

    First for “stolen tech” lawsuits in 6-18months.

      • RtFusion
      • 2 years ago

      Wow, before news broke of him going to Intel, I was expecting Samsung Mobile to pick him up for their SoC development. Maybe to do the same with the execution of graphics development while Raja was at Apple a few years back.

      Larabee (in some form) comes back from the dead I guess…

        • nanoflower
        • 2 years ago

        Think about it. He’s been working with Intel for at least months to bring that Intel/RTG(AMD) deal together. So that put him in a perfect position for him and Intel to see if they might be a good fit. Plus he gets to direct the project that he helped to create since Intel is specifying what they need in the way of custom GPUs.

          • MOSFET
          • 2 years ago

          I think nanoflower is right, except perhaps the timeline. My hunch is that Raja was eyeing Intel as either a future employer or a future buyer of RTG well before working with them on the recent deal.

            • kuraegomon
            • 2 years ago

            So, this link read like crazy scurrilous scuttlebutt at the time. Now it reads like prophecy:
            [url<]https://www.hardocp.com/article/2016/05/27/from_ati_to_amd_back_journey_in_futility[/url<] Seriously, that's from May [b<][i<]2016[/i<][/b<] Whiskey-Tango-Foxtrot.

            • derFunkenstein
            • 2 years ago

            Count me among those that thought Kyle was full of bitter herb sauce at the time

            • kuraegomon
            • 2 years ago

            I know, right? I thought he was channeling Charlie D! Wow, I’m going to re-read that whole thing. It’s going to either feel like taking the red pill, or going down the rabbit hole. Or both. But man, things must _really_ be messed up at RTG.

            • chuckula
            • 2 years ago

            If that’s true then there will be more employee defections following Raj. We just won’t hear about them.

            • kuraegomon
            • 2 years ago

            It seems impossible that there won’t be more defections now. And given the other (recent) history, I don’t feel right speculating any further along those lines here. This is all so bizarre.

      • TheRazorsEdge
      • 2 years ago

      Intel is not that stupid.

      They already have substantial cross-licensing agreements with AMD, and they are incorporating AMD GPUs into future CPU lines. They’re not about to piss into their beer.

      On the other end… AMD may be butthurt about losing an executive, but they can’t afford to burn bridges over it.

      NVIDIA is spanking them in the GP-compute and embedded learning markets, both of which are big money now and big growth in the near future.

      Our VDI admins just got some 3D hardware to support engineering apps that need it, such as SolidWorks. GRID, of course. So this makes three areas where NVIDIA is clearly ahead of the competition.

      Intel has an ecosystem and a huge bankroll, and AMD needs R&D money plus integration. While it is superficially unholy, this alliance does seem to be rather necessary.

      • chuckula
      • 2 years ago

      “Stolen tech” is overblown here. Raj didn’t walk out the door with the RTL files for Vega (or else he’d be in jail right now for industrial espionage).

      He’s got a lot of high-level knowledge but I frankly doubt he has personal knowledge that he could recite from memory about low-level inner workings of Vega beyond what Intel and Nvidia already know anyway. His job wasn’t to personally design these GPUs but to oversee the people who did the design/validation/production.

Pin It on Pinterest

Share This