AMD shakes up Radeon biz, gives Koduri full responsibility

Big changes are happening at AMD. For the first time since its acquisition of ATI nine years ago, the company has given responsibility for its graphics business to a single leader. The firm today announced that it has promoted Raja Koduri to senior vice president and chief architect of the Radeon Technologies Group.

The newly formed Radeon Technologies Group will be responsible for the graphics technology used in discrete GPUs, APUs, and semi-custom products like the chips used in the Xbox and Playstation. In his new role, Koduri will oversee everything from hardware and software development to product management, marketing, and developer relations.

With Koduri at the helm, AMD's graphics business will be run by a software engineer with 20 years of experience in graphics. Koduri returned to AMD in 2013 after a stint at Apple, where he was involved in the transition to Retina displays. Prior to that, he served as CTO of the Product Graphics Group at AMD. Earlier, at ATI, Koduri guided the development of the first CrossFire multi-GPU load-balancing mechanism, and at S3, and he created the texture compression method that became known as DXTC.

Since returning to AMD, Koduri has quietly been hiring a number of smart graphics engineers, like Timothy Lottes, formerly of Epic and Nvidia, where he developed the FXAA and TXAA algorithms. Koduri has also spearheaded AMD's LiquidVR initiative, preparing the firm to compete in the virtual and augmented reality markets.

CEO Lisa Su expressed a desire to make AMD's graphics business more "agile" and "vertically integrated" with this move. Dr. Su also cited a goal winning back lost market share in graphics.

That last goal is especially important given the fact that recent analyst reports have pegged AMD's share of the discrete GPU market at just under 20%. The firm's share has declined from roughly 40% of the discrete GPU market several years ago.

During that span, AMD has only released a few new discrete graphics chips; its chief competitor, Nvidia, has refreshed its lineup from top to bottom in the past 12 months.

The Radeon business also struggled due to a strange confluence of events over the past couple of years. A GPU shortage caused by an explosion of demand from cryptocurrency miners kept the Radeon R9 290 and 290X out of the hands of PC gamers for months after their introduction. Not long after the crypto-mining bubble burst and the smoke cleared, Nvidia introduced its GM204 chips, including the attractively priced GeForce GTX 970 for $336. Strangely, AMD waited nearly six months before officially cutting prices in response. By then, the market share shift was well underway.

No doubt some of today's changes were prompted by AMD's recent graphics market share struggles. The center of gravity for AMD's graphics business will shift to the company's headquarters in Sunnyvale, California. According to a source familiar with the matter, long-time ATI and AMD executive Matt Skynner will be leaving the company. Skynner had previously been the corporate vice president and GM of products for AMD's Client and Graphics Group. The responsibility for AMD's software and content development efforts will now fall under Koduri's authority.

Comments closed
    • AJSB
    • 5 years ago

    AMD will NOT be “allowed” to go bankrupt…at last resort, Intel and/or Nvidia will pay to it stay afloat.

    “WHAAAT ?!? AJSB’s just gone full insane !”

    Well…maybe…or maybe not…do you remember that was Microsoft that saved Apple from bankruptcy back in the days ?

    MS paid big bucks for Apple not go bankrupt because if not MS would be accused of unfair dominant position and would be split in smaller companies.

    Of course, when i said “AMD” i am referring to the CPU part of the equation…the worst thing that could happen to Intel is if AMD CPU division is gone in the wind…taking in account that AFAIK, its x86 is not transferable to other company…unless maybe if Intel agrees with it.

    This is why i also once “equationed” the chance of AMD, in case of bankruptcy, be split and graphics division be allowed to be bought by Intel and Intel would allowed the “non-transferable” x96 licence be acquired by…Nvidia.

    Nvidia would get a x86 license and so Intel could not be accused of dominant position.

    Intel would get a dGPU technology and so Nvidia could not be accused of dominant position in the GPU market.

    ….but all in all, i trully prefer that both CPU and its graphics division survive and progress…only speculating about what could happen in a worst case scenario.

    • steelcity_ballin
    • 5 years ago

    I can’t believe it’s been 9 years already since they acquired ATI. It seems like it was just a few years at best. Man. Time dilation.

    • ronch
    • 5 years ago

    Oops, wrong post. Apologies.

    • chasp_0
    • 5 years ago

    Hes rocking a quite dapper salt and pepper hair combination. I predict full grey after 1 year on the job

    • ronch
    • 5 years ago

    How about just changing ‘Radeon’ to ‘Raja’.

    Anyone here want an AMD Raja 490X?

      • Jigar
      • 5 years ago

      The Name Raja means – King.

        • Meadows
        • 5 years ago

        How is that a relevant reply?

        • ronch
        • 5 years ago

        I see AMD did something appropriate here, making him King of the graphics division.

    • bhappy
    • 5 years ago

    Its about time they created Rebrandeon Technologies Group but lets not fool ourselves companies that spend a fraction of what their competitors do on R & D and have been losing more money than they have made for years generally don’t turn things around, they usually get bought out or go bankrupt instead.

    • LoneWolf15
    • 5 years ago

    Time to let Roy Taylor know who holds the reins.

    • Generic
    • 5 years ago

    Damn. Another victim of blue/orange color cabal.

    Poor guy doesn’t look like he’s even aware of it.

    • HisDivineOrder
    • 5 years ago

    Why is it everytime I look up, AMD is juggling around who’s responsible for what? They replace their CEO. Replace him again. Replace their executives. Hire a few “star talent” names that seem to do little.

    Then do it again. And again. And again.

    All the while, they’re selling everything not nailed down and regurgitating the same cards twice, three times, hoping no one notices it’s the same product they just bought last year and the year before.

    And if they’re releasing something new, it’s overpriced and flawed in some depressingly obvious way that should have been dealt with pre-launch with proper pricing or a cooler adjustment (R9 290X launch).

    Why do I feel like I’m watching the band play musical chairs on the Titanic?

      • SoM
      • 5 years ago

      because it’s sinking

      and i’m not holding my breath for Zen either

    • SoM
    • 5 years ago

    the next architecture will look good made out of rotten bamboo

    they keep shuffling too much, they should fire the person who does the shuffling first.

    ^^i’ll leave that there so i won’t have to edit.

    so they had multiple leaders for it’s GFX dep. btw (i’m the leader of this clan) :p

    maybe now if they have one brain leader (we’ve got the brain) things might look a lil brighter for AMTDI.

    sry had a few

      • Nevermind
      • 5 years ago

      A few what though..

        • SoM
        • 5 years ago

        some St remy, well more then some

    • ronch
    • 5 years ago

    Without doing a product refresh from top to bottom and relying on an architecture that’s largely based on 4-year old technology, really, what could anyone expect?

    Market share can swing overnight if you put out a strong lineup. I’d have no problems with AMD if they had much better energy efficiency.

      • Nevermind
      • 5 years ago

      “that’s largely based on 4-year old technology” Said the skylake hype..

        • ronch
        • 5 years ago

        Difference is, Skylake has no competition and if you want the fastest x86 CPU cores available to mankind, you get Skylake.

        Meanwhile, AMD keeps holding on to practically the same GCN architecture that came out in the tail end of 2011 while Nvidia piles on more and more cards on top of them.

        Go on, press that downvote button, AMD fanbois.

          • Nevermind
          • 5 years ago

          You kind of jumped right over my point, for fanboyism..

          All I was saying, the “skylake” goodies are basically old also.
          Yes, they added spyware bells and whistles. Performance isn’t peaking.

          So if you WANT to pay 2x the price for 1x the performance, that’s totally fine.

          But I wasn’t really going for an AMD vs INTEL fanboy fight.

          I was just pointing out, Intel didn’t really innovate that much this time.
          Maybe they didn’t feel they needed to, but that’s besides my point.

            • ronch
            • 5 years ago

            Point is, we all know very well that AMD’s current lineup is practically made up of rebrands. The slight improvements they have made with GCN were used only in two or three GPU models. What the heck are they trying to do here? Deliberately try to lose and make people go with Nvidia? Compare that to Intel. Did Intel introduce Haswell or Skylake in just one or two models? No, they do entire lineup refreshes. Sure there’s Broadwell but that’s a slipup in an industry that’s making slipups left and right. Pushing IPC and process tech just isn’t how it used to be, not only with Intel but with EVERYONE else. Intel is pushing the boundaries here and lest we forget, AMD is partly to blame for Intel not pushing for higher performance, if you could call it that. And it’s not like AMD’s been raising the bar since Core 2 came out NINE years ago. I’m not an Intel fan (and if you’ve been reading my posts here on TR you’d see that I’m more of an AMD fan but I avoid drinking their Kool-Aid when I can) but I’m pushing facts here. Intel’s been pushing x86 performance far far ahead of what anyone else has managed. They’ve raised the bar far beyond AMD ever has. Yes AMD only has a small budget, but hey, if they’re now saying they’ve decided to play in the high end again, what does that mean? They just got lazy these past many years? Either that or they’re just bluffing. Bottom line is, you can’t find a better x86 supplier provider or someone in the industry that’s really pushing IPC, clock speed, and process tech better than Intel is or has. What? IBM? They no longer want to play in the foundry biz and they’re not really hands down better than Intel. In fact Intel does very well against them ([url=http://blog.xcelerit.com/benchmark-ibm-power8-vs-intel-haswell/<]link[/url<]). Maxwell has been out for a while now and Nvidia knew better than to rely on TSMC to give them a process shrink. AMD bet the farm on TSMC because they either had no budget or were too lazy to put out something that's a lot more efficient even on the same process node. And 1x performance for 2x the price? Where do you get your prices? Are Skylake's CPUs, say, the 6700K, selling for 2x as much as the 4790K? We're comparing apples to apples, you see. And is the 6600K going for 2x the price of a 4690K? Maybe the perf improvement isn't much, but Intel's not asking $680 for the 6700K either. Heck, they're asking practically the same price too, or even lower once you factor in inflation.

            • anubis44
            • 5 years ago

            Notice how litte Intel’s architecture has changed since Sandy Bridge? The same reason is behind why AMD’s GCN hasn’t dramatically changed in the same time frame. Why? Because the software is only now catching up to where the hardware was four years ago, both CPU and Radeon GPU hardware. AMD and Intel’s CPUs and AMD’s GPUs were already quite nicely tuned to run what we now are seeing in Windows 10/DX12 back then, so there wasn’t much to change until DX12 arrived and evolved. AMD and Intel had software prototypes and models four years ago and saw where Windows was moving, so they designed hardware for that, but it’s taken four years for Microsoft to release a Mantle-like DX12. Now, all the AMD/Intel CPUs and Radeon GPUs will run the way they were designed to run four years ago. All of the CPU cores will now be able to efficiently feed the GPU, but only if you have a Radeon GPU. nVidia, on the other hand, played a cynical (some would say shrewd) game of minor, incremental tweaks and fixes aimed square at DX11, an API that can only feed the GPU from one CPU core. The irony is, nVidia has made tons of money out of customers that keep paying $500-$1000 for tiny performance increases with each iteration of DX11-designed card, while the GCN Radeons have been ready with asynchronous shaders to run a Mantle-like DX12 since 2011. So Radeon owners have had little reason to upgrade their cards, and AMD has made less money than nVidia. Oh the irony.

          • Nevermind
          • 5 years ago

          And can we NOT make this a fanboy fight, as of now? I don’t have time.

      • anubis44
      • 5 years ago

      AMD hasn’t dramatically changed their architecture in four years, because it is designed for DX12, an API that is only now just being released. Why would they ‘improve’ their design when the software is 4 years behind schedule? What would that new design be aimed at? Fact is, AMD was ready for DX12 with GCN 1.1, but Microsoft dragged their feet, and probably wouldn’t even have released a Mantle-like DX12 if AMD hadn’t kicked them in the ass and released Mantle. AMD has been trying to push the gaming envelope while nVidia and Microsoft have been doing practically nothing for four years, and people have been giving the Green Goblin tons of cash for minor, incremental improvements all the while. Kind of makes me sick to watch.

        • ronch
        • 5 years ago

        I’m talking about energy efficiency. At any rate, Nvidia cards still generally outperform AMD and suck less power as well. That’s a winning combo and many folks gladly pay for that.

        And going by your logic in response to my post above, Intel shouldn’t have designed new cores after the Pentium MMX because MMX took many years to see widespread adoption. Heck, AMD should have waited for the entire industry to support AMD64 before designing something better than the K8.

        You see dude, it’s not just software support. Software support will come but it has always taken years. Would it be prudent for a chip company to sit on their laurels until software support comes? How about building something better, faster, and more energy efficient while software catches up so your competitor doesn’t catch you with your pants down sometime later? People buy products based on benchmarks they see at the time of purchase. How can AMD sell more GPUs and gain market share when they’ve been sitting still while Nvidia peddles something that is generally faster and more efficient? Remember the GTX 660 vs HD7870 matchup here at TR? That’s just one example. Even without Maxwell Nvidia was already putting out better stuff than AMD at a given market segment, and with generally better drivers! The numbers speak for the product! TR gave the nod to the 660. And when Maxwell came out… Bam! AMD’s aging architecture looks even worse. To makes things even worse, the incremental changes were deployed only in a few chips like the 285. So if I wasn’t in the market for a 285 or 960, what could compel me enough to go AMD?

        As for Intel, I’m not saying they’re right in sitting on the fundamental Sandy Bridge architecture since 2011. Heck, Sandy traces its roots from Core 2, which was released NINE years ago! But you can’t deny that Intel has the best CPUs that AMD can’t touch (because AMD also sat on the fundamental K7 architecture from 1999 to 2011 and Bulldozer, like Radeon, was also ‘a little ahead of its time’ and failed to wow customers with software that was available during its launch) and now AMD also sat on GCN for far too long and NV is killing them. If that sounds incredible, look at their market share figures. The market share numbers don’t lie: AMD’s piece of the pie has been shrinking for a while now despite aggressive pricing, generally better value for money and with free games to boot! It seems pricing and free games simply can’t make up for actual product shortcomings, as shown by AMD’s slipping market share. It’s no surprise. Why would anyone be surprised? People don’t buy cards because of a future DX version or free games, though having support for DX12 doesn’t hurt. They buy cards based on what they’re seeing NOW, based on the product’s quality and current advantages.

        As it is, I honestly do not understand your logic in defending AMD and their sitting on a largely unchanged architecture for too long. Intel isn’t right, but they do offer the best CPUs so they keep their market share. AMD, as it is, seems to deliberately let Nvidia eat their cake.

    • EndlessWaves
    • 5 years ago

    “During that span, AMD has only released a few new discrete graphics chips; its chief competitor, Nvidia, has refreshed its lineup from top to bottom in the past 12 months.”

    Uh, what?

    AMD have released Tonga, Fiji and possibly a GCN 1.2 mobile part.
    nVidia have released GM200, GM204 and GM206.

    So either three chips each or two against three if the mobile part is a myth.

    Also nVidia’s lowest chip of the last year is half-way through it’s range at current prices, certainly not at the bottom of it. The GTX 750ti is more than a year old, the cards below that even older.

    Not to mention that AMD have released an entirely new fabrication process in HBM while nVidia have not even finished rolling out an the Maxwell architecture that initially released over a year and a half ago now, back in February 2014.

    Have AMD does less than nVidia in the past year? Possibly, but I think phrasing it like that is unfair and incorrect.

      • Falcon216
      • 5 years ago

      AMD has released five new cards in the last two and a half years, two of which were just part bins of the same chip.

      290/X, 285, Fury/X

      These are all high end high price video cards (with the exception of the 285), of which roughly only 2-5% of the entire dGPU market ever buys.
      Where AMD failed is their refusal to dump surplus stock of older cards and release new mid-range and low end cards, which encapsulate a very large portion of all GPUs bought.

      The 285, Tonga (GCN1.2, essentially a cut back Fiji with GDDR5 bus instead of HBM) showed that they could put a 7970GE performance level chip in a package consuming 20% less power.
      If they did away with all the older chips and produced only new GCN1.2 chips they could be selling performance competitive mid/low-mid range GPUs with a power draw comparative to Nvidia’s respective price brackets.

      Where are the HD7870/GTX950 level AMD cards that draw 130w? Nobody wants to see the 7870 as the only other price option for three years. The ~$150 market is the highest selling in volume, many people picked a generally much worse performing 750Ti over the 270, 265 (7870 or 7850 if you want to look at it that way) because of the low power draw even though they cost roughly the same. Power draw is the current marketing trend, the current sales staple information for the buyer, and AMD’s top FOUR cards all draw similar power at vastly different performance levels. And this is why they failed in the market.

      If AMD’s 4xx series is not a complete low end to high end product release they will fail once more.
      I’m of half a mind to think that next year we will see a Hawaii released AGAIN as the 480/X, the Fury bumped down to the 490/X, and the “Fury 2” brought in. All with the same/similar power draw. It would be embarrassing and terrible.

        • EndlessWaves
        • 5 years ago

        If the 290 is a high priced card then so are the GTX 970 & up, so if you’re going to discount them then nVidia have released one chip (GM206) in two configurations (950, 960), exactly the same as AMD (Tonga in 285, 380).

        You’ve got the power consumption figures the wrong way round there. The older R9 370 is similar in power draw to the GTX 950, around 100W vs. 90W in Battlefield 4, but the newer R9 380 draws a lot more than the GTX 960.

        The R7 370 is slightly off the pace against the GTX 950, 5-10% less performance at 5-10% more power draw – but then it’s also 10-20% cheaper.

        Given that it’s better value for money in terms of performance I’d have said it was an equally good choice right now, had the 370 had HDMI2/HEVC/Adaptive Sync support then it would have been the obvious one. Although frankly I wouldn’t buy 2GB card for any more than GTX 750ti prices these days, so the neither that nor the 950 is a compelling buy right now.

        AMD definitely could have done with something better than the R7 360 though. The GTX 750’s ability to do silent/single slot/no external connector models are a big advantage, and of course the GM107 is a huge selling mobile chip that AMD have nothing to match.

    • Bensam123
    • 5 years ago

    Weird I remember it being a month before you could find a R9-290 for $240 after the 970 came out. That and the 290 was routinely priced down to $220-230 until then the Fury/3XX launch and the R9-290X down to $250-270. I was recommending them till that point.

    That aside, ATI wasn’t exactly a stellar company before it was adopted by AMD. I don’t know if you guys remember the pre-catalsyt drivers, that was and still is where AMD gets the stereotype for ‘bad drivers’. Their graphics division isn’t that bad currently, lets hope this doesn’t fux things up.

      • Chrispy_
      • 5 years ago

      Well, there was some stagnation immediately after the AMD acquisition but the first “AMD graphics” products, rather than ATI graphics products were the 4000-series.

      Quite frankly, the 4850s and 4670 cards utterly nailed the upper and mid market, excellent cards that shamed Nvidia right in the middle of their re-re-re-rebranding of the G92 and struggling with hot and hungry GT200 cards.

    • tipoo
    • 5 years ago

    So they put the King at the helm, makes sense.

    (geddit? Raja? No? Ok.)

      • Ninjitsu
      • 5 years ago

      Jigar and I got it, yes. 😀

        • tipoo
        • 5 years ago

        #justbrownthings

    • guardianl
    • 5 years ago

    “The center of gravity for AMD’s graphics business will shift to the company’s headquarters in Sunnyvale, California. According to a source familiar with the matter, long-time ATI and AMD executive Matt Skynner will be leaving the company.”

    Koduri goes out and hires people who are working on technologies that will take years to see high volume commercial success (VR etc.), while AMD’s entire software efforts are a mess. Rather than have the leader where most of their work happens (Markham ON) they let him work from SoCal.

    Yeah, this doesn’t end well.

      • chuckula
      • 5 years ago

      The rumors that came out after the merger whirled around the fact that AMD basically acted like the Visigoths sacking Rome when it came to ATi. It did not produce happy feelings in the former ATi camp, and there are not a whole lot of people left from the old ATi at this point.

    • Meadows
    • 5 years ago

    His mug doesn’t look promising, but his resume does.

      • maxxcool
      • 5 years ago

      I dunno, had Edward smith lived his resume would have been toilet paper ..

    • gamerk2
    • 5 years ago

    Combine this with an Investment firm (Silver Lake) purchasing 20% of the company, I think it’s clear this is a precursor to a spinoff. This makes sense: AMD’s GPU division still has some value, so spinning (or selling) it off is the fastest way to generate cash.

      • maxxcool
      • 5 years ago

      Yes, even a google style split works ..

      • w76
      • 5 years ago

      And by far the best move for shareholders if its true: ATi will have a hard time fighting nvidia and Intel to stay relevant (nvidia has tried getting in to low power processors so they aren’t made irrelevant by SOCs). But their odds are far better on their own than with the incredible AMD CPU business anchor wrapped around their neck, choking off oxygen and pulling them deeper and saddling them with a brand that’s becoming synonymous with “bargain bin” or just outright “failure” in peoples minds.

      Spin off, call yourself ATi and your products Radeon, and get the tech elite thinking back to the glorious Radeon 9700 Pro. Marketing win.

    • southrncomfortjm
    • 5 years ago

    From what I can see, AMD can produce really competent hardware, they always just lose on the driver front. Maybe that’s already shifting with Windows 10 and DX12, but AMD needs to keep it up.

    I would love to make a return to the Red Team for my next card. They have to earn my business though.

    • Chrispy_
    • 5 years ago

    Since December 2011 AMD graphics have done practically nothing. This is a much needed change of tack but I’m just hoping it’s not too little, too late. The fruits of today’s news will not materialise in a graphics product for at least 18 months!

    Pitcairn is still in the AMD graphics lineup and Hawaii and Tonga were just very minor tweaks of what is essentially the same architecture as they’ve had for four years, near enough. Fiji may use HBM but realistically the reason it’s quick is because it’s huge and they could have built Fiji on 2011’s GCN 1.0 and still seen roughly the same results I think.

    Nvidia are making massive architectural investments and showing progress for it. AMD are just building bigger, hotter, more expensive GPUs on a very old architecture. The fact they’ve lasted this long is amazing, incredible – probably a testament to how good GCN 1.0 was four years ago, but that shouldn’t be an excuse to do nothing but minor tweaks since then.

      • gamerk2
      • 5 years ago

      AMD doesn’t have the money to replace the architecture. They cut their R&D spending after the Bulldozer fiasco, and the GPU division is getting starved.

      Why do you think they tried to push Mantle, which was designed SPECIFICALLY AROUND GCN? Because AMD knew GCN wasn’t going anywhere, and needed an edge to improve performance, since it’s quite clear their DX11 driver is very suboptimal compared to NVIDIAs.

      • Kougar
      • 5 years ago

      The coin mining bubble certainly helped them string along for a few years on its own, but now it’s time for AMD to face the music. 18 months sounds overtly optimistic unless the traditional 3-4 year GPU development cycle has shrunk? Whatever is already in the pipeline can’t change much, but I sure hope it isn’t the current architecture thrown together with HBM 2 else they are going to keep circling the drain.

      Re-releasing rebranded products that themselves were rebranded products is a sign of a company on the ropes with few good choices remaining. It’s a move that trades market share for time, but it can only be done for so long before there isn’t enough market share left to sustain the business. This news is a promising sign that AMD is making an effort to right the ship, I just hope they can.

    • maxxcool
    • 5 years ago

    /tin hat on/

    just re-read this … holy $%^& does this sound kinda of like “pre-split” maneuvering. you cannot sell something that has not leadership or full-ORG.

    “”The newly formed Radeon Technologies Group will be responsible for the graphics technology used in discrete GPUs, APUs, and semi-custom products like the chips used in the Xbox and Playstation. In his new role, Koduri will oversee everything from hardware and software development to product management, marketing, and developer relations.””

      • flip-mode
      • 5 years ago

      Meh, the number of times people said the tea leave suggest AMD is going to split or sell is too high to count. And it really doesn’t matter unless you’re an investor anyway. I don’t know why people care.

        • Leader952
        • 5 years ago

        With the recent rumor of “Private equity to buy 20 per cent of AMD” a spinoff may well be in the offing.

        • gamerk2
        • 5 years ago

        The problem is, AMD is really out of cash this time, and it gets worse in the 2019-2020 timeframe, where over a Billion in debt comes due, which they currently have no way to pay off. AMD desperately needs cash at a time they’re loosing several hundred million per quarter.

        All the markings are there for a spin; the only one missing is a wave of restructuring, which I’m sure will happen sooner rather then later.

        AMD as constructed won’t last much past Zen. I’m 80% sure the old ATI unit is getting spun/sold within a year.

          • maxxcool
          • 5 years ago

          Some of the aforementioned restructuring JUST occurred.

      • Ninjitsu
      • 5 years ago

      My first thoughts too. If “ATI” manage to get a solid injection of cash then it’ll be great for us. At the same time, AMD should get some cash to fund Zen properly.

        • Kretschmer
        • 5 years ago

        “ATI” can start pumping resources into badass cards and competitive drivers.

        AMD can just die already. We all win.

    • maxxcool
    • 5 years ago

    *THIS* is why Nv has the advantage. Let others do the work and research while making older tech work better and better with simple driver updates and better compression techniques.

    I cannot find the article on some third party site but AMD VP’s themselves have admitted their lack of real compression\optimization (outside the oddball 285) has crippled their ability to operate effectively on DX11.1 titles and OCL.

    so now AMD is here. with new tech, that despite what should be overwhelming hardware advantages only performing marginally better than non-hbm tech because of lack of driver focus. They can beat the DX12 drum all they want, dx12 will elevate Intel, their #1 competitor as much as NV and as much as themselves.

    in the end it will be a wash and we will be in the EXACT same positions as before. intel #1, Nv#2… Amd chin deep DESPERATELY treading water.

    Jen-Hsun Huang may be a sneaky bat-$%^& guy .. but he called this one right ..

      • xeridea
      • 5 years ago

      Benchmarks show, NV does worse with DX12 than DX11, while AMD does better.

        • maxxcool
        • 5 years ago

        When TR does the review. I will believe that.

        • chuckula
        • 5 years ago

        You’re statement is objectively wrong.

        Here’s why.

        You used the word “benchmark[u<][i<]s[/u<][/i<]".

          • maxxcool
          • 5 years ago

          lol .. true!

          • nanoflower
          • 5 years ago

          It’s also based on drivers that are clearly still being developed in advance of any games. AMD has had every reason to spend more effort on DX12 drivers since they clearly have decided DX11 is a lost cause for them. It will be interesting to see how AMD and Nvidia perform in DX12 in a few months. I think the two will end with very similar performance in games and benchmarks. At least those that don’t focus on a feature that one is significantly better than the other but ignore what games will do with such a feature.

            • xeridea
            • 5 years ago

            Just saying, Nvidia has been downplaying the new APIs since inception, touting all of DX11’s greatness. It is strange they do worse in a far more efficient API, that needs little to no optimization per game from a driver standpoint because it is made for how modern GPUs work, and it is far more predictable than DX11. They pointed fingers at their terrible performance to game bugs that didn’t exist. AMD with far less resources is able to make drivers just fine, the code for game has been out for a while.

            • chuckula
            • 5 years ago

            [quote<]Just saying, Nvidia has been downplaying the new APIs since inception, touting all of DX11's greatness. [/quote<] Oh Rlly? Here's a talk about Vulkan from Nvidia: [url<]https://www.youtube.com/watch?v=NqensKmmRfE[/url<] AMD has actually been the one talking about DX12 heavily while -- ironically -- giving almost nothing to Vulkan. That's funny because Vulkan is purportedly *closer* to Mantle than DX12, but you wouldn't know it listening to AMD.

            • maxxcool
            • 5 years ago

            When you bake the pie, you get to choose what works best in it. But as stated, I will agree with what ever TR comes up with when we have at least 5 **MAINSTREAM** fully accelerated dx12 **GAMES**

            … (no P VS Z does not count :P)

            • maxxcool
            • 5 years ago

            -1 for waiting for fair reviews by TR? amd trolls….

            • f0d
            • 5 years ago

            100% agree
            a benchmark of an alpha test of an unreleased game does NOT give a clear indicator of how dx12 will perform on amd or nv

            we will need at least 4 or 5 dx12 games on different game engines to make that decision

            • K-L-Waster
            • 5 years ago

            It’ll be academic in a few months, same as it is now. Most major game studios are still wrestling with whether to cut their 2016 games over to DX12 or not — if there are more than 5 titles with DX12 by June of next year I will be surprised, to say the least.

          • Ninjitsu
          • 5 years ago

          Also because they perform the same. 😉

        • gamerk2
        • 5 years ago

        Performance was within a FPS or two, which points at suboptimal drivers. As for AMD, it shows, again, how bad their DX11 drivers have been.

        Oh, its worth noting, AMD DX12 was slower then NVIDIA DX11. Just saying.

        • ColeLT1
        • 5 years ago

        I don’t speak for everyone, but once I start playing DX12 games it won’t be on the hardware I have now.

        • Sabresiberian
        • 5 years ago

        The one test suite done on something that is not a game showed AMD’s cards gaining more performance than Nvidia cards using DX12. Unfortunately for AMD, that just brings their offerings up to around what Nvidia had already using DX11, maybe slightly better, overall.

        [url<]http://www.pcper.com/reviews/Graphics-Cards/DX12-GPU-and-CPU-Performance-Tested-Ashes-Singularity-Benchmark[/url<]

      • flip-mode
      • 5 years ago

      So… TL;DR: drivers?

        • maxxcool
        • 5 years ago

        Seems to be.. but we will see in 6+ months

      • the
      • 5 years ago

      I disagree.

      If you need to leap frog your competition (and on the CPU side, AMD certainly does), you have to invest into research to do it. Otherwise applying the same formula of more cores, more clock speed and more bandwidth will only let you scale so far.

      AMD’s reasoning for including so much compute functionality into their first Radeon HD 7000 series product wasn’t for DirectX12, that wasn’t even an idea at MS then. Rather it was for AMD implementing HSA which they have had plan long before Tahiti. It just make sense to utilize a common architecture between discrete and integrated designs when ever possible.

      The real problem with AMD right now is that they misjudged TSMC and their 20 nm process. AMD bet that it’d be good for GPU’s and it didn’t pay off. Hence why Fiji is nearly 600 mm^2 on a 28 nm process and the rest of the 2015 GPU line up are rebrands. AMD has admitted to scrapping several 20 nm products in their recent financial analyst calls. This clearly made 2015 an endurance year for AMD.

        • maxxcool
        • 5 years ago

        I agree that TSMC has yet again clipped a hopeful silicon design ..

        And yet, the driver focus they are only now showing is a bit late and has done as much damage as TSMC’s efforts. AMD is not faultless in this scenario.

      • maxxcool
      • 5 years ago

      The real winner is Intel in the end 🙁

        • Meadows
        • 5 years ago

        …Hey, as long as somebody wins!

    • chuckula
    • 5 years ago

    Meanwhile, back at the fab, the Furry X is in the middle of massive supply issues and cards are going for inflated prices on the second-hand market: [url<]http://www.amazon.com/s?ie=UTF8&field-keywords=R9%20Fury%20X&index=blended&link_code=qs&sourceid=Mozilla-search&tag=mozilla-20[/url<] And it ain't because they've sold out due to overwhelming demand. If this sounds like an issue with production not doing its job, it is. And unlike early 2014, AMD can't even have the excuse of Litecoin miners snapping up R9-290s and then dumping them 3 months later, since that ship has sailed.

      • nanoflower
      • 5 years ago

      Perhaps Nvidia was right to skip HBM in 2015 and let AMD be the first to market with a HBM product? So AMD deals with the production issues and hopefully Nvidia won’t have similar issues next when both AMD and Nvidia move to HBM2.

        • maxxcool
        • 5 years ago

        DINGDINGDING!

        • chuckula
        • 5 years ago

        Yeah, Nvidia could certainly have supply issues with HBM2, no doubt.

        Waiting until next year improves the odds of a smooth rollout, but it doesn’t guarantee a problem free launch.

        They were *very* smart to not bet the house on HBM this year though, since we’ve seen it hasn’t really hurt them.

          • Chrispy_
          • 5 years ago

          And HBM is wasted on the Fury because it still has a massive ROP and Raster bottleneck.

          After reading all the reviews, it seems that HBM was used solely to shrink the memory controller enough to make a 4096-shader part, and that the smarter move, with hindsight, might have been a 3072-shader part with better ROP and Triangle throughput….

          edit,
          if you want a specific link, take a look at what [b<]hasn't[/b<] changed between Hawaii and Fiji in Scott's table on [url=https://techreport.com/review/28513/amd-radeon-r9-fury-x-graphics-card-reviewed<]page 1[/url<]. Hawaii was not bandwidth starved. OpenCL benchmarks and synthetics prove this, so effectively Fiji is Hawaii's front end with 40% more shaders, and it's looking like some of the time, even Hawaii has more shaders than it can use (like when a 1GHz 290 performs exactly as well as a 1GHz 290X despite the extra 256 shaders on the X.

            • maxxcool
            • 5 years ago

            Link? I do not doubt it but would love to read it!

            • Chrispy_
            • 5 years ago

            Link? Every Fury X review ever. Performance did not scale as expected and almost every reviewer concluded that that zero improvement in the ROP and Raster/Triangle portion of the GPU was to blame. It certainly wasn’t a lack of shaders or bandwidth….

            • maxxcool
            • 5 years ago

            🙁 dang was hoping for a link with pie level analytical stuff ..

            • Chrispy_
            • 5 years ago

            Heh. Sorry.
            Try eating some pie if that’s what you’re hankering for. I recommend bacon flavour.

            • maxxcool
            • 5 years ago

            🙂 bacon apple pies!

        • the
        • 5 years ago

        History repeats itself. AMD was the first to GDDR5 which gave them a competitive bandwidth despite having a narrower memory bus at the time. At the time, it paid off rather well for AMD.

        This time, not so much considering the unbalanced nature of Fiji. AMD just increased the shader count which is good for compute but left the geometry processing and ROP the same from the R9 290X. Ultimately it would have been a better product had AMD increased this aspects of the design alongside the share count (say to 3584 shaders instead of 4096).

        nVidia on the other hand learned the hard way never to do too much at once with a large flagship design. GF100 on a relatively new process and well over 500 mm^2. The result is a product that was months late, ran hot and had horrible yields. It wasn’t until GF110 nearly a year later that everything was fixed.

        nVidia maybe setting themselves up for another GF100 if they go for the large Pascal early next year. TSMC’s 16 nm FinFET process is very new and HBM is another layer of complexity. I see nVidia launching the midrange GP104 chip in early 2016 but without HBM. The big Pascal chip will come later in 2016.

Pin It on Pinterest

Share This