AMD touts unified gaming strategy

Last week at GDC, AMD held an event for the press highlighting its commitment to gaming and graphics while dropping a series of newsy bits along the way. We’ve already reported on some of the news items, but the firm had a broader picture to paint that we’ve not yet conveyed.

One of the most noteworthy messages coming from AMD was simply the firm’s reiteration of its dedication to the gaming business. This is, after all, a company attempting to transform itself under the direction of a new management team, in the context of a shifting landscape, and it has some tough choices to make about where to commit its very finite resources. AMD has been consistent about emphasizing its commitment to gaming, even in the face of the news that the Radeon HD 7000 series won’t see the expected refresh until the end of 2013. That emphasis is noteworthy because we haven’t been hearing the same sort of noises related to, say, high-end desktop processors or even Opteron—not to this degree, at least.

One reason AMD is committed to gaming, of course, is because it’s a very big market. Matt Skynner, the firm’s Graphics GM, shared some projections based on data taken from analyst firms IDC and JPR, alongside AMD’s own internal numbers. Those projections peg the PC gaming hardware and software markets at about $40 billion per year and growing—an eye-opening number.

As you can see in the slide above, that makes the overall PC gaming market roughly the same size as the console gaming market—and both are several times the size of the various mobile gaming segments. We keep hearing that much of the growth in the PC gaming market is coming from non-traditional sources like free-to-play games, MMOs, and regions like APAC. Still, the sheer size of the overall market is more formidable than PC gaming’s profile in the U.S. media would seem to suggest.

From there, AMD’s ensuing presentation was built around an alliterative collection of four “Cs”: cloud, content, console, and client, all pillars of its “unified” gaming strategy.

The “cloud” portion of the picture is AMD’s response to the opportunity presented by services like OnLive. In this model, computationally intensive games run on a remote server, and the audiovisual info they create is streamed to a relatively thin client over a network. AMD’s rival Nvidia has taken an end-to-end approach to cloud gaming, developing a turn-key solution that includes everything from GPUs with hardware virtualization to a hypervisor software layer and a rack-mounted chassis. True to its usual form, AMD has taken a more partner-centric approach where it provides the hardware and lets other firms handle the integration work.

To enable those partners, AMD announced the Radeon Sky series of graphics cards. These are fanless graphics cards with big heatsinks intended for installation in high-density servers. The headliner of the series is the Sky 900, which packs two of the Tahiti GPUs that power the Radeon HD 7970.

AMD has lined up several partners for its cloud gaming efforts—probably not hard to do, since Nvidia has essentially chosen to compete against them. The most visible of those partners to date has been CiiNow, whose VP of Marketing and Publisher Relations, Chris Donahue, was on hand to talk about his firm’s Cumulus cloud gaming platform. His message focused on the the number-one question about Internet-based cloud gaming services: input lag, or the gap in time between a button press and a visible reaction in the game world. Donahue claimed Cumulus can achieve about 17% lower end-to-end latency than some Xbox 360 games do locally. The keys to this success? One, CiiNow has optimized every stage of the process to reduce latency. Two, today’s consoles have input lag on the order of about 120 milliseconds, by his numbers. Yikes.

Neal Robison, AMD’s Senior Director of Software Alliances, took the happy job of highlighting AMD’s success in the console market. With the PlayStation 4 spec already announced and details on Microsoft’s next Xbox expected soon, we may see an unprecedented unification of underlying hardware architectures shortly. The two major consoles and roughly half of the PC market will be based on x86-compatible processors and GCN-derived GPUs. Robison argued this trend will be positive for all involved because it should let game developers focus on creating content, not on worrying about architectural limitations or differences.

Analyst Rob Enderle then took the stage to elaborate on that point. He claimed the gaming market has been negatively impacted by the current generation of consoles outstaying their welcome. However, with the move to x86 processors and Radeon graphics, Enderle believes the console makers could introduce new generations of consoles more often while maintaining compatibility with older software. He also thinks the use of a common graphics architecture across platforms will reduce the cost, and thus risk, of creating AAA games with rich visuals, mostly because a large chunk of development costs today goes to porting between platforms.

AMD’s new Chief Product Architect for Graphics, John Gustafson

Up next was John Gustafson, AMD’s new Chief Product Architect for Graphics. Gustafson’s background is in high-performance computing; he joined AMD last August. He briefly reiterated AMD’s GPU-related goals and offered a sketch of the short-term product roadmap, which includes more GCN-based notebook and desktop GPUs in the first half of 2013, followed by new GPUs and APUs in the second half of the year. Longer term, Gustafson expects AMD’s graphics chips to enable much better physics simulations and new levels of realism in games. As an indicator of the potential there, he claimed that today’s best GPUs could render films from a couple of years ago in the time it would take to watch the movie.

Both Gustafson and the next speaker, AMD Director of Developer Relations Ritchie Corpus, underscored AMD’s commitment to optimizing games for its hardware, but doing so in a way that keeps faith with industry standards. As examples, Corpus cited the use of DirectCompute to implement some value-added features it contributed to recent games, including the global illumination lighting path in DiRT 3 and the TressFX hair simulation used in Tomb Raider. Both features will run on any DirectCompute-capable GPU, a contrast to the proprietary approach Nvidia has taken with its PhysX and CUDA value adds, which only run on GeForce cards.

Corpus then introduced a succession of game developers who spoke about the ways they’ve partnered with AMD. Brian Horton, Senior Art Director for Crystal Dynamics, explained how AMD handed off its TressFX technology for integration into Tomb Raider. By video, Ken Levine and Chris Kline from Irrational Games talked about how they added contact-hardening shadows, HDAO, and multi-monitor support to BioShock Infinite with the assistance of AMD. And Nick Button-Brown from Crytek told the story of AMD engineers coming on-site to Crytek’s studios during the development of Crysis 3. Button-Brown also revealed that the next patch for Crysis 3, coming this week, will include “full Eyefinity and HD3D support,” which could be pretty amazing considering the astounding quality of the game’s visuals.

The session ended with a couple of sneak peeks. First up, folks from Crytek and Illfonic gave a quick preview of a project they’re working on for AMD: a new graphics demo featuring the “Ruby” character whose real-time demos used to headline each Radeon launch. The new Ruby demo will use CryEngine and TressFX, and it will be directed by Hollywood director Simon West, who did The Expendables 2, among other films. Ruby looks very different—she’s clearly been rebooted. Unfortunately, the demo was too short and quick for us to snag any decent pictures of this work in progress. Judging by the current state of things, we’d wager it’s being prepared for the launch of the new generation of GPUs due at the end of 2013.

Finally, AMD Product Manager Devon Nekechuk held up a video card that’s coming soon: the Radeon HD 7990. With dual Tahiti GPUs and three large fans onboard, Nekechuck claimed it will be not only “the world’s fastest graphics card” but also “whisper quiet.” There are dual-Tahiti cards on the market now from a couple of board makers in fairly low volumes, but this card will be AMD’s own reference design. AMD teased the possibility of the 7990 over a year ago, at the Radeon HD 7900 series launch, by showing a slide with the outline of New Zealand (the “southern island” code name for this dual-GPU part) alongside the words “Coming soon!” Now, it seems, the 7990 is being reanimated as a means of plugging the gap until the next-gen Radeons arrive at the end of the year.

Comments closed
    • melissanxel032x
    • 7 years ago
    • Chelseyg1244
    • 7 years ago
    • Aprilg039xoxo
    • 7 years ago
    • sarahmarshallpsu230
    • 7 years ago
    • jackiesz0911a
    • 7 years ago
    • ronch
    • 7 years ago

    So what AMD is saying, is that 20 years from now we’d all be doing nothing but play games.

      • brute
      • 7 years ago

      to assume that aMD will be its own company in 20 years and not a subsidiary to PlaySkool ToddlerComputer division is funny

    • sarahatler008
    • 7 years ago
    • XaiaX
    • 7 years ago

    “Two, today’s consoles have input lag on the order of about 120 milliseconds, by his numbers. Yikes.”

    This is complete nonsense. Fthreear *may* have 120 ms of input latency, but no one is holding that up as a paragon of responsiveness in gaming. A more reasonable comparison would be to some iteration of Call of Duty, which has much lower latency, because they specifically write the engine that way. (And also to run at 60fps, which is the main driver of latency in any game, since they usually tie their input polling to their image refresh, making an inconsistent frame rate lead to inconsistent controller feel.)

    It’s just AMD cherry picking a sluggish game to make their numbers sound better. I imagine a direct comparison of input latency for the same game across systems would show negligible differences between any of the consoles or PCs.

      • El_MUERkO
      • 7 years ago

      Interestingly Sony’s PS4 lead designer made a host of changes to the controller to combat input lag which was a problem with the PS3. Check out Digital Foundry on Eurogamer, they go into it in great detail. I’d love TR to work with DF, my too favourite sources of game-tech information & opinion.

    • Mat3
    • 7 years ago

    Hopefully the rebooted Ruby won’t have that mole.

    • Airmantharp
    • 7 years ago

    Thinking about this a little more, I agree that AMD will be able to get GCN in a lot of places, like tablets, but consoles just don’t count as ‘everywhere’. Consoles may be ubiquitous, but coding for either one will require a different set of tools, each being different than coding for a PC environment running Windows, OSX, or Linux.

    x86 is going to be everywhere only because Intel wills it so.

    • liquidsquid
    • 7 years ago

    So for roughly 85% of the rest of the population of the US… you do you reduce input lag below 120mS when they are on DSL at best? Cloud gaming won’t be “the next big thing” until the US starts investing in internet infrastructure to bring higher speeds and great bandwidth to the masses.

      • l33t-g4m3r
      • 7 years ago

      Not to mention how insane you’d have to be to actually “buy” games over such a system. I’m sure there’s a few people here that would do it, but they aren’t people who care about owning what they buy.

    • sutyomatic
    • 7 years ago

    John Gustafson? I bet then Max Goldman is working at nVIDIA.

    • Ryu Connor
    • 7 years ago

    [quote<]Nekechuck claimed it (7990) will be not only "the world's fastest graphics card"[/quote<] I'm sure no runt will be left behind.

      • JustAnEngineer
      • 7 years ago

      I posted in the comments to the FCAT thread. I expect that the full render is done in the frame buffer, but it is quickly obsoleted by a fresh frame from the other GPU before very many scan lines are output to the display. It’s not that work is being skipped, it’s that the two GPUs are too close to being in sync rather than being completely out of phase with their alternate frame rendering.

    • My Johnson
    • 7 years ago

    I got to say the TressFX was actually worthwile. I ended up wishing more characters in the game used it.

      • HisDivineOrder
      • 7 years ago

      I like how the video flashback moments where you see what she recorded on the camcorder (in Tomb Raider) have the normal hair and then the game proper has her with her new hair.

      It’s like when she took a dip in the ocean, she washed her hair at the same time. It came out so light and fluffy and bouncy and shiny. Before, it was clumpy and seemed to have tangled together. After that swim in the ocean, it was so shiny. Only problem is she seemed to have split ends.

      She needs to get some conditioner, stat.

    • JustAnEngineer
    • 7 years ago

    [quote<] AMD's commitment to optimizing games for its hardware, but doing so in a way that keeps faith with industry standards. As examples, Corpus cited the use of DirectCompute to implement some value-added features it contributed to recent games, including the global illumination lighting path in DiRT 3 and the TressFX hair simulation used in Tomb Raider. Both features will run on any DirectCompute-capable GPU, a contrast to the proprietary approach Nvidia has taken with its PhysX and CUDA value adds, which only run on GeForce cards. [/quote<] This is reason enough to cheer for the good guys.

      • HisDivineOrder
      • 7 years ago

      The very second you call any corporation “the good guys” you really should reevaluate your biases. No corporation is good. No corporation is bad. They’re like a hurricane or a shark. There aren’t good sharks and bad sharks. There aren’t good hurricanes and bad hurricanes.

      They just are. If you assign “good guys” to AMD, then you are giving them attributes they do not have. They want to make money. That is all they want to do. If making money meant selling gaming out and making waffle irons, they’d do that. If making money against an entrenched company that has the resources and money to develop an end to end solution for cloud gaming, then they’ll go the cheaper, easier route of collecting partners and promising big things while not really working out the particulars. If making money meant delaying CPU’s and GPU’s for six months to a year to fill their money coffers, then they’ll delay. If making money means having to take a card they already designed last year and release it now, then they’ll do that.

      Nothing AMD does is good. It just is. AMD doesn’t believe they could get away with proprietary gimmicks because they just don’t have the high end GPU’s to push it. nVidia doesn’t either, but they think they do.

      Corporations may not be good or bad, but they can be deluded as hell.

        • JustAnEngineer
        • 7 years ago

        You’re probably right. AMD is neither good nor bad. They just [b<]appear[/b<] to be good in comparison to the black-hearted evil marketing geniuses at NVidia. ;p

        • l33t-g4m3r
        • 7 years ago

        I disagree. You may be right to a large degree with generalization, but there are plenty of corporations that do operate with a code of ethics greater than their competition, and people looking for ethics will buy their products over the competition for that reason. Sometimes you want to buy American over Chinese, or locally grown organic over mass produced in mexico. There’s a market for this, and AMD does seem to be trying to fit in this niche, and has been for years.

          • peartart
          • 7 years ago

          yeah, calling all corporations amoral ignores the fact that some of them do clearly immoral things, and doing immoral things is the usual definition of immorality. It’s easiest to see if you look back at history and see stuff such as violent union busting and other exploitative labor practices.

            • Airmantharp
            • 7 years ago

            Specific morality is undefinable and fluid; for a corporation, not doing everything they can get away with to make profit for their stakeholders could be considered immoral.

            • peartart
            • 7 years ago

            Sure, you can use a variety of definitions of morality (only mathematicians have a right to expect abstract concepts to be well-defined), but the situation you gave doesn’t create amoral corporations either.

    • ashleytehone039o
    • 7 years ago
    • spigzone
    • 7 years ago

    “One reason AMD is committed to gaming, of course, is because it’s a very big market”

    Reason 2 – only one major competitor.
    Reason 3 – that competitor is vulnerable.

    Reason 4 – a CEO with a winning record intent on laying Nvidia’s gaming guts on the floor.

      • HisDivineOrder
      • 7 years ago

      You forgot Reason 5 – It’s the one area where AMD is actually still vaguely competitive.

      • kilkennycat
      • 7 years ago

      Nice to be top silicon dog in the console market, with razor thin profit margins and continuous downward price pressure from the console manufacturers. nVidia got a very nasty taste of that from Microsoft on the original Xbox. And since AMD’s financial position is highly precarious, I’m sure that Sony (and Microsoft, if the next Xbox has AMD silicon) will have insisted on access to the complete design and fab information…… should AMD go “belly up”, or be acquired by a third party..

    • tbone8ty
    • 7 years ago

    7990 Malta has to be “sea islands” updated Tahiti chip since they are saying its “whisper quiet”

    i wanna see a Kabini based Razer Edge for around $499

      • HisDivineOrder
      • 7 years ago

      I wanna see a $100 iPad, a $500 Razer Blade, a $1 Oculus Rift, and a deal where they PAY me to own an Ouya.

      Alas, we don’t get what we wanna see. We get what they want for us instead.

    • spigzone
    • 7 years ago

    AMD is going to [b<]C-R-U-S-H[/b<] Nvidia in the gaming space. It had the tools and now it has the leadership to make it happen. By this time in 2015 Nvidia will be a fringe player in consumer graphics sales.

      • NeelyCam
      • 7 years ago

      Good troll. +1

      • LoneWolf15
      • 7 years ago

      Speaking for AMD fans -we/they don’t need you.

      • My Johnson
      • 7 years ago

      I offer you a 😉 face just to be safe.

      • HisDivineOrder
      • 7 years ago

      April Fools!

      • spigzone
      • 7 years ago

      All your C-R-U-S-H are belong to me!!!

        • brute
        • 7 years ago

        camel crush is as gross as 60 year old hooker booty

    • spigzone
    • 7 years ago

    For anyone looking to play in the gaming space, AMD provides a set of synergistic integrated one stop cutting edge solutions nobody else can come close to matching. An AMD x86 gaming hegemony is looking like a real possibility.

    Developers can only see Kaveri and it’s successors with $$$ in their eyes, super easy to program and optimize for and turning even entry level PC machines into gaming machines capable of spectacular graphics and gameplay opening up a new demographic to sell their games to. AMD wins, the developers win … Intel and Nvidia lose.

    I’m thinking those rumors of JHH being considered for the job of the next Intel CEO are looking better all the time.

    • indeego
    • 7 years ago

    I think it’s safe to say whatever AMD’s strategy is it would do well to bet on the competition’s as a pretty safe bet.

      • spigzone
      • 7 years ago

      That makes no sense. Rory Read has a winning record to date and is doing a bang-up job at AMD so far. The safe bet is AMD.

        • NeelyCam
        • 7 years ago

        He’s been doing pretty well all things considered. But AMD is still doing so badly that I would bet on NVidia.

        • HisDivineOrder
        • 7 years ago

        Are you kidding? He fired a lot of the engineers and R&D. Axed their PR team right before two big launches. Then as his big innovation in the one area where AMD is doing well (ie., GPU’s), he has them delay everything a year after what’s left of the driver/engineering teams just discovered they completely missed the boat and were giving up performance/smoothness for years.

        I can’t decide if he didn’t fire enough of them or if he fired so many it negatively impacted their work. Certainly, I haven’t seen anything Rory’s done to warrant a, “bang-up job.”

        Unless you were being sarcastic. In which case, you’re not doing a bang-up job on that. 😉

          • NeelyCam
          • 7 years ago

          Rory has been very effective in cutting cost. Pundits argue over if some parts should’ve been cut or not, but Rory’s trying to improve the company’s cash flow so they’d have a future, and then focus on building that future.

          AMD will never again be able to make good margins on consumer desktop products, and it’s unlikely they can do that on high-end mobile products either. Intel is too far ahead in cost/performance (cost != price). However, focusing on high-margin server business and low-end mobile/tablet (Jaguar) seem like good margin propositions.

          Rory’s impassionately looking at numbers, and cutting out whatever doesn’t make sense, instead of having emotional ties to this project and that project. This is exactly why I think non-engineers make the best managers/CEOs.

            • anotherengineer
            • 7 years ago

            Non-engineers……………hmmmm I recall seeing a post from you before saying JH Huang is a decent CEO.

            [url<]http://en.wikipedia.org/wiki/Jen-Hsun_Huang[/url<] and he is an engineer.

            • Airmantharp
            • 7 years ago

            I think you’re right on all points- if AMD hadn’t cut when they did, they might’ve gone under. They’re focusing on what matters while their competitors enjoy the margins of less competition; sadly, that means Intel and Nvidia get to sit on their own products for the same amount of time.

            But focusing on servers, where Bulldozer was targeted, makes much more sense- here, they can be cost competitive with Intel, and even performance competitive. Tablets are another win, where people want real x86 with full-fat Windows and real graphics performance in something portable.

    • alwayssts
    • 7 years ago

    I think AMD is certainly going to market products scaled across resolutions and settings, with GPUs/APUs purposely overlapping the xbox/ps3 specs (granted in a different config). It should help them acquire a foothold in each segment while creating consistency of scalability of products across resolutions. It could potentially be a big deal when they are setting the goal-posts (especially for use of compute etc vs current games where extra shader resources are often not needed).

    Also, I too remember Ruby accompanying each product-generation launch. Too bad ya’ll couldn’t snag a picture, but appreciate the details. I question if CryEngine is the right choice, but we’ll see.

    I guess we’ll have to wait for ‘the coming weeks’ (which in marketing-speak I take to mean less than 2-3 months, ie this quarter) for it to be unveiled…alongside either 7890 or 7990, I presume.

    FWIW, if AMD doesn’t launch a new series but launches 7790, 7890, and 7990 (the later officially) somewhere over the course of 3 months, with the former being new chips and the later being (if they fix frame-time issues) similar to a higher-end GPU, can it really be called a delay (for Bonaire/Hainan) compared to perhaps re-launching them 6 months later with slightly different specs (ie when faster/higher density ram is available and salvaged gpus are thoroughly collected/binned)?

    I’m kind of curious on peoples’ thoughts on the last part.

    (Edited for grammar.)

      • HisDivineOrder
      • 7 years ago

      I bet they chose Crytek as the engine after they saw the faces in Crysis 3. That old woman…er… I mean, Psycho is very detailed.

    • Bensam123
    • 7 years ago

    Huh… that graph still shows the console and handheld markets growing at the same rate… The mobile market should be consuming the handheld market and the console market will soon to be taking a downturn.

    Of course that may incite some sort of controversy. It’s best not to actually make predictions when predicting and instead show last years trends, this way you can avoid being wrong and even if you are you can say it’s based off old data. It’s win-win!

    It is sort of interesting how AMD is more closely working with software developers in order for them to do things they could’ve done on their own, like physics simulations for hair. I’d almost say they’re more interested in seeing these features then the game developers are. It does suck that AMD is doing this in a way to alienate Nvidia users though instead of pushing the entire industry forward, such as making or working with some sort universal standard. Such as TressFX for OpenCL or something.

    …I wonder if AMD considered making their own physics SDK that doesn’t have limitations, since Havok is owned by Intel and hasn’t seemingly done anything with it and all AMD has done with PhysX is turn it into a cheap two dolla crack whore.

      • Game_boy
      • 7 years ago

      So I’ve seen the data that says DS>3DS and Vita>PSP in software sales.

      But where is the data that the mobile industry revenue is increasing year-over-year? Where are the companies with big public profits from this growth? It’s just assumed that one leads to the other but I hypothesise it’s just the 3DS/Vita overshooting the DS/PSP audience with unwanted technology.

        • Rand
        • 7 years ago

        Vita has been a disaster obviously and the PSP wasn’t really a big success itself, but the 3DS appears to be selling very well after a bit of a rough start so I’m not sure there’s a big concern about the near term viability of dedicated gaming handhelds.

        • Bensam123
        • 7 years ago

        I don’t know why you’d see any handhelds continuing to be successful. Pretty much everything you can do on a handheld you can do on a smartphone or other mobile devices, they just need to port their games to them. Nintendo would make a killing doing this as all their games are pretty casual and laid back to begin with, they perfectly fit the mobile market. I guess they see more sales coming from the hardware devices?

        Either way, after using a smartphone and playing a couple games on it, it’s pretty easy to see this niche disappearing entirely.

    • Alexko
    • 7 years ago

    Did they specifically say that the 7990 would have Tahiti GPUs?

    • StashTheVampede
    • 7 years ago

    The concept of a “network” rendering GPU is pretty slick, but can we use it home? Imagine your “new” home server that has one (or a few) of these Sky based cards and your current computer simply can’t run the latest game. Setup the client/server and now your server is doing all the rendering to your several year old card and all you’re performing is the interactions.

    AMD is definitely targetting this for the online gaming market, but there is a tiny niche of users that would love to stream this out at home.

      • Waco
      • 7 years ago

      I would personally love this. My HTPC would then house a “big” GPU with hopefully reasonable capabilities and then I’d simply build a tiny box to stream games with that would sit in my office.

      I highly doubt that’ll come to light…but a man can dream, right?

        • StashTheVampede
        • 7 years ago

        Imagine replacing one card in your house, but it powers 3-4 computers at the same time.

        • spigzone
        • 7 years ago

        Essentially what Steambox is meant to do. I know Valve mentioned Nvidia at CES, but am thinking AMD will win the contract as all the reasons it ended up with the console wins would apply to what Valve needs.

          • Deanjo
          • 7 years ago

          Um, ya, no, for one simple reason steambox is linux based and AMD’s linux development team has been whittled down to next to nothing.

            • spigzone
            • 7 years ago

            Actually AMD’s Linux development teams were re-organized and re-focused. Bet you’ll never guess where that new focus is directed.

            • NeelyCam
            • 7 years ago

            Windows 8?

            • Deanjo
            • 7 years ago

            Part time janitorial duty.

            • Deanjo
            • 7 years ago

            re-organized = shrunk re-focused = reallocated to other non linux projects

            • ermo
            • 7 years ago

            Didn’t AMD try to hire a lot of Linux engineers around this time last year?

            I get that the new management needed to contain costs, but a bit of stability after this period of upheaval probably wouldn’t go amiss. I would tend to think that engineers need time and job security to perform at their best.

            Putting on my slightly rose tinted glasses, I would tend to say that the recent incremental improvements they are putting out, coupled with their design wins and their improved focus on mobile computing as well as the alliances they are making and their embrace of heterogenous computing (x86+ARM+GCN) makes me wonder if AMD are actually slowly, but steadily plotting a course out of the downward spiral by executing well despite limited resources.

            If one looks at the things AMD have cancelled or postponed, it appears to me that Rory & Co. are focusing less on innovation and new designs and more on better execution with existing tech and manufacturing processes, thus attempting to make the most of their talent and their investments. Here’s hoping that this newfound focus on execution in the short and medium term will pay dividends in the form of increased sales volume and profitability and in turn help fund their long term R&D.

            • Deanjo
            • 7 years ago

            [quote<]Didn't AMD try to hire a lot of Linux engineers around this time last year?[/quote<] No in fact they shut down their OS Research center that was responsible for adding items like cpu feature support (such as ACPI, scheduling, etc) to the kernel. Those are being maintained on a volunteer basis by the former engineers that were laid off. For how long.... we don't know.

            • ermo
            • 7 years ago

            I misremembered — it was two years ago, not one (so March 2011):

            [url<]http://www.zdnet.com/blog/btl/amd-to-hire-1000-it-positions-globally/45944[/url<]

            • Deanjo
            • 7 years ago

            Two years ago was about 4 restructuring plans ago in AMD land.

      • peartart
      • 7 years ago

      GPU virtualization should eventually come to consumer hardware like it did for CPUs, so you are more likely to be passing a virtual GPU from your main desktop to a laptop/HTPC than to have a server dedicated to this.

      • HisDivineOrder
      • 7 years ago

      What you describe is essentially nVidia’s Project Shield tech. I agree, I think this would be great. I wish nVidia would skip the handheld whatever and focus on making this tech work for regular laptops running nVidia or Intel integrated GPU’s served by nVidia Geforce powered desktops.

      However, the company that really should have built this into a standard for all computers is Microsoft. That is, if they even gave a damn about PC gaming anymore.

    • Sam125
    • 7 years ago

    Sounds like AMD has a coherent plan to include their entire portfolio of gaming products and services. Everything sounds good on paper but to anyone who isn’t a hack, AMD needs to execute well enough for their unified gaming strategy to matter.

      • spigzone
      • 7 years ago

      With the console wins their unified gaming strategy already matters.

        • HisDivineOrder
        • 7 years ago

        How long do you figure their console wins is going to enable them to have the better, smoother ports they seem to be promising? One generation of GPU’s? Or do you think Maxwell will so far outstrip their performance that they won’t even have that?

    • brute
    • 7 years ago

    “We’ll have bad drivers across the entire product line!”

      • NeelyCam
      • 7 years ago

      I would have upthumbed you, but you failed to mention first in a first post, so…

      • Deanjo
      • 7 years ago

      Consistency is key.

        • JustAnEngineer
        • 7 years ago

        Remember years ago when NVidia’s GPUs botched DXT1 texture compression and NVidia refused to fix it for several GPU generations? The result: The entire game development industry quit using that type of compression because they didn’t want to have to maintain two sets of code: one for everyone who did it right, and one for NVidia.

          • Ryu Connor
          • 7 years ago

          You mean a compression scheme that is patent encumbered and undesired in open source software? A permissive standard that allowed engineers to choose different implementations of compression?

          Yeah, I’m sure the situation is a red and green dichotomy.

            • JustAnEngineer
            • 7 years ago

            I mean one where the ATI, Matrox and 3Dfx cards call matched the software result and NVidia’s didn’t.

            • Ryu Connor
            • 7 years ago

            The standard did not have a uniform compression implementation. Saying didn’t match the software is irrelevant.

            • Airmantharp
            • 7 years ago

            Wikipedia agrees with Ryu, though I see where both of you are coming from. Had to look it up, but it looks like Nvidia decided to cheapen out with a 16-bit decoder that caused color banding in every GeForce before the FX-leafblowers. Given that it’s on the decompression side that was specified on DXT1-DXT5 while the compression side was patented but left open for tuning, it’s easy to see how such a hardware configuration could be interpreted as a flaw or ‘broken’.

          • Deanjo
          • 7 years ago

          Hey I’ve have a still open bug report for 5 generations of Radeons that still isn’t addressed when resizing GDI+ windows on certain resolutions causing systems to hard lock (non-GCN generations only).

            • willmore
            • 7 years ago

            Sounds like the typical Ubunto “Won’t fix” bug. 🙁

            • Deanjo
            • 7 years ago

            GDI+ is a windows bug.

            • willmore
            • 7 years ago

            Yes, hence the ‘sounds like’. Like meaning ‘similar to’.

          • brute
          • 7 years ago

          reality doesnt factor into what i post

      • 5150
      • 7 years ago

      indeego’s alt?

Pin It on Pinterest

Share This