AMD ramps up its Gaming Evolved program

On June 6, Nvidia celebrated the 10th anniversary of its The Way It’s Meant To Be Played program. The festivities were well-deserved. These days, it’s almost expected to see Nvidia’s logo among the intro splash screens of PC blockbusters. Anyone who’s picked up, say, Borderlands 2 or Batman: Arkham City can attest to that fact. Seeing the logo usually means Nvidia has collaborated closely with the game’s developers, and it often hints at support for proprietary Nvidia technologies like PhysX and 3D Vision. Nvidia has used the TWIMTBP program to push other, non-proprietary features, as well, like its FXAA antialiasing scheme.

Historically, AMD has had a much lower profile than its rival in that department. Titles that didn’t carry the TWIMTBP badge almost never had an AMD logo in its place, and more often than not, one could expect fresh releases to work more smoothly on GeForces than on Radeons. We experienced that disparity on a grand scale last year, when AMD bungled the release of its Catalyst drivers for Rage, and users had to wait a few days for a driver supporting both the new id Software title and EA DICE’s Battlefield 3 beta. Native support for Radeon-specific features like HD3D, AMD’s stereoscopic 3D implementation, was spotty, as well.

Lately, however, we’ve seen AMD’s developer relations program gain prominence. The program got its own brand back in March of 2010: Gaming Evolved. We saw it bear fruit in August 2011, when Deus Ex: Human Revolution debuted with full AMD branding and native support for both HD3D and Eyefinity, AMD’s multi-display scheme. Earlier this year, DiRT Showdown arrived with a DirectCompute-based global illumination lighting scheme that Codemasters developed in collaboration with AMD. Time and again, we saw AMD cards perform much better than the competition with that lighting mode enabled.

GeForce users had enjoyed similar perks in new games for some time. After years of comparative neglect, Radeon users were now getting exclusive goodies of their own, at long last.

Source: AMD.

Since then, Gaming Evolved has continued to gather steam. Two major blockbusters released this fall, Dishonored and Sleeping Dogs, are part of the program. So are a number of games due within the next several months: Medal of Honor Warfighter, Far Cry 3, BioShock Infinite, and the Tomb Raider reboot. These are exactly the kinds of high-profile releases that would have, in the past, featured Nvidia TWIBMTP branding. Now, they feature the other team’s logo.

Last weekend, we traveled to AMD’s offices in Markham, Ontario and spoke to Gaming Evolved marketing chief Peter Ross, who gave us some idea of what’s been going on behind the scenes.

The program’s growing profile is evidently no accident. According to Ross, AMD has increased the size of its developer relations team, on both the marketing and the engineering sides. Part of the recent push has involved giving the developer relations program a name and marketing it explicitly, just has Nvidia has been doing. That effort began a couple of years ago with the introduction of the Gaming Evolved label. AMD has endeavored to work more closely with both developers and publishers, as well.

Interestingly, Ross told us AMD’s recent executive changes have been beneficial to the program. He said the new executive team better appreciates the importance of gaming. Ross also pointed out with some exultation that Rory Read, AMD’s new CEO, has made public statements about the company’s commitment to gaming. Given AMD’s precarious financial situation as of late, it’s telling that the company has seen fit to increase funding for the Gaming Evolved program.

The future looks bright, too. The aforementioned changes all took place more than a year ago. Ross said we’re only just now seeing them produce results, and those changes represent a continued commitment on AMD’s part. This is “not something that just flamed up and will go away,” Ross stressed. Things “will only get better from here.”

What do game developers think about all this? Jurjen Katsman was also there in Markham, and he spoke to us about his company’s collaboration with AMD. Katsman is President of Nixxes, a Dutch firm that’s developed the PC versions of Deus Ex: Human Revolution, Kane and Lynch 2: Dog Days, and several Tomb Raider games. The firm is currently working on the PC versions of Hitman: Absolution and the upcoming Tomb Raider reboot—both Gaming Evolved titles, just as Human Revolution was.

Top: Hitman: Absolution. Bottom: the new Tomb Raider. Sources: Steam, Square Enix.

Katsman made it clear that his company has ongoing relationships with both AMD and Nvidia. The folks at Nixxes “always have a good time” working with both firms, he said, and with Human Revolution, Nixxes was “just as much in touch” with Nvidia as with AMD. Katsman pointed out that engaging both companies is necessary to ensure players get the best experience. Nobody wants their message boards flooded with bug reports and complaints, after all.

Nevertheless, Nixxes seems to favor Gaming Evolved over Nvidia’s developer program. According to Katsman, what AMD brings to the table is simply more compelling, and he views the AMD team as more dedicated. While he didn’t delve too deeply into specifics, he mentioned that AMD engineers helped Nixxes implement not just Radeon-specific functionality in their games, but also MSAA antialiasing support and general DirectX 11 features. The two companies collaborate sometimes over Skype and sometimes in person, when AMD engineers visit the Nixxes offices in Utrecht, Holland.

I asked whether Katsman had seen a stark change in AMD’s developer program in recent years, but he didn’t seem to think so. The rapprochement between the two companies has happened “organically,” in his view. While he conceded that AMD and Nixxes are working more closely on Hitman: Absolution than they did on Deus Ex: Human Revolution, he sees that as a natural evolution of the relationship, given the history of two firms.

Whether developers perceive a sudden improvement or a more gradual one, the effects of AMD’s developer relations push are very much obvious. Radeon owners should now see more high-profile games cater to their hardware—and, hopefully, work flawlessly on day one. Even folks shopping for a new GPU this holiday season may reap the rewards of AMD’s effort. Yesterday, the company announced what might be its most compelling game bundle offer to date, as part of which some Radeons are available with free copies of Far Cry 3, Hitman Absolution, and Sleeping Dogs, plus 20%-off coupons for Medal of Honor Warfighter. I don’t recall Nvidia ever offering anything quite so appealing.

Comments closed
    • spigzone
    • 7 years ago

    Ain’t rocket science. AMD graphics in all three next gen consoles. Likely a 2nd gen GCN/HSA capable Kaveri in Sony and Ms. That’s going to take a whole lotta AMD development collaboration with all the next gen console (and PC) game developers and publishers. As Nvidia brings nothing to the console table, sparing the time and personnel to collaborate with Nvidia’s isn’t going to he high on their priority list.

    The devs see the writing on the wall. In 2014 fully integrated HSA APU’s are coming to market. The future of Nvidia OEM graphics don’t need to wearing no sunglasses. their AIB graphics won’t be all that far behind.

    Be logical to expect TWIMTBP to fade into obscurity over the next few years along with their collaboration with Nvidia.

      • Silus
      • 7 years ago

      LOL good luck with that! Especially at a time that consoles as we know them, won’t last long. Tablets and smartphones will take that pie and AMD has nothing worth while in that market…and this is assuming that AMD will even survive their current problems, which raises many doubts. Meanwhile NVIDIA has no debt and has all markets secured, with profits and successful products.

        • spigzone
        • 7 years ago

        Wowsies, I didn’t realize that. Thanks for the heads up.

        • Essence
        • 7 years ago

        Nvidia is already pulling out of the gaming market slowly and concentrating on the HPC market, thats where they make the most of their money.. what Nvidia didnt expect was the compute in the AMD HD7XXX series, nvidia got a whiplash from AMD release, the AMD HD8XXX series is going to put nvidia in a coma… Even Intel is giving nvidia a run for its money in HPC.. Let the games begin…

        Am not a fanboi but i refuse to support companies or their corrupt company policies e.g. Intel and nvidia

          • Silus
          • 7 years ago

          The compute in AMD 7000 ? Give me a break…the FirePro line isn’t even a contender at this point. They may get some sort of market share, but that’s assuming NVIDIA does nothing to counter them…
          Intel is definitely a much more dangerous contender, given its resources, but thus far NVIDIA’s solution is better and more efficient.

          As for NVIDIA leaving the gaming market, what are you smoking ? Is reality such a harsh stepmother for AMD fanboys, that they refuse to accept it ? NVIDIA is NOT leaving the gaming market. What’s happening is that the Tegra business is getting more successful and more profitable and if that market continues to grow, Tegra may become the most profitable division at NVIDIA. But they are not leaving the gaming market, at least while it’s profitable and i has been quite a bit…unlike for AMD.

    • shaq_mobile
    • 7 years ago

    More importantly, I think Lara Croft got a breast reduction.

      • odizzido
      • 7 years ago

      I didn’t like tomb raider when I tried it like 12 years ago, but the new one looks pretty sweet.

      [url<]http://www.youtube.com/watch?v=ol_-QGlwRqc[/url<]

        • sweatshopking
        • 7 years ago

        [quote<] More importantly, I think Lara Croft got a breast reduction. [/quote<] [quote<] the new one looks pretty sweet. [/quote<] DOES NOT COMPUTE

    • Silus
    • 7 years ago

    Ahh the typical double standards!

    News: “NVIDIA strengthens relations with developers”
    Comments section: “NVIDIA bribes developers”, “They cripple games on other platforms to make them look better”, etc

    News: “AMD strengthens relations with developers”
    Comments section: “Great to hear!”, “Awesome news”, “My <insert Radeon here> will love this”, etc

      • willmore
      • 7 years ago

      If one company is the only input to game studios, games will be biased towards that companies products. If another company starts working with game studios, then the games will work better on *both* companies products.

      So, the news is *not* “AMD strengthens relations with developers”, but “now both companies will be working with studios”. And, that is good news as there is now hope that the bias will lessen.

      If nVidia wasn’t working with game studios and AMD made this anouncement, then you would have a good arguement, but you’re ignoring the context.

        • Silus
        • 7 years ago

        LOL! So, you’re calling bias, to the fact that AMD ignored developer relations, while NVIDIA didn’t ?
        It’s like saying that a pro-active person looking for something and getting it, means a bias against those that did nothing, but still wanted the same thing. Such BS is astounding!

        No, these news are ONLY about AMD strengthening its relations with developers, because they have been historically very bad and kudos to them. At least that should diminish the amount of drivel from AMD fanboys that NVIDIA’s relations with developers means bribes, while poor AMD that is a saint and doesn’t bribe anyone, is left in the dust.

        The reality of course is that games that are part of any of these programs, are not about money, but rather an exchange in resources to help optimize the game to the company’s products. The only exchange in real money occurs when games are bundled with graphics cards. In that case, both AMD and NVIDIA pay the producers of those games for the copies of the games bundled with their graphics cards.

          • l33t-g4m3r
          • 7 years ago

          One shouldn’t need developer relations if their hardware/drivers is up to snuff. Dev rel is just hand holding for undereducated and spoiled game devs that should learn how to program and troubleshoot on their own, and any help should be appreciated, not taken for granted as some entitlement. Not to say that there isn’t a mutual benefit, but it isn’t any hardware company’s job to write your shader code for you.

            • willmore
            • 7 years ago

            Have you read any of the Phoronix articles about Valve developers working with Intel Linux driver developers? There’s a bit more to it than hand holding poor devs. Don’t get me wrong, there’s always some of that, but there is *more* going on.

            • Silus
            • 7 years ago

            Ah Valve…the developer that was payed handsomely by ATI to have the Source Engine optimized around Radeons. Definitely a good example…

            • Washer
            • 7 years ago

            Ridiculous post. “One shouldn’t need developer relations” is never a correct statement. The process of bringing hardware, drivers, APIs, and the programs that use those resources “up to snuff” requires extensive cooperation among each of those areas.

            Even more you try to toss out the mutual benefit, which is exactly why AMD should have been pushing these relationships ages ago. It helps both AMD and game developers to work out these problems. It allows AMD to identify issues and improve their drivers. It allows game developers to release less buggy and higher performing games. Why shouldn’t this happen again?

            EDIT: So confused why your 9AM and 12AM posts differ so significantly. Your 12AM posts highlights why these relations can be good, you even provided an example! Yet this post is the opposite…

            • sweatshopking
            • 7 years ago

            it’s l33t.

            • l33t-g4m3r
            • 7 years ago

            “toss out the mutual benefit’ No, I specifically didn’t say that, and in fact specifically say the opposite:
            [quote<]Not to say that there isn't a mutual benefit, but it isn't any hardware company's job to write your shader code for you.[/quote<] All I'm saying is that it isn't AMD or Nvidia's job to write shader code for game developers. That shouldn't even be allowed. In fact, that was a large part of the Batman AA fiasco, because Nvidia wrote the AA, and then put id checks in the code to stop ATI users from using it, and nobody could officially get it removed because it was proprietary. Cooperation, yes. Nvidia sabotage, no.

            • Washer
            • 7 years ago

            Shouldn’t be allowed? Who in your crazy mind decides that this type of thing is “allowed” or not? That’s hilarious!

            Your statement specifically disregards the mutual benefit, you mention it and then toss it out like it should not matter (I bet the person/thing that decides if it should be allowed decides if it matters).

            I can’t make sense of what you’re trying to say.

            • l33t-g4m3r
            • 7 years ago

            You can’t? That’s your problem, not mine. Yet again you say I’m specifically disregarding the “mutual benefit’, when I’ve said both times I am not. I am not referring to a company helping a game developer or vice versa, I am referring to outright sabotage via a video card company directly writing code for a game that doesn’t work properly for the competition. Wow, are you thick!

            Read up on some [url=http://www.brightsideofnews.com/news/2009/11/4/batmangate-amd-vs-nvidia-vs-eidos-fight-analyzed.aspx?pageid=1<]history[/url<] before looping your broken record again. This type of stuff shouldn't be allowed, and there is no "mutual" benefit. The devs were being lazy, and Nvidia was playing dirty tricks. However, if Nvidia was helping the devs to write their own code, and it worked for everyone, that would be different.

          • willmore
          • 7 years ago

          No, I didn’t.

          I said that the bias that exists in some games towards nVidia cards may lessen because of this because AMD will now be working with devs to improve performance on their platform. I’m saying that bias will *decrease*. Or do you deny that some games are biased towards performing well on nVidia GPUs due to the effort nVidia has put into getting the developers to optimize their games for nVidia hardware?

          You’re making quite a leap to characterize the interaction that AMD will be having with game studios to be anything like what nVidia has done. Since nVidia has done evil crap in the past when they ‘worked with developers’ that means that AMD will be evil when they ‘work with developers’? That’s a bit of a stretch.

          I don’t think cash has to change hands to demonstrate foul play when vendors work with developers. Looks at nVidias description of their TWIWMTBP program on their own web site. They explicitly say that they give hardware to developers who participate in the program. The cash value of a stack of top end video cards that you can’t even find in stock in stores is pretty nice, but he value they represent to studios is even higher. Hell, they can even have access to *unreleased* hardware. Don’t misunderstand me. I think that studios should have access to this hardware–but from *all* vendors of hardware. That way, the games will get optimized more generically ranter than just tuned for a particular architecture. We don’t want another Crysis 2 tesselation mess.

          Edit: Spelling, WTF.

            • Silus
            • 7 years ago

            You know, repeating that there was a bias, doesn’t make it true. But I guess that won’t stop you…

            Then there’s this:

            “You’re making quite a leap to characterize the interaction that AMD will be having with game studios to be anything like what nVidia has done. Since nVidia has done evil crap in the past when they ‘worked with developers’ that means that AMD will be evil when they ‘work with developers’? That’s a bit of a stretch.”

            Here’s what I said:

            “No, these news are ONLY about AMD strengthening its relations with developers, because they have been historically very bad and kudos to them. At least that should diminish the amount of drivel from AMD fanboys that NVIDIA’s relations with developers means bribes, while poor AMD that is a saint and doesn’t bribe anyone, is left in the dust.

            The reality of course is that games that are part of any of these programs, are not about money, but rather an exchange in resources to help optimize the game to the company’s products. The only exchange in real money occurs when games are bundled with graphics cards. In that case, both AMD and NVIDIA pay the producers of those games for the copies of the games bundled with their graphics cards.”

            So when I say that these programs are meant to optimize games for a company’s products and that there’s nothing wrong being done in them, somehow in your twisted brain what I said was “if NVIDIA was evil, AMD is going to to be evil too”. You know that’s why I often use the term fanboy around here, because only fanboys can read something like what I wrote and come up with something entirely different and twisted based on what they read. Actually I don’t even think you read what I wrote, which makes it even worse…

            Also, speaking of NVIDIA’s “evil crap”, can you point me to one of those “evil crap” examples ? And I’m not talking about speculation started by people like you…I’m talking about actual proof of said “evil crap”. Oh and here’s a few that you’ll certainly talk about:

            Batman AA – no “evil crap” was done. NVIDIA provided their algorithms for Anti-Aliasing, which will only work on GeForces. The game developer asked AMD for their AA algorithms and AMD never provided them. If anything the game should’ve shipped only when both provided their algorithms, but not shipping a game at the target date means more money spent on it and that’s not always easy to do, especially when it’s because of a third party entity.

            Assassin’s Creed DX10.1 – No “evil crap” from NVIDIA. Ubisoft sent out a statement letting people know that they had a bug in their 10.1 path and they had to remove it from the game until they fixed it. That’s what we know. Care to add more to that with actual proof of NVIDIA’s “evil crap” doing ?

            Then there’s the Crysis 2 example, which isn’t an example at all. Just because you don’t like that tessellation is used in high levels, doesn’t mean it can’t be used that way. And NVIDIA also doesn’t have to limit their tessellation capabilities in GeForces just because AMD’s Radeons suck at it, just like AMD doesn’t have to limit their compute capabilities just because the newer GeForces aren’t focused on compute.

            • l33t-g4m3r
            • 7 years ago

            Every single one of your game examples have been debunked and shown to be malicious.
            [url<]http://www.brightsideofnews.com/news/2009/11/4/batmangate-amd-vs-nvidia-vs-eidos-fight-analyzed.aspx?pageid=1[/url<] Batman AA was evil. Nvidia provided a GENERIC msaa algorithm that DOES work on AMD hardware if you fake the vendor ID, not only that, but Nvidia coded it so that the AA normally runs on AMD cards without doing resolve. [quote<]"'Amusingly', it turns out that the first step is done for all hardware (even ours) whether AA is enabled or not! So it turns out that NVidia's code for adding support for AA is running on our hardware all the time - even though we're not being allowed to run the resolve code! So… They've not just tied a very ordinary implementation of AA to their h/w, but they've done it in a way which ends up slowing our hardware down (because we're forced to write useless depth values to alpha most of the time...)!"[/quote<] TR has also done articles on both Assassin's Creed and Crysis2, you should read them. Claiming Nvidia is innocent will get you nowhere, the court of public opinion has already adjourned, years ago even.

            • Silus
            • 7 years ago

            Is this what you’re talking about ?

            [url<]https://techreport.com/news/14707/ubisoft-comments-on-assassin-creed-dx10-1-controversy-updated[/url<] Care to show me where any evil crap was done ? Ubisoft quite clearly exposed the situation, even with TR's questions. And what about Crysis 2 ? What evil crap was done ? As for Batman AA, so NVIDIA did do evil crap ? Ok, that's one!

        • PixelArmy
        • 7 years ago

        The article makes it sound more like they are just forming their own camps which is the same thing… Unless games are supported by both TWIMTBP and Gaming Evolved simultaneously. Are they?

        Game A being supported by TWIMTBP and Gaming Evolved (which keeps everyone honest)
        [b<]is not the same as[/b<] Game A supported by only TWIMTBP and Game B supported only by Gaming Evolved (two separate evils*). * Though personally, so long as the game isn't unplayable by the other camp, I view it more as "value added" features and say bring them on.

      • Washer
      • 7 years ago

      Unless you work for AMD or Nvidia you are not part of their competition. It’s easier to not take sides than choose a favorite.

      Better relationships among game and hardware developers is only a good thing. I’d prefer if Game X developers worked with both Nvidia and AMD to improve games and drivers. We are probably still a ways from that, since these relationships are still most strongly linked at the marketing level, but this is a step in that direction.

      • rrr
      • 7 years ago

      Yo dawg, I heard you like trolling, so we put a troll in your troll so you can troll while you troll.

    • Novuake
    • 7 years ago

    Loving this news! Proud owner of a Sapphire HD 7950 OC Edition. As soon as it is bundled with a nice game I will seriously consider getting another for XFire. Sapphire is usually very generous with their in bundled games. AMD we thank you…

      • Arclight
      • 7 years ago

      Does ur model have the dual bios switch? Have you tried flashing it to a 7970?

        • BestJinjo
        • 7 years ago

        You can’t flash HD7950 into a 7970 as it’s laser cut. Also, at 925mhz, HD7950 is just 5-6% slower than a 7970. All you need to do is overclock the 7950 and save yourself $100+.

          • Arclight
          • 7 years ago

          The 7970 bios does give you increased voltage limitations and better memory timings. But ofc if you don’t mind Ocing through a program that loads at start-up there’s no benefit in flashing. I think i just had that reaction that Loiue CK described: you can do this now (use dual bios switch) to which i respond “OMG! I”M GONNA DO IT”.

            • Firestarter
            • 7 years ago

            Increased voltage limitations, better timings? Can you elaborate? AFAIK, the 7970 has a higher stock voltage (1.175v IIRC) but neither bios or drivers allow voltage adjustments without a tool like MSI Afterburner. And by better memory timings, do you mean higher stock memory clocks or stricter/looser memory timings at the same clock rate? Higher clocks seems obvious, but I haven’t heard about ram timings being a factor.

            Anyhow, having a higher stock voltage would be enough reason to flash a 7970 bios. I have my 7950 clocked at 1100mhz with the voltage set at 1.187v (voltage measured under load is more like 1.13v), but using MSI Afterburner is a bit messy. If you accidentally close the program or it fails to set the voltage, you’re in BSOD land.

            • Arclight
            • 7 years ago

            [quote=”Firestarter”<]Increased voltage limitations, better timings? Can you elaborate?[/quote<] All the info i got was from the chatter on different forums after the cards was released. Ofc the voltage will be bumped up to the 7970 spec, the RAM speed and timings as well. By voltage limitations i meant in the power configuration within the CCC. Iirc since the introduction of the 6000 series, there is a power tune feature in the catalyst that allows you to tweak somewhat the TDP of the card or force it to throttle down if it exceeds a certain value. I presume the 7970 has a bigger threshold than the 7950,.....I don't know much more really, i have a GTX 560 Ti at the moment. Best way to find out is to try it, if your model has the bios switch.

    • Arclight
    • 7 years ago

    [quote<]We experienced that disparity on a grand scale last year, when AMD bungled the release of its Catalyst drivers for Rage, and users had to [b<]WAIT A FEW DAYS FOR A DRIVER SUPPORTING BOTH THE NEW id Software TITLE AND EA DICE's Battlefield 3 beta[/b<][/size].[/quote<] 1st World Problem. Me:Someone pick up the phone. You: Erhm what? Why? Me:Because i f*cking called it.

      • Cyril
      • 7 years ago

      [quote<]1st World Problem.[/quote<] As opposed to the genuine humanitarian crises that affect other aspects of PC gaming? 😉

        • Arclight
        • 7 years ago

        Purportedly.

      • Medallish
      • 7 years ago

      Wait.. The Rage problem was solved almost instantly, first with a hotfix, and then with a full on driver, it wasn’t just a driver issue however, the game was buggy as hell, But yeah agree it’s definitely a 1.st world problem, even as a gamer a couple of days(Less than a day if I remember correctly), isn’t a real problem.

        • l33t-g4m3r
        • 7 years ago

        No. AMD fubar’d everything. The hotfix was the wrong version, and the “fixed” version still had problems. It wasn’t really fixed for well over a month. AMD doesn’t prioritize OpenGL, so it took pretty long to actually fix. Also, Rage was not “buggy as hell”, that was fud from idiots and trolls. The issue was with the auto mode, which Carmack fixed faster than AMD fixed their drivers, and he even wrote a workaround for AMD’s bugs.

          • Arclight
          • 7 years ago

          I remember that time so vividly, i felt like the world was crumbling around me.

    • l33t-g4m3r
    • 7 years ago

    I have nothing against AMD and Nvidia working with game developers, and overall I think it helps move things forward, but both companies like to sabotage their competition by implementing some specific feature that doesn’t run well on the other guy’s hardware. Nvidia did this with tessellation, AMD did this with global illumination. This only hurts the consumer, who will both resent the game developer and the company that sponsored the performance sapping hack. Deus Ex: Human Revolution is a perfect example of how things should be done, and I hope it sets a standard of cooperation instead of sabotage for future titles. That said, I actually think that this is how things are moving with AMD here, and the more of this the better, since we’ll see less screw-ups like Rage.

      • Medallish
      • 7 years ago

      Tesselation normally wouldn’t be a bad thing for gaming, but when nVidia has Crytek tesselating the living crap out of a square concrete block, you realize it’s just there to slow down cards that can’t do it as well, which is a shot in the dark because as we see now games that use tesselation tends to run better on AMD cards today.
      Global Illumination however isn’t completely the same, had the 600 series been good at computing we wouldn’t be thinking that the game was overly optimized for AMD, Global Illumination uses DirectCompute to calculate lighting, it’s not an AMD specific feature, the worst offender imo is PhysX, this is made to hurt the consumer, especially when nVidia pulls the move of artifically disabling the support for a add in PhysX card with a different card used as a main card.

      I’m not saying AMD doesn’t have the capacity use shady tactics to hurt the competition with certain optimizations, but currently I don’t see them forcing anything irrelevant into games just to hinder competition.

        • Silus
        • 7 years ago

        Neither is tessellation a NVIDIA feature. Had AMD actually been good at tessellation you wouldn’t see games that use it run better just on NVIDIA cards. You have other games other than Crysis 2 that used tessellation that ran better on NVIDIA cards than on AMD cards. Your whole argument is just an excuse and the typical double standard. When AMD fails at something, it’s NVIDIA doing something shady. When AMD shines at something, it’s not because AMD did anything shady, it’s just that NVIDIA sucks at it…

          • Arclight
          • 7 years ago

          Tell us a non TWIMTBP title that has tesseletion and AMD falls drastically behind nvidia when it comes to frame per second.

            • sweatshopking
            • 7 years ago

            uniengine? not so much anymore, but with the 5k series, it was a slaughter.

            • Arclight
            • 7 years ago

            ….unengine, 5k series? Try harder.

            • sweatshopking
            • 7 years ago

            WELL, you did ask. that WAS the last time nvidia had a major advantage. i wasn’t saying it mattered today, but it was the last major time.

            • Arclight
            • 7 years ago

            I did no such thing. Read my post again, there is no past tense. I was reffering to the present cards lineup and present games.

          • l33t-g4m3r
          • 7 years ago

          Look, Silus. Crysis2 was a TWIMTBP title that was created while Nvidia still had superior tessellation performance. That’s why square bricks were tessellated to hell. AMD could only tessellate 1 triangle per pass, while Nvidia could do four, and Nvidia paid the developer off to implement those shenanigans. The extreme tessellation of bricks added nothing to the quality of the game, and it was done solely to screw the competition. AMD had the last laugh though, since their GCN architecture now outperforms Nvidia. Tessellation wasn’t invented to use on square bricks, it was invented to improve model quality like in the Heaven benchmark.

          Another point: DX11 is a forced standard. Don’t start with this exclusive “feature” bull that got so out of hand in dx8-9 that Microsoft had to forcibly unify feature parity starting with dx10. PhsyX and CUDA is actually an end-run of standardization, being they are designed to only work on nvidia hardware, and nobody should support lock-in, whether or not it currently works for Nvidia. Lock in is a complete scam that screws everyone, including Nvidia users. (look at how poorly physx still runs, making a second card a requirement)

          Also, it doesn’t matter whether AMD or Nvidia can “SHINE” at some niche scenario, neither of them should be exploiting it to the detriment of the gaming community. That’s pure evil.

        • Alexko
        • 7 years ago

        Exactly. There is no indication whatsoever that AMD’s implementation of global illumination does anything unreasonable just to hurt performance on NVIDIA hardware.

        Calling it even, in this case, is just not fair.

          • l33t-g4m3r
          • 7 years ago

          I’m not saying it does. What I am saying is that AMD knows that is a weak spot for Nvidia’s hardware, and immorally took advantage of that fact. Global illumination is not a well used lighting method for good reason, same as ray tracing. Thankfully, it isn’t the only way to play the game.

          I agree that AMD hasn’t come anywhere close to nvidia when it comes to shenanigans, but that doesn’t excuse them when they do. I like to think they have higher standards.

          More importantly, both companies need to work on hitting a constant 120 fps for 3d, and that can’t happen with all this pointless sabotage, feature creep, bloat, or whatever else.

            • DaveBaumann
            • 7 years ago

            I’ve said this before elsewhere and I’ll say it again here – when the work on either the Leo demo occured or the engagement with devs with the techniques demonstrated by Leo (i.e. Forward+ & Global Illumination rendering via DirectCompute) there is no way for us to know that Keplar would be realtively weak specifically in this area. When Keplar became available and we say what the performance disparity was like we were somewhat shocked.

            From a graphics progress perspective there wasn’t much reason to expect this to be the case either. While realtively easy to grasp features of DX11, like tessellation, grab headlines, its DirectCompute that really provides some of the biggest benefits and is probably the area that allows the biggest area of developer creativity as it give rise to new techniques and algorithms. DC has been increasing in use in DX11 titles and will continue to do so as devs further get entrenched in it going forward. Global Illumination is also a technique that has long since been sought after in realtime rendering and DC now enables techniques to achieve that in an affordable manner – indeed, not only are we busy helping out getting it implemented in current titles, but NVIDIA are evangelising for future titles as well : [url<]http://blogs.nvidia.com/2012/06/its-unreal-glimpse-the-future-of-gaming-with-epics-ue4/[/url<]

            • Alexko
            • 7 years ago

            Immorally? How on Earth is any of this immoral?

            I wouldn’t blame Nvidia for pushing tessellation, or whatever might work better on their GPUs, and I don’t blame AMD for this either. Global illumination adds realism, and is optional, no one loses anything here.

            The problem is that Nvidia doesn’t stop at pushing tessellation, they do things like tessellating the hell out of [i<]flat[/i<] surfaces (jersey barriers in Crysis 2), or objects that eventually are not even rendered (the sea in Crysis 2). Or they add a vendor ID check that disables a completely standard implementation of MSAA when an AMD card is detected (in a Batman game, Arkham Asylum I believe). If I recall correctly they pulled similar crap in Hawx 2 with absurd levels of tessellation in distant objects. They also made Ubisoft remove DX10.1 support from Assassin's Creed with a post-release patch because AMD had DX10.1 support and they didn't. In Batman they [i<]removed[/i<] cloth banners that were animated with PhysX when PhysX was disabled. They didn't reduce the granularity of the cloth animation, or even remove all cloth animation, they just removed the cloth altogether. Pushing for features that work better on your hardware is fair game and common sense, especially if these features have real benefits, which they usually do. It helps the gaming industry move forward with new, better technologies, so gamers win, even if it makes one vendor look a bit worse. Pushing it to the point that it hurts performance for no benefit whatsoever crosses the line into sabotage territory, where gamers lose, one vendor looks much worse and the other much better—except to people who know about the sabotage, that is. Nvidia tends to cross that line far too often, and AMD has never crossed it in recent history, at least not to my knowledge. That's a huge difference.

      • WaltC
      • 7 years ago

      Meh…what matters with all of these so-called “features” that various products support is what game developers do with them. Most of the time a specific “bullet-point” feature in a game is way overdone–it’s either used far too often or else improperly supported or some combination thereof. There have been many games which have thrown in “tessellation” as a bullet-point and it’s obvious–the game slows way down without one iota of improved image quality when tessellation is turned on–that’s not “tessellation” that’s just poor game development. Same with “motion blur” and many of the other features–they are often so overdone, used at the wrong times and wrong places, so that you *know* the game developer has no clue as to what he’s doing.

      Simply put, does it matter if feature X can be turned on in a game, delivering higher frame rates relative to another gpu, [i<]if you can't tell the difference in the scene whether the feature is turned off or on[/i<]? No, it doesn't matter at all. If there's no attendant increase in image quality then the fact that gpu A runs a scene faster than gpu B when the feature is turned on means a whole lot of nothing. I like it when I can tell the difference between gpus based on how they render in the majority of 3d games. I see that as an advantage instead of a disadvantage, since it provides me with a [i<]choice[/i<] of rendering capabilities and rendering styles. I do not expect all gpus to render equally, and were they to do so I would be disappointed. Where's the fun in that? Mini editorial: Edit: I want AMD to make it, and I'm a fan boy because I am a fan boy of competition. I still remember paying $500-$600 for Pentiums (honestly, I cannot remember how much I paid for 2/3/486's or MC680xx's except that it was way too much), and today's economies of scale ought to render a return to those days impossible. Still, I worry. I hope AMD succeeds in this and many other gambits, but one thing I don't want to hear out of AMD is, unfortunately, what I'm starting to to hear from Rory Read--that "AMD is not shooting for the x86 performance crown anymore, but wants to live in Intel's 'value sector' shadow." As much as I hate to say it, I clearly recall [i<]every single one of Intel's former x86 cpu competitors[/i<] (Cyrix, etc.) saying *the same thing* shortly before they turned belly up, died, floated away, and started to stink (OK, I admit that's a bit much...;)) But it is true, all the same. Attitude and goals are invaluable inside a company. If you believe you are going to lose then you have already lost. AMD under Sanders was the only one out of all of the rest of them that never made remarks about "being willing to live in Intel's shadow" and feeling fortunate to soak up the crumbs falling from Intel's table. Thus, those Intel competitors died once-upon-a-time while AMD lived and thrived. Read needs to take heed here. AMD challenged Intel head-on...and won! Actually won, for at least a year, which is an aeon in this business. And AMD shaped the direction of 64-bit x86 desktop computing in such a way that Intel was *compelled* to follow (with Core2 & later) even when Intel clearly did not wish to do so. But now, Intel has soundly beaten AMD at its own cpu game. Come on, AMD--please get your head out of your posterior and get *original* and start making change happen instead of sitting on the sidelines scratching your head and waiting on other companies to tell you what to do and where to go. You bested Intel once when all the prevailing wisdom at the time said you could not--that's why I believe you can do it again. It's all but certain, though, that if you believe Intel will simply leave you alone and let you live in its shadow--I think you need to think again. Such behavior is not in Intel's nature, and as a company AMD will not long survive with that philosophy. AMD cannot, of course, "dominate" Intel--that will never happen. Right now, Intel is too big for AMD's lasso. But AMD can and *must* beat Intel occasionally in the cpu arena--just every now and then--to stay in the game. AMD, you are already eating Intel's lunch on the gpu side of the fence and have been doing so for years--so Read, you have much to be proud of in AMD--trumpet your major victories when you have them! Don't piss them away with a sad attitude. Getting closer to game developers is a much-needed and logical step. Just don't apologize to Intel for being better at gpu building as if to placate Intel (800-lb gorillas in the room will not be placated.) Bad management has sunk far more than one ship in the past. Commodore upended and went straight to the bottom because of bad management, because of CEOs and other officers who didn't know what C= had going for it, and who could only see what the competition had that C= didn't. Don't emulate C=, don't emulate 3dfx, don't emulate--well, tech graveyards are full of the decaying superstructures of once high-flying tech companies that had the world by the tail but let it go through some combination of cowardice and bad management. I cringe at the thought of having to buy my first Intel system since 1999, when I built my last Intel box. Don't throw in the towel through a singular lack of ambition and imagination and put me in that position again, AMD. Please, asking as nicely as I know how.

        • squeeb
        • 7 years ago

        [quote<]tech graveyards are full of the decaying superstructures of once high-flying tech companies that had the world by the tail but let it go through some combination of cowardice and bad management.[/quote<] SGI comes to mind..

    • Bensam123
    • 7 years ago

    Hmmm… it should be interesting to see if AMD has had a natural handicap all these years because they simply didn’t work with game developers (as far as performance goes). That could potentially have a major impact on how the entire graphics card scene is perceived…

      • spigzone
      • 7 years ago

      And the natural handicap Nvidia has having no slightest piece of the next gen consoles?

      What AMD did in the past doesn’t mean much when every console game developer will, by necessity, be working closely with them into the future.

      Working closely with Nvidia will be an option.

    • willmore
    • 7 years ago

    So, maybe it was a good time for me to buy that HD7850 to replace the old GF9800GTX+? Yay?

      • Arclight
      • 7 years ago

      Ney. It’s high time you wait for the 8000 series, if you waited this long you can wait 2 or 3 months longer.

        • nanoflower
        • 7 years ago

        And then buy that HD7850 when it goes on sale to free up space for the 8850 replacement.

          • Arclight
          • 7 years ago

          If price/performance would favor that investment, sure. It’s a win either way imo.

        • willmore
        • 7 years ago

        I needed to upgrade. The GF9800GTX+ wan’t going to cut it in Borderlands 2.

          • Arclight
          • 7 years ago

          Suit yourself, just don’t let buyer’s guilt affect you in a few months.

            • willmore
            • 7 years ago

            I got a good card for a fair price. The new driver improved performance for free. Now I find that future games will be more optimized for my card. Nope, I’m good. 🙂 Plus, I’m playing BL2 a bunch and that’s fun.

            Edit: FWIW, I didn’t thumb you down, I think you made a valid point.

        • BestJinjo
        • 7 years ago

        HD8000 series is rumored to launch between March and June, far cry from 2-3 months from now. Not to mention there is no way HD8850 will cost $160-185 as HD7850 costs now.

          • Arclight
          • 7 years ago

          Oh snap, didn’t read about that new rumour. Thought it’d be in January or something.

      • rrr
      • 7 years ago

      More than twice as fast, obviously depends on a PSU and CPU you have.

Pin It on Pinterest

Share This