AMD clarifies discrepancies in Computex Radeon RX 480 demo

During AMD's presentation at Computex 2016, the company showed no fear as it compared the new Polaris-based Radeon RX 480 GPU to Nvidia's GeForce GTX 1080. The $700 Pascal-based powerhouse was tested in Ashes of the Singularity versus a pair of the Polaris GPUs and came away a little embarrassed in this scenario. AMD presented a slide that claimed the GTX 1080 only managed 58.7 (we presume average) FPS, while the Radeons pushed out 62.5 FPS using the same settings.

Sharp-eyed viewers, including a few of you here on TR, noticed some oddities in the numbers and in the video stream. Units and world geometry weren't in quite the same places, and one of the renders appeared to be using different quality settings from the other. AMD also claimed 98.7% GPU utilization for the GTX 1080, while the RX 480 cards were supposedly only 51% loaded. What was going on?

Well, yesterday, Robert Hallock, AMD's Global Head of Technical Marketing, made a post in the official AMD sub-Reddit attempting to explain what was causing the confusion. He stated that both tests were run on the same Core i7-5930K-equipped machine, differing only in GPU configuration. He also confirmed that the tests were run with the same settings: "Crazy" quality in 1080p with 8x MSAA and v-sync off.

For starters, Ashes of the Singularity uses a random seed to procedurally generate certain game assets, leading to slight variation in the way the benchmark looks. More interestingly, though, Robert says the GTX 1080 is "incorrectly executing" some of the shader code in the benchmark, which leads to less snow pile-up on the terrain. In his words, "the GTX 1080 is doing less work to render AOTS than it otherwise would if the shader were being run properly." According to Robert, the RX 480 is correctly executing the terrain shaders, leading to more procedurally generated snow—and worse performance.

Robert did not go to great lengths to explain the GPU usage metrics. He did however say that the RX 480 was CPU-limited in the "single" or "normal" batch test, and that that was the reason for the low 51% GPU usage number given in the presentation. GPU usage did climb to 71.9% in the "medium batch" test, and again to 92.3% in the "heavy" batch test. Robert stated that the final multi-GPU scaling in these tests was 183% versus a single GPU.

Ashes of the Singularity runs on DirectX 12 and uses Explicit Multi-Adapter mode for multi-GPU processing, so the GPUs were not running in Crossfire mode. As such, these results are not actually indicative of Crossfire scaling on the RX 480. Still, this performance level is impressive for a pair of $200 graphics cards. We can't wait to see what just one can do.

Comments closed
    • sweatshopking
    • 3 years ago

    NEED ANYMORE FANBOYS IN THIS THREAD?!

      • biffzinker
      • 3 years ago

      nvidia/amd = butt-hurt fanboys?

    • barich
    • 3 years ago

    I’m really looking forward to the RX480. The VIA and nVidia chipsets for the Athlon 64 just aren’t stable enough. I hope ATI can do better.

    • Sonk
    • 3 years ago

    I really would love to see how the 480 compares to the GTX 1070 and 1080

      • biffzinker
      • 3 years ago

      Yup I’m a dumb shit alright even though it’s a 50/50 split between Nvidia/AMD for past GPU’s I’ve owned.

      • Spunjji
      • 3 years ago

      Go home, you’re drunk.

    • Ummagumma
    • 3 years ago

    I am glad that stuff like this doesn’t bother me. I lost my desire to play video games long ago…..

    • Ninjitsu
    • 3 years ago

    [quote<] He did however say that the RX 480 was CPU-limited in the "single" or "normal" batch test, and that that was the reason for the low 51% GPU usage number given in the presentation. [/quote<] Well that was [url=https://techreport.com/news/30222/amd-polaris-powered-radeon-rx-480-will-ring-in-at-199?post=982327<]obvious[/url<]. But yeah, I don't fully understand the results or the implications of any of this. I'm seeing data that suggests that the 1080 in their testing gets a "score" of between 4700 and 5100 for "Crazy 1440p" (most results are 5100 and 5000), and the dual RX480s get around 5000. FPS varies accordingly. "Extreme 1440p" ends up with the radeons getting a much higher score (5800 to 6500) and a single RX480 gets 3900 (can't find the 1080 for extreme). My take from this is that the benchmark is extremely variable and perhaps shouldn't be used to draw too many conclusions from. I don't know what "CPU FPS" means, but if it's a measure of overhead then Nvidia has more room to breathe than AMD has (pointing towards drivers, I suppose). Finally, if the 1080 isn't rendering correctly or whatever, is it possible that it's over-utilizing things? Anyway, here are the links from AMD's test runs: GTX 1080 (Good run)(Crazy 1440p): [url<]http://ashesofthesingularity.com/metaverse#/personas/b0db0294-8cab-4399-8815-f956a670b68f/match-details/3e58c785-b1bc-49ae-a6b9-e2faab515279[/url<] RX480 crossfire (Crazy 1440p): [url<]http://ashesofthesingularity.com/metaverse#/personas/b0db0294-8cab-4399-8815-f956a670b68f/match-details/ac88258f-4541-408e-8234-f9e96febe303[/url<] RX480 crossfire (Good run) (Extreme 1440p): [url<]http://ashesofthesingularity.com/metaverse#/personas/b0db0294-8cab-4399-8815-f956a670b68f/match-details/0561a980-78ce-4e24-a7c3-2749a8e33aac[/url<] RX480 (Extreme 1440p): [url<]http://ashesofthesingularity.com/metaverse#/personas/b0db0294-8cab-4399-8815-f956a670b68f/match-details/8b748568-fc96-4e48-9fed-22666a7149f5[/url<] BTW Crazy > Extreme; Crazy has 16 million terrain shading samples vs 8 million for Extreme. p.s. They seem to have just 1 result for RX480 Crossfire with "Crazy 1440p".

      • chuckula
      • 3 years ago

      I find the “extreme 1440p” to be a more reasonable resolution and quality setting level (not dumbed-down low but not the rather dubious “crazy” setting level).

      Here’s a single GTX-1080 scoring well above the multi-GPU 480 in your link at “Extreme” 1440p:

      [url<]http://ashesofthesingularity.com/metaverse#/personas/812cec85-aeeb-4325-b42b-2cb79f76c78b/match-details/4bef6264-8419-45b0-b84a-e520627316ce[/url<] Edit: And for those of you who think I'm being unfair to AMD in not using the "crazy" preset, here you go: GTX-1080 at 1440P "crazy" mode: [url<]http://ashesofthesingularity.com/metaverse#/personas/89760c37-0f1f-4a01-9f08-af807deaaa3e/match-details/599c8698-507a-4bbd-8936-a333cba7274e[/url<] Note that the GTX-1080 wins both benchmarks by pretty noticeable margins at 1440p over the "multi GPU" 480 setup.

        • Ninjitsu
        • 3 years ago

        Your first link’s to SLI’d 1080s…the third and fourth results from the top are for a single card, which is 6300, and in the same range as the Radeons…

        …which basically confirms what is known from AMD’s results – they both perform roughly the same in AoS under some conditions and the utilization figure is at best meaningless, and at worst indicative of bad scaling (100% scaling would be around 7800 points for Extreme 1440p).

        Second link is interesting because it’s with a higher clocked Skylake as opposed to Haswell-E, so it seems that despite the “98% utilization”, the 1080 still had at least 13% headroom (comparing the scores)…

        Long story short, AMD was doing marketing stuff and it’s worth ignoring till an actual independent and reliable comparison between the RX480, 1070 and 1080 is made.

          • chuckula
          • 3 years ago

          Good point, I corrected the first link to a non-SLI result. A single GTX-1080 still wins but has a more pronounced dropoff going to heavier batches. Interestingly, the SLI setup didn’t really boost the lighter workloads but was more effective at maintaining the FPS at higher workload levels.

          • chuckula
          • 3 years ago

          One More Thing (TM):

          Here’s is your link to the *single* 480 in “Extreme” at 1440p: [url<]http://ashesofthesingularity.com/metaverse#/personas/b0db0294-8cab-4399-8815-f956a670b68f/match-details/8b748568-fc96-4e48-9fed-22666a7149f5[/url<] Here's another link to a *single* R9-390 at the exact same setting at 1440p: [url<]http://ashesofthesingularity.com/metaverse#/personas/c1aff257-b8b9-4326-bdc0-7deeb34eaa96/match-details/1c2e1c3b-d89f-47f9-bceb-f909145ec393[/url<] [Edit: For anybody thinking the R9-390 link above is really for an R9-390X, here's another test result for an "R9-390 class" GPU that's well ahead of either of those two scores. This is a true R9-390X, and I'm putting it in here to show that I wasn't misidentifying my references: [url<]http://ashesofthesingularity.com/metaverse#/personas/d3ae008b-7de1-458d-b450-8a7ff0a220b0/match-details/379a8165-6fd5-48be-9c44-5b8d484f5d2d[/url<] ] Notice something interesting? Like the fact that the results are basically tied but the R9-390 is actually winning by a very small average margin? Additionally, the CPU for the R9-390 is a rather pedestrian Skylake 6600 which is certainly lower end than the 5930K that AMD used for its official single-card 480 results.

            • Spunjji
            • 3 years ago

            Given margin of error I’d definitely call that a tie. Not sure CPU is relevant here as they’re both sufficient to make that scenario GPU limited in DX12.

            That puts performance pretty much where people have been expecting it. Still disappointing, though, as I was hoping there was more than just hot air to their per-processor optimisation claims.

            • chuckula
            • 3 years ago

            Despite the massive downthumb storm that’s been going on, you’ll see that I’ve been saying Polaris 10 is basically going to land between the R9-390 and the R9-390X for quite some time… mostly because that’s basically what AMD has told us.

            Once again, it’s not that a $200 card with that level of performance is “bad”, it’s when AMD’s marketing squad tries to pretend that the $200 card is really a massively better card than it actually is.

            • bjm
            • 3 years ago

            [quote<]it's when AMD's marketing squad tries to pretend that the $200 card is really a massively better card than it actually is.[/quote<] LOL, so you're mad that AMD's [b<]marketing[/b<] department is trying to over hype a product? Maaan, you are really trying hard.

            • maxxcool
            • 3 years ago

            Every time I read marketing slides from AMD I need a shower. They are early designed like tabacco product ADs, designed to appeal to people who don’t know any better.

            • bjm
            • 3 years ago

            Indeed, they are garbage, you won’t see me defending that BS. But, on the otherhand, so is almost every other marketing department. After all, that is why we are all at sites like Tech Report. Since even before the quake/quack fiasco, we should all take every companies slides with a grain of salt until verified by multiple independent reviews.

            To fault AMD for it and act as if they’re the only devil in town, like chuckula does, is idiotic. And I say that not in defense of AMD (hell, I haven’t bought an AMD product since the 4800 series), but rather out of annoyance from chuckula’s holier-than-thou attitude in his crusade against AMD fanboys when he’s no better than one.

    • floodo1
    • 3 years ago

    Who uses 1080p still? (-8

    • puppetworx
    • 3 years ago

    I get the feeling this comparison setup was chosen either a) because it shows the RX 480 in the best possible scenario or b) because it makes the true average gaming performance of a single RX 480 very hard to extrapolate.

    Obfuscation like this makes me very interested in reading the reviews. Now let’s just hope AMD actually deliver one to TR for testing.

      • JustAnEngineer
      • 3 years ago

      Let’s hope that TR reviews all of the new graphics cards from both GPU suppliers as they appear in the market.

    • CScottG
    • 3 years ago

    Ashes of the Singularity + 1080P = nearly useless test of GPU performance.

    It says more about the drivers for the game than anything else.

    See testing at 4:35 here:

    [url<]https://www.youtube.com/watch?v=-ZG54Da2pMM[/url<] It's only once you move higher in res. that the GPU starts to suffer performance drops as seen at *4k* 4:55's into the video.

      • PadawanTrainer
      • 3 years ago

      I have no idea why you got downvoted for this…

      • Pancake
      • 3 years ago

      Not only that, it’s a B-grade title that not many people will care about. I generally buy graphics cards to play particular mega titles e.g.

      GTX 970 to pay respect to GTA V
      6950 for Tomb Raider and that lovely Tress FX
      9600GT for Mass Effect

      and so on. Also only buy consoles when there’s a must-play title.

      AMD banging on about AotS is both embarrassing and pathetic.

        • DoomGuy64
        • 3 years ago

        [quote<]it's a B-grade title that not many people will care about.[/quote<] *cough* Project Cars *cough* [quote<]I generally buy graphics cards to play particular mega titles [/quote<] Very true. Big titles are definitely more relevant to the general public. But so is 1080p for that matter. Which all looks great for the 480, because it is a card tailored for the masses. It's perfect for this market. [quote<]AMD banging on about AotS is [/quote<] Relevant because it's one of the few dx12 titles available on the PC. Guess what? I can see Doom also becoming a popular benchmark after the Vulkan update is released. Supposedly it supports Async, and as such would be a great representation of next gen API performance. Of course, if AMD does really well because of that, the nvidia fanboys will crawl out of the woodwork crying foul because their preferred brand doesn't handle async as well. Protip: Consoles support async. All future dx12 titles will support it. It's a relevant feature. Blame nvidia for not supporting it. It's not gameworks, so it's not crippleware. Async is a performance enhancer. IF it's supported. Nvidia just needs to support it, then they'll do fine. Until then, you're overpaying for a feature lacking graphics card, then whining over your buyers remorse, and that behavior is just sad.

          • Pancake
          • 3 years ago

          No, AotS is utterly irrelevant. It’s almost outrageous how AMD are setting themselves up to fail. It’s like watching someone slowly and methodically shooting themselves in the foot. Why are they doing it?

          As to DX12, async or whatever. It’s hardly going to matter. What matters is this – the majority of gamers are currently gaming with NVidia cards so if you want your game to sell you make sure it runs well on NVidia. As a game player, not game developer, that suits me just fine.

          We’ll all find out in a few weeks but I’m expecting the RX480 to be mediocre in comparison to the new hotness from NVidia. Then the argument from AMD fans will once again be about how cheap it is. Then there will be another quarterly loss announced by AMD (thanks cheapass fans) with a declining marketshare because they’ve been hammered by NVidia from all directions (I’m expecting RX480 to be somewhere between a 1060 and 1070 but consuming much more power with frame rates stuttering all over the place).

          It’s like Charlie Brown, Lucy and that damn football. Why expect something different this time?

          Anyway, my next card will be whatever supports the next-gen of VR well. I expect this will be driving 4K+ resolutions as the current screendooring and limited field of vision is not good enough. So none of the current generation cards will suffice.

            • DoomGuy64
            • 3 years ago

            tldr summary: ignorance is bliss. Whether or not AotS is [i<]personally[/i<] relevant to you, is irrelevant of it's status as a proof of concept. What AotS shows is that AMD does well in dx12 with async enabled. Does that represent dx11 performance? No. Does it mean AMD can beat nvidia in other games? No. The only thing AotS shows is that AMD does very well in AotS using dx12 and async, which CAN be relevant to other games that use dx12 and async. [url=http://wccftech.com/async-compute-praised-by-several-devs-was-key-to-hitting-performance-target-in-doom-on-consoles/<]Case in point.[/url<] [quote<]That prompted some other developers to chime in, such as id Softwareโ€™s Lead Renderer Programmer Tiago Sousa, who said that Async Compute is awesome and his team gained 3 to 5 ms in rendering time with DOOM. This, alongside AMD intrinsic functions, was key to hitting the performance target (60FPS) in DOOM on PlayStation 4 and Xbox One. Sousa also remarked that profiling tools on consoles helped a lot as well, while PC can improve in that regard.[/quote<] [quote<]Mickael Gilabert, 3D Technical Lead at Ubisoft Montreal, added that his team noticed gains of 2.5 to 3 ms in Far Cry: Primal and concluded with โ€œAsync for the winโ€.[/quote<] Doom is going to get a Vulkan port sometime down the road. AMD is going to do well in it, and that is far more relevant that you want to admit, which is why you have ignored that point and keep ranting on AotS. Guess what? AotS is one game, but EVERY game that uses dx12 and async will receive similar performance boosts. You can ignore AotS all you want, but that behavior will only carry you so far until other games support the same features, and they will. Consoles use Async compute. Console Ports will use Async compute. You can't ignore it by sticking your head in the sand forever.

            • Pancake
            • 3 years ago

            Nice story about console optimisation. OF COURSE if there’s a feature on a console that will help performance they will use it. Async could be a great developer feature for all I know. Fabulous. I’m not a games developer.

            No, we don’t know very much but lets try to analyse this fascinating choice of AotS as seemingly THE AMD benchmark. Why didn’t AMD go to id for a benchmark of Doom for PC with async enabled? Are their developer relations so bad? Have they sunk so far as having only a second/third tier developer to champion their product?

            The second point I’d like to make is about how you go banging on about a specific feature of the AMD graphics architecture. As I have been saying for years low-level APIs are a seriously bad idea. Although low-level optimisation might seem a good idea at a current point in time and garner you some nice immediate gains they force the graphics card manufacturers to keep iterating a very similar graphics architecture in order not to break compatibility or performance. This will stifle innovation. Software developers will have to, much more than currently, target their software to run optimally on an NVidia, AMD or Intel platform.

            • DoomGuy64
            • 3 years ago

            [quote<]Why didn't AMD go to id[/quote<] Because the patch isn't available to the public yet. Duh. [quote<]how you go banging on about a specific feature[/quote<] Me? I quote: "AotS is utterly irrelevant." AotS isn't relevant because "Aots". It's relevant because it uses dx12 and async. One of the FEW dx12 games [i<]available to the public[/i<] today, and we're going to see more games use these features in the future. The whole point of testing AotS is the exact use of this feature, because it gives us a glimpse of how future games will perform when using these features. The benchmark never was just about AotS as a game. Never. [quote<]low-level APIs are a seriously bad idea[/quote<] Only for Nvidia, since AMD gets the console optimization bonus. Plus dx12 isn't exactly glide. It's an open standard, and isn't anywhere near glide levels of low-level. [quote<]Software developers will have to...[/quote<] Quite a few of them wanted low level API's. Microsoft made PC optimization too difficult with dx11, although that has improved with wddm 2.0 and dx12 feature backports. This also makes console ports MUCH easier. [quote<]target their software to run optimally on an NVidia[/quote<] TWIMTBP and Gameworks have had their time. Now the shoe's on the other foot. Perhaps instead of stifling the competition, they should have been more cooperative? I don't have any sympathy for Nvidia here, especially when overcharging an extra $100 for reference "founders edition" cards. Hell no. Nvidia needs to be taken down a peg. They're too arrogant and ruined the ecosystem for too long. Also, keep in mind Async is dx11 tessellation 2.0. AMD had a monopoly on tessellation up until Fermi. Async isn't a black box like gameworks, it's a dx12 feature. Nvidia will come back as soon as they properly support it. Until then, it's only something that gives AMD a temporary edge, and makes their current hardware more valuable. It's a proper bang 4 buck situation, where AMD has finally given gamers some incentive to go red. Nvidia still owns the high end, but mid-range gamers can now get an acceptable card for $200. This hasn't happened for a good while, so I applaud AMD for pulling it off.

            • Spunjji
            • 3 years ago

            Why so much focus on people being “cheap”? I don’t get it. It’s like you actually don’t like people who don’t have as much money to spend on gaming as you, which is… well, weird.

            • Pancake
            • 3 years ago

            I’m not focussing on people being cheap. What I’m saying is that will be the ONLY argument advanced by AMD fans as they always seem to do when their current champion card gets smashed by NVidia at benchmarks. The tune never bloody changes.

            • chuckula
            • 3 years ago

            I don’t recall seeing any posts from those exact same people claiming that the $650 R9-Fury X was meaningless when it launched. Maybe they thought the Fury X was so stratospherically overpriced that they just forgot to post at all?

            Or the $650 Nano, which frankly makes the Founder’s Edition look like a very reasonable price/performance card.

      • juzz86
      • 3 years ago

      You are on a roll for downvotes this week mate!

        • CScottG
        • 3 years ago

        LOL!

        I honestly think it’s funny. Sort of like a game – but I can’t for the life of me figure-out the damn rules! It seems to be tied to negative comments, but then again – some of the higher up-vote comments are negative. (..*scratches head*.) Even when asking for down-votes (as an Edit on another comment) I then proceeded to get up-votes.

        Is there a manual available?

          • juzz86
          • 3 years ago

          Haha, trust me mate I have less idea than you do.

          I must admit to being a bit surprised to see ‘where’s the 1080 review’ posts being upvoted – that’s usually very much a no-go here and the old guard will hop on the button quicker than you can blink.

            • CScottG
            • 3 years ago

            Yeah I spotted that as well. Maybe the added bit of wit was appreciated despite it being both negative and TR-related?

            I just shoved up-votes in all the replies to my comment above. Gotta do my part, right? (lol.)

          • yogibbear
          • 3 years ago

          Timezones. Different people from TR log in at different times. Also, TR is not a bandwagon/pile on +1/-1 place. I’d say there’d be some very interesting stats on the total +1’s and -1’s certain posts receive when discussing something controversial/on the fringes of general agreement. (sometimes being a lot higher than the balance). Whereas good humour will almost certainly always get +1’s, and certain tones/topics will always get -1’s.

          • tipoo
          • 3 years ago

          The votes are really entirely cosmetic here, you don’t get burried like say on Reddit. Just high rated posts show up on the side thingy. *Ideally* people would vote on content relevance rather than disagreement, but it is what it is and doesn’t really matter.

    • travbrad
    • 3 years ago

    They are really pushing those Ashes of the Singularity numbers considering almost no one is playing it: [url<]http://steamcharts.com/app/228880[/url<] Good marketing I suppose, but it's not a good indication of how games people are actually playing will run. That's why we need independent review sites. ๐Ÿ™‚ I play a 4 year old FPS (Planetside 2) that was largely considered a commercial failure and even that game has about 50x as many players, let alone comparing AoS to a game that is actually popular. Ashes of the Singularity barely makes it into the top ONE THOUSAND games on Steam.

      • LostCat
      • 3 years ago

      Somehow I don’t think we’ll go back to benchmarking DX9 games.

        • tipoo
        • 3 years ago

        DX12 is cool and all, but I wonder which company does the best on DX7 titles.

          • travbrad
          • 3 years ago

          Because there was nothing between DX7 and DX12 right? Like for example the countless DX11 games people are currently playing?

            • tipoo
            • 3 years ago

            Take a breath man, was just a joke and not meant to be at your expense.

            • travbrad
            • 3 years ago

            I take several thousand per day.

            • tipoo
            • 3 years ago

            A few short evidently.

        • travbrad
        • 3 years ago

        I’m not saying to benchmark DX9 games/Planetside 2 since it’s not popular either. I just think it’s a bit silly to show off how their new graphics card performs in the 900th most popular game because it suits their agenda. Both AMD and Nvidia do it and I don’t know why anyone thinks their PR slides are a meaningful indication of anything. Nvidia was touting their VR performance before the 1080/1070 came out even though almost no one has VR yet.

        The majority of games people end up playing on these cards will be DX11 games. I do think DX12 in general will help AMD (maybe not as much as in AoS though), but there needs to actually be a good selection of DX12 games available before that is an important metric.

          • JustAnEngineer
          • 3 years ago

          If you’re only interested in the most popular games, you’ll just want to look at Candy Crush or solitaire benchmarks, right?

          DirectX 12 is the future. If you’re dropping $700-800 on a new card, you’re wanting it to play the games that come out at the end of this year, too–not just the ones that came out two or three years ago.

            • travbrad
            • 3 years ago

            Nice strawman but it seems to me there are a ton of people playing DX11 games which could stress these cards and are actually relevant. But nope nothing between Candy Crush/DX7 and DX12.

            Most games being released at the end of this year are still DX11 too btw. It’ll probably be another couple years before a lot of games support DX12, and even longer still before it’s required. If you hold onto your graphics cards for a long time I guess it could be important, although I’d still be wary of basing all of your assumptions on one DX12 game (just as I’d be wary of basing your assumptions on one DX11 game)

            • Spunjji
            • 3 years ago

            It’s not a straw man, it’s a reductio ad absurdum.

            Obviously AMD are playing off their biggest strength. That’s what marketing do. Do you seriously expect them to stand up and say “yeah our performance is still not as good as it could be in DX11 and it probably never will be”?

            • travbrad
            • 3 years ago

            I understand why AMD does it I just don’t understand why so many people eat it up without questioning it. Ashes of the Singularity is suddenly the most important game in the world to the true believers despite only 0.004% of people on Steam playing it. The 480 will almost certainly be a good card for the money, but it has nothing to do with the AoS numbers. Just like the 1080 and 1070 look like good cards, but it has nothing to do with VR performance.

            Whether it’s DX12 or not it just seems to be a poorly optimized game in general when you look at the graphics quality and the performance, so I even have doubts about how representative it will be of DX12 games. If AoS is any indication DX12 will bring worse performance and worse graphics quality. ๐Ÿ˜‰

            Also “DX12” has various aspects to it and some games will make use of different parts of the API and use it in different ways than others. The implementation of Async compute in AoS reminds me a lot of HAWX 2 tessellating flat surfaces back when Nvidia had a big advantage in tessellation. In both cases it seems like the games don’t really benefit in graphical fidelity or performance from using these things, which isn’t something to be celebrated IMO. DX12 was supposed to help developers optimize their games better and provide better performance and graphics for gamers. AoS has mediocre performance and mediocre graphics.

            • DoomGuy64
            • 3 years ago

            It’s nowhere near the same as HAWX, and you should be ashamed for making that comparison. First off, it’s a performance enhancer, so it doesn’t actually hurt Nvidia, aside from them maybe poorly emulating the feature in software. Second, this isn’t a case of tessellating flat surfaces either, as overusing async will decrease performance on AMD. Developers have to load balance the feature, so it is not overused. The simple fact is, Nvidia doesn’t support it, and emulating the feature with pre-emption is slow. Either way, the AMD card is doing more work, but they’re being efficient at it which raises performance.

            The Best Case scenario for Nvidia is zero performance difference on Pascal, and worst is 10-15% on Maxwell. It’s not a cheat. AMD is actually doing work in the background that improves their performance, as they are multitasking, while Nvidia is context switching. Also, I think the only card that can see huge gains from async is the Fury, being how unbalanced it is in favor of shaders/compute.

            Anyways, the hypocrisy is ludicrous. Where was this outrage over Gameworks and Physx, which was real sabotage? lol.

            Simple solution here is Nvidia needs to do two things:
            1: Stop cheating.
            2: Support industry standards.
            If nvidia starts supporting standards they’ll even the odds, and ending the sabotage will equalize the playing field and improve their mind-share. Until then, I have no problem with AMD using async against them. It’s a legitimate function of dx12, and will obviously be ported over from all console games. Since Pascal doesn’t see a performance hit, all this means is AMD gets free performance. The only people who don’t like this are fanboys who have buyers remorse over their over priced nvidia cards. Solution? Sell your nvidia card and buy AMD. That or deal with it.

            • Spunjji
            • 3 years ago

            Why do you think anybody here is “eating it up without questioning it”? Apart from the troll posters, obviously.

            As for that last paragraph, given that AMD clearly DOES benefit significantly in performance terms from Async compute (and other developers including ID have found frame-latency savings by using it) your point is invalid.

            The fact that there are actual benefits attached means it’s not like the HAWX nonsense at all – it’s more like the DX9 situation when the 9800 Pro beat out the 5800 Ultra because NVIDIA used a conservative design at a time when the industry moved forward; except nowhere near as dramatic as that given that NVIDIA still easily have the performance lead by some way.

            • travbrad
            • 3 years ago

            [quote<]As for that last paragraph, given that AMD clearly DOES benefit significantly in performance terms from Async compute (and other developers including ID have found frame-latency savings by using it) your point is invalid.[/quote<] My point was the game has medicore graphics and performance in the first place. The fact that Async compute improved a game that seems to be poorly optimized is nice I guess, but it would be better if it was just better optimized in the first place. A game should not look that bad and run that poorly in 2016 IMO. It's not a good showcase for the benefits of DX12. Not that it really matters anyway because as I said no one is actually playing it.

      • somanshu
      • 3 years ago

      Well my guess is these guys probably have to pay money to the developers of whatever game they show on stage (to avoid copyright issues). Since Ashes of singularity is associated with amd, I guess it was cheaper for them to showcase Ashes as opposed to say, Call of Duty, for which activision would probably charge way too much making the presentation needlessly expensive. Of course marketing is always biased so better on stage performance always helps ๐Ÿ˜›

      • Ninjitsu
      • 3 years ago

      Yeah I agree with your overall point, a lot of people tend to keep talking about AoS, except at this point it’s effectively like getting excited about Unigine Haven numbers. It’s just another benchmark.

      That said, AMD marketing will push whatever advantage they can get, they’re paid to do that, after all.

    • ish718
    • 3 years ago

    [quote<]Ashes of the Singularity runs on DirectX 12 and uses Explicit Multi-Adapter mode for multi-GPU processing, so the GPUs were not running in Crossfire mode.[/quote<] The thing is that many developers probably won't bother using Explicit Multi-Adapter since it is a lot of extra work. In those cases you will be at the mercy of crossfire driver support. It will be an excellent value when it does work though.

      • chischis
      • 3 years ago

      Then shout loudly about it when EMA is not implemented in games that you care about. “Lazy” devs should be called out in these cases.

    • southrncomfortjm
    • 3 years ago

    Can we please get another DX12 game to talk about? It really just seems as if AOTS is the only game out there that AMD wants to talk about.

      • DPete27
      • 3 years ago

      Perhaps because AMD has worked very closely with the AOTS developers since the game’s inception. They have a lot invested in the game.

        • floodo1
        • 3 years ago

        This. AMD basically cherry picked benchmark results here because AOTS is one of the only games where Fury X actually competes with 980ti.

        This is also 1080p which even last gen cards can easily handle …. what does this pair of cards do at even 1440p?

      • nanoflower
      • 3 years ago

      There just aren’t any other DX12 games that do a good job of using the API (and in particular async compute.) They believe they have an advantage over Nvidia with asynch compute so they are going to keep pushing it and feature any game that uses the feature heavily. Unfortunately that’s only AOTS for now and the near future.

      • maxxcool
      • 3 years ago

      I Believe Plants VS zombies is Mantle\dx12 compatible is it not ๐Ÿ˜‰ ?

      • chuckula
      • 3 years ago

      Tombraider at least claims to have a DX12 mode. Don’t know if it counts as being “real” DX12 or not.

        • NoOne ButMe
        • 3 years ago

        I believe everyone qualifies it as so broken that it’s considered worthless in that mode.

          • LostCat
          • 3 years ago

          Hmm…nope, I don’t.

      • Mat3
      • 3 years ago

      They also like to talk about Hitman and Total War: Warhammer.

        • yogibbear
        • 3 years ago

        Hitman is a decent engine though and TW:W is a crappy Total War engine that usually cripples a system so it’s usually interesting to see what minimum hardware you need to actually play their games.

    • NoOne ButMe
    • 3 years ago

    If AMD’s claims about more efficiency are true this means at around 91% load the Rx480 uses under 96W of power. I’m dubious given the nature of the shit AMD pulled here. But it would be very nice to see AMD basically equal the performance per watt of Pascal. Instead of being 20-30% behind.

    [url<]https://forum.beyond3d.com/posts/1918985/[/url<]

    • torquer
    • 3 years ago

    Can’t wait to see it either.

    When we get the TR review.

    2 months after it launches.

    :p (meant as a loving jest)

      • ronch
      • 3 years ago

      Uh oh. I hope the powers that be are reading all these comments of discontent with how TR is being run these days. It may be ‘meant as a loving jest’ for now but if this goes on people will start to light their torches.

        • biffzinker
        • 3 years ago

        I’m sure TR isn’t going away any time soon hopefully

          • ronch
          • 3 years ago

          Neither did Tom’s and Anandtech.

            • maxxcool
            • 3 years ago

            And them .. Boom. Dear John and news aggregation with much less content.

            I am holding out.. but EVERY day I do not see a real review and more offsite news I grow evermore doubtful.

      • NoOne ButMe
      • 3 years ago

      Well, I cannot imagine TR having trouble getting any AMD cards from now on. I would hope…

        • ronch
        • 3 years ago

        What’s Scott working at AMD for??!?

          • NoOne ButMe
          • 3 years ago

          To quote the article he wrote as a farewell: ” to help implement the frame-time-based testing methods for game performance that I’ve championed here at TR.”

          But given he stated that Raja is the one who “poached” him I think he probably ended up relatively high up the ladder in RTG. And while him favoring TR could be seen as damaging to the TR independence as he knows this I think he will at least attempt to stop them being cut out of anything.

      • Anovoca
      • 3 years ago

      Can’t tell if this was an attempt at ironic sentence structure or a failed Haiku.

        • torquer
        • 3 years ago

        Ironic sentence structure :p

        • JustAnEngineer
        • 3 years ago

        New cards, gamers cheer
        Marketing claims unbounded…
        TR reviews when?

          • Anovoca
          • 3 years ago

          Pensive discord stirs
          The masses walk in darkness
          Frametimes bring hard truths

            • Spunjji
            • 3 years ago

            -actual applause-

    • maxxcool
    • 3 years ago

    “Randomly seeded assests”. if the #$%^ing test doesn’t generate static 1:1 content between runs it is UTTERLY useless as a benchmark.

    Until we can get a 1080, 1070, 480x on BF5 (or 1?), Cod, Fallout, Fable benched in dx 12 mode *with* and *without* async + Explicit Multi-Adapter mode this is all the same old slide deck BS from both AMD and NV.

    and since I’m ranting … Reviews!?!

      • Kretschmer
      • 3 years ago

      Fable was cancelled, no?

        • biffzinker
        • 3 years ago

        Yes, and the development team (Lionhead Studios) was shutdown. Thanks again Microsoft

        • maxxcool
        • 3 years ago

        Well then.. ๐Ÿ˜› Maybe the next Unreal engine? Cry engine ? We desperately need 7+ games to sample that will utilize the feature sets to get a proper idea. Even then coding errors and coding Bias can #$%& with results.

        Just soooo tired of the buthurts, fanbois, rage fests, techno-evangalizingand making $^#^ up when TR has yet to do a proper review on to compare to Anand’s or Hradocp’s .. jesus.

      • nanoflower
      • 3 years ago

      That’s not true. So long as you run the test multiple times and get the same results you can be confident in the results. In the case of this Ashes of the Singularity benchmark AMD said they ran the test 5 times and got consistent results. That’s the same thing that Scott did and Jeff does when testing GPUs (though in their case they will avg them together instead of picking a single representative result.)

        • maxxcool
        • 3 years ago

        Whilst true for TR testing, a dubious feeling creeps over me when reading a PUBLICIST from marketing repling to technical questions.

          • maxxcool
          • 3 years ago

          /snort/ a -1 already … to close to truth ? /CHUCK HAT ON/ No-none can defend the level of BS AMD’s marketing team employs. /CHUCK HAT OFF/

          • nanoflower
          • 3 years ago

          I wouldn’t say to buy a card based on the benchmarks reported by the company that makes the card, but it gives you some idea of what to expect from the card. Until the actual reviews come out there’s not much else to discuss other than what AMD reports and the few benchmark leaks (AOTS and 3DMark) that are out there so of course people will be discussing this. Especially given this card looks to be a great value based on the reported performance and the cost.

        • Ninjitsu
        • 3 years ago

        The funny thing is, according to the AoS benchmark database, they ran it on the dual GPU config only once (at the apparently compared preset, i.e. Crazy 1440p) and a bunch of times on the 1080 with varying results (400 point spread).

      • SoundFX09
      • 3 years ago

      This is just one benchmark. While it’s true that it’s very suspicious, there is no reason for you to go bat S*** crazy over just one benchmark.

      I’ll wait patiently for more Benchmarks for the card to come out Later. On Paper, this card looks like a 1080p killer for a very affordable price. Not only that, It makes sense for AMD to target the Budget Market. That’s usually where people will favor AMD cards over Nvidia’s (Yes, there are exceptions to this).

        • maxxcool
        • 3 years ago

        Yeah true words friend. I’ll Tell you what. I’ll buy one at 299$ if it’s within 10% of a 1080+8gb gddr5x ram. ๐Ÿ˜‰

          • maxxcool
          • 3 years ago

          Hah -2 for SUPPORTING Amd.. fanbois seems a bit ADHD today ..

            • 0x800300AF
            • 3 years ago

            If I had to guess, more like -2 for either reading comprehension
            “On Paper, this card looks like a 1080p killer for a very affordable price”

            or unrealistic expectations
            ” I’ll buy one at 299$ if it’s within 10% of a 1080+8gb gddr5x ram. ;)”

            • maxxcool
            • 3 years ago

            Sad day when double sarcasm is missed …

            • JustAnEngineer
            • 3 years ago

            [quote=”Maxxcool”<] [i<]It's a[/i<] sad day when double sarcasm is missed. [/quote<] [url=https://www.youtube.com/watch?v=OQSNhk5ICTI<]It's so beautiful![/url<]

            • maxxcool
            • 3 years ago

            Lol!! Damnit I need more upvotes…

    • DPete27
    • 3 years ago

    Whats the difference between “single/normal” and “heavy” batch test?

    Sounds like AMD took favorable results from two separate testing scenarios and combined them into one slide to make themselves look awesome.

      • RAGEPRO
      • 3 years ago

      ([i<]This is me personally speaking, not as a TR newsie.[/i<]) As far as I can tell the AOTS benchmark runs in three phases: a "normal batch" test that's heavily CPU-bound, a "medium batch" test that's more balanced, and then a "heavy batch" test that's heavily GPU-compute-bound. From what I understood it seems similar to testing an SSD using various queue depths. All three tests are part of one complete benchmark cycle.

        • tay
        • 3 years ago

        So they’re running a heavy CPU bound test because they can do the async processing that nvidia is poor at, thus showing their card in the best light. Regardless, single GPU scores soon please. Never buying 2 GPUs of any stripe.

        • DPete27
        • 3 years ago

        So they never should have included the results of anything EXCEPT the heavy batch test since they’re comparing [b<]GPUs[/b<]. Clearly some idiot saw the normal batch test GPU usage and thought "oooh look, the RX480 isn't even trying!! If we show this alongside a favorable framerate bar graph comparison, Nvidia will be embarassed."

          • nanoflower
          • 3 years ago

          No, they did exactly what Nvidia did and reported the results that made their product look the best. They won at all three levels of the test so they weren’t lying about the CF’ed 480s being faster. It’s just that the usage level at the normal batch level looks much better than the GPU usage level at the heavy batch level. Though the fact is they probably would have been better off to have also reported the heavy batch level GPU usage as people were getting worked up over the 480s not hitting close to 100% GPU usage and were thinking a CF’ed 480 solution could not do better than 50% GPU utilization. That seemed to be the real issue people had, which turns out to not be an issue at all.

    • ronch
    • 3 years ago

    Did everyone get a nice corsage at Computex? Or was it only the AMDers?

    • wingless
    • 3 years ago

    The did well clarifying this. In DX12, I expect AMD to lead. Nvidia isn’t looking as spectacular in DX12 as they are in DX11. Likewise, AMD’s DX11 performance likely won’t see any great breakthroughs. Nothing new to see here folks.

    • chuckula
    • 3 years ago

    [quote<]In his words, "the GTX 1080 is doing less work to render AOTS than it otherwise would if the shader were being run properly." According to Robert, the RX 480 is correctly executing the terrain shaders, leading to more procedurally generated snowโ€”and worse performance.[/quote<] Yeah, here's his LinkedIn page: [url<]https://www.linkedin.com/in/rthallock[/url<] Notice anything in there that would indicate this guy is an engineer? Notice anything in there that would indicate this guy has the skillset to reverse engineer a competitor's product that was ostensibly on sale only 4 days before the event to provide us with a deep technical analysis of the GTX-1080 "cheating" in some benchmark of a program that is [i<]purportedly[/i<] a game from a third-party developer and not just some dressed up AMD tech demo? This has "overclocker's dream" written all over it. Maybe if a real engineer at AMD had given a highly detailed and indepedently verifiable post-mortem including trace dumps, I'd at least be inclined to consider evidence that can be verified outside of AMD's PR department. Offhand nonsense from a marketing drone? I don't believe a word of it. Here's the problem AMD: You are so utterly lacking in confidence in your own products that you had to paint a target on Nvidia in a hastily prepared demo. You couldn't just come out and say that Polaris provides nice performance at a nice price. Instead, you had to try to tear down Nvidia instead of building yourself up. Well congratulations: Now you are going to get trashed in the media when these fairy tale benchmarks don't translate into reality. If you had never poked the bear in the eye in the first place, nobody would be holding it against you when the 480 fails to deliver the empty promises that you have made for it.

      • RAGEPRO
      • 3 years ago

      You mad, bro?

        • chuckula
        • 3 years ago

        Nowhere near as mad as some of the usual suspects are going to be when the 480 gets tested in the real world.

        I could literally write the “OMG HOW CAN YOU BE SO BIASED” complaint comments right now.

        Additionally, for all the downthumbs, I’ve noticed that the usual suspects don’t have a single answer for why we should believe a marketing drone from AMD’s comments about the inner workings of a GTX-1080. I sure as hell didn’t see Jen-Hsun on stage at the Pascal introduction nitpicking issues with the Fury-X, and believe me, there are some.

          • RAGEPRO
          • 3 years ago

          To be fair, if you look at the Reddit post, he even says it’s barely a few percentage points’ difference. And they won the benchmark anyway, legitimately. It’s pretty reasonable to suspect that an AMD or Oxide engineer could observe the shader code running on GTX 1080 and go “that’s not right.”

          I mean, I get it, it’s very clearly a best-case scenario (I actually meant to write that in the article in those words and forgot) that doesn’t really represent the cards’ performance, but I don’t know if you really needed to dig up the guy’s LinkedIn profile? Heh.

          He also didn’t actually imply at all that Nvidia was intentionally cheating, just that the shader wasn’t running correctly. It seems likely to be a mistake or bug, doesn’t it? Pascal is brand-new after all.

            • slowriot
            • 3 years ago

            LMAO! I can’t help but read Chuck’s stuff and then think “YOu realize AMD could have just asked the game dev’s if this looked right, right?” But nah. He’s too busy constructing some outlandish narrative that no one cares about but him. The dude literally argues with himself more other posters.

            • Action.de.Parsnip
            • 3 years ago

            The dude literally craps on every other news thread. I’m lurking the site every day and this attention seeking is farting on my user experience.

            • cegras
            • 3 years ago

            Actually, if you read through his posts he is essentially doing damage control for intel.

            • Action.de.Parsnip
            • 3 years ago

            I dont like to get involved with the mud slinging but jesus h christ endless endless sh1tposting.

            • Srsly_Bro
            • 3 years ago

            His mom didn’t love him.

            • nanoflower
            • 3 years ago

            Yeah, I understand people wondering why the two benchmarks looked different but when you choose a word like “Cheating” that sounds like you want a fight since it’s clearly a trigger word. All that I got from Robert’s post (read it shortly after he posted it and linked in the Polaris article comments on TR) is that Nvidia’s code is a bit buggy. That comes up all the time with new games or new drivers. Sometimes the bug might deliver bad graphics and sometimes it changes the performance. No need to assume there’s anything nefarious underway from Nvidia or AMD.

            As for the performance in real games, of course we have to wait for the cards to get into reviewer hands and get a chance to test against many games. That’s when we can see what the 480 compares to. I expect it will be like most GPUs and it will beat the 1070 and maybe even the 1080 in some games and they will win in others.

            • arbiter9605
            • 3 years ago

            I think they will lose to 1070 is mostly all games, Reason why is well The game they used is known to be a game that massively favors AMD hardware since it was released in alpha. Which isn’t a shock AMD used that one for comparison since well they need any thing to look good, But people with any sense will wonder how good the card really is in more neutral games.

          • ronch
          • 3 years ago

          Ever noticed that Intel and Nvidia don’t go so low as to put AMD down in words or pretty slides? They let their products do the talking. I can just imagine Jensen often smiling to himself these days.

          Then again that’s what Roy Taylor, Adam Kozak and the rest of the McDonald’s gang do for their paychecks. I kinda miss John Fruehe. He sure made me feel good about AMD’s new products.

            • slowriot
            • 3 years ago

            Dude, you are desperate to be offended by any AMD news.

            • ronch
            • 3 years ago

            Offended? Not at all. However, I reckon some people here are the ones who are desperate to get offended by my rants against AMD. ๐Ÿ™‚

            • slowriot
            • 3 years ago

            Your issue here was that AMD labeled the charts instead of simply inferring to the competitor like everyone else. That is beyond silly.

            • ronch
            • 3 years ago

            Not even close to what i really think. Nice try.

            • rechicero
            • 3 years ago

            When you are the top dog, you ignore the underdog, that’s giving them credit and free mindshare. You treat them like they don’t exist. But when you’re not, then you do other things. As simple as that. Nvidia did the same in the past, now they are dominant and try to do what they think best for them.

            • Mat3
            • 3 years ago

            I once had a dream that Nvidia used Gameworks, physx and pointless tessellation to make themselves look better than they really are. But it was just a dream. They always play fair, speak the truth, and are humble when discussing their competition.

            • arbiter9605
            • 3 years ago

            Tessellation was a directx standard back in like DX10.

            • Spunjji
            • 3 years ago

            “pointless tessellation”

            As in, tessellation beyond that required to see any difference. As in, hairworks in The Witcher 3.

            • --k
            • 3 years ago

            The underdog always punches up. Marketing 101.

          • anotherengineer
          • 3 years ago

          “Nowhere near as mad as some of the usual suspects are going to be when the 480 gets tested in the real world.”

          So you are mad then.

          Never watched the pascal event or the Radeon events, don’t care. Will wait for reviews, then availability, then pricing, then some customer reviews then make a decision.

          No need to get all bothered by a trivial launch event for some material object that will be obsolete in 1-2 years time.

          This is a tech site we all know here how weak amd’s marketing and such is, no need for you to point it out to everyone. Also we know you’re a linux man, which basically puts you in Intel/Nvidia camp.

          So why care even at all about AMD??

          Have anything to disclose??

        • Sonk
        • 3 years ago

        You happy, sis?

      • PrincipalSkinner
      • 3 years ago

      Agreed 100%.
      To AMD :Bashing your competitor without a solid evidence while being destroyed in the market seems very unprofessional.
      Make a decent/good reference cooling, fix your damn drivers and get the bloody product out. Including press samples. And shut the hell up!

        • USAFTW
        • 3 years ago

        When it comes to drivers, the 360 drivers from NVidia haven’t exactly been a paragon of stability. Check out the GeForce forums and reddit.
        Some even report their cards going bust after opening up YouTube or Netflix.

          • PrincipalSkinner
          • 3 years ago

          Maybe. But as an owner of 380X, I don’t care about Nvidia’s drivers.
          I want Freesync to work in every game. I don’t want my screen to go black when RAM runs low, forcing me to reset.

            • Meadows
            • 3 years ago

            It does what?
            Jesus Christ.

            • sweatshopking
            • 3 years ago

            Weird. I have a 290, and never had a black screen. You sure that’s a driver issue, and not hardware?

      • USAFTW
      • 3 years ago

      This comment’s just out there in terms of bottled up and irrational hatred.
      [url<]https://blogs.nvidia.com/blog/2010/11/19/testing-nvidia-vs-amd-image-quality/[/url<] And they were quick to point this out even without nobody raising it.

      • human_error
      • 3 years ago

      [quote<]Notice anything in there that would indicate this guy is an engineer? Notice anything in there that would indicate this guy has the skillset to reverse engineer a competitor's product that was ostensibly on sale only 4 days before the event to provide us with a deep technical analysis of the GTX-1080 "cheating" in some benchmark of a program that is purportedly a game from a third-party developer and not just some dressed up AMD tech demo?[/quote<] If AMD works like any other large corporation (tip: they do) then this guy is responsible for PR, so it is expected that he will be the person who would publish info on this as he should know the fine line between stating information and making libelous claims against a competitor (an engineer would not want to risk their job by saying the wrong thing when this guy can do it for them). It does not mean that he has made these findings himself. There will have been AMD engineers, who may have been speaking to the AoS developers, figuring out why there is a discrepancy which was then posted by him.

      • Anovoca
      • 3 years ago

      He deals with the god d@mn customers so the engineers don’t have to. He has people skills; he is good at dealing with people! Can’t you understand that? WHAT THE HELL IS WRONG WITH YOU PEOPLE!!!!

      • AlphaEdge
      • 3 years ago

      Are you qualified to judge his credentials? Where are your Human Resources qualifications?

      Really? His personal qualifications, for someone who works for a company, that has hundreds of engineers that can advise him?

      Your whole post reads like “They lied, I will never buy a product from them!”.

      What about Nvidia’s whole GeForce GTX 970 fiasco. I guess you’re not buying from them also?

      I know, I don’t have the qualifications in my background to give the opinions above, so please disregard them.

        • chuckula
        • 3 years ago

        [quote<]Are you qualified to judge his credentials? [/quote<] Yes. Note that I never claimed to have the detailed information to diagnose exactly what happened in the demo. I never made a final analytical conclusion. That's not the point. The point is that this Hallock guy, who already has a shaky track record, sure as hell doesn't have the background to do it. [quote<]What about Nvidia's whole GeForce GTX 970 fiasco. I guess you're not buying from them also?[/quote<] Name for me where Jen-Hsun stood on stage and centered his entire presentation about the GTX-970 on the ROP configuration. There's a difference between a small technical error in tech documentation and the centerpiece of a presentation that is put on by a C-level executive of a company.

          • AlphaEdge
          • 3 years ago

          > “centered his entire presentation”

          Really? His entire presentation was centered on that???

          You do know that people/sites will getting this card, and providing reviews on it, and people can make up their mind then, whether to get the card or not.

          You’re making a mountain out of mole hill.

            • chuckula
            • 3 years ago

            The factual performance information in his presentation was entirely centered on the one and only demo that was shown: That AoTS benchmark. That was it.

            Sure, he threw out the 5.5 TFlop number, but that’s not a demonstration of anything and believe me, plenty of AMD fanboys don’t like to think about the fact that 5.5Tflops is a somewhat overclocked R9-390 and a step below the 390X.

            Sure he talked about VR a lot. Didn’t demonstrate a thing.

            • AlphaEdge
            • 3 years ago

            You’re hung up on nothing.

            When the reviews come out, no one will care about that presentation, except you.

            • SoundFX09
            • 3 years ago

            Besides, We’re smart enough to know that we shouldn’t take anything AMD or Nvidia says to the bank right now. Once the benchmarks come out, we’ll see which card is the better buy, and whether your argument was valid or not, Chuck.

        • ImSpartacus
        • 3 years ago

        The personal qualifications are being brought up because his actual analysis/write-up is next to none.

        Imagine if a new TR review for a product is only a couple paragraphs totaling 350ish words and it said that the product had a completely unexpected flaw that skewed comparisons with competing products.

        Everyone would be floored that TR would make such a claim with next to no evidence.

        And then what happens if that same thing happens from the marketing department of a for-profit company and the claim just so happens to make their biggest competitor look bad.

        The claim is definitely possible and perhaps even plausible. But until we either get a fantastic supporting write-up or an independent third party to substantiate the claim, then it’s just baseless.

      • Action.de.Parsnip
      • 3 years ago

      You have literally nothing better to do do you.

      • DPete27
      • 3 years ago

      It’s likely that an engineer had to tell Robert their findings. Then he found a way to spin it in a positive light. The problem with most salesmen is that if you are somewhat educated on the subject matter, you can see right through their BS. They get paid to be the mouth, not the brain.

      • rechicero
      • 3 years ago

      I’m more a lurker, but you’re becoming boring sooo fast. If you have some issue with AMD, might you settle it with them without kidnapping every single news about anything related to them? Or go looking for professional help, whatever you choose.

      Tip: You could go to semiaccurate and argue with Charlie. You seem kindred (and obsessed) spirits.

        • DoomGuy64
        • 3 years ago

        Yup. I’m just wondering how many downvotes it takes for chuck to grow up and stop trolling. -100? Every post? Obviously it’s gotten old and irritating enough that people no longer shrug it off as a joke, which is generally how his behavior was previously treated.

        I guess he just got away with it for so long without consequences, that he thought he could step it up to full retard. Of course, that’s obviously the one place you’re never supposed to go.

      • bjm
      • 3 years ago

      Do AMD fanboys really irk you that much that you have to seemingly preemptively strike at them all the damn time? You persistently attack AMD with the justification that AMD fanboys do the same to the competition, yet your comments are never replies to anybody.

      The only thing more predictable than you jumping on AMD at every mistake is ronch prefixing his AMD-related posts “I’m an AMD fanboy, but…”.

      There are fanboys of every company, yet you only seem to recognize AMD fanboys. You’re like some sort of… AMD fanboy fanboy.

      • Srsly_Bro
      • 3 years ago

      I’ve never seen anyone white knight over something where the two didn’t have a consensual sexual relationship. I can’t rule out that possibility, however.

      P.S. When Jen-Hsun wakes up, tell him to speed the release of the 1080Ti.

      • Unknown-Error
      • 3 years ago

      Over 40 thumbs downs? Ok, Chuck, I am officially envious of you. I am up-voting you for that :p The max thumbs down I got was 20 something.

        • Anovoca
        • 3 years ago

        That’s weaksauce. Watch this:

        Ssas is the future.
        FPS locks are for your own good
        Torrenting is immoral
        APU or gtfo
        Keep flash player alive
        Windows KB 3035583 is your friend
        RGB LEDs for everyone
        David Kanter is a mouthbreather
        Who wants to talk about the Buddah?

      • albundy
      • 3 years ago

      you almost bored me to death. do go on.

      • YukaKun
      • 3 years ago

      When you attack or discredit the person making the comment/statement, you already lost the argument IMO.

      I don’t see any ill in what that person stated and nVidia is in their right to answer it or just keep quiet. Instead of ranting about what that person said, you should be asking for more testing around the subject and see if he is right or wrong.

      And please, don’t pull straws with polarized examples ๐Ÿ™‚

      Cheers!

        • ImSpartacus
        • 3 years ago

        I think the problem is that the AMD technical marketing rep that posted that claim didn’t actually support his claim.

        Ideally, you want to judge someone based on their work. It’s possible to do good work and not have a technical background. You can look at someone like Anandtech’s Dustin Sklavos who got picked up by Corsair’s technical marketing team. He did fantastic work at Anandtech, so no one really cares that his background is in film studies.

        This situation is markedly different. This guy does not do good work. He posted barely 350 words (including methodology!) and had to make several clarifications because his work was sparse and generally pitiful.

        So the guy’s analysis leaves much to be desired, so where else can he get credibility? His background is the last resort to add credibility to his work. And it, too, leaves much to be desired.

        Therefore, I’m pretty suspicious as well. He could be correct, but I can’t tell that unless I see much more detailed analysis.

      • yogibbear
      • 3 years ago

      I’d give you more +1’s if I could. ๐Ÿ™

      AMD marketing plan:
      -pick obscure test settings of a single irrelevant game
      -obfuscate results with crossfire scaling
      -obfuscate results further with GPU utlization %
      -declare victory!

      Sounds like legitimate example of expected real world benchmarking/performance results.

      • Meadows
      • 3 years ago

      +1.
      You know what’s hilarious? Just recently they vowed to [url=http://arstechnica.co.uk/gadgets/2016/04/amd-focusing-on-vr-mid-range-polaris/<]stop "badmouthing" NVidia[/url<]. Although now I guess that was merely on the topic of VR.

      • ImSpartacus
      • 3 years ago

      And since he’s not an engineer, why should we believe a man paid by AMD who wrote barely a couple paragraphs of analysis?

      I criticize places like TR or Anandtech when they write thousands of words that aren’t quite right, so I think it’s beyond fair to criticize a corporate-funded write-up that’s only a few hundred words long. This guy might not be wrong, but his pitiful level of analysis leaves me wanting more.

      I hope that a respected independent source takes a look and provides some much needed clarity & credibility on this subject. It’s a fascinating subject that deserves a deep dive.

    • ebomb808
    • 3 years ago

    Excited for Benches. Nvidia’s founder edition nonsense was their own worst enemy on this comparison as $700 should have been $600 for any past Nvidia release. I pre-ordered a G1 Gaming 1080 from Amazon this morning and it was $611 and comes with a ~10% Factory OC. I assume that for any AIB cards on the RX 480 you won’t be paying MSRP, so I think the $ comparison narrows further still.

    Would love a world though where all games support Explicit Multi-Adapter Mode and you could mix and match GPU’s including mixing AMD/Nvidia configurations. I’m sure both AMD and Nvidia will try to kill this though as it would reduce incentive for buying new cards. Time will tell.

      • rahulahl
      • 3 years ago

      I cant even find a place to preorder on Amazon. All I see are those $850 + cards that are price gouging the customers.

        • ebomb808
        • 3 years ago

        The Gigabyte G1 Gaming was available for pre-order for about 4 hours this morning on Amazon but is now sold out. It appears that Gigabyte card will be the first non-reference card to ship on June 7 with other cards not coming until a week later. Gigabyte states that the first 100 people who order from Newegg on June 7 will get a free gaming mouse, so if you want that model, be prepared to F5 at 7:30 AM pacific at Newegg on June 7.

        I still have an EVGA FTW pre-ordered from their website on launch day, but it’s looking like with the voltage limitations, adding more voltage will mean little for overclocking and just add more heat. I will probably end up cancelling the FTW for these reasons.

      • NoOne ButMe
      • 3 years ago

      FE is terrible. But this method of showing a “benchmark’ is far worse. Disgusting any company would ever do this.

        • ebomb808
        • 3 years ago

        We’ll find out real benchmarks before you can buy the RX 480 so I’m not sure that will be some lasting black mark. While the FE is terrible, at least it may prevent some impatient people from ordering a subpar reference cooler card at launch rather than waiting for better cooling solutions. When the reference cooler was cheap, people could use that as an excuse for buying a worse card.

        That being said, I do feel bad for SLI people or small form factor case people who desire the vapor chamber cooler. Nvidia seems to be saying this FE will be sold for the life of the card at a price premium.

          • NoOne ButMe
          • 3 years ago

          Great. Does not creating a lasting black mark not mean it is not a terrible, deceptive and disgusting thing to do when presenting a product? They weren’t even consistent in their own presentation. They used average FPS for the whole benchmark and the load from 1/3rd of the benchmark.

            • ebomb808
            • 3 years ago

            You seem pretty upset about this. If I were you, I probably wouldn’t buy an RX 480, as morally, I don’t think you would be able to live with yourself.

            • Spunjji
            • 3 years ago

            It’s pretty lame, but so’s… 90% of marketing. Seriously people are up in arms over a presentation. Even wood screws don’t deserve this much hand-wringing.

    • ronch
    • 3 years ago

    I’d love to see how the 480, 1070 and 1080 stack up against each other, with the scatter plot thrown in for conclusion.

    I reckon the 480 would see more sales though, despite AMD needing a halo product. $200 is simply irresistible for this many stream processors.

      • tipoo
      • 3 years ago

      Yeah, I wonder about the impact of a lack of a crown product. To people who are into PCs but not to the level of, you know, the regular first 100 TR article responders for example, it may be a matter of “I hear Nvidia has the best GPU out, I’m going to get whatever 200 dollar thing they have”, even if AMD wins the 200 dollar bracket.

      I feel like Nvidia already pulled a similar mindshare coupe a few times.

        • NoOne ButMe
        • 3 years ago

        I think it depends. For the 970 in example. If the Fury X had completely trounced the Titan X/980ti I don’t think sales would have fallen off.

        Because the card did originally deliver such huge value improvement. Even after the RAM issues popped up it still was overall super well received. I think the Rx480 will fall into a similar position.

        • Welch
        • 3 years ago

        It’s sad that you are probably right about that, because it suggests that those people are being ignorant.

        I’d like to think that the first 100 posters as in your example, would be the ones who are analytical enough to know “Hey… Nvidia has the hottest thing if money is no option, but this $200 AMD card is king in it’s bracket.”

        Thats what I hope anyhow.

      • HERETIC
      • 3 years ago

      Here’s my best guestimation…………
      PERFORMANCE
      100%—-GTX1080—$600
      75%—–GTX1070—$400
      73%—-R9-Fury X
      73%—-GTX980Ti
      60%—-GTX980
      57%—-RX480——-$200
      52%—-GTX970
      Don’t think all is well with 14nm-Rumors are AMD was hoping for higher clocks……….
      as the process matures we should have improvements-Also this is rumored to not be
      a full die-Let’s hope they are not being saved for apple as in the past………………………..

Pin It on Pinterest

Share This