Wanted for review: AMD’s Radeon R9 Nano

Although we've covered the Fiji GPU and the Radeon R9 Nano graphics card closely since it was first announced, I've just been informed that AMD has chosen not to provide TR with a product sample for review. The reasons behind this decision aren't clear to me, but whatever.

Here's the deal: we are in need of an R9 Nano card to review. If you're in the industry and somehow have access to one, please contact me at damage@techreport.com. I only need to borrow it for a few days in order to have a look at it. After that, I'll return it to you intact, and the world will have a truly independent review of this intriguing little video card.

As you may be aware, consumers would likely benefit from an independent review of this product, since we suspect AMD's public performance numbers don't reflect the experience of real gamers. We'd very much like to try out our frame-time-based performance testing methods on the Nano.

Thanks!

Comments closed
    • USAFTW
    • 6 years ago

    It’s starting to look like a white list instead of a black list:
    [url<]http://www.techpowerup.com/mobile/215776/amd-radeon-r9-nano-review-by-tpu-not.html[/url<] I wonder how bad the supply issue really is. Edit: Sorry, just saw entropy's post.

      • Klimax
      • 6 years ago

      Fortunately your post was first I saw or there would be now three links to it…

    • entropy13
    • 6 years ago

    TPU is not reviewing it too because apparently, there are “too few review samples”.

    [url<]http://www.techpowerup.com/215776/amd-radeon-r9-nano-review-by-tpu-not.html[/url<]

    • LoneWolf15
    • 6 years ago

    As someone whose last half a dozen cards prior to my GTX 980 were AMD, I’ll just say this:

    If you paper-launch a card and then won’t release it for review to my favorite sites, I’m more than happy to “paper-buy” the card, and not purchase without those reviews.

    AMD, when you draw a gun, try not to point it at your junk, mmm’kay? You’re not hurting us or our tech sites –you’re hurting your own sales.

    • Terra_Nocuus
    • 6 years ago

    A quick search on Newegg showed a pair of mITX sized GTX 970’s from Gigabyte and Asus for ~$340 and $360, otherwise you’re (probably) looking at a 750/750Ti based card. While the Nano would crush a 750, the price-vs-performance will be a hard sell compared to the 970.

    • freebird
    • 6 years ago

    Now I know why I visit this site less & less…

      • Firestarter
      • 6 years ago

      Because journalistic integrity is icky? Maybe you just don’t like fair reviews? Or is it that you only want to hang with the most popular kids that suck up to everyone? Please do tell, we’re hanging on your lips

        • maxxcool
        • 6 years ago

        Because a 650$ card that is ludicrous and would practically fall off the price\peformance chart?

      • K-L-Waster
      • 6 years ago

      Because your paymasters in Sunnyvale told you not to?

      • NeelyCam
      • 6 years ago

      Because trolls and shills get banhammered here more often than on other sites…?

      • Meadows
      • 6 years ago

      Now I’d like to know why you do.

    • puppetworx
    • 6 years ago

    It’s like you want to lose customers AMD.

    • Unknown-Error
    • 6 years ago

    Don’t feel bad Dr. Wasson. It looks like HardOCP also havent got their sample.

    [url<]http://www.hardocp.com/news/2015/09/03/amd_refuses_to_sample_hardocp_its_new_nano/#.VeiL5FWqpBc[/url<]

      • LoneWolf15
      • 6 years ago

      Difference is, HOCP gets snarky about it. I like how professionalism is maintained here.

      [i<]"Today is just a paper launch as is AMD's new habit"[/i<]

        • ImSpartacus
        • 6 years ago

        I think there are many levels of “snark”. HOCP is pretty far up there, but TR isn’t positively angelic. If we’re looking to define the opposite side of the spectrum, I’d have to say that Anandtech is up there, at least when Anand was at the helm. I generally think TR is somewhere in the middle. It takes all kinds.

      • anubis44
      • 6 years ago

      Wow, HARDOCP? That site hates AMD. It’s hardly comforting to be in the same company as them.

    • HisDivineOrder
    • 6 years ago

    The bigger question is which review sites were “worthy” of getting a review unit because then we can compare and contrast those reviews with the reviews of the sites that had to find their own review units in the wild.

    • Damage
    • 6 years ago

    Looks like I’ll have a card to test early next week. May not hit the Thursday launch date with a review, but should be too far off of it. Thanks to everyone for their support.

      • southrncomfortjm
      • 6 years ago

      Glad to see that TR will get to review this product. Nice work Damage!

      • K-L-Waster
      • 6 years ago

      Did AMD come to their senses, or is this a case of the community coming through when called upon?

        • Damage
        • 6 years ago

        This is a friend helping us out.

          • chuckula
          • 6 years ago

          THANK YOU FOR BEING A FRIEND!

          • flip-mode
          • 6 years ago

          Since you don’t have to honor any NDA, you could publish a couple early benchmarks! Oh please, please do this.

            • the
            • 6 years ago

            I’ll second this!

      • chuckula
      • 6 years ago

      Mission Accomplished!

      Go forth and Benchmark!

      • Mr Bill
      • 6 years ago

      1 vote for the ‘Ashes of the Singularity’ beta to be in the benchmarks. I want to see it get the real TR treatment.

        • A_Pickle
        • 6 years ago

        I don’t think that’s an unfair request.

      • Mikael33
      • 6 years ago

      Feel free to dedicate a page to the silliness of AMD not giving you a card to review.

      • maxxcool
      • 6 years ago

      Don’t feel bad, HardOcp was left with their junk in the breeze as well … I have a big tin hat most of the time but this smells extra fishy with the high as hell price…

        • Silus
        • 6 years ago

        TPU also isn’t getting one.

        AMD is just showing how low they can get…And they’re pretty much on the floor already…

      • A_Pickle
      • 6 years ago

      Any chance you might be able to include numbers using [url=https://techreport.com/review/28912/tiny-radeon-r9-nano-to-pack-a-wallop-at-650#perf<]AMD's graphics settings[/url<]? Or at 4K, with no anti-aliasing, but [i<]with[/i<] 16x anisotropic filtering? That might be interesting to see, and to get an idea of where the card's strengths are -- which might be helpful for consumers who are buying the card. I love the idea of the Fury Nano, it was the [i<]one[/i<] card from this AMD series launch that I was excited about. It hugely disappointed me that they weren't supplying you with a review sample, but there might be a simple, non-malicious intent behind their decision (like "it's a $650 card that we know isn't going to perform superbly well, so we'll take our chances on the market without [i<]more[/i<] negative press").

      • Mr Bill
      • 6 years ago

      I’d be curious if you can find out whether AMD will at some future point be participating in the new Exaflop GPU Supercomputer initiative that came out this July. I see that NVIDIA is in the program using their NVLink technology.
      [url<]http://blogs.nvidia.com/blog/2015/07/30/exaflop-supercomputer/[/url<] Is this part of what is motivating AMD to develop GPU virtualization? Edit: I found this on AMD's page... [url<]http://www.amd.com/en-us/press-releases/Pages/extreme-scale-hpc-2014nov14.aspx[/url<] Edit2: and this... [url<]http://www.hpcwire.com/2015/07/29/amds-exascale-strategy-hinges-on-heterogeneity/[/url<]

      • Wonders
      • 5 years ago

      Further word yet on Nano review prospects?

    • Klimax
    • 6 years ago

    Question: Had Intel or NVidia ever did such thing? Don’t remember but that doesn’t say much…

      • Nevermind
      • 6 years ago

      What, declined to give someone a free card, exactly when they expected it?

      • rxc6
      • 6 years ago

      NVidia definitely did it before. When the 5800 FX came out, it was helpless and couldn’t compete with the 9700 pro. Damage bought one himself. I consider that worse because it was a flagship.

        • Klimax
        • 6 years ago

        Could have linked it…
        [url<]https://techreport.com/review/4966/nvidia-geforce-fx-5800-ultra-gpu[/url<] 7th paragraph.

          • K-L-Waster
          • 6 years ago

          Oh Lordie, do those heat sink shrouds look primitive…

          • chΒ΅ck
          • 6 years ago

          wowee that article makes me feel old

      • chuckula
      • 6 years ago

      Intel was pretty late with the 5775C sample, although it was more of matter of Intel being late with the 5775C in general rather than Intel making a political decision to deny TR the sample due to them not liking TR’s reporting.

        • Ninjitsu
        • 6 years ago

        They did prioritize AT and THG over TR, though.

          • chuckula
          • 6 years ago

          That’s true. I think AT was actually at the event where Intel did the launch and they got one of the very rare samples early. THG is owned by the same outfit that owns AT, so they probably used the exact same sample.

      • Unknown-Error
      • 6 years ago

      nVidia and Semiaccurate have very hostile relationship.

        • Leader952
        • 6 years ago

        Charlie Demerjian has a hostile agenda against Nvidia going all the way back to when he was with “The Inquirer”. He started Semiaccurate to continue his agenda when “The Inquirer” let him go.

        Nvidia could care less about Charlie or Semiaccurate.

        A Charlie Nvidia classic :

        [quote<]5-11-2010 [url<]http://www.semiaccurate.com/forums/showpost.php?p=48497&postcount=10[/url<] I would ask the question in a more general sense. Will GPUs exist in 5 years. The answer there would be no. The low end dies this year, or at least starts to do a PeeWee Herman at the end of Buffy the Vampire Slayer (the movie, not the show). There goes the volume. 2012 sees the same happening for the high end. The middle isn't enough to sustain NV. They have 2 years to make compute and widgets profitable. Good luck there guys. -Charlie [/quote<] Charlie's five year prediction ended this last May and discrete GPUs are not only still around but a great strength for Nvidia who has 81.9% market share.

      • raddude9
      • 6 years ago

      Most tech companies release their fastest/flagship products first, and quietly bring out slower products afterwards, hoping that the lesser products will be be able to bask under the same halo as the flagship in the eyes of the consumer.

      Most people fall for this, for example I read on some website the other day that the new intel core i3-6320 was proclaimed as the “Gamer’s New Best Friend”, while I can’t even find a single proper review of the chip anywhere! Generally speaking though, this “flagship-worship” is not a bad rule of thumb, but I like to see the numbers for these “lesser” products before I part with my hard-earned cash.

    • MrJP
    • 6 years ago

    So now I dislike nVidia [b<]and[/b<] I dislike AMD. What do I do now when I want a new graphics card?

      • Klimax
      • 6 years ago

      Get Intel? πŸ˜€

      Or better yet Matrox or S3…

        • EndlessWaves
        • 6 years ago

        Except despite being mentioned on the launch slides for Skylake-H Intel doesn’t seem to have even bothered to release their top GT4e graphics.

        • One Sick Puppy
        • 6 years ago

        3DFX is eternal.

        • Beelzebubba9
        • 6 years ago

        Funny thing is that between the relative maturity of their Gen9 core and the packaging/MCDRAM work they’ve done for Knights Landing at 14nm, Intel already has a lot of the tech in place to make a high end discreet GPU.

        There’s no real chance of this happening, but it’d be cool to have a 3rd viable option regardless.

      • Mark_GB
      • 6 years ago

      Intel cometh… Sometime in the 22nd century…

      • K-L-Waster
      • 6 years ago

      Get a used card – sure, it was built by the two companies you don’t like, but they don’t get the revenue πŸ™‚

      • HisDivineOrder
      • 6 years ago

      Seems like this offers you the freedom to just buy whatever card is the best instead of worrying about which company you like. That really should have been your mindset from the beginning, but now, hey, better late than never.

        • K-L-Waster
        • 6 years ago

        ^^This^^

        Really should be making purchasing decisions based on how the product meets your needs and budget, NOT on whose logo is on the disposable packaging.

        • MrJP
        • 6 years ago

        I knew I should have included a “joking” tag…

      • the
      • 6 years ago

      Get a Caustic card from Imaginationtech.

    • anotherengineer
    • 6 years ago

    Keep checking Techpowerup for a Nano BIOS, and flash it to a Fury X, retest, done and done πŸ˜‰

    Edit – can you test this nano also??
    [url<]http://www.newegg.com/Product/Product.aspx?Item=N82E16814202151[/url<]

      • DPete27
      • 6 years ago

      Correct me if I’m wrong, but isn’t the nano just a downclocked Fury X?

      This whole situation just stinks of corporate manipulation, and it’s not the first time AMD has done it (what APU was it that they delayed the NDA on CPU benchmarks?).

        • brucethemoose
        • 6 years ago

        The Nano has a different PCB, and it supposedly uses binned Fiji chips that are more efficient at lower clockspeeds.

        • NoOne ButMe
        • 6 years ago

        It’s a full Fiji like Fury X, except air cooled. It hits up to 1000Mhz as long as the power is under 175W.

        In some rare cases it will be as fast as the Fury X, in most it will be clocked at 800(?) to 900(?)Mhz is my guess.

        • BobbinThreadbare
        • 6 years ago

        It’s designed to be power limited instead of clock speed limited. It will reduce clock rates to make sure it never draws more than 175 watts.

    • mtruchado
    • 6 years ago

    This is a silly situation. Sooner or later you will get a copy and they know It.

      • Klimax
      • 6 years ago

      Stalling, pure and simple. The longer honest reviews will take to surface the longer they will be able to evade the truth and keep selling it to ignorant.

        • Nevermind
        • 6 years ago

        “The truth” lol. You don’t know that yet. You can speculate, sure.

        What if the card isn’t quite ready, what if they’re working on a last minute issue?
        What if? You don’t know.

        All you know is TR didn’t get a card when they thought they would.

          • mtruchado
          • 6 years ago

          This is not the first time that I see TR guys doing a review over a prototype. They MENTION It, and they say how It looks like AS A FIRST IMPRESSION, but referring always that the product that they have in their hands is not production ready yet. Other possibility is to give them the final product in time. But no answer at all is a no go

          • Klimax
          • 6 years ago

          Sorry, but supply of benefit of doubt is long gone. Be it because of stupidity like Mantle or Fury X launch or continued Bullshit about competitors technologies.

    • beck2448
    • 6 years ago

    TR is one of the most objective sites out there. They go the extra mile to give consumers the most accurate data possible. If this is upsetting for any company, that would seem to be a red flag.

    • Chrispy_
    • 6 years ago

    You haven’t been provided a review sample because you dared to take issue with AMD’s choice of unfiltered, aliased, previous-millenium graphics settings in a 2015 product launch.

    HOW DARE YOU TEST GAMES WITH GRAPHICS QUALITY PRESETS, OR SETTINGS THAT PEOPLE ACTUALLY USE! THAT IS UNACCEPTABLE, SCOTT πŸ™

    • Ryhadar
    • 6 years ago

    Just curious: How many people were seriously considering buying one of these at $650?

      • llisandro
      • 6 years ago

      Wait, we’re not all gaming at 4K exclusively on a single beta DX12 title, in a mITX case?

    • kilkennycat
    • 6 years ago

    Further bad news for Fury owners.

    I ran the Shadow of Mordor benchmark with ALL graphics maxed (inc. textures) and with the 4K setting (3840×2160) yesterday on a GTX780Ti. While the benchmark was running, I checked the GPU memory usage via the On-Screen Display in the latest iteration of eVGA Precision The GPU memory consumption periodically exceeds 4GB ( around 4.3GB max) while the benchmark was running..

    Really a bit sad about that 4GB limit on the current iteration of the Fury products. A very stupid decision by AMD to limit the memory to 4GB on a product claiming to be a superb 4K UHD gaming performer. Probably because the necessary packaging complexity of adding another 2GB of memory would make the product financially non-competitive. Wait for an upgraded memory version of Fury to properly compete with the GTX780Ti.

    BTW, running the benchmark as set above with a single GTX780Ti did not result in particularly stellar frame-rates. Min 25, max 50, average 40. However, with G-sync turned on, the motion was perfectly smooth, no tearing, no judder.

    As for the AMD vs nVidia DX12 issues, not enough data yet. More data from Dx12 benchmarks on the major game engines. By the time Dx12 becomes important for PC gamers at least a year will have passed and a new generation of video cards will be with us anyway. Remember that Dx11 (and Dx9) will continue to be supported in NEW games for a very long time due to the massive volume of legacy GPU hardware.

      • NTMBK
      • 6 years ago

      If only Scott had already done an in depth article explaining why that isn’t a good measure of VRAM requirements [url<]https://techreport.com/blog/28800/how-much-video-memory-is-enough[/url<] EDIT: Sorry, that came off more sarcastic than intended. Basically, the engine/OS caches things in VRAM even when they aren't strictly needed, so just watching memory occupancy isn't a good method. You need to really get hands on and see how it performs. Looking forward to HBM2, when this problem goes away.

      • Ninjitsu
      • 6 years ago

      4K you say? Not a problem for most, then. πŸ™‚

      (And Scott’s article showed that it doesn’t make a difference anyway).

    • tsk
    • 6 years ago

    Linus to the rescue!

    • Unknown-Error
    • 6 years ago

    Is this limited to Techreport? Or has AMD failed provide other sites as well? Either AMD really really broke or Nano is really really $h!++y or both.

      • kmm
      • 6 years ago

      As noted earlier, HardOCP isn’t getting one either.

      And announced Friday, TechPowerUp isn’t in on the party either. TPU has very straightforward reviews with relatively little analysis and architecture explanations, letting the press decks do most of the explaining there. (They test a lot of games albeit just for fps, and have the best setup for testing board power consumption and acoustics, so they let the numbers do the talking.)

      In other words, I don’t think they’re targeted for exclusion here based on negative review content or anything along those lines.

      I saw at least one other site post pictures of their card already in for testing, so it’s out there. Just in limited quantities to certain reviewers based on… some criteria.

    • xand
    • 6 years ago

    I’m so glad I didn’t buy dual Fury X cards now. AMD should DIAF. (I’m keeping it classy).

    • Jigar
    • 6 years ago

    Genuine query: Why did TR never posted any news related to Aos – Every single website on internet posted it but TR never did. What happened guys ??

      • anubis44
      • 6 years ago

      Yes, exactly. What is going on with this website? The bombshell revelation about Maxwell and Kepler not having any hardware support for asynchronous shaders is everywhere… except here. nVidia has just been hit with a major broadside in DX12, and you guys are still sniggering about AMD’s latest products like they are sub-standard. Maxwell looks very much like it has just been rendered obsolete, and you’re still recommending it without reservation?

      Read this guy’s posts. He’s a former AMD GPU engineer, and he explains in excellent detail what this means for DX12:

      [url<]http://www.overclock.net/t/1569897/various-ashes-of-the-singularity-dx12-benchmarks/450#post_24323663[/url<]

        • f0d
        • 6 years ago

        amd themselves have said no gpu out today supports all of dx12’s features – async shader/compute is one aspect that nvidia doesnt support very well (only 31 queues vs 64) but there are also aspects of dx12 that amd doesnt have any hardware support for either

        [url<]http://www.guru3d.com/news-story/amd-there-is-no-such-thing-as-full-support-for-dx12-today.html[/url<] amd fanboys always so dramatic

          • anubis44
          • 6 years ago

          Asynchronous shaders are by FAR and AWAY more important than the other features of DX12. They are the equivalent of hyper-threading for the GPU. If you want to argue this, then read this first before you make yourself look silly:

          [url<]http://www.technobuffalo.com/2015/04/04/async-shaders-will-allow-gpus-to-live-up-to-their-full-potential-says-amd/[/url<]

            • Redocbew
            • 6 years ago

            You are by FAR and AWAY the most important fanboy here. Interestly.

            • f0d
            • 6 years ago

            “says amd”

        • K-L-Waster
        • 6 years ago

        Soooo Project Cars proves Nvidia are evil manipulative SoBs, but AoS proves AMD are the awesome protectors of open standards gaming?

        Please look up the term “double standard.”

      • Damage
      • 6 years ago

      We weren’t on the list to receive an advance copy of the benchmark. I have since purchased the game and have done some testing, but I’m not yet finished and ready to publish results.

        • Jigar
        • 6 years ago

        Thanks, I am looking forward to your findings.

      • AmazighQ
      • 6 years ago

      Some people are smart enough not to step in the cesspool that is called AoS and ACE.
      You will get a lot of uninformed, baised and crappy comments with fanboism radiation going of the charts.
      And at the end of the day you will get an Nvidia card to review ;).

        • Redocbew
        • 6 years ago

        I think it’s funny that asynchronous shading is getting compared to SMT. It’s not without reason since there are some similarities, but we’ve all seen how variable the benefit gained from SMT can be depending on the application, and yet a lot of people are still going bonkers about it.

    • ronch
    • 6 years ago

    AMD is getting more and more shady and suspicious by the minute.

    Selecting weird graphics settings to make their cards look better? Selecting tech sites that will give them favorable reviews?

    I like AMD as much as the next guy and I love rooting for the underdog. But if that dog turns into a weasel I don’t think it’s right to keep rooting for them. Yes, I know lots of other companies do this but I won’t root for them either.

    Come on, AMD. Fire your entire PR and marketing department. They are a cancer eating your company from within.

      • AmazighQ
      • 6 years ago

      They always say: YOU LEARN FROM THE BEST. And is i recall the best is NVIDIA at the moment.
      If being the “nice guy” does not cut it, I suppose using the same tactics as the guy will cut it

    • derFunkenstein
    • 6 years ago

    I know that availability has nothing to do with what people are interested in, but since you can barely buy any Fiji chips, it could just be that AMD is sending out 100% of the available Fury Nano cards it can produce to sites it knows are sympathetic.

      • LostCat
      • 6 years ago

      It could also just be the product is too niche for anyone to actually care outside of website reviewers. heh.

        • derFunkenstein
        • 6 years ago

        Yeah, maybe. See, I care, because I like seeing the very best out of hardware vendors, but I don’t care enough to spend that kind of money on my own system. If that makes any sense.

      • sleepygeepy
      • 6 years ago

      I believe this is the real reason. They probably only had a small number of cards available, and they had to carefully choose which sites would give the best possible outcome for them.

      Maybe they are allotting most of their good Fiji chips to the R9 Fury X instead of the R9 Nano, as the latter is a niche product?

      Edit: If a site receives an R9 NANO sample for review… what kind of instructions does AMD specify in reviewing the product?

      • Convert
      • 6 years ago

      This is exactly what I thought when I first read this.

      Seems like it’s being blown a bit out of proportion by people. My bet is that quantity is so low that only a few even got one and who knows why those people were chosen. One would really have to know how many were sent out and to whom to draw any conclusions.

    • DrDillyBar
    • 6 years ago

    AMD’s Radeon. For some reason, my brain still ‘thinks’ ATI.
    Maybe that’s why I was surprised this happened.
    *blank look*

    • Krogoth
    • 6 years ago

    Methinks, Nano is the new “Radeon X800XT Phantom Edition”.

    • Laykun
    • 6 years ago

    This doesn’t sound good, no mater which way you interpret it.

    • snook
    • 6 years ago

    first the DX12 AoS benchmark and now the nano.

    seems you may have called AMD’s mom ugly and it’s children sub 70 I.Q.ers.

    chin up sport, you still are my favorite tech site.

    • ultima_trev
    • 6 years ago

    I wonder if anyone else is going to be excluded from receiving an R9 Nano as well. My assumption is at this point is that they are extremely supply limited and therefore only sending samples to the AMD shill sites for the best possible reviews. That being said, I wouldn’t be surprised to see [H]ardOCP denied a sample either as they tore the Fury X a new one.

    It would be better if AMD could just complete on price rather than chase the Halo product status… Especially as Fury X is no where close to Titan X when it comes to raw game performance at 1080P and under, the only resolutions that are relevant at this point. 1440p and 4k are not going to be mainstream for years to come.

      • anubis44
      • 6 years ago

      “Especially as Fury X is no where close to Titan X when it comes to raw game performance at 1080P and under, the only resolutions that are relevant at this point.”

      In DX11 or DX12? Please specify.

        • pranav0091
        • 6 years ago

        How many DX12 games have been reviewed by any independent authority so far ?

        Running a couple of feature-tests from a single game and “confirming” performance is silly, particularly when its so early in the timeline of DX12 that the drivers are far from where they should be. If anything, you should be amazed at how good Nvidia DX11 drivers are – so much that they are sometimes faster than the current DX12 numbers – but its just too early for DX12 numbers to be taken seriously.

        Ever stacked up the spec-numbers of the Hawaii and the GM204 ? Or the Fury X and the 980 Ti ? How is it that the numerically “lighter-spec” Nvidia card beats the “heavier-spec” AMD card every time ?

        If you have a car, adding 4 more wheels to it isnt necessarily going to make it faster. Balance and clever architecture is the key to performance – far more so than the sheer number of flops you got. Sometimes you need a more powerful engine, sometimes more tires and sometimes just a better [i<]driver[/i<]. <I work at Nvidia, but my opinions are only personal>

    • chuckula
    • 6 years ago

    In addition to outrage at AMD (see other post) I’m also confused. If anything, TR’s testing methodology went out of its way to be [i<][b<]favorable[/b<][/i<] to the Furry X since the large majority of tests were run at 4K where the Furry X has its smallest performance delta to the GTX-980Ti. Hell, there were numerous complaints about the fact that too many benchmarks focused on 4K where the Furry X is at its best*. I haven't done a deeper analysis, but if the usual conspiracy whackos who think that TR's entire existence is dedicated to attacking AMD were the same ones screaming about too many 4K benchmarks, then I'd get an even bigger laugh at their expense. On top of that, TR bent over backwards to kick Project Cars out of its benchmark results just to placate the AMD fanbase†. And this is how TR gets "rewarded" by AMD? * The 980Ti still won most of the time, but the delta was smaller at 4K. † Interesting how Far Cry 4, which was pretty favorable to AMD, wasn't the subject of a huge conspiracy rant by hundreds of pro-Nvidia posters. Interesting how there wasn't a giant conspiracy campaign to remove Far Cry 4 from the benchmark results. Funny how "fairness" is a word that has come to mean "AMD better be made to win or else NOT FAIR."

      • snook
      • 6 years ago

      lacks logic, but sure bro.

        • Klimax
        • 6 years ago

        Says more about you then him…

          • snook
          • 6 years ago

          not likely

            • Klimax
            • 6 years ago

            For one you have definitely not demonstrated where the lack of logic is… not that you can.

            • A_Pickle
            • 6 years ago

            AMD fanboys think they’re doing AMD a favor by failing to recognize that their brand has a problem…

            • Klimax
            • 6 years ago

            Evidence of thinking?

      • f0d
      • 6 years ago

      what i find funny is that most people on the AMD reddit thinks TR is biased AGAINST AMD despite TR doing a fair bit to show AMD in a positive light
      example [url<]https://www.reddit.com/r/AdvancedMicroDevices/comments/3cxdgg/regarding_the_techreport_review_of_fury_and_4k/[/url<] EVERYONE IS AGAINST AMD - ITS A CONSPIRACY

        • Jigar
        • 6 years ago

        Something doesn’t add up, – Every website showed AMD’s fury was losing against 980Ti but why TR got the blame for being biased ???

      • NoOne ButMe
      • 6 years ago

      it wasn’t a conspiracy because AMD doesn’t have anti-competitive practices that puts technology in games that hurts both sides experience but hurts AMD’s more.

      this very own site did a great example of it… [url<]https://techreport.com/review/21404/crysis-2-tessellation-too-much-of-a-good-thing[/url<] Look at how hairworks was implimented, pretty much every batman game involving gameworks, etc. On AMD's side, the last case I Recall was TressFX in Tomb raider. In which AMD open sourced the code and handed it to Nvidia. Nvidia than patched it and the last comparison I saw (http://www.hardocp.com/article/2013/03/20/tomb_raider_video_card_performance_iq_review/6#.VefCmflVhBc) Nvidia loses LESS performance from TressFX.

      NVidia plays dirty, so people get mad when it plays dirty. While no big corporation is “good” or “clean” AMD is leagues ahead of NVidia and Intel when it comes to influencing games/compilers/etc. I know this was done on purpose when Ruiz was there. Unsure if they are trying now, and just failing.

      [b<]now, Project Cars was (Sadly) one of the "Best" implimentations of a Nvidia technology into a game recently[/b<] due to it not taking a large chunk of performance off of Nvidia's own cards. Like what a lot of the things they add tends to do.

        • Waco
        • 6 years ago

        Don’t mistake laziness for evil.

        • f0d
        • 6 years ago

        project cars is one case where it is amd’s own driver incompetence thats hurting performance

        the project cars devs have said that its a amd driver issue and that the physx implementation is “light” and isnt accelerated by the GPU (and i tested this myself by disabling gpu acceleration and had zero impact on performance)

        [url<]http://www.pcgamer.com/project-cars-studio-denies-it-intentionally-crippled-performance-on-amd-cards/[/url<] [url<]https://steamcommunity.com/app/234630/discussions/0/613957600528900678/[/url<] [quote<] We've provided AMD with 20 keys for game testing as they work on the driver side. But you only have to look at the lesser hardware in the consoles to see how optimised we are on AMD based chips. What can I say but he should take better stock of what's happening in his company. We're reaching out to AMD with all of our efforts. We've provided them 20 keys as I say. They were invited to work with us for years. Looking through company mails the last I can see they (AMD) talked to us was October of last year. Categorically, Nvidia have not paid us a penny. They have though been very forthcoming with support and co-marketing work at their instigation. We've had emails back and forth with them yesterday also. I reiterate that this is mainly a driver issue but we'll obviously do anything we can from our side. Some great gains we saw from an earlier driver they released have been lost in a later driver they released. So I'd say driver is where we start. Again, if there's anything we can do we will. [/quote<] its funny how its a "conspiracy" when amd's performance is down for a certain game but when it comes to amd having a performance advantage its because "amd are better" like in AoS benchmark

          • NoOne ButMe
          • 6 years ago

          What negative impact has AMD had for Nvidia in that game? As far as anyone can tell, none.
          Who has control over the biggest CPU “hog” in pCars? NVidia via PhysX.
          [b<]Here AMD clearly does share a sizable portion of the game. At the same time, *some* (not conclusive) people who played from the very start said at one point they were performing fine on AMD and it just tanked at one point. Also "has not paid us a penny" instead of saying NVidia hasn't given the momentary compensation of any kind makes it a bit pointless.[/b<] Just like any corporate doublespeak, it only means exactly what it says: NVidia did not give Project cars any money. [u<]I've got over a dozen years working around very successful advertising firms and that's a very obvious wording distinction.[/u<] They couldn't make any statements like the AoS developer [i<]"We only have two requirements for implementing vendor optimizations: We require that it not be a loss for other hardware implementations, and we require that it doesn’t move the engine architecture backward (that is, we are not jeopardizing the future for the present)."[/i<] Hm? They've got no claims. We'll see how their DirectX 12 implimentation is. I'm already guessing (and hope it's not what I think it is). AMD's tended to bring features such as "TressFX" (originally bad for NVidia, quickly open sourced), "mantle" (no negative impact to NVidia), "Async Compute" (No negative impact on NVidia" and such. Nvidia has mostly brought over tessellation to degrees that hurt both vendors, but hurts AMD more. And, they've done it a lot. The only time it's gotten fixed is when the developer of the game stepped in, or when AMD was forced to write code to restrict tessellation limits. Why should I expect anything from NVidia except from code designed to hurt everyone? That's what they've done time and time again. Crysis 2, Batman: Arkham Origins, The Witcher 3 and probably in Arkham Knight. I'm sure there are quite a few more. I spent about 3 minutes making that list. The over-tessellation in these games in most cases DID NOT give better quality. What it did, was drive down performance by X on NVidia cards and greater than X on AMD cards.

        • Klimax
        • 6 years ago

        You are wrong and ignorant on all counts from here to Pluto and are just reposting nonsense and AMD PR or just fanboy idiocies.

        Like stuff about NVidia code. It is just wrong and never was correct.
        See and educate yourself out of ignorance:
        [url<]http://www.hardocp.com/article/2015/08/25/witcher_3_wild_hunt_gameplay_performance_review/8[/url<] And stuff all that nonsense where sun doesn't shine! Be it about Project CARS, Hairworks or any other Gameworks technology...

      • namae nanka
      • 6 years ago

      Yes 4k did play into AMD’s ‘strengths’ but it didn’t matter much considering how Fury cards still lost out. You can see my ‘conspiracy theory’ reddit thread in one of the replies to your comment.

      Project Cars was being compared to Dirt Showdown’s removal which was a game that favored AMD massively.

      As for Far Cry 4, it’s quite mundane really.

      [url<]http://www.geforce.com/whats-new/articles/far-cry-4-nvidia-gameworks-trailer[/url<] It wasn't really that favorable to AMD. And Fury X usually wins at 4k, at Tom's 980Ti only pulls ahead in 2 games. Of course 980Ti is still the better card to get unless async compute really shakes things up.

      • HERETIC
      • 6 years ago

      These days sites that want to continue to receive products are expected to-
      “Do a HONEST review but with POSITIVE SPIN on the product”
      It is not unusual to see other sites changing their test procedures to make products
      look good-and keep their advertisers happy…………………………………….

      • Ninjitsu
      • 6 years ago

      My complaint with 4K was general irrelevance of the resolution to over 99% of Steam users and a similarly large % of TR readers. Had nothing to do with AMD-Nvidia performance deltas being smaller.

      • jihadjoe
      • 6 years ago

      lol you should see what /r/amd has to say about TR and Damage.

    • PrincipalSkinner
    • 6 years ago

    Oh AMD. I loved you once. But now you’re just stupid.

    • NoOne ButMe
    • 6 years ago

    As edited by someone else for clarity (I agree with his edit) addition comments in brackets.
    TR is quite reasonably balanced. If I was to only go off the opinions of “pro-AMD” sites TR would be the most anti-AMD and biased.

    Those “pro-AMD” site users seem to forget great unbiased articles like the 680 review. Or that one of TRs helped AMD find, fix and boost crossfire performance.

    I think including Project Cars is definitely something to be dropped. But on the whole benchmarks [and general points of views] are significantly more balanced than comparable tech site.

    [s<]EDIT[/s<][original post]: (Rewrote my [s<]opening sentence[/s<] [post]. The original version was as follows) TR is reasonably balanced when I visit compared to pro-AMD sites. TR seems to rank as the one of worst. It amazes me. first edit: While I feel this site is fair in the vast majority of cases, when visiting places (forums, etc) that are "pro-AMD" this site appears to be considered one of the worst in the "cr*p" it spreads. I guess they forget things like, well. Basically every site seemed to say that the 680 "destroyed" the 7970 (which, it didn't) and this was one of the few sites which didn't say that. Or the fact that it has reviews which has caused very good things like AMD's great crossfire lag reduction. There have been some things I consider misses *REVIEWINGWITHPROJECTCARS* to name of one them. There is far less "bad" or "mistake" stuff here than most similar tech focused sites I visit. original post: Well I feel this site is fair in the vast majority of cases when I visit places that are far more pro-AMD in nature, well. This site seems to rank as one of the worst of the worst. To my amazement.

      • flip-mode
      • 6 years ago

      Can you do a quick edit to make that intelligible?

        • NoOne ButMe
        • 6 years ago

        A bit better? =]

          • DancinJack
          • 6 years ago

          nope

          • Klimax
          • 6 years ago

          Worse, it made your bias and fanboism much more blatant…

          • auxy
          • 6 years ago

          Your post is pretty incoherent. I realize English is not your native language, but what were you even trying to say?

          At first and at the end it seems you’re being negative about TechReport but in the middle two paragraphs you’re praising their impartiality. So which one is it?

          I feel like you might have been trying to be facetious and failing. (Β΄Π”βŠ‚γƒ½

            • NoOne ButMe
            • 6 years ago

            No, I am not. I suffer from multiple executive functioning issues which makes it very hard to word my statements in ways that are clear to understand. This is something I work with, but, can be quite hard to write around. As I can only try to read it as not knowing what is in my head so much.

            While the post is not necessarily cleanly written, and has drawn out statements/thoughts, it still says the same thing.

            The first sentence is explaining “why” AMD did not give them the card using the only evidence I have. That people on forums that I visit that are more pro-AMD or anti-NVidia (choose) seem to think that this site is one of the worst sites with regards to how it treats AMD.

            Honsetly, I’m trying to understand how you read the first part differently and I cannot see how. I very clearly state [b<]my opinion[/b<] and also state the opinion of the "forums". With "I" visiting them and "this site appears to be considered one of the worst" implied when visiting those forums. IF you can explain how you read it would help me. Thank you.

            • auxy
            • 6 years ago

            Oh, I see. It’s clearer now with green’s revisions. Basically, you were saying that while some people or some sites may regard TR as being “anti-AMD”, they lack context and forget that TR is one of the more unbiased news sites out there.

            Yeah, from the middle of your post I really didn’t get a negative vibe. Shame you got downvoted early on, because you’re right. (‘Ο‰’)

      • green
      • 6 years ago

      i”ve given a thumbs up to counteract the downs
      i think the thumbs down are due to a mis-interpretation of what is being written

      as best as i can tell it’s supposed to read along the lines of:
      [quote<]TR is quite reasonably balanced. If I was to only go off the opinions of "pro-AMD" sites TR would be the most anti-AMD and biased. Those "pro-AMD" site users seem to forget great unbiased articles like the 680 review. Or that one of TRs helped AMD find, fix and boost crossfire performance. I think including Project Cars is definitely something to be dropped. But on the whole benchmarks are significantly more balanced than comparable tech site. EDIT: (Rewrote my opening sentence. The original version was as follows) TR is reasonably balanced when I visit compared to pro-AMD sites. TR seems to rank as the one of worst. It amazes me.[/quote<] Overall, showing appreciation for the comparative balance TR presents. The post is just not very well written is all.

    • flip-mode
    • 6 years ago

    Well, AMD just lost any last shred of good will I had for them.

      • anubis44
      • 6 years ago

      You mean because they’ve included better DX12 support baked into their GCN designs since 2011? Yeah, that’s pretty bad-faith dealing there by AMD, giving their products longevity and supporting open standards. What a bunch of goons. Now, if only they’d try to lock in their customers with proprietary standards like G-Sync and PhysX, or market a card as having 4 gigabytes, but in reality…

      nVidia lost my good will when:

      1) They sold me a GPU in a laptop that got fried during the bumpgate scandal, and gave me diddly squat as compensation because I live in Canada.
      2) The GTX 670 I actually bought, despite my misgivings, couldn’t drive a 3 monitor setup without a 19 step procedure that involved manually fixing the default horizontal sync rate, and plugging in one monitor at a time, and the whole process had to be repeated every single time I updated the driver. After about a month, I finally had it with nVidia’s vaunted ‘superior drivers’ and sold the 2GB 670 and picked up a 3GB 7950 for $100 less, which brought up the 3 monitors without a hitch using supposedly ‘inferior’ drivers.
      3) They continue to do everything in their power to lock in people to their ecosystem (PhysX, G-Sync) and nVidia people don’t really seem to give any thought to just why they might be doing this. Now that there’s some doubt that their beloved nVidia cards just MIGHT be inferior to GCN Radeons in DX12:

      [url<]http://www.overclock.net/t/1569897/various-ashes-of-the-singularity-dx12-benchmarks/450#post_24323663[/url<] I'll bet that $200 extra spend on a G-Sync monitor, which nVidia won't allow to work with an AMD card, just might be starting to feel a little bit like a rip off. 4) They seeded every tech site with GTX970s, drove demand like crazy for it, and then waited 4 months to let anyone know about the 'special' memory on it: (https://www.youtube.com/watch?t=111&v=tNGi06cq_pQ)
      5) They rushed out Maxwell, and sold everybody a shed load of cards as soon as they realized Mantle had forced Microsoft’s hand with DX12, and saw that they had a next-generation design that was lacking major features that DX12 was going to be able to use.

      nVidia is a company that has shown far more than mere competitive zeal. They’ve demonstrated an utterly contemptible disregard for their own customers time and time again. If me observing this makes me a ‘fan boy’ in somebody’s eyes, then maybe that person should ask themselves how badly nVidia would need to screw them over before they’d acknowledge the possibility that the company is unscrupulous.

        • chuckula
        • 6 years ago

        Yeah, so AMD is so amazingly perfect and their products are superawesome perfect amazing.

        Thanks for posting the copy-n-pasted wall o’ text.

        Now answer one simple question without going into a conspiracy theory about Nvidia being so bad: If the Nano is so superawesome perfect amazing, then why isn’t AMD eager to actually get it reviewed by professionals?

          • anubis44
          • 6 years ago

          “Yeah, so AMD is so amazingly perfect and their products are superawesome perfect amazing.”

          No, AMD’s not perfect, they’re just not a bunch of deceptive twats who are trying to rip people off by straight up deceiving them about their products:

          [url<]https://www.youtube.com/watch?v=tNGi06cq_pQ[/url<] (Yes, they should include some Preparation-H with every GTX970) or locking them in to proprietary standards, like G-Sync. If you want to bend over and pay an extra $200 for a vertical sync monitor, please be my guest. I'm just not going to do it. Nor am I going to overlook the fact that AMD has had to fight to push gaming forward, despite the foot-dragging of Microsoft and nVidia. The fact that nVidia's most recent design doesn't even have hardware asynchronous shaders, when a 2011-era 7950 does, shows me how surprised nVidia is that AMD forced Microsoft's hand with a Mantle-like DX12. nVidia was determined to sell obsolete hardware to people for as long as possible, and nVidia apologists like you are helping them to do it by being in denial about this situation. Just admit it. nVidia was caught with their pants down, and they've sold a whole bunch of people cards that can't run DX12 as well as AMD Radeons. It's a simple thing to say.

        • HisDivineOrder
        • 6 years ago

        Nowhere did the OP say a single word about nVidia. Why are you talking about nVidia? He’s talking about his problem with AMD and there you are, off in the wilds of your imagined sleight, not addressing his point.

        If you want to discuss why you hate nVidia, I suggest you find an nVidia article to respond to or at least someone that mentioned nVidia in their post.

          • chuckula
          • 6 years ago

          Something tells me he’s incapable of living in a world without Nvidia to hate. In fact, I don’t think he even cares about AMD beyond the fact that AMD isn’t Nvidia.

          Obligatory Star Trek reference: [url<]https://www.youtube.com/watch?v=o7MQrL_ABE0&oref=https%3A%2F%2Fwww.youtube.com%2Fwatch%3Fv%3Do7MQrL_ABE0&has_verified=1[/url<]

        • maxxcool
        • 6 years ago

        Havok = proprietary amd

        • maxxcool
        • 6 years ago

        Canada has zero to do with Nv , everything with US legal system.

        • maxxcool
        • 6 years ago

        As for unscrupulous. fake-8-core cpus. For 4 years. false advertisement everywhere.

        The rest is mostly spot on though except for the three monitors, bumpgate should have netted a new ceo for sure …

      • wimpishsundew
      • 6 years ago

      “AMD just lost any last shred of good will I had for them.”

      Relative to who? NVDA?

        • chuckula
        • 6 years ago

        It’s entirely possible to have no respect for either company.

        Get out of your Hollywood good guize vs. bad guize mindset and remember that it’s possible for BOTH sides to be the bad guize.

          • K-L-Waster
          • 6 years ago

          … or for BOTH to be profit-seeking entities with no shortage of regular ol’ incompetent bozos in their employ….

          • DrDominodog51
          • 6 years ago

          You are right. However in this case one guy is [i<]slightly[/i<] better than the other. Only slightly.

            • flip-mode
            • 6 years ago

            Who? Intel? For the last few years Intel has been the one company that seems not to be sleazy. Ever since Bulldozer AMD has been playing games with benchmarks and blacklisting tech press and such nonsense, and Nvidia has it’s list of sleazy maneuvers. The best part is that not being a gamer, I can drop both AMD and Nvidia from my systems.

            • A_Pickle
            • 6 years ago

            It’s so sad to see where AMD is. Like, I can get it to a certain point… they’re not stupid, they exist in the same universe that the rest of us do. They know exactly how their products perform before reviewers get ahold of them, so… if their product performs worse than their competitor’s, they know damn well that it’s going to look like that on bar graphs and frametime charts all over the internet. They’re understandably not thrilled with plastering “HEY EVERYONE, WE ARE WORSE THAN EVERYONE ELSE” all over the internet.

            The only AMD products that are remotely interesting are their APU parts, and infuriatingly, they didn’t bring their 8-core Jaguar featured in the PS4 and the Xbone to the PC. Their CPUs are low performers (esp. in gaming!) and devour power, while their GPU’s aren’t bad… they also don’t offer sufficiently attractive savings to make me willing to eat the performance hit and the absurd power consumption. HBM was a terrible idea, IMO.

            I don’t know if AMD will last another generation, and if they do, I can’t imagine they’ll have anything particularly groundbreaking to bring to the table. They needed another HD 4000-series, but they just launched an HD 2000 series…

          • egon
          • 6 years ago

          Not that it matters – they want our money, not our respect.

    • tipoo
    • 6 years ago

    Wait. They thought not giving Tech-tell-the-people-all-the-things-Report a review sample was a good idea? Something is awfully fishy about that…Jinkies.

    • TheSeekingOne
    • 6 years ago

    I’m not surprised.

    • brucethemoose
    • 6 years ago

    Well you could underclock/undervolt the Fury X to Nano levels and get similar performance/power consumption figures. I doubt the bigger VRMs or the pump + Gentle Typhoon eat up significantly more power.

    I know TR is too classy for that… Still, I wouldn’t mind an “unofficial” Nano review :).

      • f0d
      • 6 years ago

      im pretty sure the nano has different powertune settings than the fury x

      i guess you could always flash the fury x with the nano bios tho – that should work

      edit:then again, the fury x has a different thermal solution than the nano so that would mess with the results too so i cant see how it could be tested properly without actually having one

        • brucethemoose
        • 6 years ago

        The PCB is different, so it would almost certainly mess something up.

        However, you could edit the X’s BIOS and copy the Nano’s powertune settings.

          • f0d
          • 6 years ago

          still doesnt solve the thermal difference (which as we all know can make a huge difference to the performance of the card – remember the r9 290 with stock cooler controversy?)

            • brucethemoose
            • 6 years ago

            True. The Nano’s cooler does look pretty skimpy, but I bet it won’t throttle the card like Hawaii’s blower.

            The real problem is the quality of chip. A leaky Fury X GPU is gonna eat more power than a binned Nano chip.

            • the
            • 6 years ago

            I’m scratching my head where all the binned chips are going. While binning for power is one dimension, functional units is another. The Nano and Fury X are both fully enabled chips 596 mm^2 in size, which is kinda crazy. The vanilla Fury is trimmed down but some are actually having success unlocking units. AMD still has yet to launch the dual Fury card or any FirePros based upon Fiji. I’m starting wonder what yields on these chips actually are. I see three possibilitites: 1) they have exceptionally good yields meaning that there are very few flawed Fiji chips, 2) that AMD spec’d too high and is throwing away lots of partially functional dies, 3) AMD has another project planned for the Fiji dies. Number 3 is the most likely but there doesn’t appear to be any hole in their line up right now.

            • wof
            • 6 years ago

            It’s #3! I bet all those dies are going into the Nintendo Wii Fury U !

      • Firestarter
      • 6 years ago

      The power figures are the crux of the review, you can’t simulate that! The question is how well the Nano performs [b<]despite[/b<] the power profile, and to test that you have to have one in your paws. I for one would love to see how well it does (overclocked) with the cooler swapped out for something beefy, given that the silicon has apparently been cherry picked by AMD to give good enough performance at 100W less. I don't expect them to do such a review as it is a bit more involved than a normal review, but it's definitely something that I'm curious about.

        • brucethemoose
        • 6 years ago

        Leaky, power hungry chips are actually good overclockers because they can take more voltage. The “binned”, low leakage Nano chips might actually be worse under water, where voltage is a bigger problem than heat.

        The Nano’s VRMs aren’t very good either.

          • Firestarter
          • 6 years ago

          I’ve seen this point made, but I’d love to see it tested. Scott isn’t going to take that borrowed card apart, that much is obvious, but maybe we’ll see someone else do it

      • NoOne ButMe
      • 6 years ago

      The difficulty is the Nano will have varying clockspeed.

      The “way to do it” would be to see what it hits in most reviews and set it to clock to there.

      Or just do a review with it both at 800 and 900Mhz. =]

    • Deanjo
    • 6 years ago

    The start of being cut off from AMD hardware for review like they did at Phoronix?

    Poor Micheal over there has to buy every single AMD card to review now days.

      • chuckula
      • 6 years ago

      And BOY was he pissed after dropping all that money on an R9 Furry and having it underperform too.

      Phoronix tends to be sensationalistic, but I’ll actually forgive him for the saw photo given the state of support for the Furry under Linux right now.

      [url<]http://www.phoronix.com/scan.php?page=article&item=amd-r9-fury&num=10[/url<]

        • ChangWang
        • 6 years ago

        “Sensationalistic” is like the understatement of the year

    • Kougar
    • 6 years ago

    I bet AMD would change their minds if it was a DX12 only review. πŸ˜‰

      • anubis44
      • 6 years ago

      I think they’d settle for any DX12 performance metrics. This site only seems to acknowledge the existence of DX11, and seem to be verging on smugly proud that they don’t bother with what virtually all the other tech sites are dealing with: nVidia’s serious design flaws in DX12.

      That’s why AMD aren’t happy with it. TechReport is starting to look very nVidia-biased because of this.

        • Silus
        • 6 years ago

        Can you explain why DX12 matters, when 100% of the games use DX11 ?

        I know the answer of course. You and other AMD fanboys have to clinge to something since AMD is not doing so well so if they apparently have better DX12 performance in benchmarks then that’s all that should be covered in any review right ?

          • raddude9
          • 6 years ago

          [quote<]Can you explain why DX12 matters[/quote<]... Duh, Because 12 > 11. Seriously though, people (well, me at least) are replacing their GPU's less frequently these days and they would like an indication of how the card they buy now will perform in a couple of years time. You can cling (but not clinge) to your anti-fanboi argument if you like, but I'd like to see results from more than one directX 12 game to help me guide my purchases.

            • travbrad
            • 6 years ago

            [quote<]Seriously though, people (well, me at least) are replacing their GPU's less frequently these days and they would like an indication of how the card they buy now will perform in a couple of years time. [/quote<] A fair point. I'm just not convinced an unoptimized beta of a single game (AOS) is an indicator of how all DX12 games will perform. There just isn't enough information out there yet to know how DX12 will perform across a wide variety of games. Imagine how different the DX11 picture would look if you only looked at benchmarks for a single game. Project CARS vs Far Cry 4, for example.

            • Silus
            • 6 years ago

            So, you want to see results from more than one directx 12 game to help you guide your purchase…
            That’s great…fantastic even…but if there are NO DX12 games, you’ll have a hard time doing that.

            Also, stop living in AMD’s RDF. You think you’ll get an indication of how a card will perform in a couple of years time ? You really do live under a rock, because there’s no such thing. In a couple of years, every single card out now is absolutely outdated and outclassed by any mid to low range card of the time. No ONE buys a graphics card now, thinking of performance in the future…well, maybe crazy people do…

            I’ll “clinge” to my anti-fanboy argument as the (your) fanboy rhetoric is tiresome and ridiculous

            • raddude9
            • 6 years ago

            [quote<]but if there are NO DX12 games[/quote<] So that wikipedia page about directX games coming out in 2015 and 2016 is just made up: [url<]https://en.wikipedia.org/wiki/List_of_games_with_DirectX_12_support[/url<] Better get on to those wikipedia guys... I think what you meant to say is that no DirectX games have been released yet. [quote<]You think you'll get an indication of how a card will perform in a couple of years time[/quote<] Yup, that's how reviews work. People read them to try and get an indication of how the card will work with games that haven't been released yet. If reviews had a few directX12 games they would do this better [quote<] In a couple of years, every single card out now is absolutely outdated[/quote<] You seem to have completely missed the point of including DirectX12 games in reviews so I'll make it simple. If a current card is bad at DirectX12 then it will be "absolutely outdated" in a few years. However if a card is great at directX12 then it will be a lot less "absolutely outdated" in the future. [quote<]No ONE buys a graphics card now, thinking of performance in the future...[/quote<] LOL. Actually everyone does. When people buy a new graphics card they don't go back and play all of their old games from scratch again (well, maybe crazy people do). People buy new graphics cards for all of the new games that they are going to buy.

            • Silus
            • 6 years ago

            “Yup, that’s how reviews work. People read them to try and get an indication of how the card will work with games that haven’t been released yet. If reviews had a few directX12 games they would do this better”

            You’re kidding right ? Or do you really believe this ?
            You really believe that reviewing a card in current games (whatever DX or OGL version they use) will give you an indication of performance in games that haven’t been released yet ?….

            Wow, I’ve seen people say/write ridiculous things, but I think that one takes the cake. You just show you have no idea what reviews actually do, plus you lack any understanding of graphics architectures and even how game engines are built and/or updated.

            Reviews show you how graphics cards handle the games tested in that review in multiple resolutions and settings. Nothing more. Even games that use the same core graphics engine will often perform differently on the same hardware.

            • raddude9
            • 6 years ago

            First:
            [quote<] You just show you have no idea what reviews actually do[/quote<] and then [quote<]Reviews show you how graphics cards handle the games tested in that review in multiple resolutions and settings. Nothing more.[/quote<] That's funny, you must be trolling, but I'll bite πŸ™‚ as you've just shown that you have no idea of what reviews actually do! Sure, a review will tell you how a card performs at certain resolutions and settings, but there is indeed more to it than that. The most important part of a review is the comparison of multiple cards. How one card performs relative to another, is the key take-away of any good review. Tech websites go to great lengths (usually) to keep all other review variables the same in order for this comparison to be fair and meaningful. This then is what informs a users purchase, not the absolute performance, but the relative performance of a card next to it's counterparts. Because, with a number of important caveats, a cards relative performance in the present will be a strong indicator of it's relative performance in the future. However, as I mentioned, there are important caveats to this future performance projection, things like memory size can be important, as can API changes. So if reviews can address some of these forward facing issues it will prove useful to readers.

            • Silus
            • 6 years ago

            I’m baffled that, not only you believe that reviews tell you future performance, but that someone is thumbing you up…I guess ridiculousness likes company.

            There are NO indicators of future performance in ANY review.

            Saying there are, is gibberish. Relative performance of a card next to its counterparts gives you nothing more than the relative performance of that card next to its counterparts. You cannot take any performance indicators of that card in future games.

            And it’s hilarious that you call me a troll when you’re the one talking about “future telling reviews”…Sigh

            • raddude9
            • 6 years ago

            [quote<]I'm baffled that, not only you believe that reviews tell you future performance[/quote<] I'm not surprised you are baffled, I did not say that. I said you can get an [b<]indication[/b<] of how a card will perform in the future based on a (good) review. This is not the same as telling you the future performance, like, for example, I don't know if a certain card will achieve an average of 37fps at a certain resolution in Grand Theft Auto 6, or Battlefield 5. But, based on the reviews I've read I can tell you that a GTX 980 will in all probability at least equal and most likely outperform a GTX 970 in those games, and, that pattern will repeat itself for vast majority of games. How did I make such a bold prediction, is it witch-craft? No, it's easy, and if you have the patience I can teach you how to do this for yourself. But first we have to break things down into bite-sized chunks: Do you admit that you were wrong when you said: [quote<]Reviews show you how graphics cards handle the games tested in that review in multiple resolutions and settings. Nothing more.[/quote<] I.e. that reviews actually tell you how a card performs relative to other cards at the tested settings and resolutions. Next, do you realise that reviews go to great lengths to have a spread of different resolutions and detail settings when testing their cards, all while comparing the cards using the same base hardware? Finally, do you admit that people do not generally buy graphics cards to play only old games that have been tested in some review of that graphics card. They in fact buy graphics cards to play games that have not been used in reviews of that graphics card, and they generally keep that graphics card for a few years and play games that not been released at the time of the purchase of said graphics card.

          • the
          • 6 years ago

          It matters not because of today but for tomorrow. While true that 100% of games are currently DX11, Windows 10 fast adoption rate and the benefits of the new API will push rapid DX12 growth. It’ll still be several years before we a few DX12 only titles but there will be several high profile games in 2016 that’ll have rendering paths for both.

            • Terra_Nocuus
            • 6 years ago

            And in 2016 there will be a new batch of GPUs to test.

            I have a 980. I bought one when they launched, but I did so knowing that it wouldn’t be as great at DX12 stuff as advertised. It’s happened before (DX8 > 9 transition, IIRC) where GPUs that were released right before the new API were shown to not handle it well at all.

            When I made the purchase, it was the best available option. I don’t envy the people that want to upgrade right now.

            edit: grammar 9_9; ugh

            • EndlessWaves
            • 6 years ago

            Yeah, the FX series were abysmal.

            Personally I really don’t care how a 650USD card performs right now. Unless it’s a major disaster then they’ve all got an excess of power. How they’ll stack up in 2-3 years time when they’re more mainstream is far more interesting.

            • Silus
            • 6 years ago

            It’s funny how it’s always AMD fans that come up with the “future-proof” argument. They said the same with R600 and how it would be so awesome with DX10…of course that never came true, because no card gets better performance with time. That’s just stupid.

            No one will push rapid DX12 growth. Whoever says that has absolutely no idea of what development cycles are or what deadlines are.
            Developers have their hands full developing games to use the current technology and to deliver them at a certain date. It’s called planning and that planning happened 2-3 years ago for most AAA games (a bit less for “smaller” games). DX12 will be supported in some games of course, but the focus will still be on DX11 for a couple of years. That happened with every DX release (and DX11 had a really long stay) and that won’t change with DX12.

          • raddude9
          • 6 years ago

          [quote]100% of the games use DX11 ?[\quote]

          You seem to be forgetting about DirectX 10, DirectX 9, DirectX 8, DirectX 7, DirectX 6, DirectX 5, DirectX 4, DirectX 3, DirectX 2 and DirectX 1!

        • Klimax
        • 6 years ago

        That’s why they used a lots Mantle until last AMD cards got negative performance from it.
        (Because low level APIs are atrocious for HW which developer didn’t target)

        • Ninjitsu
        • 6 years ago

        THERE ARE NO DX12 GAMES AVAILABLE. ASHES IS STILL IN BETA. I DON’T EVEN THINK MOST PEOPLE ASKING FOR AOS REVIEWS HAVE THE GAME OR WANT IT.

          • the
          • 6 years ago

          Let the SSK flow through you.

            • raddude9
            • 6 years ago

            use the CAPS Luke.

            • willmore
            • 6 years ago

            Come to the CAPS side of the farce.

        • BobbinThreadbare
        • 6 years ago

        Except for the fact that TR uses mantle whenever it’s available.

        There actually has to be a dx12 game to test dx12 you know.

        • bfar
        • 6 years ago

        Until we see real post beta DX12 benchmarks, or even (don’t hold your breath) actual game releases, all claims are disingenuous. Yes, GCN is better optimized for shader performance and parallelism, but Maxwell will still win at fill rate and tessellation. In other words, performance is still going to depend on the game engine.

    • Milo Burke
    • 6 years ago

    Wanted for keeps: AMD’s Radeon R9 Nano

    • Silus
    • 6 years ago

    Come on Scott…of course you know why…:)

    Had you said that the Fury X was the best thing since sliced bread, you’d have 10 Nanos at your door step right now!

    It’s AMD being AMD and at the time when they really don’t need the bad press.

      • ImSpartacus
      • 6 years ago

      Yeah, this is pretty upsetting. I’m glad that TR made a post about this because this kind of behavior is unacceptable.

      TR does all sorts of things that piss me off, but I think they deserve a shot at reviewing the Nano just like everyone else.

        • killadark
        • 6 years ago

        Oo Really? what that TR does Piss u off in real life? all they do is reviews dnt like em leave em πŸ˜›

      • Klimax
      • 6 years ago

      Don’t need bad press. Proceeds to do the worst thing they could getting bad press before even reviews got done.

      • namae nanka
      • 6 years ago

      Wouldn’t matter much though because he’d also have a very angry Silus at his door step since the review. πŸ˜‰

    • ludi
    • 6 years ago

    “The Crap in my Hat” by Dr. Su

    Also not to be missed:

    “One Fish, Two Fish, Reddish Bloopers”
    “How the Green-ch Stole Christmas”
    “And to Think That I Lost It On One AMD Place”
    “G. P. U. Fiji, Will You Please Go Now”

    • K-L-Waster
    • 6 years ago

    Maybe you need to change the site’s CSS to a suitable shade of red?

    • chuckula
    • 6 years ago

    [quote<]I've just been informed that AMD has chosen not to provide TR with a product sample for review. The reasons behind this decision aren't clear to me, but whatever.[/quote<] See, Damage is just a little bit too nice and too classy to call them out. However, I am neither nice nor classy, so let me put on my Doc Holliday accent and let AMD know that I'm your Huckleberry: AMD isn't sending these cards out because they know darn well that they are overpriced and underperforming. All this nonsense about miracle chips that suddenly have such amazing performance with barely the need for any cooling has been just that: Complete and utter nonsense. Oh, and that statement that Lisa Su made on the record about how the Furry Nano will beat the R9-290X in performance? It just became a hell of a lot more suspect. If there is one, I mean *one* real benchmark where the 290X can beat the nano, she should seriously consider hiring a lawyer because there's going to be a lawsuit, especially when that little whopper came during the same official press conference where they cooked the numbers for the Furry X and told that whopper about the "overclocker's dream". For all the people who scream about Nvidia at the top of their lungs all day long, there's something that Nvidia doesn't mind doing: They'll send test units out for reviews.

      • K-L-Waster
      • 6 years ago

      Ok, let’s be charitable. They know just as well as the rest of us that Damage is over worked, and are trying to save him the trouble of all that testing to come up with independent numbers. They’ve already provided the numbers, so TR can just reprint those and save a boat load of time and effort.

      Sounds legit….

      • ImSpartacus
      • 6 years ago

      I’d say that the existence of this article is effectively calling out AMD. Granted, that doesn’t mean that AMD should get a pass on this, but let’s call a spade a spade.

      I think there is a right way to call out a company and a [url=http://www.kitguru.net/site-news/announcements/zardon/amd-withdraw-kitguru-fury-x-sample-over-negative-content/<]wrong way[/url<] to do it, but it definitely has to be done and we shouldn't sugar coat it something else.

        • slowriot
        • 6 years ago

        I fail to see how Scott’s post above is anything more than sugar coating quite frankly. And I fail to see your point about KitGuru, if anything in both cases neither site went far enough.

          • JustAnEngineer
          • 6 years ago

          The speculation in the subtitle of [url=https://techreport.com/review/28912/tiny-radeon-r9-nano-to-pack-a-wallop-at-650<]the previous article[/url<] wasn't flattering.

            • K-L-Waster
            • 6 years ago

            Seeing as the skepticism was based on the, how shall we put it, “unorthodox” graphics settings used for the published numbers, I for one don’t think the subhead was out of line.

            If AMD wants to publish results with settings that are skewed in their favour, they shouldn’t be too surprised when someone points out that end user’s mileage may vary.

      • the
      • 6 years ago

      It is a bit of professionalism here as being too negative tends to burn bridges. In the end, the performance numbers will speak for themselves.

      Also it isn’t AMD who is acting like this. Intel’s new Sky Lake chips like the i3 6320 launched earlier this week isn’t going to be sampled for review. (FWIW, the i7 6700K can be configured to the i3 6320’s spec and tested to get an excellent approximation.)

    • ish718
    • 6 years ago

    [quote<] I've just been informed that AMD has chosen not to provide TR with a product sample for review. The reasons behind this decision aren't clear to me, but whatever.[/quote<] AMD doesn't want Nano on Tech Report's price/performance charts

      • Damage
      • 6 years ago

      Aww. I was gonna make ’em 3D and add board length as the Z axis.

        • ImSpartacus
        • 6 years ago

        Honestly, that would be pretty fucking incredible. Fuck the Nano, do it anyway.

        • jihadjoe
        • 6 years ago

        Reminds me of the Bitchin’ Fast 3D 2000, where Bungholiomarks seem to be scale proportionally with the length of the card.

          • KeillRandor
          • 6 years ago

          [url<]http://vgamuseum.ru/gpu/noname/videoloca-bitchin-fast-3d-2000/[/url<]

            • GrimDanfango
            • 6 years ago

            Hahaha, great stuff, but the best bit of that joke is the fact that back then, they considered “256MB of curiously high-bandwidth LMNOPRAM” to be an outrageous notion πŸ™‚

            I think the Bitchin’ Fast 3D 2000 is due a 2015 model πŸ˜›

            • Beelzebubba9
            • 6 years ago

            Naw, that’s from Boot magazine from back in the 90s. I think I have that paper issue somewhere. πŸ™‚

      • Topinio
      • 6 years ago

      Gave you an upvote, because that’s probably part of it, but it wouldn’t exactly be fair to have it on that chart as it’s not competing there.

      A separate chart for cards in that form factor, that would fit mini-ITX builds, would be fair and would still look horrible for the R9 Nano.

      Ofc, same for the Titans: it’s a silly money product for the absolute best in a market segment.

        • tbone8ty
        • 6 years ago

        performance/watt will look amazing though πŸ˜‰

        • Waco
        • 6 years ago

        I don’t know of many putting together mini-ITX builds that don’t choose a case that can hold a “normal” size GPU…

          • chuckula
          • 6 years ago

          Yeah, while I’d pick a smaller card, it’s even possible to make a “mini-ITX” build using a GTX-980 if you are crazy enough: [url<]https://www.youtube.com/watch?v=8JfOGegjwsM[/url<]

      • jessterman21
      • 6 years ago

      Or the frame-time metrics… Expose all those ugly spikedies.

      • YellaChicken
      • 6 years ago

      I’d say do it anyway, set it at $650 and performance at 0 as the card hasn’t successfully rendered any frames until it arrives.

      • the
      • 6 years ago

      I think starting expectation is that it’d be rather poor due to it being targeted at the SFF crowd. Compared to other SFF cards, it’ll fair a bit better as they tend to carry a premium. How much of a premium AMD can charge is dependent upon demonstratively being the fastest SFF video card.

    • Mikael33
    • 6 years ago

    AMD’s top notch PR strikes again

      • PrincipalSkinner
      • 6 years ago

      This.

      • Redocbew
      • 6 years ago

      I personally would rather see the R9 Nano get tested and miss the published performance numbers than i would see AMD get all spooky about it, deny access to independent reviewers, and then see it miss the published numbers.

      Furthermore, I think AMD has more to lose from pulling a stunt like this also. They need all the help they can get right now, and this is not the way to get it.

        • Prestige Worldwide
        • 6 years ago

        They know they are going to get destroyed in perf / $ so don’t want this obvious fact to surface in reviews I guess.

          • Redocbew
          • 6 years ago

          30 years ago that might have worked, but not so much anymore.

    • chuckula
    • 6 years ago

    [quote<]I only need to borrow it for a few days in order to have a look at it. [/quote<] Something about that phrase reminded me of this famous scene: [url<]https://youtu.be/9V7zbWNznbs?t=1m6s[/url<]

Pin It on Pinterest

Share This