In the lab: Nvidia’s GeForce GTX 1080 graphics card

Morning, folks. As you may have already seen elsewhere on the web, Nvidia's GeForce GTX 1080 review embargo lifted this morning. To get the bad news out of the way first, we won't be publishing our usual in-depth review today. With that said, we do have a GTX 1080 in our labs now, and we'll be working around the clock to get the card tested and dissected as soon as we can. Given the results we're seeing from other sites, the numbers should be well worth the wait. In the meantime, here's the GTX 1080 Founders Edition card in the flesh:

Under that spiffy cooler is a GP104 Pascal GPU. That chip has 2560 stream processors running at a cool 1607MHz base clock and a 1733MHz boost speed. The card has a 256-bit path to 8GB of GDDR5X memory running at 10GT/s. GP104's 7.2 billion transistors are laid down using TSMC's 16-nm FinFET process. We're really excited to put this next-gen graphics chip through its paces, to put it mildly. Stay tuned, and thanks for your patience as we get our review ready.

Comments closed
    • kamikaziechameleon
    • 3 years ago

    how long have you had this darn GPU??? No review yet is amazing.

    • anotherengineer
    • 3 years ago

    The silver matches my beard highlights πŸ˜‰

    [url<]https://pbs.twimg.com/profile_images/728371857129353216/alsa049Y.jpg[/url<] Sorry Damage πŸ˜‰ Couldn't help it.

    • Tech_Geek
    • 3 years ago

    Hey all guys, i follow this youtube channel and i trust this guy with his hardware news/evaluations. I believe he’s informative and unbiased. He just made a video about performance and some potentially important information/opinion of Nvidia’s new flagship GPU, the GTX 1080’s. I hope you find it useful. Enjoy the video. [url<]https://youtu.be/myDYnofz_JE[/url<]

      • chuckula
      • 3 years ago

      He broke the rule.

      It was Scottish… but IT’S CRAP.

      • tipoo
      • 3 years ago

      Nice try, guy who made that video

    • hasseb64
    • 3 years ago

    Performance @ 4K is a disappointment especially considering price.
    They said a “GPU revolution” before release, all I see is a “milkcow”

      • tipoo
      • 3 years ago

      Nvidia trying to get all the milk before the competition arrives isn’t new. For either company, really. Kind of what companies do. If Polaris delivers, expect the lower bound of the price, if Pascal dominates, expect the “founders edition” pricing to be regular for partners. I think that’s why they included such a spread.

      • Krogoth
      • 3 years ago

      Not really, 4K gaming is just extremely taxing.

      1080 is close to achieving “smooth” 4K gaming but it just doesn’t have the bandwidth or shading prowess to obtain it. That’s going to be Big Pascal’s job.

        • --k
        • 3 years ago

        I was mildy impressed, but the next revision should be able to run 4k at 60fps. At the rate moore’s law is taking effect in GPU performance, this should have been a 100% performance boost, and not a 30-50%. Blame the lack of competition for the state of affairs.

    • tootercomputer
    • 3 years ago

    Help me clarify something.: some are saying this card would be wasted on a 1080p monitor, that there would be no added benefit compared to, say, a 970 or 980?

    • techguy
    • 3 years ago

    A suggestion for your review, if I may. Please test this card *under-clocked* vs. 980 Ti OC’d. Same/similar clocks if possible. At this point in time it is not entirely certain whether Pascal brings any significant architectural enhancements for the majority of non-DX12 graphics workloads vs. Maxwell. Another review site (don’t remember which) tested against an Asus 980 Ti Strix (which is factory-clocked at 1300MHz+) and found the 1080 to only be 10-15% faster. That’s troublesome, to me.

    • Fonbu
    • 3 years ago

    All that can be said has basically been said about this card. If you can afford it, the upgrade itch has been bugged. Some will say its impressive others will be meh. Some people have been taken by the Hype others have been rational. But more to come…..

      • BIF
      • 3 years ago

      So are you going to buy one or not?

        • Fonbu
        • 3 years ago

        Waiting for non-reference reviews and AMD’s newest to compare… same old same old πŸ™‚

    • sophisticles
    • 3 years ago

    Time for an unpopular comment that I’m sure will get a ton of down thumbs, this card, like every single high end card released since at least back in the Geforce 2 Ultra days, from any vendor, is a complete waste of money. Of course I would argue that any high end computer hardware purchased to play games is a total waste of money.

    I would like an honest answer from anyone interested in buying a card like this, is there any game either currently available or soon to be released that you would consider paying $600+ to play it? If not then what is the point of buying hardware like this?

    Maybe I’m just jaded from all the crappy PC games that have been released in the last decade and half.

      • jts888
      • 3 years ago

      Sorry, my Radeon 9700 Pro (circa 2002) disagrees.
      [list<][*<]Sub-$400 [/*<][*<]completely surpassed the previous generation (not by some measly 30% margin) [/*<][*<]even eclipsed the competition's product for a whole generation[/*<][/list<] Not every new top-end GPU has been a $700 investment with marginally improved performance.

        • Rikki-Tikki-Tavi
        • 3 years ago

        So, that’s not what he’s talking about. High end is a waste of money. Everybody with half a brain waits at least for the 1070, if not the 1060.

          • rxc6
          • 3 years ago

          The 9700 pro WAS as high end as it gets. It was the absolute top of the line card. It was also the first time ever that ATI had completely outdone NVidia instead of trailing. I still remember (ridiculous) reviews claiming that you shouldn’t buy the card because it was “too overkill” and “dx9 games are not available.”

            • DancinJack
            • 3 years ago

            Haha I do too. And my 9700pro had a lovely home in my old Athlon XP machine for quite a few years.

            edit: for nostalgia’s sake: [url<]https://techreport.com/review/4104/ati-radeon-9700-pro-graphics-card[/url<]

          • sophisticles
          • 3 years ago

          THANK YOU!

        • sophisticles
        • 3 years ago

        Yep, I bought a 9700 Pro for about $350 shortly after it was released, but let’s be honest with ourselves was there any game during that time, or at any time for that matter, that was worth spending $300+ dollars to play? Any game at all? I also had a Ti4600, the truth is the 9600pro that came later on for half the price offered just as good visuals as it’s more expensive brethren at half the price.

        I currently own a GTX960 and an R7 265, honestly what game or application justifies the expense of a GTX1080. The only thing that comes to mind is if you have some app that uses GPU compute capabilities, like Folding at home or some high end video editing app that uses GPU acceleration then maybe you can justify the expense. But to play some lame game?

        Grow up.

          • Kretschmer
          • 3 years ago

          Is there any TV show worth a $100/month cable TV subscription? Entertainment value is entirely subjective.

        • Pancake
        • 3 years ago

        And lollipops were ten to the penny. You’re exaggerating. If you include inflation of about 35% over that period then that makes the current price of your then top-end GPU $540. The non-founders 1080 edition is $599. Given the huge performance leap over the 980 it’s not a bad deal.

        ps hope you’re still not using the 9700 Pro.

        • jihadjoe
        • 3 years ago

        8800GT baby!

        $250 and 90% of the performance of Nvidia’s own $600+ 8800GTX, absolutely pwns AMD’s $400 2900XT

        So good they had to rebrand it, and rebrand it, and rebrand it again.

      • DancinJack
      • 3 years ago

      You’re jaded, and uninformed I would guess. There are plenty of reasons to spend this kind of money on a graphics card provided you have the budget for it.

      Maintaining a stable, high framerate at pretty much any resolution >=1080p.
      Incoming VR titles.

      I’m only going to name those two, but rest assured there are dozens more for those that want it. Just like anything else – this might not be the product best suited for you, but that doesn’t mean there isn’t a market of people that can utilize it.

        • sophisticles
        • 3 years ago

        You do realize that the maximum displayed frame rate is limited by the refresh rate of your monitor? Any frames rendered but not displayed are stored in the back buffer until they are a) ready to be displayed then they are moved to the front buffer) or b) they are never used, the back buffer is flushed so that new rendered frames can be stored.

        That’s the thing, most of these cards, by virtue that the monitors are not able to allow them to reach their full potential, end up spending most of their time wasting cycles and electricity rendering frames that will never be displayed, thus making them a waste of money.

          • jihadjoe
          • 3 years ago

          4k monitors completely change the game though.

          Out of 16 games [url=https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080/8.html<]TPU tested[/url<] the GTX1080 only managed to get 60fps in two games, the ageing Battlefield 3, and the even older World of Warcraft. In everything else it averaged between 40-45fps.

      • JustAnEngineer
      • 3 years ago

      [quote=”sophisticles”<] Every single high end card released since at least back in the Geforce 2 Ultra days, from any vendor, is a complete waste... [/quote<] [url<]https://www.youtube.com/watch?v=ldiYYJNnQUk[/url<] [url<]https://www.youtube.com/watch?v=ooLO2xeyJZA[/url<]

      • yogibbear
      • 3 years ago

      Wow. There’s like a million amazing PC games that have been released over the last decade. Yes there’s a lot of crap games. But there’s definitely at least 300+ decent PC games.

      Dark Souls, Dark Souls 2, Dark Souls 3, Half Life 2: Episode 2, TF2, Portal, Portal 2, DOTA 2, The Witcher 2, The Witcher 3, Deus Ex: Human Revolution, Dishonored, Fallout 4, Fallout: New Vegas, Crysis, Crysis: Warhead, Crysis 3, Assassin’s Creed II, Assassin’s Creed Brotherhodd, Assassin’s Creed 4, Call of Duty 4: Modern Warfare, Minecraft, Civ V, XCOM, XCOM 2, Grand Theft Auto IV, Grand Theft Auto V, Bioshock, Bioshock 2, Bioshock Infinite, TESV:Skyrim, Dragon Age: Origins, Mass Effect, Mass Effect II, Mass Effect III, Far Cry 2, Far Cry 3, Far Cry 4, Diablo 3, Pillars of Eternity, Path of Exile, Mirror’s Edge, Left 4 Dead 2, Hotline Miami, Dead Space, Dead Space 2, Terraria, SPAZ, Stellaris, Battlefield: Bad Company 2, Saints Row IV, Elite Dangerous, Max Payne 3, Braid, Papers Please, Rocket League, Batman Arkham Asulum, Arma 3, STALKER: SoC, STALKER: Clear Sky, Starcraft II, FTL: Faster Than Light, Divinity: Original Sin, Alien Isolation, Crusader Kings II, Torchlight II, Guild War 2, Company of Heroes, The Witness, The Division, Grim Dawn, Dragon’s Dogma: Dark Arisen, Dirt Rally, Just Cause 2, Just Cause 3, Starbound, Bastion, Metal Gear Solid V, Wolfenstein: The New Order, Amnesia the Dark Descent, Metro Last Light, Tomb Raider, Axiom Verge, The Binding of Isaac, Borderlands, Borderlands 2, Darkest Dungeon, SOMA, South Park: The Stick of Truth, Super Meat Boy, Spelunky, Undertale, The Evil Within, Middle Earth: Shadow of Mordor, Legend of Grimrock, Legend of Grimrock II, RAGE…. and on and on and on. Do you want me to? Or are you just super jaded?

        • NTMBK
        • 3 years ago

        And the majority of those will run just fine on a 7870… What’s your point?

          • yogibbear
          • 3 years ago

          [quote<]Maybe I'm just jaded from all the crappy PC games that have been released in the last decade and half.[/quote<]

      • Johnny Rotten
      • 3 years ago

      [i<]like every single high end card released since at least back in the Geforce 2 Ultra days[/i<] lol

      • BIF
      • 3 years ago

      Not for gaming. Want it for graphic rendering and F@H. So there.

        • caconym
        • 3 years ago

        Yeah, this. Rendering on consumer/prosumer GPUs is now pretty feasible thanks to bigger VRAM, and cheaper than building a secondary Xeon machine just to (CPU) render on.

        Top-end consumer cards are marketed to gamers, but artists turn to them when they can’t justify claiming a Quadro on their taxes.

        Plus you can play games on ’em.

          • BIF
          • 3 years ago

          Yep, it’s ALWAYS cheaper to add or replace GPUs than it is to build, house, and operate an additional machine. Although I may be doing that sooner rather than later, since this workstation just seems to have developed a new aversion to posting. It’s up now and it’s even on Windows 10, but now I’ll never be really sure that it’s fully reliable (not related to Windows 10). So come on, Broadwell-E, come to daddy!

          Okay, I can’t believe I just typed that.

          Anyway, with GPUs, I just want to know what to expect for my money and current render and F@H test results don’t REALLY tell me what I need to know.

      • hkuspc40
      • 3 years ago

      For most people with a Maxwell card I kind of doubt a 1080 is absolutely necessary. I’m running a 970 and won’t upgrade until there’s better performance in 4k or I’m forced by the recommended requirements (gaming).

      • havanu
      • 3 years ago

      4K or in my case 3440×1440 gameplay at 60fps.
      My 980ti came close in most titles, but not all of them.

      • delsydsoftware
      • 3 years ago

      You’re assuming that your previous high-end card is disappearing into the vapor after the upgrade. I tend to either shift my old cards into other computers (like HTPCs, etc) or sell them, which offsets the price by 40-50% or more. So, all of the sudden, that $600 card might cost $250-300 after selling your older card, which ups the value proposition quite a bit. And, some lucky person on eBay is getting the last cycle’s flagship card for a good deal. Or, you end up upgrading 2 computers at once and then still have an older card left over to sell.

      If you think of it that way, you’re only buying an expensive card at full price once.

      • jihadjoe
      • 3 years ago

      Racing games! If you factor in the cost of building a cockpit setup the peripherals alone completely dwarf the $699 you’d spend on a GTX1080FE.

      Added bonus is modern driving games have gotten really good. Starting with GTR the physics have been incredibly realistic, and today’s Assetto Corsa and Project Cars add slick graphics on top of that. There’s also huge community that’s dedicated to modelling cars and tracks from all over the world.

      • cygnus1
      • 3 years ago

      I plan to purchase a high end monitor sometime in the not too distant future. Most likely a 21:9 1440p type or possibly just a decent sized 4K, I’m still up in the air. But I know my dual 760’s won’t cut it for that and I’ll be planning on purchasing a 1080 or maybe a Polaris based card if they’re competitive. Either way, I will need a very high end GPU to be able to get close to maintaining 60+ FPS at those resolutions with all/most of the eye candy turned on.

      Is it absolutely a luxury… Yep. Do I need it to play most games… Nope. But I want it. And I’m very blessed, so it’s a luxury I can afford.

      • Kretschmer
      • 3 years ago

      Is there any TV show worth a $100/month cable TV subscription? Entertainment value is entirely subjective.

      • Andrew Lauritzen
      • 3 years ago

      I agree with your argument, but realize for hardware like this there is a chunk of the audience that is just not that price sensitive. Frankly computer hardware – while not insignificant – is not the most expensive hobby compared to many other things you can do with your money. Ex. for the price people spend to get a slightly fancier car (with zero practical value), you could have the latest GPU for 20+ years.

      It’s completely practical for many people to continually upgrade to the newest stuff if it’s a priority, and if you’re spending a decent enough chunk of your time gaming or similar, why not?

      Perf/$ will always favor stuff further down the stack (and older), but Perf/$ peaks at free devices that have non-zero performance πŸ™‚ More importantly it’s just not the most important metric for a lot of people.

      To answer your question: I’m personally considering one of these for a few reasons. 1) VR, which can use all the performance it can get and 2) from benchmarks this will get me significantly closer to hitting 144Hz (monitor refresh) than my current 980 setup. High refresh rate stuff is so much nicer that I’m definitely willing to pay for it πŸ™‚

    • yogibbear
    • 3 years ago

    So…. it looks like the GTX1080 is gonna be the 1440p/max settings/60+ fps solution. But not the 1440p/144hz solution, nor the 4k/60+fps solution. So…… do I hang onto my 770GTX till the 1080TI? Or do I just get a 1070GTX and rock out at 60+fps/1440p?

    • EndlessWaves
    • 3 years ago

    Could you include some thoughts on the power consumption of the card in the review?

    It strikes me as very odd that nVidia pushed performance to such an extent that they’ve even increased power consumption. Before the announcement I was taking it for granted that at the very least they’d split the difference and provide more performance at lower power consumption as that’s the general trend of the computing industry. Especially after moves like the GTX 980 notebook.

    Are nVidia thinking this will be a one off hike for VR & 4k? Do they feel threatened by Intel’s rapidly improving integrated graphics and want to keep desktop graphics at a high power level? Would the chips be too open to massive overclocks eating into their profits if they reduced it?

      • Ninjitsu
      • 3 years ago

      I think perf/w is still up so it’s acceptable.

      And honestly the mid range and low end may consume less power, but I’d gladly take a 30w hit on load for 70% more performance.

    • TardOnPC
    • 3 years ago

    Thanks Jeff,
    I look forward to reading an honest TR review complete with the usual graphic settings used and unfiltered results. Price chart should be interesting. πŸ™‚

    I noticed some sites exclude the 980 SLI from their benchmark results except for when the 1080 was faster; what kind of shady sh!t is that?

    • End User
    • 3 years ago

    From what I have seen so far performance at 2560×1440 makes me very happy. πŸ™‚

    • anotherengineer
    • 3 years ago

    Well read a few reviews, impressive yet a bit Krogoth at the same time.

    My reasoning, 1080 on avg is about 70% faster than the 980 fps wise anyway.

    980 – 28nm, 4GB ram @ 1750=7000 eff mhz, 5.2 bill transistors, default core clock 1126 Mhz, 2048 shaders, 64 rops, original opening market price $549

    1080 – 14nm, 8GB ram @ 1250=10,000 eff mhz, 7.2 bill transistors, default core clock 1607 Mhz, 2560 shaders, 64 rops, orig opening market price $699 (yes F.Ed)

    I just remember way back, when I bought a 3850 for $210, then a bit later the 4850 for $210 but double the performance same 55nm node. Then the 6850 for $135 on 40nm for double the performance again. (brought about the same way, more transistors, etc.) So 100% increase in performance for same price, to a lower price.

    Which is basically the 980 to the 1080, 70% increase in performance but more money, a small amount would be expected for the extra 4GB though.

    So basically just getting back to where video card leaps were getting back in 2007. So just a bit Krogoth at the hype I guess, performance is where it should be expected to be with the extra shaders, features and clocks.

    So yeah for 16nm!!!!!!!!!!!!

    I think I am more curious in the silicon of TSMC & their 16nm process vs the silicon of GF & Samsung’s 14nm process. I guess soon enough we will see if AMD should have used TSMC or if GF is equivalent or better.

    Now just have to wait patiently for TR’s review.

      • Srsly_Bro
      • 3 years ago

      I think it’s best to find a 1070 and upgrade to a 1080Ti or wait until Vega. That’s my plan, btw. Go ahead and copy, if you like.

        • anotherengineer
        • 3 years ago

        Glad I’m an occasional gamer that is still happy playing counter strike, dungeon defenders and others. Heck me and a buddy are still shooting our way through borderlands Knoxx DLC. With everything maxed (except bloom) on 1080p it hits the 60fps frame cap.

        So my old AMD 955 and Radeon HD 6850 isn’t obsolete to me yet πŸ˜‰ Also my budget is typically $180-$250 for a vid card.

        I will wait for a 14/16nm card in that price range whatever it might be πŸ™‚

      • muxr
      • 3 years ago

      You just described Heroin addiction in GPU terms.

    • tootercomputer
    • 3 years ago

    Ars Technica has a reivew. Pretty impressive.

    [url<]http://arstechnica.com/gadgets/2016/05/nvidia-gtx-1080-review/[/url<]

      • Dashak
      • 3 years ago

      [quote<]8GB of GDDR5X memory brings PC graphics up to parity with consoles[/quote<] Yeah, think I'll wait for TR's review.

        • MathMan
        • 3 years ago

        Their comments about color compression are pretty dubious as well.

        > Where Maxwell featured 2:1 compressionβ€”that is, where the GPU calculates the colour difference between a range of pixels and halves the data required if the delta between them is small enoughβ€”Pascal can do 4:1 or even 8:1 colour compression with small enough deltas.

          • cygnus1
          • 3 years ago

          Yeah, there’s a reason Ars doesn’t really do hardware reviews much anymore…

          • DPete27
          • 3 years ago

          I believe that’s a true statement. PCPer reported that as well and showed a visual of compression from Maxwell to Pascal. Not sure what your issue is with it.

        • Visigoth
        • 3 years ago

        I like Ars Technica in general, however that review has a really negative undertone to it. I’m not used to such low-quality reviews from that site. They should know better (and fire that reviewer). Those idiotic statements do not belong anywhere near a tech forum.

        • tipoo
        • 3 years ago

        They also say it doesn’t’ have async compute (they call it async shaders). Pascal does have asynchronous compute. Maxwell is the one that’s in question, they said it did but the hardware does not seem to support it,instead they seem to be planning on making the driver make the CPU work to schedule things for the GPU and make the GPU context switch (far slower)

        [url<]http://videocardz.com/59962/nvidia-geforce-gtx-1080-final-specifications-and-launch-presentation[/url<] I like Ars, but they can get way out of their technical depth sometimes.

          • Pitabred
          • 3 years ago

          I don’t think it was ever that they didn’t support async compute, it’s that it wasn’t dedicated hardware doing it. It’s part of their “preempting” technology, which while good, isn’t actually asynchronous: [url<]http://www.eteknix.com/pascal-gtx-1080-async-compute-explored/[/url<]

        • ish718
        • 3 years ago

        LOL, PCs can finally compete with consoles!

        • Firestarter
        • 3 years ago

        they changed it: ‘8GB of GDDR5X memory is totally dope’

        weak

        • muxr
        • 3 years ago

        Man I used to remember when Ars Technica used to be awesome. How far they’ve fallen. Unbelievable.

        • USAFTW
        • 3 years ago

        It sure does. WTFFFFFF!!!

        • maxxcool
        • 3 years ago

        deleted…

        • maxxcool
        • 3 years ago

        I just searched for that comment out of curiosity and did not see it in the review. where is that comment for the full context to read myself ?

        the only time I saw the word console was here on page 4 “”A locked 30FPS is no problem, handily beating out the console experience. But after years of promises, it’s surprising to find that single-card 4K at 60FPS is still out of reach.””

          • RAGEPRO
          • 3 years ago

          They removed it.

            • maxxcool
            • 3 years ago

            *BAD ARS!* BAD! /rolled up newspaper/

          • Firestarter
          • 3 years ago

          They changed it, it was in the “good/bad/ugly” section at the end of the review, listed under “good”, it says “8GB of GDDR5X memory is totally dope” now. This is what it said before: [url<]http://i.imgur.com/UP4RjdT.png[/url<] Instead of only announcing their questionable technical competence, they've instead opted to show their increasing lack of journalistic integrity. That is, unless somebody can tell me when and where they redacted their previous statement and offered an explanation

            • maxxcool
            • 3 years ago

            Nice… good screengrab.

            • chuckula
            • 3 years ago

            Well it could have been a misspelling of “parody” that turned into “parity” πŸ˜›

      • Rza79
      • 3 years ago

      Comprehensive review here too:
      [url<]http://www.computerbase.de/2016-05/geforce-gtx-1080-test/7/[/url<]

        • Firestarter
        • 3 years ago

        worth a read if you can into German, they tested a lot of games at multiple resolutions and they’ve also tested the 6700K vs the 2500K in several games

          • tipoo
          • 3 years ago

          The nice thing about German is that the sentence structure is so close, Google Translate gets you a perfectly readable review πŸ™‚

          I’ve actually read sites it auto-translated that I only later noticed weren’t native english. There’s some oddities, but really not the worst grammar you’d see on the internet lol

            • Firestarter
            • 3 years ago

            link: [url<]http://translate.google.com/translate?js=n&sl=de&tl=en&u=http://www.computerbase.de/2016-05/geforce-gtx-1080-test/[/url<] or go straight to ze German graphs: [url<]http://translate.google.com/translate?js=n&sl=de&tl=en&u=http://www.computerbase.de/2016-05/geforce-gtx-1080-test/7/[/url<]

            • Ifalna
            • 3 years ago

            *chuckles* As a German, I can’t say that it works this well the other way around though.
            Esp automatic YouTube translation produces rather hilarious results. πŸ˜€

      • NeoForever
      • 3 years ago

      Wait… Freaking Ars got a card before TR?
      What the actual @#$%!

        • chuckula
        • 3 years ago

        Uh yeah…. what other review outlet can analyze the impact of Pascal’s architecture on the struggles of transgendered penguins in Southeast Antarctica?

      • beck2448
      • 3 years ago

      Here’s a review from Hexus. A brutal beatdown of FuryX reaching 100% at times.
      [url<]http://hexus.net/tech/reviews/graphics/92846-nvidia-geforce-gtx-1080-founders-edition-16nm-pascal/?page=8[/url<]

    • kamikaziechameleon
    • 3 years ago

    How much for this monster???

      • SoM
      • 3 years ago

      2 testicles and an arm :p

      it’s $599 USD

        • Srsly_Bro
        • 3 years ago

        No, Founder’s Editions are $699 per Nvidia.

          • SoM
          • 3 years ago

          aye, typo

            • Srsly_Bro
            • 3 years ago

            with both arms it comes out to $699.

          • anotherengineer
          • 3 years ago

          So about about $1030 Cnd. after shipping and taxes.

          Have to pass on that.

    • toki
    • 3 years ago

    I know everyone seems mad, but I think it’s great that Nvidia stepped to the plate to help rectify the problem. That is a very nice card btw.

      • Ninjitsu
      • 3 years ago

      Well, they basically had “NVIDIA SUCKS” plastered in the top comments for a week on the home page, doubt they were pleased about that. πŸ˜€

    • USAFTW
    • 3 years ago

    Hey Nvidia, nice GPU launch. Be a shame if anything were to happen to it. Now go home and get your ****ing shine box!

    • Firestarter
    • 3 years ago

    From what I’ve seen so far, it’s a great card that’s probably going to drop in price quickly over the next 2 years as AMD and Nvidia release successors with larger dies and way higher power ratings. Unless you’re fond of power sipping systems, I see little point in shelling out this much money for a 180W card when you know there’ll be a 300W+ variant on the horizon that could very well be twice as fast. Plus it’s somewhat limited by memory bandwidth from what I’ve read, and there’s a lot that can be done on that front

    That said, I still want it. All 28nm GPUs effectively just got EOLed. Which begs the question, how long will 16nm finFET last? Another 4.5 years?

      • bittermann
      • 3 years ago

      [quote<]From what I've seen so far, it's a great card that's probably going to drop in price quickly over the next 2 years[/quote<] Wow...talking out of both sides of the mouth are we...lol

        • Firestarter
        • 3 years ago

        2 years used to be a long time in GPU land, but here I am with a 4 year old GPU wondering whether this is actually the new card that I want to upgrade to or if my current one can do a few more months

          • yogibbear
          • 3 years ago

          My 770 GTX is 3 yrs and 1 wk old. πŸ™

    • bfar
    • 3 years ago

    Nvidia are coming in for a bit of criticism on pricing in some of the reviews I’ve read. I’m glad to see that. Effectively setting two prices is an extraordinarily devious strategy – it capitalizes on early adopters and it sets the price against either scenario that the Polaris launch throws up.

    The only good thing I can see from this is that it implies that Nvidia aren’t 100% sure what AMD have up their sleeve.

      • cynan
      • 3 years ago

      To play Devil’s advocate, OEMs have been charging premiums for higher clocking/binned chips for years. The only difference is that they have much lamer marketing (SUPER TOXIC XXX OC EDITION).

      • jensend
      • 3 years ago

      I strongly disagree. nV’s founders edition is a good idea that makes both them and early adopters better off.

      “Capitalizing on early adopters” is exactly what low supply (very early in the manufacturing run) and high demand (early adopters’ willingness to pay) dictates in a free market. Such capitalizing is one of the main strengths of capitalism.

      The alternative to high prices at launch is shortages and cards being sold on the gray market at high markup. People who value the cards the most, and who would be willing to pay even more than $699 for one, wouldn’t be able to find one. People who found one on the gray market would be willing to reward nV for their R&D but those rewards would be going to middlemen instead. In economics 101 terms, both consumer and producer surplus would be reduced. On the whole society would be worse off.

      nV has found a way to do ‘high initial prices at launch’ (which makes good economic sense) in a way that makes good marketing sense.

      The only problem with the way nV has done this is that the regular, non-Founders Edition 1080 is getting great reviews but is only a paper launch. The Founders Edition, while available reasonably soon, doesn’t shift the price-performance curve for single GPUs all that much, it just extends it. The regular edition is getting high marks for price-performance right now, but by the time it’s actually hit the streets, prices for other high end cards will have dropped and Polaris could be on the market too.

    • Juba
    • 3 years ago

    Please include GTX 770 and 1080p in your tests.

      • chuckula
      • 3 years ago

      A GTX-960 will do fine as a proxy for the GTX-770.
      As for 1080p… maybe in some of the more extreme benchmarks, but the GTX-1080 is really designed for at least 2560×1440.

        • Meadows
        • 3 years ago

        I’ve said it before and I’ll say it again, it makes sense to test the resolution 70% of all people (and gamers) use. I know full well that the results will be hilarious, but it only goes to better highlight the improvement people should expect if they do upgrade.

          • Gyromancer
          • 3 years ago

          When we test the cards, we try to push them to their limits and see how other cards perform at such extreme levels. It is hard to test the cards at 1080p when everything is running at a steady 60+ fps with no frame drops. If the cards perform well at 4K, you can be confident that they will perform well at 1080p. Also, if you’re still using a 1080p monitor, you probably shouldn’t be looking at the latest cards that are meant to push the envelope of PC gaming.

            • EndlessWaves
            • 3 years ago

            Plenty of 144hz 1920×1080 monitors around.

            • travbrad
            • 3 years ago

            Yep I specifically bought a 1080p 144hz monitor instead of a 1440p because I wanted to actually get near 144FPS in some games. The GPUs I buy in the $200-400 range just weren’t capable of getting anywhere near 144FPS @ 1440p in most games, which kind of defeats the purpose of getting a 144hz monitor. Heck they couldn’t even stay above 60FPS in some games at 1440p.

            I know some people prefer more pixels but I generally prefer smoothness/responsiveness (maybe because I mostly play FPSs and other fast paced games)

            • travbrad
            • 3 years ago
            • Andrew Lauritzen
            • 3 years ago

            If your argument is that this card is overkill for 1080p because there are zero frame drops at 144Hz w/ max settings, fine… but I don’t think many people actually agree with that based on the discussion last time in the comments. I know from experience that my 980 Ti can’t keep a solid 144Hz (or even 60 in some games) all the time, so I’m very much interested in how much closer I can get with a 1080 for instance.

            I’m almost completely uninterested in whether it can get 30 or 35fps in 4k or some other equally silly resolution.

            Meadows is right here – the Steam data is extremely clear: almost no one has monitors above 1080p so it’s still the most important test point for now.

            • Ninjitsu
            • 3 years ago

            According to the TR hardware survey last year, hardly anyone here had 4K either.

            • JustAnEngineer
            • 3 years ago

            Hardware surveys are trailing indicators. They show what old hardware people are still using at some later date. They don’t show what hardware people are (or should be) buying in the future.

            However, here’s an inexpensive 1080p monitor that might be used for testing at that resolution.
            [url<]http://www.newegg.com/Product/Product.aspx?Item=N82E16824267006[/url<]

            • Liron
            • 3 years ago

            It’s a 144Hz monitor for $230 and they don’t advertise it either in the title or in the bullet points? You have to open the specifications to see it? Why wouldn’t they advertise such an obvious differentiator from other monitors in that price range?

            • ImSpartacus
            • 3 years ago

            And the Steam Hardware survey shows that just over 95% of Steam users game at now more than 1080p.

            It really puts things in perspective.

            • sweatshopking
            • 3 years ago

            misunderstood your comment

            • travbrad
            • 3 years ago

            Yep getting 35FPS at 4K isn’t really “pushing the envelope of gaming” IMO. It’s trying to drop the PC master race down to the the same framerates as consoles. No thanks.

            • jensend
            • 3 years ago

            The Steam data is extremely clear: almost no one has a GTX 1080, so the Intel HD Graphics 4000 is still the most important test point for now.

            Or, back to reality rather than such silliness: what is relevant for this review is not the resolutions used by Steam users in general, but the resolutions used by people who are spending $500-$1000 on a graphics card.

            • Ninjitsu
            • 3 years ago

            Considering that the GTX 980 and 970 just about hold 60 fps at 1080p, i shall respectfully disagree.

            The 1070 will probably be the first GPU of this class that can hold 60 fps at 1080p under all conditions, I think – even then there will be an exception or two.

            EVIDENCE: [url<]http://www.computerbase.de/2016-05/geforce-gtx-1080-test/7/#diagramm-xcom-2-1920-1080[/url<]

            • Meadows
            • 3 years ago

            Since when is 60 fps the holy grail? I thought we’d moved past that years ago. Also, there should be several titles that put on an acceptably good show with 8x antialias or similar maths sinks.

            • Meadows
            • 3 years ago

            More to the point, 1080p @ 180 Hz should be the new measuring stick in these cases, because it means the card will provide a perfect experience in VR (90 Hz to each eye).

            I don’t doubt the prowess of these new cards but I’d be surprised if they could uphold that framerate under all conditions.

            • cygnus1
            • 3 years ago

            While I agree with you, they should be testing for VR playability at some point, technically it doesn’t need to render 180 FPS. With the Simultaneous Multi-Projection feature, being able to sustain anything over 90 should be good enough for VR as the 2nd view for the other eye should basically be free.

            • homerdog
            • 3 years ago

            I downvoted you but then realized it would be more constructive to correct you. SMP doesn’t make the 2nd view free, everything still has to be shaded again. And now I can’t take my downvote back πŸ™

            • chuckula
            • 3 years ago

            I took it back for you.

            • cygnus1
            • 3 years ago

            No worries on the downvote, especially since chuckula fix that πŸ™‚

            I’m honestly curious though. The nVidia release event with their crazy high numbers (I believe they showed it doubling Titan X in VR) made it seem as though it was likely rendering the additional view for the 2nd eye almost completely for free. So when they say Simultaneous Multi-Projection(I don’t want to call it SMP since that acronym already exists) allows rendering multiple viewports in a single pass, how much of the rendering process does it “do for free” for the additional view ports?

            • Meadows
            • 3 years ago

            Okay, “180 fps” then, in funny quotes. As in, one normal frame and one kind-of-free-but-not-really frame.

            • cygnus1
            • 3 years ago

            This is kind of a spoiled, condescending attitude to take. Who knows, maybe you’ll uncover some issue with the cards outputting high frame rates or something. I know I’d like to see the cards tested at 1080p to see exactly how well they scale.

            Hopefully the real answer is that you just don’t have the time or don’t have the equipment to test the way that’s been asked.

          • chuckula
          • 3 years ago

          Well in that case, Jeff can spare himself the effort because I could just tell you the results of the test:

          1080p resolutions: So fast that it’s immaterial because the refresh rate of your monitor can’t keep up.

            • sweatshopking
            • 3 years ago

            My monitor can keep up with the 30fps these puppies get on witcher 3, without even max settings (granted it’s 30fps on 1440p, but it’s not going to hit a solid 60fps from a drop to 1080p, AND IT’S NOT EVEN MAX SETTINGS)

            • Andrew Lauritzen
            • 3 years ago

            Tons of people disagreed with that last time citing many reviews of frames being dropped even by 980Ti’s w/ 60Hz monitors… and I know from experience that definitely isn’t true for 144Hz monitors πŸ™‚

            It’s all well and good to test the raw throughput @ high resolutions and so on, but that doesn’t change the fact that that is a *different*, and frankly easier workload than when serial bottlenecks and arch-specific stuff come up when trying to push up refresh rates of VR or 144Hz displays. We need to understand that at least as much as understanding stupidly high resolutions πŸ™‚

            • chuckula
            • 3 years ago

            OK LOOK MAN. I KNOW THAT YOU ARE ALL ABOUT THE [s<]BASS[/s<] [u<]IGP[/u<], NO DISCRETE CARDS, BUT SOME OF US HAVE RECENTLY UPGRADED BEYOND 1080p!

            • sweatshopking
            • 3 years ago

            Imitation is flattery. Thanks, chuck

            • Andrew Lauritzen
            • 3 years ago

            πŸ˜€

            I actually don’t even remember the last time I had a 1080p monitor… it would be >10 years ago at this point! That said, the stats don’t lie… I’m the minority :S

            After my recent upgrade to a 144Hz monitor as well, I’m a total convert to high refresh rate. Need moar games running >120!

            • JustAnEngineer
            • 3 years ago

            I’ve [b<]never[/b<] purchased a 1080p PC monitor for my personal gaming use. I've had my 2560x1600 UltraSharp 3007WFP for a decade. My second monitor sitting beside it is a 1200x1600 UltraSharp 2001FP that I've had for 15 years. My ultrabook, however, does have a 1080p display, as does my smartphone.

          • USAFTW
          • 3 years ago

          Why on earth would you test a 1080 at 1080… Oh.

            • jihadjoe
            • 3 years ago

            lol!

            Srsly though 1080P will still be relevant if your target high frame rate. Looking at TPU’s review there are a still couple of games where even the 1080 isn’t able to get a constant 120/144Hz at 1080P.

          • Ninjitsu
          • 3 years ago

          Yup, I agree with this completely. If there are no (or exceedingly little) frames over 16.7ms at max settings and at least 4x AA, these cards can do 1080p @ 60fps – otherwise they can’t.

          (Accommodating for benchmark quirks, of course)

            • travbrad
            • 3 years ago

            Why is 16.7ms/60FPS the limit where we have deemed everything is fine though? Just because that’s what most LCD displays used for years? It’s very easy to see framerates/refresh rates way above 60FPS/hz. A lot of people were gaming at higher refresh rates on their CRTs…

            • Ninjitsu
            • 3 years ago

            Well, I suppose for me it’s because then I can enable vsync/adaptive vsync and not go below 60 fps, resulting in smooth animation and no tearing.

            Anyway, I’m totally cool with keeping 1080p around for higher frame rates, although I won’t be able to argue that most people who own 1080p monitors have >60Hz monitors.

        • sweatshopking
        • 3 years ago

        I’m with meadows. I don’t think the 980 is enough for 1080p gaming, and i’m sceptical the 1080 will be able to handle 60fps on max settings on many games @ 1080p. I’d definitely want to see it.

          • DPete27
          • 3 years ago

          I know/hope you’re joking. If not, [url=http://www.pcper.com/reviews/Graphics-Cards/GeForce-GTX-1080-8GB-Founders-Edition-Review-GP104-Brings-Pascal-Gamers/Dirt-<]read a review[/url<]

            • DancinJack
            • 3 years ago

            And here are some 1080p results in case PcPers crappy graphs and 1440p isn’t enough evidence for them. [url<]http://www.guru3d.com/articles_pages/nvidia_geforce_gtx_1080_review,13.html[/url<]

            • sweatshopking
            • 3 years ago

            Which results? the witcher which dips down to 30FPS without even max settings ON THE GEFORCE 1080??!
            GTA V which is 43, again, without max settings?!

            GREAT READING GUYS.
            seriously, though, you’re proving my point. These things are going to struggle under TW ROME 2 on high fer sure. Warhammer TW will destroy these cards. Not fast enough.

            If you’re gonna minus me, fine. BUT LET’S BE CLEAR. THESE CARDS ARE NOT GOING TO RUN EVERYTHING ON MAX AT 60FPS @ 1080P.

            • DancinJack
            • 3 years ago

            I think you may be reading it wrong… Where do you see the 1080 dropping to 30FPS @1080p on the witcher?

            • sweatshopking
            • 3 years ago

            slight hyperbole, but: [url<]http://www.pcper.com/image/view/69341?return=node%2F65365[/url<] [url<]http://www.pcper.com/files/fixed/gta5-screen1.jpg[/url<] from same review, and neither of these are max settings and regardless, TW warhammer is going to cream these things.

            • DancinJack
            • 3 years ago

            It’s not Nvidias (or AMD’s) fault the TW games are optimized super awfully. Notice that most everything else, if not all, runs just fine.

            • sweatshopking
            • 3 years ago

            Not just tw. Gta v is 43 fps. I would want to game with these at a higher than 1080p resolution. Just me. I like a smooth vsync and max settings. 1080 is close, but for 700$ usd, barely making the cut. Ill wait for next gen.

            • Froz
            • 3 years ago

            Regarding that first link, here’s a quote from that review, just below the graphic you linked:

            “Our manual test run in The Witcher 3 has a brief transition between cut scene and game play at about the 25 second mark, so you can go ahead and ignore the spikes in frame times and dives in frame rates there”

            And by the way, that is not 1080p.

            The second link is most likely from 4k test, but the review doesn’t really say that.

            I agree about TW games though. It’s not just about their optimalisation, which usually happens some time after a game was published. I think it’s more about that they have atypical amount of detailed objects at the same time on the screen + they also stress CPU more then your usual FPS.

            • sweatshopking
            • 3 years ago

            I did say it wasn’t 1080. Good to know about the witches. It still isn’t max settings though. Looks like it is getting close, but not there quite.

            Tw games sure do make a system work. They’re also what I play. I don’t fps. Just strategy games, amd since this baby will probably cry under Warhammer 1080p, I don’t think it is far to say they’re sooooo overkill for 1080p. Tw isn’t an obscure game. Neither is gtav and this thing will struggle. Optimization? Maybe. I don’t really care though. I want 60fps min, and max settings. I’m not buying a new gpu until I can accomplish that.

            • Froz
            • 3 years ago

            [quote<]I did say it wasn't 1080.[/quote<] No, you didn't, at least not here. It is also a question of what you consider "max". Technicaly max would be using that tech (can't remember the name) that renders everything at quadruple the real resolution. If that's what you mean, then yeah, we are far far away from stable 60 FPS for 1080p gaming. And could you stop implying GTAV is struggling at 1080p or proof it? The links mentioned here do not show that at all, in pcper.com article it never dips below 60 FPS at 1440p. As I already said, that 43 FPS screenshot must be from 4k test, as that is the only one that has that kind of FPS numbers.

            • sweatshopking
            • 3 years ago

            I said it was hyperbolic, and was inferring to the resolution. Could have been clearer, but that’s what i meant.
            No, what i mean is all graphic options which are open to all cards being set to high. In the case of witcher, hairworks, which functions on amd and nvidia, is off.

            Might be 4k. That’d be pleasant.

            • Andrew Lauritzen
            • 3 years ago

            For a 60Hz monitor, you might be fine… but it still doesn’t hit 144Hz in *any* of the titles. Doesn’t even hit 120Hz in a lot of them even at 1080p.

            And most importantly, these are just averages. I want to see that it *never* drops a frame at 120Hz or similar. That is frankly way more important than 4k.

            Honestly 2560×1440 120Hz+ displays are probably the sweet spot of testing, but as Steam indicates 1080p is what people have. Thus it needs to be tested in addition to whatever else.

            • beck2448
            • 3 years ago

            [url<]http://www.hardocp.com/article/2016/05/17/nvidia_geforce_gtx_1080_founders_edition_review/14#.Vzvd9Z8xXqA[/url<] A gold award from one of the few sites along with Tech Report that exposed the fact that fps wasn't telling the real story.

        • the
        • 3 years ago

        I do prefer to have the GTX 770 included. There are a few areas where there is a clear distinction between the two cards.

        A GTX 680 on the other hand is a good proxy for the GTX 770 since they’re literally the same chip.

        • Flapdrol
        • 3 years ago

        A 960 isn’t a good proxy, in doom for instance the 960 outperforms the 780.

      • jensend
      • 3 years ago

      The idea that anyone should even consider purchasing such a card for 1080p is reliant on falsehoods about high framerates.

      For real-world video, credible studies have consistently shown that somewhere in the 20-25 ms range humans can no longer distinguish faster rates in double blind tests. There are also good biological reasons (understanding the reactions going on in the rods and cones) to think this is pretty much the limit.

      But wait, you’ll say, I’m sure I can tell a difference when I’m getting better than 50fps. What gives?

      1. Inconsistent frame times. The main reason people associate a 100+ fps counter with happiness is that it comes with fewer painfully slow >40ms frames. Getting to the root of this is the “inside the second” revolution.

      2. Monitor refresh rate timing interactions. If 15ms of extra latency are added to your 18.33 ms frame because it missed the 60hz refresh interval, you can notice that. With wide-range variable refresh this problem goes away.

      3. Input lag problems esp. on poorly coded games.

      4. Temporal aliasing. Normal rendering is like having a video camera with an infinitely fast shutter; natural video has the advantage of integrating information throughout the shutter duration. So the perception threshold for no-motion-blur CGI would be faster than that for natural video. But we’re not talking about 2x as fast here.

      If you are getting consistent 13.33 ms frametimes (75fps), you have variable refresh, and the game’s input loop is decent, you simply would not notice any difference switching to a different card that gets consistent 8.33 ms frametimes (120fps) in a double-blind test. TR is interested in testing for real perceptible differences rather than snake oil.

      (Remember, the jump from 50 fps to 60 fps is the same size as the jump from 60 to 75, 75-100, 100-150, or 150-300; it’s just 3.3ms in each instance. Using the inverted measure – fps or Hz – is misleading here; we should use ms instead.)

      In conclusion: if you have the cash to even think about shelling out for a $700 video card, you shouldn’t be using a $200 1080p fixed-but-high-refresh monitor. Getting yourself a higher-resolution monitor with variable refresh will make a much bigger perceptual difference than e.g. the 5ms difference between 100fps and 200 fps.

    • DPete27
    • 3 years ago

    Looking like no FreeSync support?

      • Ninjitsu
      • 3 years ago

      Well there’s “Fast Sync” instead (which is some new vsync thing apparently).

        • brucethemoose
        • 3 years ago

        From what I understand, it adds quite a bit of latency if you aren’t already getting really high frame rates.

        In other words, it looks like fancy triple buffering to me.

          • Andrew Lauritzen
          • 3 years ago

          Not even “fancy” triple buffering… it looks exactly like triple buffering. They aren’t really clear on why it exists or whether it really needs hardware vs. just OS/drivers. You can clearly already do what they are describing in the application with FLIP chains in Windows 8+, so I’m not really sure what secret sauce they are proposing or whether it’s just a driver override.

            • Ninjitsu
            • 3 years ago

            Yeah, “fancy triple buffering” is what I thought of too as I was reading pcper’s article just now. Came here to see if anyone was talking about it πŸ˜€

            Nvidia [i<]claim[/i<] the latency is lower and stuff, so not sure what's up. IIRC the triple buffering option used to be recommended with vsync on (in the nvidia control panel).

            • cygnus1
            • 3 years ago

            The way I read it is that Fast Sync will use more than 2 back buffers (not sure how many frames will actually be buffered) and nVidia has some secret sauce algorithm that will pick frames based on how much in the scene has changed. All with the aim to keep animation smooth. I can’t see it being that helpful unless you are rendering an absurd number of FPS and trying to output at 50hz or 60hz or something like that.

            • jts888
            • 3 years ago

            If the goal is smoothness when rendered frame rate exceeds display refresh rate, wouldn’t a simple renderer frame rate cap be better all around?

            • cygnus1
            • 3 years ago

            I agree. I can only assume this is intended for some game engines that lock the simulation to the frame rate and users would want it running as fast as possible but to still maintain smooth animation. I have no idea which games might fit that description, but I know I’ve heard games described that way.

            • Andrew Lauritzen
            • 3 years ago

            Yeah that’s all I got from it as well… let game engine run faster so it can run simulation faster.

            But with the comments on the PCPer stream about it actually *negatively* affecting animation smoothness if the engine rate is “slightly above vsync” or similar, it sounds pretty much exactly like bog standard triple buffering, so not sure what’s new here.

            • cygnus1
            • 3 years ago

            I’m not shocked it’s jerky. I don’t know how you can expect to basically do frame decimation and not end up with jerky animation.

            • Andrew Lauritzen
            • 3 years ago

            Right. It would be interesting if they were doing some sort of timewarp style interpolation on the backend, similar to what the Force Unleashed 2 did to “upsample” from 30->60. But that involved some game engine trickery and actually *adds* a bit of latency so it’s not clear that it’s useful for the cases they are targeting here.

            Honestly the way adaptive sync works if fine…
            1) Get a 120Hz+ monitor
            2) VRR below monitor refresh
            3) Lock to monitor refresh if you ever manage to hit it

            That’s pretty much the best case anyways. To get absolute smoothness for stuff like VR, stay comfortably below the point where you’d ever miss a frame.

            • Flapdrol
            • 3 years ago

            [url<]http://www.neogaf.com/forum/showpost.php?p=203905992&postcount=430[/url<] If I understand correctly this fast sync thing can let the cpu spit out as many frames as possible and the gpu choose which ones to render, skipping them if there is a more up to date frame available.

            • Andrew Lauritzen
            • 3 years ago

            Yeah that’s my understanding as well… but that’s exactly how triple buffering works today πŸ™‚ Same with “iFlip” in the new Win10 swap chains, etc.

            • Flapdrol
            • 3 years ago

            triple buffering works today?

        • _ppi
        • 3 years ago

        According to what was said on PCPer, this “Fast Sync” work like (pre-patch?) UWP games from Windows Store with VSync off (effectively VSync is On, but game tries to draw one or more extra frames before refresh and the old ones are discarded).

        Everyone scolded it. No reason why we should like just because it’s by nVidia this time.

      • f0d
      • 3 years ago

      freesync (or gsync also im guessing) isnt as good as the hype
      i have a freesync monitor and an R9-290 and unless im under 50fps i cant tell if its on or off with the games i play

      and theres no way id want to play any games at 50 and lower freesync or not

        • Airmantharp
        • 3 years ago

        You do realize that the point of VVS is to smooth out lower framerates, right?

        Also, your experience will vary with the FreeSync monitor in question.

          • f0d
          • 3 years ago

          why even play at the low framerates that it enhances?
          it still feels like a low framerate and looks like a low framerate so you are better off dropping a few settings and getting back into 70+fps

            • Firestarter
            • 3 years ago

            what if the game doesn’t need high framerates? In such a game, if you don’t look at the framerate for once, can you even tell that the framerate is low if you have freesync on? If not, then freesync just achieved its goal

            if all you do is stare at the FPS counter then it’s never going to work

            besides, some people rather enjoy eye candy and prefer not to turn down those settings unless necessary. It’s a trade-off after all

            • f0d
            • 3 years ago

            [quote<]can you even tell that the framerate is low if you have freesync on? If not, then freesync just achieved its goal[/quote<] easily im guessing you are one of those "the human eye cant see any more than 60fps/hz" type of people?

            • Firestarter
            • 3 years ago

            I have a 120hz screen but I’ll be the first to concede that when the view is stationary that I cannot tell whether said stationary view is being refreshed 120 times per second or not. Some games just don’t require that many FPS because of what they are, other games are competitive first person shooters where anything other than the fastest available strobed backlight display is a competitive disadvantage. Variable refresh displays are objectively better than conventional displays for a good portion of the spectrum that lies between those extremes, and when better is free as is the case with freesync, who doesn’t want that?

            Nvidia doesn’t want that, that’s who

            • f0d
            • 3 years ago

            i have nothing against nvidia supporting freesync
            i just dont think VRR is as useful as the hype after using one although admittedly i dont have any games where i stare at a wall for minutes at a time that could make it useful

            high refresh is WAY more of an improvement yet VRR gets all the attention, why dont people make as much fuss about more monitors supporting high refresh?

            • anotherengineer
            • 3 years ago

            Money.

            If someone can get a cheap card and a cheap monitor and avoid tearing and jerkiness of typical non-freesync systems then that’s a win for them.

            I bought a 120Hz display about 6-7 years ago for about $330, it’s old tech, same thing now should be $120, but it’s not.

            I’m all for high refresh rates, but honestly the cost for me with a family for a luxury like gaming is to too prohibitive. I would need to buy another 120hz monitor since I ditched mine, a video card and basically a whole new system to push those frames.

            If I can get a freesync card for $175 and a nice 2560×1440 ips freesync display for $300 (which I probably can’t due to possible collusion or something) I could get at least something that would be playable without the tearing and jerkyness of a non-freesync setup.

            Also when you have a nice 120hz capable set up and then FPS capped games get pushed out, it’s a kick in the nads.

            So I hear ya, but it is what it is. To make matters worse, I have seen BestBuy slapping “Gamer” on crap 22″ TN panels and saying 1ms response time and selling it for almost $200. The whole “Gamer” moniker is getting used to mark up the crap out of prices and it’s having a detrimental effect of getting 120Hz screens into mainstream and prices down.

            • BurntMyBacon
            • 3 years ago

            [quote=”f0d”<]high refresh is WAY more of an improvement yet VRR gets all the attention, why dont people make as much fuss about more monitors supporting high refresh?[/quote<] [quote="anotherengineer"<]Also when you have a nice 120hz capable set up and then FPS capped games get pushed out, it's a kick in the nads.[/quote<] This right here is enough for me to consider Sync monitors. Though, I haven't yet pulled the trigger on one.

            • Ninjitsu
            • 3 years ago

            Money, exactly.

            • travbrad
            • 3 years ago

            You are missing out f0d. Wall simulator 2016 is the best yet in the series.

            • Firestarter
            • 3 years ago

            because adaptive sync actually solves a problem, while high refresh rates only mitigate it. Fixed refresh rates have been a bane of PC gaming for [i<]decades[/i<] and people have been trying to fix the resulting annoying tearing for about as long with ham-fisted non-solutions like vertical sync and triple-buffering. That was the case back when people were playing Quake 3 competitively on 100hz CRTs just as well as it is now on any normal LCD, regardless of whether it's 60hz or 144hz. Yes, a display at 144hz is fast enough that tearing is a non-issue for many people in many games, because it happens less often and is less visible, but it's still there. An adaptive sync display can display that game at the same frame rate and finally eliminate that tearing without introducing lag. Every day that Nvidia refuses to adopt the VESA standard that finally does away with this problem in a cross-vendor way that isn't burdened with copyright or licensing BS prolongs the existence of this problem that we should have pronounced good and dead years ago. The fact that they and their partners are charging money to solve it in the mean time with their proprietary solution only adds insults to injury

            • BurntMyBacon
            • 3 years ago

            [quote<]Yes, a display at 144hz is fast enough that tearing is a non-issue for many people in many games ...[/quote<] I don't have any trouble finding tearing at 144Hz. In my experience, it actually happens more often than at 60Hz, though its on the screen less than half the time per offense. From a technical standpoint, there is no reason to believe it should tear less at higher refresh rates, but the tears should have less horizontal shifting and less screen time. Given that the graphics card immediately moves on to the next frame after a frame render is complete (and doesn't in fact sit back to enjoy some well deserved donuts), the chances that you will only be partway done with a frame when it comes time to display are the same regardless of how short or long the cycle is. Vsync does solve the frame tearing problem, but introduces a variable lag problem. Also, if the framerate moves above and below the monitors frame refresh period, you will experience a discontinuity between the game simulation and the displayed scene. Frankly, I'm a little surprised that someone capable of perceiving the improvements that higher refresh rate monitors bring would miss the tearing and/or lag problem. These drive me nuts. I'd rather have a continuous motion that more accurately reflects simulation time at the expense of slower screen refresh (to a certain extent) than a faster screen refresh that either has frame tearing or increases input lag and has motion discontinuity. I'm still waiting for a decent price on a non-TN film 120Hz or better "Sync" monitor before I pull the trigger, though.

            • Firestarter
            • 3 years ago

            What I mean is that when switching from 144hz to 60hz tearing becomes immediately noticeable and distracting, along with the extra lag and judder. It’s still there at 144hz (or 120hz in my case) but when you see the comparatively huge magnitude of the problem at 60hz it’s clear to me that many people who are annoyed by tearing at 60hz might find it almost invisible at 120hz. It’s still there and I can see it, but it’s so much reduced that my brain has an easier time filtering it out. That said, after playing for a while at 60hz I find myself adjusting to it and becoming less bothered by it, even though it’s still visible plain as day.

            What I don’t know but what I can guess is that people with an adaptive sync display have a similar reaction when switching from adaptive sync to fixed 144hz. I bet the suddenly re-introduced tearing annoys them, but after a while it becomes less visible as the brain adjusts to it

            • BurntMyBacon
            • 3 years ago

            [quote<]What I don't know but what I can guess is that people with an adaptive sync display have a similar reaction when switching from adaptive sync to fixed 144hz. I bet the suddenly re-introduced tearing annoys them, but after a while it becomes less visible as the brain adjusts to it[/quote<] Pretty much correct in my experience. For me, the downsides of a TN-Film panel on a 27" monitor out weighted the benefits of "Sync". So I'm back to a 120Hz fixed refresh rate IPS until such a time as a suitable "Sync" replacement comes down to a suitable price. [quote<]What I mean is that when switching from 144hz to 60hz tearing becomes immediately noticeable and distracting, along with the extra lag and judder. It's still there at 144hz (or 120hz in my case) but when you see the comparatively huge magnitude of the problem at 60hz it's clear to me that many people who are annoyed by tearing at 60hz might find it almost invisible at 120hz.[/quote<] I definitely agree here. My statements about people not noticing the lag and judder was a general statement and not targeted at you. πŸ™‚

            • Chrispy_
            • 3 years ago

            You’re missing the other components of adaptive sync, one that (as a competitive gamer) you *should* care about.

            Adaptive sync reduces input lag by getting the frame to your monitor faster than a fixed rate display otherwise would.

            Adaptive sync improves animation smoothness and reduces tearing, providing a cleaner, easier-to-track field of view over which to hunt for moving targets.

            Yes, in an ideal world you’re getting 144Hz and 144FPS at a perfect, lag-free 1:1 ratio but that never happens. You either get the visual noise and motion-tracking disruption of tearing without vsync, or you get input lag and animation disruption with vsync. Adaptive sync allows you to have your cake *and* eat it, providing the visual noise improvements of vsync whilst giving you the animation smoothness and latency reduction you normally only get without.

            • Andrew Lauritzen
            • 3 years ago

            > Adaptive sync reduces input lag by getting the frame to your monitor faster than a fixed rate display otherwise would.

            Double buffered vsync-locked with no frames dropped is still the gold standard. Gsync is better than vsync when you fall below refresh rate, but it’s not better than not falling below refresh rate in the first place πŸ™‚

            It’s clearly a better solution than classic vsync when rendering below refresh rate, but f0d is correct in that simply hitting vsync all the time is the ideal. That said, it’s a much better parachute and there’s no reason not to want both high refresh rate *and* variable refresh. Hitting vsync consistently on a 144Hz monitor is… uhh… “challenging” πŸ™‚

            • BurntMyBacon
            • 3 years ago

            [quote<]Double buffered vsync-locked with no frames dropped is still the gold standard. Gsync is better than vsync when you fall below refresh rate, but it's not better than not falling below refresh rate in the first place :)[/quote<] I'm not sure I'd really consider that the gold standard standard anymore. I wrote a long description of why, but it got too ... long ... and it seems like you have a good understanding of the subject anyways, so I'll summarize it as follows: If your frame rates are consistently and invariably above the monitor refresh rate, double buffered vsync and variable refresh effectively achieve the same thing (if slightly differently). The monitor always transitions to the new frame that has been waiting some delay less than a refresh cycle at the beginning of every screen refresh. This delay is dependent on when the video card completes the frame and doesn't really vary based on sync technique. In this respect, I call them even. However, below the maximum refresh rate of the monitor variable refresh techniques fair far better than vsync (as you stated). With no downside, that makes variable refresh the new gold standard in my mind. The bigger question is whether it is worth the price (I haven't pulled the trigger yet). I completely agree with your conclusion, so I'm going to borrow it here. [quote<]That said, it's a much better parachute and there's no reason not to want both high refresh rate *and* variable refresh. Hitting vsync consistently on a 144Hz monitor is... uhh... "challenging" :)[/quote<]

            • BurntMyBacon
            • 3 years ago

            @f0d

            Serious question:
            On a 144Hz refresh rate monitor, can you tell the difference when you drop into high frame rate (120Hz – 143Hz) VRR range. I didn’t think to check this during the time I had a high refresh rate (unfortunately TN-Film) Gsync monitor before I decided to return it.

            • travbrad
            • 3 years ago

            [quote<]what if the game doesn't need high framerates? In such a game, if you don't look at the framerate for once, can you even tell that the framerate is low if you have freesync on? If not, then freesync just achieved its goal [/quote<] I can't comment on freesync, but with gsync yes it feels noticeably choppier any time the FPS drops below 60ish or so, and even 60 feels pretty choppy compared to say 70-80+. IMO every game benefits from higher framerates. Maybe some to a lesser degree than others in competitive terms, but it's always noticeable visually.

            • Milo Burke
            • 3 years ago

            If you display new frames when they’re ready instead of according to the display’s strict schedule, you remove some of the jankiness inherent at your given frame rate. This doesn’t improve your frame rate, it just makes lower frame rates more tolerable.

            Scott said something to the effect that “with variable refresh, 40 fps is the new 60 fps.” One could extrapolate that to mean that 80 fps with variable refresh feels as smooth as 120 fps without.

            It’s up to the user to decide, at this point, whether to trade that smoothness advantage for higher settings, a more cost effective GPU, or keep everything the same and appreciate the increased smoothness.

            • travbrad
            • 3 years ago

            [quote<]Scott said something to the effect that "with variable refresh, 40 fps is the new 60 fps."[/quote<] I have great respect for Scott and all the great testing/writing/reviews he has done over the years, but in this case I just completely disagree. 40FPS with VRR still doesn't look good to me, nowhere near as good as 60FPS without VRR. Maybe it looks as good as like 45-50FPS without VRR at best.

            • Milo Burke
            • 3 years ago

            Thanks for your opinion.

            I look forward to making my own opinions on this topic.

            Edit: What’s your setup?

            • travbrad
            • 3 years ago

            2500K@4.5ghz, 16GB DDR1600, GTX970 would be the main specs, and a 1080p 144hz G-sync monitor from Philips. I actually don’t regret getting a G-sync monitor, since it was only about a 20% difference in price compared to non-gsync monitors of the same size/resolution/refresh rate. G-sync does make a small difference and since I will likely be using it for years to come I feel it was worth the extra money. It’s just not the earth shattering “game changer” that many reviews would seem to indicate IMO. The higher refresh rate makes a MUCH bigger difference than G-sync for me, especially if you can actually achieve those higher framerates.

            P.S. Don’t poke me with pitchforks too hard for going for 1080p/higher framerates instead of the master race 1440p. πŸ˜‰

            • Milo Burke
            • 3 years ago

            You’re safe. I loaned my pitchfork to my neighborino Homer and haven’t diddly gotten it back yet.

            • Chrispy_
            • 3 years ago

            It depends on how sensitive your eyes/brain are to motion. I find 40 to be juddery on VRR displays, but 45 is pretty good.

            The same problem has existed for ages, some people found that 60Hz CRT’s didn’t flicker, others like me thought even 75Hz was a flickery mess. I find 30fps console gaming and 24fps movies painful in certain scenes, other people say that 24fps is fine. It’s all a matter of opinion and everyone’s opinion will vary.

            I forget what the study was called but the last time a large group of people were tested on their perception of framerate smoothness, the golden number (median result) was 41.5fps, which I think tallies pretty well with how I feel, and explains why I dislike 30fps and 24fps content, whilst loving 85Hz with vsync (because even if it drops a frame, half of that is still quicker than 41.5fps)

            • Ninjitsu
            • 3 years ago

            Oh god anything below 85Hz on CRTs was headache inducing…the bad memories…

            • _ppi
            • 3 years ago

            Perhaps depends on past experience. I played original Unreal in 20-30fps range, sometimes dropping to 15s (now that was really ugly). I just could not afford a new computer back then, but I wanted to play it through. Since then, I prefer image quality over fps, as long as framerate does not drop below 30. But then, I mostly play RPGs and strategies, where high fps is not that necessary.

            Simple fact is, that once a game is too demanding on your computer, you run into combination of:
            (i) Inability to improve performance much at all. Going from 30 to 35 fps will not save the day; and
            (ii) Too big image quality sacrifice to enjoy the game. E.g. in game like Skyrim, I just have to have long viewing distance

            Therefore for me, adaptive V-Sync technologies are saving grace for me. Now where is that 4K 120Hz HDR monitor … And it pisses me off nVidia does not support open standard, though I bet they have it ready with Pascal and could now just turn the knob.

        • Billstevens
        • 3 years ago

        It doesn’t need hype it is just the most functional nearly free option to eliminate screen tearing without the drawbacks of vsync.

        So Nvidia just to be different and keep gysnc relevant ignores it and entroduce another half measure for those not willing to shell out $300 extra for their monitor.

        But it is no surprise they aren’t supporting freesync. With their market share there is no reason to assume it’s a necessary feature.

        • slowriot
        • 3 years ago

        CS:GO yes I want max FPS to take advantage of my 144Hz max refresh rate. Witcher 3? I want max visuals with a good enough frame rate to get a smooth experience and FreeSync/G-Sync help a lot toward that goal.

      • Flapdrol
      • 3 years ago

      Would be disappointing.

      I figured if nvidia was serious about long term gsync they’d have made an asic version by now, so I guessed it would be in, maybe in some driver update.

    • DancinJack
    • 3 years ago

    Why does this happen to TR so much? Being truthful and objective doesn’t get you preferential treatment from hardware PR depts?

      • Wildchild
      • 3 years ago

      Nvidia not providing TR a 1080 and the whole Founders Edition launching first at $699 all comes off as really sketchy to me.

        • DancinJack
        • 3 years ago

        I don’t think you should look at it as a one-off. There are other sites that haven’t had the card long enough to evaluate the way they like, but these situations happen to TR more often than most other major sites.

      • Jeff Kampman
      • 3 years ago

      There’s a lot going on behind the scenes that I can’t talk about, but Nvidia helped us get this card, so it’s not fair to demonize the company for the delay. We’d have liked to have a launch day review, but that just wasn’t to be. Hopefully we’ll return to that sort of coverage in the near future.

        • chuckula
        • 3 years ago

        Thanks for the info Jeff.

        Reviews have to be fair & all πŸ˜‰

          • BurntMyBacon
          • 3 years ago

          [quote<]Reviews have to be fair & all[/quote<] Seems I recall hearing something like that not too long ago?

        • Redocbew
        • 3 years ago

        Personally, I don’t mind waiting if it takes a little while longer for TR to do a proper deep dive here.

          • DancinJack
          • 3 years ago

          That’s always the sentiment, and I am in the same boat. That doesn’t change the facts though. It happens a lot, and I was just wondering if there was an answer to that question.

        • derFunkenstein
        • 3 years ago

        It’s not like you can go out and buy this card today – and I doubt you’ll be able to go out and buy one the day you publish a review. Further, I’m pretty sure you won’t be able to buy one on May 27, considering all the hype.

        • Anovoca
        • 3 years ago

        let me guess, they mailed it to Scott by mistake.

          • sparkman
          • 3 years ago

          And he is playing Civilization on it all day.

        • Neutronbeam
        • 3 years ago

        You may not have the first review but you will have the BEST review! All hail Jeff!

        • sparkman
        • 3 years ago

        I’d take a wild guess that nVidia is conflicted about how much to support TR given the staff connection to AMD.

        That doesn’t make nVidia evil. You don’t normally send free product samples to your competitors. In this case I believe TR is independent and will produce unbiased reviews, but you can’t automatically expect some random marketing employee at nVidia to know that.

        • Jigar
        • 3 years ago

        Please do use the factory OCed GTX 980Ti like you use in other reviews.

          • torquer
          • 3 years ago

          Yeah the one person on earth using stock clocks wants to see how it performs against the nearly un-overclockable competition

    • tipoo
    • 3 years ago

    Request for the full review, please look into async compute. I know that was the big bugbear with Maxwell, with Nvidia promising it but it looking like they were doing it in CPU for scheduling, not GPU like GCN. So once you pushed the full 128 (!) compute sources GCN could handle without longer execution times, it could do 10x (!!) the compute work without slowing down as Maxwell so far. Nvidia still says they’re implementing it in drivers.
    To Nvidias credit, a low amount of compute work ended up being faster on Maxwell than GCN.

    [url<]http://www.extremetech.com/extreme/213519-asynchronous-shading-amd-nvidia-and-dx12-what-we-know-so-far[/url<] [url<]https://forum.beyond3d.com/threads/dx12-performance-discussion-and-analysis-thread.57188/page-11[/url<] Second link is particularly interesting, someone wrote a script to increase compute sources, you can see a stair stepping on Maxwell (time increases - can't do it asynchronously) while there's a flat time in GCN even up to 128 sources (it's doing it asynchronously - no increase in time, just using idle resources). If TR could muster something like that it would be cool. Polaris vs Pascal async compute will be very interesting.

      • Ninjitsu
      • 3 years ago

      Nvidia apparently did claim support in the presentation. (possibly already covered by your links, didn’t check, lazy πŸ˜› )

      [url<]http://videocardz.com/59962/nvidia-geforce-gtx-1080-final-specifications-and-launch-presentation[/url<] (scroll to the end). EDIT: even the GP100 whitepaper suggested some sort of async compute.

        • tipoo
        • 3 years ago

        They did in Pascal, yeah. Which makes their claim of it in Maxwell even more suspect, haha. But still curious to see how the compute queues compare to GCN and Polaris. And how much better than Maxwell it is.

          • Ninjitsu
          • 3 years ago

          There are more benchmarks here, but everyone’s favourtie AoS…

          [url<]https://www.guru3d.com/articles-pages/nvidia-geforce-gtx-1080-review,15.html[/url<]

        • jts888
        • 3 years ago

        For GP104, they’re talking about “async time warp” for VR, which is not the same as actual async compute as done in GCN.

        Maxwell could switch between graphics and compute processing through context switches at predefined or at least very coarse grained points in the draw request queues, and Pascal is being billed as being able to do context switches at any arbitrary instruction/rasterized pixel block.

        However, this is confusing and disingenuous marketing, since the context switches still take close to 0.1 ms and nobody should really care whether an interrupt can occur on a cycle boundary if the switch still needs many tens of microseconds to complete.

        GCN is able to smoothly interleave ALU instructions from both graphics and compute instruction queues on a cycle-by-cycle basis per compute block. Though like other Simultaneous Multithreading implementations like HyperThread this no doubt is tricky to get right and not cause undue register/cache pressure, this is what people mostly consider the “proper” implementation of async compute shaders.

          • tipoo
          • 3 years ago

          That’s right. Currently the 1080 is able to brute force past the Fury due to how much more execution hardware 16nm lets it pack in and the higher clocks, but the real question will be Pascal vs Polaris, where AMD can use the same benefit, multiplied by its superior async compute implementation.

      • beck2448
      • 3 years ago

      [url<]http://www.forbes.com/sites/jasonevangelho/2016/05/17/nvidia-geforce-gtx-1080-review-hail-to-the-king/#3dc5cee95119[/url<] According to Forbes both Dx12 and 11 crushed the competition.

      • USAFTW
      • 3 years ago

      I’d like to second that. There’s been a lot of bruhaha about Async for Pascal in outlets like WCCF or videocardz. Would be great if TR looked into it. Also, what I’m looking forward to is bringing on D. Kanter to discuss the architecture and the manufacturing process.

      • Prestige Worldwide
      • 3 years ago

      GTX 1080 is way ahead of Fury / Fury X in AOTS, which has been the go-to game for Async comparisons as of late:

      [url<]http://www.hardocp.com/images/articles/1463427458xepmrLV68z_6_2.gif[/url<] [url<]http://www.guru3d.com/index.php?ct=articles&action=file&id=21949[/url<]

        • tipoo
        • 3 years ago

        I see, thanks. I’d like to see the Beyond3D thread guys script run on Pascal though, where it starts with 1 and keeps increasing the compute sources, to see if there’s any “stepping” pattern with increases in compute time, or it’s all flat like GCN.

        [url<]http://s2.postimg.org/e7mes6ut5/kepler_vs_maxwell.png[/url<] [url<]http://s3.postimg.org/rwqwzfx37/fury_x_vs_390x_vs_tahiti.png[/url<]

        • Tirk
        • 3 years ago

        Way ahead? I’d look at more than those 2 sites…. For one guru3d’s 1440p results do not seem to match any other site I’ve seen in fact they are very far off from even the HARDOCP review you linked which seems to make it more suspect.

        Also, your HARDOCP link still shows the 1080 losing performance in DX12 vs. DX11 albeit not by much. I wouldn’t call an 8 fps difference with the Fury X in DX12 as “way ahead” it but it is definitely faster.

        Tipoo was specifically asking about async compute and your link of HARDOCP seem to indicate Nvidia still has some work ahead in implementing it correctly.

        • Tirk
        • 3 years ago

        Take a look at:

        [url<]http://www.anandtech.com/show/10326/the-nvidia-geforce-gtx-1080-preview/2[/url<] They do a DX11/12 of Hitman and show a drop in performance on the 1080 with a neck and neck 2fps difference with the Fury X in DX12.

        • beck2448
        • 3 years ago

        Exactly!

        • tipoo
        • 3 years ago

        Currently the 1080 is able to brute force past the Fury due to how much more execution hardware 16nm lets it pack in and the higher clocks, but the real question will be Pascal vs Polaris, where AMD can use the same benefit, multiplied by its superior async compute implementation.

        As discussed above, it appears Pascal is still using context switches which take 0.1ms each, vs GCN which can seamlessly interleave per clock.

          • chuckula
          • 3 years ago

          Eh.

          HardOCP summed it up nicely here:
          [quote<]AMD Radeon R9 Fury X improves upon performance moving to DX12. Under DX12 it is now 7% faster than it was under DX11 which is a nice little improvement. However, that improvement is not enough to offset the shear performance the GeForce GTX 1080 Founders Edition is capable of. The GTX 1080 FE is 18% faster than the AMD Radeon R9 Fury X. The GTX 1080 FE is 32% faster than the GeForce GTX 980 Ti.[/quote<] Link: [url<]http://hardocp.com/article/2016/05/17/nvidia_geforce_gtx_1080_founders_edition_review/6#.Vzx21bqlxhE[/url<] "Brute force" or not, the flagship R9-Fury X gets a whopping 7% boost going from DX11 to DX12. That's it. Assume that ALL of that 7% is literally just "Async Compute"... it's still only 7%. And ask yourself, how much of that 7% is AMD just not doing a good job with DX11 as opposed to the magical Async Compute being that great. Meanwhile, the GTX-1080 is basically statistically tied between both DX11 and DX12, so the purported "DX12 penalty" doesn't seem to exist for Pascal.

            • tipoo
            • 3 years ago

            Don’t get me wrong; it’s no magic bullet. I’m just keen to see the more efficient async implementation added to all the added execution hardware 14nm will allow and the new uArchs. That’ll be the real interesting comparison.

      • Rza79
      • 3 years ago

      [url<]http://www.computerbase.de/2016-05/geforce-gtx-1080-test/7/#diagramm-ashes-of-the-singularity-2560-1440[/url<] In Ashes of the Singularity, 8% ahead in 1440. 18% ahead in 4K. Much less than the usual 30% of the other games.

        • tipoo
        • 3 years ago

        I think enough people have pointed me to the AoS benchmarks of it by now, but thanks πŸ™‚

        What I’m interested now though is like my second link, start with 1 compute source, add them, and see if there’s a stair stepping pattern like Maxwell (so new compute commands take more time), or a flat line like GCN (done asynchronously, no increase in time as it’s just using idle hardware).

    • NTMBK
    • 3 years ago

    NVidia’s PR department have some explaining to do. This had better not be “revenge” for Scott going to AMD.

    Looking forward to your review, your frametime analysis is still the gold standard.

      • Jeff Kampman
      • 3 years ago

      Nvidia helped us get this card, so.

        • NTMBK
        • 3 years ago

        Well, I’m a little less annoyed. But… why was it so late?

          • Helmore
          • 3 years ago

          AnandTech also hasn’t had the time to produce a proper review, so the TechReport isn’t the only one who got their card only recently.

            • Firestarter
            • 3 years ago

            I think most sites that put up a review today did so prematurely, unless their quality standards for a review are quite low

            • DPete27
            • 3 years ago

            [url=http://www.pcper.com/reviews/Graphics-Cards/GeForce-GTX-1080-8GB-Founders-Edition-Review-GP104-Brings-Pascal-Gamers<]PCPerspective's review[/url<] was quite thorough.

            • Leader952
            • 3 years ago

            Tom’s Hardware’s review is top notch.

            [url<]http://www.tomshardware.com/reviews/nvidia-geforce-gtx-1080-pascal,4572.html[/url<]

            • _ppi
            • 3 years ago

            Wow, they upped their game.

            • DancinJack
            • 3 years ago

            Yeah, I don’t think that’s fair.

          • Leader952
          • 3 years ago

          TR has been late with reviews before.

          And that is the truth (negative nancy’s) go look at previous reviews.

            • rxc6
            • 3 years ago

            You just failed completely at understanding the situation. NVidia has surely screwed up when YouTube reviewers got cards, but TR had to wait way longer to get it. Add that to the fact that TR provides frame time analysis that so many other sites don’t even look at and you can see why many people here are annoyed at NVidia.

            • DoomGuy64
            • 3 years ago

            Not really, this is Nvidia tactics at their finest.

            Step1: Claim 2x perf of 980Ti (hype-hype-hype) (only in VR)
            Step2: Take youtube reviewers out to a luxurious ranch vacation in Texas. (wow the bribery)
            Step3: Hype train has left the station.
            Step4: Profit from huge markup on “Founders Editions”

            Amazing that nobody calls them on it. Probably has a lot to do with that free vacation.

            • Meadows
            • 3 years ago

            They have, however this time they haven’t even received a card by the time most everyone else had one.

          • Leader952
          • 3 years ago

          So Nvidia gives TR a free card and you still are annoyed.

          Seems like being annoyed is in your DNA.

        • Neutronbeam
        • 3 years ago

        Well, Nvidia didn’t give ME a card, so consider me to be in high dudgeon–and I don’t even know what that means. So take THAT Nvidia!

        Also, I miss Scott.

          • nanoflower
          • 3 years ago

          You don’t do Youtube reviews. I’ve seen all sorts of people putting out reviews on YT. You just needed a few thousand subscribers and you too could have had a GTX 1080 and put out your own review. At least we can see that either the Tech Report has less than 10000 readers or Nvidia values YTers more than text review sites. πŸ™

          We all miss Scott. Though… Now I’m wondering just what video card he is using since he shipped off his Fury (X) this week. Could he be running on Polaris 10? Too bad he can’t do a review for us for old times sake.

          • MOSFET
          • 3 years ago

          TR is my site and Jeff is my captain.

        • Klimax
        • 3 years ago

        Do you know why you were in “second wave” and can you tell us? From what I have read so far (not reviews yet, just post prior to embargo lift by Sky at Hardware Canucks and your hints in this article), there is no bloody reason to exclude you from initial batch.

      • cjcerny
      • 3 years ago

      If all the facts and figures I’ve seen so far turn out to be true, Scott should be worried about AMD’s solvency–or lack thereof.

      • Mr Bill
      • 3 years ago

      Solution: TR should have a Youtube review channel. We will be at the top of the charts in no time and get all sorts of cool loots to err, review.

        • nanoflower
        • 3 years ago

        Sadly that looks to be true. Can’t believe a guy with only just over 10,000 subscribers got invited to the PR event and got a 1080 while TR didn’t get one until some time later.

    • chuckula
    • 3 years ago

    [quote<]With that said, we do have a GTX 1080 in our labs now,[/quote<] Still pissed that NVidia didn't get you the card along with everyone else, but happy to see that there will be a review prior to actual launch. Take your time Jeff, nobody is buying one for a while anyway.

      • maxxcool
      • 3 years ago

      WUT? I BOUGHT 42 OF THEM!

      • DPete27
      • 3 years ago

      Being “late” to post a hot product review means you need to bring something new and innovative to the table. Something that none of the other reviewers out there have thought of. That’s something that has made TR so great. Take your time Jeff. Most of us will have already read the “plain Jane” stuff elsewhere by the time the TR review goes up. Time to prove TR should still be considered a top tier review site. (no pressure)

        • blahsaysblah
        • 3 years ago

        Can you please keep track of watts used by the different cards that run through your hands?

        If any brand uses better voltage regulators to keep idle/peak watts less and produce less heat.

        Wish there was 80Plus type certification for graphics cards.

        Really, only concerned about stinkers. Like those 750ti cards that had power connectors. Or 950 cards with 8 pin connectors…

          • DPete27
          • 3 years ago

          I’m pretty sure power consumption is on the list of “required” tests for reviewers. TR and most/all other review sites have done power consumption comparisons when they get [url=https://techreport.com/review/28685/geforce-gtx-980-ti-cards-compared/5<]cards from multiple manufacturers.[/url<] Linking power consumption and heat to voltage regulators would be difficult/impossible since those metrics are more prominently controlled by what clockspeeds each manufacturer chooses, whether they spin down their fans at idle (and at what point along they way the fans actually start spinning), cooler contact with various components (processor only, or processor, RAM, VRMs, etc), and size/type/design/number of fans on the cooler.

      • Leader952
      • 3 years ago

      [quote<]Still pissed[/quote<] Take a chill pill you will feel better.

      • Deanjo
      • 3 years ago

      Have a poop, it will make you feel better.

        • anotherengineer
        • 3 years ago

        It does………….it always does………..

Pin It on Pinterest

Share This