Nvidia unveils the GeForce GTX 1080 Ti at GDC for $699

Well, the cat's officially out of the bag. Surprising nobody that was paying attention to the TIme teaser at GeForce.com, Nvidia has launched the GeForce GTX 1080 Ti this evening at GDC.

Folks like us that like to keep an eye on the comings and goings of graphics cards were expecting a card faster than the GTX 1080 but a bit slower than the eye-wateringly-expensive Pascal Titan X. In a change of the usual order of things, the GTX 1080 Ti is actually a bit quicker in some ways than the Titan X, thanks to its higher 1600 MHz boost clock.

Summing it up in few words: there's a lot of every flavor of the good stuff, and it all works out to what's an insanely high-performance card. Oddly enough, the GTX 1080 Ti has 11GB of 11 GT/s GDDR5X RAM on a 352-bit bus, two figures likely to send OCD buyers into a tizzy. Here's the full specs galore—and try to contain your credit cards in your wallets, yes?

  GTX 1080 Ti Titan X

(Pascal)

GTX 1080
GPU GP102 GP102 GP104
Boost clock 1600 MHz 1531 1733
ROP pixels/clock 88 96 64
Texels filtered/clock (int/fp16) 224/224 (??) 224/224 160/160
Shader processors 3584 3584 2560
Rasterized triangles/clock 6? 6 4
Memory interface width (bits) 352 384 256
Est. transistor count (billions) 12 12 7.2
RAM (GB) 11 12 8
RAM bandwidth (GB/s) 492 480 320
TFLOPS (FP32) 11.5 11.0 8.9

The GTX 1080 Ti uses a slightly refreshed version of the vapor-chamber blower cooler that's been appearing on Nvidia reference cards for some time now. The card requires eight- and six-pin PCIe power connectors to support its 220W TDP, and both Founders Edition and custom cards will apparently be available soon. The Founders Edition card will launch next week for $699.

Along with the launch of its highest-performance consumer graphics card yet, Nvidia is dropping the price of the GTX 1080, its former consumer performance champion, to $499.

Developing…

Comments closed
    • DeadOfKnight
    • 3 years ago

    I like how the event started off with a Titan XP giveaway. “Here is the fastest GPU money can buy…until now” Anyway, it seems to me the best thing about the new card is the reference design and price. Otherwise the cards look about the same in terms of performance. I guess they had to use up the weaker GP102s with faulty memory controllers somehow.

    • mikepers
    • 3 years ago

    Don’t know if anyone mentioned this yet but the $499 price for a 1080 is starting to show up now:

    [url<]https://www.newegg.com/Product/Product.aspx?Item=N82E16814137084[/url<]

    • MustSeeMelons
    • 3 years ago

    Exactly twice the price I would be willing to pay for a GPU. When will there be something worthy the upgrade from a 970 in that price bracket?

      • NarwhaleAu
      • 3 years ago

      You are too close generationally to see a massive increase in performance vs. the 970. You could step up to the 1070, which is faster (~20%) and will likely come down in price about $50, but that isn’t some sort of giant leap forward. You need to give it a generation and target the 1170. If that comes in faster than the 1080, then you will have closer to a 50% increase in performance for a little more than the price of the 970.

      TLDR: 970 is a great card.

        • Airmantharp
        • 3 years ago

        I have two 970’s, and it looks like this new 1080Ti (or the Titan X it was berthed from) would be necessary to see a real boost in performance in a single-card. Also, this card isn’t much less than the total I paid for said 970’s…

    • Laykun
    • 3 years ago

    OK, we get it 1080 Ti, you’ve got a vapor-chamber.

    • Kougar
    • 3 years ago

    I’m awful at numbers, but shouldn’t the RAM bandwidth be 484GB/s?

    • Bauxite
    • 3 years ago

    Literally every rumor was off target. Some were pretty good educated guesses but still technically wrong, nice info control Nvidia.

      • Voldenuit
      • 3 years ago

      How off were they really though? Pretty much everyone assumed it would be a GP102 part with disabled CUs, just that no one thought they would disable 32 units and end up with 11 GB of VRAM.

    • terranup16
    • 3 years ago

    And for the 1080Ti, I’m not sure how to take this for Vega. NV clearly bumped their existing lineup in anticipation of some form of credible competition, even if just on price. But they also very obviously left themselves room to outdo the 1080Ti with Pascal (12GB fat GP102 with the faster GDDR5x).

    Generally hard to read into pricing as well. Cuts could be for Vega, but with that being relatively far out, it’s equally likely nV is simply making its pricing more attractive for anyone who hasn’t bought in yet. From a 980Ti or 780Ti setup, the 1080Ti hits an attractive price now that may drive upgrades. Likewise, 980 -> 1080, 970 -> 1070, and 960 -> 1060 seem more attractive too. And if you’re using a Fury card but now have a VR rig, a $500 1080 might be a reasonable consideration.

    The majority of nV’s card pricing goes towards regaining R+D expenditures and its fat wallet, so with the 10XX series having been in the market for 6+ months now and doing so well, R+D has probably been paid for and then some, meaning at this point it’s probably more advantageous for nV to drive higher sales volume than to maintain the higher per card profit margin.

    • beck2448
    • 3 years ago

    Looking forward to custom cooled overclocked cards
    Max them out should be interesting.

    • SomeOtherGeek
    • 3 years ago

    So, basically everyone who bought a 1080 got ripped off? You get this for 100 bucks more. Oh well, they still have a nice card.

    EDIT: I understand where everyone is coming from. I was more pointing my finger at NVidia. They know how to milk their customers. I know the 1080 is an awesome card. I just feel like NVidia could have lowered the price on the first day. But that is business, you want something, you pay for it.

      • deruberhanyok
      • 3 years ago

      “ripped off” is a giant misunderstanding of how this all works. New products come out that are better than old ones. Old products may continue to be available at decreased price. It has been this way for literally thousands of years, since the very first GeForce SDR brought fire to the world after its cooling fan failed.

      I would think everyone who bought a GTX 1080 at launch was very happy with it. I imagine anyone who bought a GTX 1080 on, say, Monday, wasn’t very happy. But then, a lot of stores have return policies, or will do a price guarantee within a certain timeframe, so it may all be moot anyways.

      • Devils41
      • 3 years ago

      I do not feel ripped off. I bought my 1080 at launch and got it in the middle of June last year. That’s roughly 259 days with it. I paid $675 for it and lets say it goes for the $499 now (AIB over founders) that’s a depreciation of $.68/ day. Its technology nothing keeps its value but when I decided to get my 1080 it was the best card offered at the time for a high refresh rate 1440p monitor and still is.

      Although, I agree if I purchased the card within the past 2 months I would feel a little buyers remorse.

        • NarwhaleAu
        • 3 years ago

        That’s the best way to do it – buy it at launch and enjoy it for a few years.

        I bought the 960 to tide me over until something like this came out – looks like a really good price vs. performance spot that should last for years to come.

      • chuckula
      • 3 years ago

      Don’t seem to recall this sentiment when AMD went fire sale with the Furry X that used to be $650 once upon a time.

        • K-L-Waster
        • 3 years ago

        Funny how when AMD drops prices they are “aggressively going after the competition” whereas when the competition does it they are either “desperate” or “up to their usual dirty tricks.”

    • DPete27
    • 3 years ago

    So they’re listing a 1.6GHz “Boost” clock and a 2GHz+ “OC” clock. What’s that supposed to mean?

      • nico1982
      • 3 years ago

      I would guess that Boost is guaranteed for all chips under reference designs, OC frequency depends on silicon lottery and OEM board/cooling designs.
      Basically, as if Intel touted 4.2 GHz Boost and 5 GHz OC frequency for an i7-6770K.

      • Jeff Kampman
      • 3 years ago

      It means that GPU Boost 3.0 is going to do whatever it likes within the constraints of a given case, cooler, and ambient temperature.

      • Freon
      • 3 years ago

      There is base clock, boost clock, but Pascal cards can still clock themselves dynamically higher.

      Ex. My 1070 regularly maintains 2+ghz in games despite being quoted as a boost clock of more like 1800. Even in Furmark it will start off around 2050 and slowly move down to around 1950 after several minutes.

      Ironically the overclocking clock setting on the GPU core don’t seem to do much. Power target seems to matter, however. A higher power target allows higher clocks to be maintained, assuming cooling also keeps up. Actual clocks seem to just be dynamically set irrelevant of clock setting. *shrugs*

      They demoed the same behavior for the initial 1080 launch.

    • chuckula
    • 3 years ago

    What we “know” about Vega in descending order of certainty:

    1. AMD has shown off a real physical package with a Vega die and 2 stacks of HBM 2.0 memory. The die size is large (on the order of 500+ mm^2). There is an assumption (likely a good one) that at least one version of Vega will physically look like this product that AMD has shown off. If there are multiple Vega models (see below) you would assume that the part AMD has shown off is the “big” version.

    2. That there are two different Vega parts (Vega 10 & 11) based on rumors. However, all the innuendo from Raj yesterday at the new Capsaicin event indicates that there is only one Vega due to the supposedly intentional lack of model numbers. So…. is there only one Vega? Who other than AMD knows?

    3. So if there are two Vega models, then it makes sense that the smaller one would target the same bracket as the GTX-1080 and the larger one would target the same bracket as the GTX-1080Ti. However… if there’s only one Vega, where does it fit exactly?

      • Anovoca
      • 3 years ago

      You’re giving me such a clue right now.

        • chuckula
        • 3 years ago

        Here’s another one: Colonel Mustard.

      • ImSpartacus
      • 3 years ago

      The popular rumor is that Vega 10 is simply a “big” gpu because amd needs that while Vega 11 is a strategically “smaller” gpu to be used as a test bed for their multi chip module efforts.

      The general consensus is that 2x Vega 11 should beat one Vega 10 (otherwise, why bother?) and they could technically offer a gigantic 2x Vega 10 too by some time in 2018.

      I don’t think Vega 11 is supposed to come out for a couple months, but it would roughly be in the ballpark of Polaris 10’s tdp. This sounds ridiculous, but remember that Polaris 10 shits the bed when it comes to laptops. If Vega 11 had a single stack of energy-efficient HBM2 and a more advanced Vega-based architecture, then it could probably be more competitive in mobile where Polaris 10 has severely lacked (especially in the face of the skull-crushing GP104).

      So imagine a lineup in the high-ish end that starts with a cut down 8GB Vega 10 Pro that beats a 1080, then a fully enabled 8GB Vega 10 XT that competes with 1080 Ti, then an 8GB Vega 11 X2 (on the same package, not crossfire) that rules the world and then a 16GB Vega 10 X2 that’s retardedly expensive for prosumers.

      I think that’s the pinnacle of what amd would like to achieve with Vega on the desktop gaming front.

      Then Navi goes crazy and starts stacking memory on top of the gpu so you can cram like four small-ish GPUs on the same module). AMD designs one gpu and just doubles/triples/quadruples it as needed.

        • Beahmont
        • 3 years ago

        Okay, so that’s AMD’s GPU related Pipe Dream(Big Note: Please note the “AMD’s GPU related Pipe Dream” not ImSpartacus’s. Not going after him at all.).

        Now let’s have the much more realistic scenario where Nvidia is staggeringly incompetent and AMD isn’t ‘God’s Gift to Gamers’ with the superior brains and design skills compared to mere mortal men. Because that all sounds nice, but like you said ImSpartacus, it is the “popular rumor” which means it almost completely from AMD Fanboi’s who have been predicting AMD’s domination of all markets anyday now for the past 2 decades.

          • ImSpartacus
          • 3 years ago

          Yeah, it is kinda sad that Nvidia just continues to destroy amd by making brutally efficient architectures while amd just lies to their fans to drive hype to unreasonable levels ([url=https://youtu.be/oNA6fll2DDQ<]remember how Polaris 11 was supposed to be twice as efficient as a 950?[/url<] [url=https://techreport.com/review/30488/amd-radeon-rx-460-graphics-card-reviewed/12<]Except when it turned out to be neck & neck with the last gen 28nm 950...[/url<]). We're gearing up for Vega and its actually not looking good. Nvidia just announced its "contender" based on a cut down 471mm2 chip that's been in production for nearly a year and uses cheap-ish GDDR5X. Meanwhile AMD "contender" will probably have a fully enabled ~500+mm2 chip that has no production history and demands a special interposer so it can sport expensive HBM2. Nvidia isn't stupid. I kinda trust them to configure the 1080 Ti to be performance-competitive (corporate espionage is rampant). They did that with the 980 Ti weeks before the Fury X release, so I wouldn't be surprised if they pull it off again this year. So yeah, I have a good handle on what amd is trying to do and its really cool, but I can't help but be suspicious.

    • AnotherReader
    • 3 years ago

    [url=http://www.anandtech.com/show/11173/nvidia-partners-to-begin-selling-geforce-gtx-1080s-gtx-1060s-with-faster-memory<]9 Gbps GDDR5[/url<] is also available for the GTX 1060. Hopefully, there will be RX 480s with the faster memory.

      • ImSpartacus
      • 3 years ago

      Yeah, this is easily the most interesting piece of news in my opinion.

      Both the 1060 and 480 could greatly benefit from this memory. I wouldn’t be surprised if we get a “580” that features this.

      But I haven’t heard a peep about 9 Gbps gddr5.

      Comparatively, the 11-12 gddr5x showed a month ago in micron’s official catalog, so it wasn’t a surprise at all.

    • Chrispy_
    • 3 years ago

    “Time for something faster than TITAN X”

    Holds up a product that’s just 91% of a fully-enabled Titan X yet clocked only 4% higher to compensate.

    By my rule-of-thumb reckoning, that’s going to be about 5% slower than an Titan X.

      • Krogoth
      • 3 years ago

      It depends if the application and settings are shader limited, ROP limited or bandwidth limit.

      • Srsly_Bro
      • 3 years ago

      Your numbers don’t check out.

        • Chrispy_
        • 3 years ago

        Aren’t most games ROP-limited at the resolutions/framerates that people would buy a 1080Ti for?
        91% of the ROPs and 104% of the clockspeed are the headline figures here.

        It’s speculation, which is why I said “rule of thumb”.

      • derFunkenstein
      • 3 years ago

      It’s not that simple, of course. All the CUs are still there, and the memory clock is more than 4% higher. The biggest difference is ROP count, and the 1080Ti is probably a little slower than Titan XP in that regard. But is Titan XP really fillrate-limited? (like for real, I have no idea, but I’m guessing Nvidia doesn’t think so).

        • psuedonymous
        • 3 years ago

        “But is Titan XP really fillrate-limited? (like for real, I have no idea, but I’m guessing Nvidia doesn’t think so).”

        I can’t think of any recent GPU that is fillrate limited. You could slap on a trio of UHD monitors and you’d run out of geometry or shader grunt before you’d be limited by raw pixel-pushing.

          • derFunkenstein
          • 3 years ago

          that’s what I had figured, but wasn’t 100% sure on.

            • Chrispy_
            • 3 years ago

            Well, ROP rate, texel rate and memory bandwidth all contribute to the fillrate limitations that affect even the Titan XP at higher resolutions. 4K framerates require 4x the ROP, fillrate, and uncompressed bandwidth of 1080p. That’s just physics!

            If it wasn’t “fillrate limited” then it would get the same FPS at 1080p as it does at 4K, because the limitation would be elsewhere – such as geometry or shader throughput, or perhaps even the CPU feeding it in the first place.

            With Unified shaders, the divide between “fillrate” and “geometry” are far more blurred. Shaders are used for everything, and the balance between geometry, fill, and postprocessing is completely random depending on the game/application these days.

            My gut feeling with such a massive and fast arsenal of shaders is that the reduced ROPs are going to matter. If not in all situations, at least enough that the 4% clock boost isn’t going to be enough.

            • derFunkenstein
            • 3 years ago

            Gut feelings aren’t data. We’ll just have to see at review time, but my guess is that Nvidia wouldn’t say it was faster than the card that costs 2x as much if it wasn’t. They’d be throwing away sales otherwise.

            • Chrispy_
            • 3 years ago

            My guess is that the drivers are doing the magic somewhere, rather than the hardware. Based on the hardware stats alone it’s going to be within single-digit percentages anyway. -9% change here, +4 there, 0 change elsewhere, it’s basically a handwaving way of saying “no real change”.

            Titan cards has always been the best “cheap” way to get very high-end DP compute performance since Fermi. Kepler and successive iterations of the architecture in the GX-104 and higher models have been culling DP compute functionality in favour of SP16 and SP32 performance for gaming.

      • Bauxite
      • 3 years ago

      I suspect the XP will remain on top for 4k but the Ti will claim the crown for 1440/1080p.

      This is out of the box clocks, otherwise it will be more about the luck of the draw and depend more on which randomly selected card overclocks better. Both of them will need the power limit workaround if they are good bins though.

      • DeadOfKnight
      • 3 years ago

      Titan X is not fully enabled.

    • Solean
    • 3 years ago

    I’m conflicted.

    I plan to buy Ryzen 1800X.
    But also this card…it really really tickles my impulsive buyer complex.

    Don’t know what to do.

      • Kretschmer
      • 3 years ago

      Kill your inner fanboy. No conflicts; more objectivity.

      • Neutronbeam
      • 3 years ago

      Buy ME the card instead. As my twin Inspector Clousseau would say, “Prob lem sol ved.” 🙂

    • ptsant
    • 3 years ago

    Looking at the respective AMD and nVidia presentations, I am getting the impression that AMD is trying to counter nVidia’s launch and not the opposite, as we assumed.

    Don’t get me wrong, Vega sure looks like a very decent card, but at best it will slot between the 1080 and the Ti, when it becomes available. Then again, there is no bad card, only a bad price. Vega at $400 with 1080 performance would certainly make me happy…

      • AnotherReader
      • 3 years ago

      Vega is too large to be sold for $400. They would probably stick to the same price as the Fury X.

        • Krogoth
        • 3 years ago

        Not really, the actual of silicon production in volume is relatively trivial. Most of the cost comes from R&D.

          • AnotherReader
          • 3 years ago

          They could, but for the margins needed to recoup that R&D, turn a profit and invest in Navi, they couldn’t sell it for $ 400.

            • mesyn191
            • 3 years ago

            Supposedly interposer costs for the FuryX have gotten down to $10 and the large die FuryX price is also supposedly something like $30-40.

            Now the gotcha there is FuryX’s die is still on 28nm which is mature as all get out and Vega would be 14nm which is still pretty new but its quite possible they can still turn a decent profit at $400. The interposer for Vega is supposed to be smaller and they’re using less stacks (2 instead of 4) of HBM so that will help too on costs.

            Quite frankly though if they get in between 1080Ti and 1080 performance I don’t see why they couldn’t sell for $500 or so.

        • Airmantharp
        • 3 years ago

        To add: Nvidia once sold Gx100 GPUs (currently only available as Quadro and Tesla parts) for <US$300. Even after inflation, that’s well under your US$400 mark. These GPUs are also about the same size as the GP100 that ships in their current Titan X and 1080 Ti.

          • AnotherReader
          • 3 years ago

          Yes, we have seen an increase in the price of big GPUs that is greater than the inflation rate. The GTX 285, almost the same size as GP102 and fully enabled to boot, debuted at $400, but was available on launch day at [url=http://www.anandtech.com/show/2711<]$380 on Newegg[/url<].

      • Chrispy_
      • 3 years ago

      We also don’t know what the architectural differences are between the RX480’s GCN and Vega’s NCU shader arrays.

      If they’re 1:1, then a 1500MHz, 4096 shader Vega part is going to be 110% faster than a 1266MHz 2304 shader RX480, and that puts it pretty squarely in GTX 1080 territory.

      One would [i<]hope[/i<] that NCU is better than 1:1 compared to GCN though....

        • AnotherReader
        • 3 years ago

        The reference 480 didn’t average 1266 in quite a few games. 2.1 times faster than a 480 would be [url=http://tpucdn.com/reviews/NVIDIA/Titan_X_Pascal/images/perfrel_2560_1440.png<]closer to the 1080 Ti than a 1080[/url<]. However, we have no idea what the final clock speed of consumer Vega would be. Usually, the consumer variants are clocked higher than the FirePros and the 12.5 fp32 Tflops figure applies to the MI25.

          • Chrispy_
          • 3 years ago

          That’s a separate issue really – 7/7 of the reference RX480’s I have played with have been needlessly overvolted. All of them run at 1266+ and never ever drop if you reduce the voltage by even as little as 100mv. I’m actually running my personal ref. RX480 at -175mv and 1275 MHz. It’s insane how power-efficient and quiet it is at those speeds.

          Yeah, 1080 or 1080Ti – it’s hard to call with no benchmarks of the 1080Ti yet.
          2.1x RX480 performance seems to split the 1080 and Titan X(P) pretty evenly at higher resolutions. At lower resolutions such as those the Vive and Rift use, the lesser cards perform relatively better.

            • AnotherReader
            • 3 years ago

            I agree with you about AMD’s long history of overvolting cards. I recall that Computerbase showed that just reducing the voltage by 75 mV increased average performance by 6% as the clock was a stable 1266 MHz.

            Nevertheless, that graph that I linked to is using a reference 480. In the time since then, the 1060 and 480 have become equivalent so you could multiply the 1060’s numbers by 2.1. That would get you even closer to the 1080 Ti.

            Edit: 175 mV is an insane undervolt. AMD should do a better job of binning its GPUs.

            • Chrispy_
            • 3 years ago

            0.925V, and the only reason I can’t go lower is because the GDDR5 needs 0.925V to run at 8GHz and the core is tied to that in higher states.

            • AnotherReader
            • 3 years ago

            Out of curiosity, what clock speed can this golden sample reach when it is overvolted to 1.15 V?

            • DPete27
            • 3 years ago

            I started [url=https://techreport.com/forums/viewtopic.php?f=4&t=119219<]a forum thread about undervolting GPUs[/url<] I've effectively undervolted my RX480 by about 150mV or more across all the useful load states (5-7). At it's stock 1305MHz, it wanted 1.15V on Auto, but I'm running my card at 1305MHz and 1.030V. My intent wasn't to see how high I could push clocks, since it doesn't hardly break a sweat on my 1080p/60Hz monitor. **{Add] It's actually disappointing that we can squeeze that much out manually considering all the hubub about [url=https://techreport.com/review/30328/amd-radeon-rx-480-graphics-card-reviewed/2<]AVFS in Polaris.[/url<] But I do like that AMD (seems) to be looking more into downclocking to hit performance targets in lighter games and saving power/heat as opposed to Nvidia chips just OC-ing the piss out of themselves at all times (correct me if I'm wrong).

            • AnotherReader
            • 3 years ago

            Reading that thread makes me want to buy a RX 480 just to undervolt it. It would be a sidegrade from my 290X though.

            • DPete27
            • 3 years ago

            I think Chrispy undervolted his 290X also…

            • Chrispy_
            • 3 years ago

            [url=https://techreport.com/forums/viewtopic.php?f=3&t=109936<]Yea, verily[/url<]

            • Chrispy_
            • 3 years ago

            I haven’t tried overclocking properly but I think it was this card that locked up at 1325.

            I’m no expert but I think that means this chip is non-leaky and leaky chips clock higher (with adequate cooling).

            It’s worth pointing out that this is no “golden sample”. I’ve had a trio of RX480’s at home – An Asus and Gigabyte reference (I think this is the “Gigabyte” I’m running at 925mv but they’re all produced from the same batch and just have different labels depending on the vendor) as well as an XFX BE at 1328MHz . All of them run 1250+ at under 1V.

            The few I’ve played with at work seem stable at 1V too – there are four in the OpenCL crunch stack, but other than trying to reduce the load on the AC unit by downvolting them slightly, I’ve not pushed those to see what their limits are.

            • terranup16
            • 3 years ago

            [quote<]That's a separate issue really - 7/7 of the reference RX480's I have played with have been needlessly overvolted. All of them run at 1266+ and never ever drop if you reduce the voltage by even as little as 100mv. I'm actually running my personal ref. RX480 at -175mv and 1275 MHz. It's insane how power-efficient and quiet it is at those speeds.[/quote<] This succinctly explains why I haven't yet brought myself to click my left mouse button while hovering over "Add Preorder to Cart" on Newegg for the R7 1700X until tomorrow... AMD of the past decade seems to find some way to mindlessly screw up something out of box. On the plus side, if Ryzen takes after Radeons at all, then there's probably another ~5% performance left on the table atm that drivers and firmware will unlock within 18 months, so if leaked benchmarks are to be believed, those +5% down the road will be quite nice.

      • Billstevens
      • 3 years ago

      I’d pay $500 if it consistently beat or matched a 1080.

        • Chrispy_
        • 3 years ago

        I’d pay $500 if it consistently beat or matched a 1070, because then I wouldn’t need a G-Sync monitor and its silly $200 stupid-tax.

        I’m also hoping for good things in the perf/watt department – it’s supposed to be significantly better than Polaris according to AMD’s internal slides:

        [url=http://cdn.wccftech.com/wp-content/uploads/2016/07/AMD-GPU-Architecture-Roadmap-Vega-10.jpg<]Vague and handwaving AMD roadmap with no zero or scales on the perf/watt axis![/url<]

    • ronch
    • 3 years ago

    (Looks at Ryzen pricing..)

    (@_@;)

      • Klimax
      • 3 years ago

      And? GPUs and CPUs have different market and price dynamics.

    • Demetri
    • 3 years ago

    Will Nvidia be sending thank you cards to everyone who bought this for $1,200?

      • DancinJack
      • 3 years ago

      No, because if those people had a brain in their heads they’d have waited for the 1080 ti, just like every generation past…

      • Klimax
      • 3 years ago

      Why? We knew this will get released, someday. Yet, we had card for quite some time before this got released. And got a lot of use out of it…

      • ptsant
      • 3 years ago

      Almost all the rumors pointed to a $700-800 Titan rebranded as 1080Ti later in the year. Everyone who cared, knew this.

      It comes down to the value of having the best for 6 or more months. Think of it as “rent”. If you don’t mind paying $60/mo for the “best card” out there, then you got a good use out of it.

      Even though my annual hardware budget comes to $100 per month, I wouldn’t allocate $60/mo just for the pleasure of having the best GPU. I’d rather invest that money differently: storage, monitor, nice keyboard, new router etc.

        • ImSpartacus
        • 3 years ago

        $100/mo? Damn, son, what are you spending that on?

        • bfar
        • 3 years ago

        They do it every year. The Titan brand is a great way to milk the obsessive types who absolutely crave the latest and greatest. Nvidia know their market.

        • Bauxite
        • 3 years ago

        August 2016, March 2017, what year are we in again?

      • Billstevens
      • 3 years ago

      People who paid $1200 have had a godly video card for months. They got what they paid for, better than the best performance. Titan X is probably still better for non-gaming stuff. Slightly better.

      • chuckula
      • 3 years ago

      If they get thank you notes for that purchase, does that mean AMD sends fruit baskets to each of the [s<]victims[/s<] customers who bought the Furry Pro Duo for $1500?

        • MOSFET
        • 3 years ago

        No, It was obvious from the beginning that it was intended for no one. Therefore it had no customers.

          • chuckula
          • 3 years ago

          MISSION ACCOMPLISHED!

      • Freon
      • 3 years ago

      Early adopter tax of a very luxury product.

        • Voldenuit
        • 3 years ago

        Plus, if you were buying a TitanX in 2016 for professional purposes, it should have more than paid off for itself by now.

        If you were buying it to game on, then you knew what level of performance you wanted, what you were willing to pay for it, and that there would always be something faster and/or cheaper in the future.

    • Forge
    • 3 years ago

    I came here to make a comment about those just-barely-off numbers making my latent OCD twitch act up, and you beat me to it. Congrats on a well-turned phrase.

    Now I’m wondering if this is why eVGA hasn’t refit the 1080 Classified with their new cooler. Maybe they’ll be going straight to 1080 Ti for the top model instead.

    • Ifalna
    • 3 years ago

    Wouldn’t get too excited about the pricetag.
    Anyone remember the supposed “350” pricetag of the 1070?

      • PixelArmy
      • 3 years ago

      No, considering it was $3[b<]8[/b<]0.

        • ptsant
        • 3 years ago

        I rarely saw it below $450 where I live (EU). Only much later in the year could I hope to get the 1070 at $400. The RX480 I got was at $280, which is also a $40-50 premium over US prices.

          • EndlessWaves
          • 3 years ago

          Don’t forget that US sources quote prices without taxes, so much of that will be VAT.

            • Ifalna
            • 3 years ago

            Importing from a non EU country, they add 20%. That would be 430€ (up from 359).
            More realistic prices were around 530ish for the OC’d models.

            Got my Strix for around 485, which was an awesome deal back in Nov. 16.

          • Freon
          • 3 years ago

          There was a shortage of supply so prices took time to fall for even the the more basic cards. But the 1080 was way worse, they were simply out of stock for months regardless of pricing.

          I think there was a lot of pent up demand for new high end cards and new 14nm parts after years of 28nm, plus a new die shrink itself may have kept production numbers lower than 28nm which had been refined for several years.

          There’s always a bit of supply issues early on as everyone wants the new thing, it’s not exactly unheard of. That’s just how things go. For GPUs its a bit different than the release of something like a game console because manufacture and final design is still up to the third party manufacturers, NV isn’t truly in the production business, it’s still ultimately outsourced. There’s only one real Xbox when released. Sometimes those also have short supply…

          Thus is life. This is still a luxury item in the grand scheme of things, wouldn’t complain too much.

        • Ifalna
        • 3 years ago

        Which would be ~359€ 😛

    • juzz86
    • 3 years ago

    GeForce. Now turned up to 11.

    • DeadOfKnight
    • 3 years ago

    Damn, I just bought a 1080 about a month ago assuming that this card would be priced about $200 more. Oh well, it’s not like I’m struggling at 1440p. I’ll just wait for Volta.

      • EzioAs
      • 3 years ago

      The real thing you missed out is the price drop on the 1080s (assuming you bought your 1080 at the former MSRP). Win some, lose some.

    • hansmuff
    • 3 years ago

    It’s really nice to see competition again. Thanks AMD!

      • Leader952
      • 3 years ago

      What?

      AMD only competes in the $200 or lower segment.

        • ptsant
        • 3 years ago

        Well, it can be argued that the announcement of the 1080Ti and the price drops are in anticipation of Vega. Or the opposite, the Vega event is an effort to do some damage to 1080 sales.

        Anyway, nVidia did not drop prices out of sheer generosity.

          • ColeLT1
          • 3 years ago

          Every time there is a new high end card, the old high end card drops in price.

          • Billstevens
          • 3 years ago

          It’s the perfect time to steal more high end market share. May as well get ahead of the curve since Vega would likely force a price drop anyway. Also it’s possible manufacturing costs have lined up to make a 1080 price drop not eat into margin.

          Ever 1080 buyer between now and Vegas release is another locked in customer who won’t buy Vega. I think they will get a lot of bites.

          • TEAMSWITCHER
          • 3 years ago

          When you ship products on time and on budget, you can price your products strategically after they have been on the market some time. This makes it difficult for your competition that has a tendency to ship products late…or very very late in this case.

    • Ochadd
    • 3 years ago

    I’ll be a buyer at $699 supposing there’s enough supply at launch. 1700x + 1080 ti and I’m ready for Mass Effect.

    • deruberhanyok
    • 3 years ago

    And today AMD showed us… nothing. :/

    So we get NVIDIA dropping price on 1080, NVIDIA showing off 1080ti and AMD talking about ice cream and VR.

    This kills me. I was hoping for Vega parts that positioned a little above 1070 and 1080 in performance but below them in price. Sure the 1080ti would still be King, but it would mean that AMD was competitive up to the $500 price range, instead of $200.

    Polaris seems to have done well in that respect, but it feels like all of that talk when it launched of how AMD was just ceding the high-end market to NVIDIA “for the time being” is really turning into “for the foreseeable future”.

      • Krogoth
      • 3 years ago

      Nvidia is just proactively avoiding a price disruption from AMD like what happened with RX 480/470 launch.

        • Voldenuit
        • 3 years ago

        What price disruption? I can find 8 GB RX480s for under $199 these days, but 6 GB 1060s are still $239+, despite performance being essentially identical.

          • Krogoth
          • 3 years ago

          RX 480/470 depute completely changed the mid-range segment. It used to be 660-tier performance for ~$199 USD for almost two years (960 was a disappointment) . RX 480/470 change that ~$199 USD segment into 970/980-tier performance. Nvidia was forced to throw 6GiB 1060 below their original target $299 USD and quickly come up with a 1060 3GiB to fight off the RX 470.

          Nvidia wants to avoid the same thing happening with Vega launch which is why they are preemptively slashing prices on 1070 and 1080 to steal “potential” Vega buyers before its depute.

      • derFunkenstein
      • 3 years ago

      $500 for a 1080 is super interesting. Wonder what happens with the rest of the lineup… Probably nothing.

        • EzioAs
        • 3 years ago

        The 1070 is reduced to $349. [url<]https://www.techpowerup.com/231125/nvidia-cuts-price-of-the-geforce-gtx-1070-down-to-usd-349[/url<]

          • USAFTW
          • 3 years ago

          That’s really interesting since partner cards mostly carried above-Founder’s Edition prices.
          Would be nice to see reference and non-reference 1070s at around 350$.

          • derFunkenstein
          • 3 years ago

          heh, TPU took the post down. I’m guessing that it’s going to be $349, which isn’t too shabby either.

      • cegras
      • 3 years ago

      If Vega isn’t launching for many months to come, why were you hoping for them to preemptively release their pricing? That implies where it would be positioned, and would give nvidia ample time to prepare a response.

      • Pancake
      • 3 years ago

      hold yer breath and count to 150,000

      • USAFTW
      • 3 years ago

      I really thought they would at least reveal SKUs and prices but no, we got bupkis from AMD. The RX 480 along with its derivatives are right now the only sorta viable cards they’re offering.
      Sad!

      • DragonDaddyBear
      • 3 years ago

      I’m not a developer but I imagine the circus that comes with a product launch would be very annoying if I were there to, you know, learn some development stuff. It also gives AMD time to check their price and performance expectations so they can launch Vega an MAYBE be competitive.

      • Billstevens
      • 3 years ago

      I would be surprised if Vega matched up well against a 1080 TI. Odds are as others have noted they will compete at $500 trying edge out a 1080.

      Once GPUs are 1 year from release I wonder how much head room they have to target higher performance.

      Its possible they are chasing Titan X performance, but nothing we have seen indicates that. Settling for second best always sucks, but on the bright side their card will only cost $500 if that’s the case.

      And a bit better than 1080 performance is good at that price point.

    • jts888
    • 3 years ago

    This is basically a slightly boosted TXP with a 40% price cut.

    Great for consumers, but Nvidia wouldn’t be suddenly “generous” without good reason.

    Panic reaction about Vega to any extent?

      • derFunkenstein
      • 3 years ago

      Hardly. More like they’ve been waiting for Vega.

      • chuckula
      • 3 years ago

      Nvidia looks about as panicked about Vega as they were about Fury when they launched the gtx-980ti.

      • Freon
      • 3 years ago

      They made a similar move with the 980 Ti, I think it’s just to take the wind out of AMD’s sails.

      NV beat AMD to market with a superior product for less money than AMD probably intended to retail their exotic HBM part for.

      I think they have metric tons of margin in all their parts at this point with how efficient they are. Significantly less bandwidth and less power. They can afford to stick it to AMD in the short term to retain a near monopoly.

      • CScottG
      • 3 years ago

      Hmm, maybe *a* reaction:

      [url<]https://youtu.be/6PQ6335puOc?t=25[/url<]

      • Kretschmer
      • 3 years ago

      I think they just want to continue to dominate the market.

      They’ve been gobbling up the high end with an unmatched 1070 and 1080; why not continue to crush at everything north of $350?

      If this is as insane as it looks I’ll be very tempted to sell my month-old 1070 and pick up the first 1080Ti to fall below $650.

      • Krogoth
      • 3 years ago

      Not exactly, it is a binned GP102 chip that has a sight clock boost and GDDR5X chips to help compensate for the loss of the blocks and memory controller.

      1080Ti most likely has little overclocking headroom but it doesn’t really need it either.

      • NovusBogus
      • 3 years ago

      Haven’t they always done it this way? Certainly the last couple of generations. First comes the lolprice GPGPU, then the x80 Ti for half as much.

      Cutting the 1080 might be a reaction to Vega, but it was mad overpriced to start with so this was probably necessary to still sell the thing. I actually do think Vega will be legit, but this feels more like internal product re-positioning.

      • Redocbew
      • 3 years ago

      Given that it’s not just a price cut I’m not sure how much Nvidia could “panic” even if they thought it necessary.

      • renz496
      • 3 years ago

      some people said nvidia are scared of Vega hence the lower pricing than expected on 1080ti and price cuts on 1080. but then again nvidia x80ti has always priced in that range

      780ti – $699
      980ti – $649
      1080ti – $699

      unless the said card bears “titan” naming nvidia still not crossing $699 mark since 2013. but people never really think that nvidia did this to “torture” AMD financially. i mean look at what nvidia did with 900 series. 970 at $330 pretty much forcing AMD to drop their 290X and 290 that they sold for $400-$450 down to $300 mark while at the same time nvidia leisurely sell 980 at $550. then nvidia ninja Fury X with 980ti with $649 pricing. the fact AMD did not undercut nvidia pricing and then decided to price nano at $649 shows that AMD probably prefer to sell them at much higher price. i dare to bet nvidia most likely try doing something similar here.

    • GrimDanfango
    • 3 years ago

    Huh, I was wondering where the hell they got 11GB from, now I see.
    A 352-bit memory interface has to be weirdest GPU configuration yet.

    That price seems impressively only-slightly-extortionate… considering I reckoned $799 was going to be the best case scenario and $899 seemed more likely.
    Of course, that could just be clever marketing having pre-seeded my brainwashed brain with higher figures… but well, it’s a pleasant surprise considering I’d already decided to buy one regardless.

    Top stuff!

    Edit: I hope AMD manage to steal back some of the thunder with Vega. Best of luck to ’em!
    Competition is good!

      • JustAnEngineer
      • 3 years ago

      11 GiB on a 352 GiB bus for the GeForce GTX1080Ti is 11/12 of the full GP102 GPU in the Titan X (Pascal 2016). This is similar to the crippling that they did on the GeForce GTX970 that had only 3½ GiB of usable memory – 7/8 of the full GM204 GPU in the GeForce GTX980.

        • ImSpartacus
        • 3 years ago

        No, that was very different.

        The 970 had 3.5+0.5GB of vram. It had all 8 controllers.

        The difference was that it lost some ROPs (and therefore L2), so there wasn’t enough resources to feed that last controller at the same rate as the others.

        [url<]http://www.anandtech.com/show/8935/geforce-gtx-970-correcting-the-specs-exploring-memory-allocation[/url<] With the 1080 Ti, the understanding is that this is 11GB of vram, not some 11+1GB setup.

          • JustAnEngineer
          • 3 years ago

          Once NVidia updated the drivers to use only the 3½ GiB of memory on the GeForce GTX970 that runs at normal speed, that other ½ GiB might as well have not been there for anything other than marketing purposes.

    • rudimentary_lathe
    • 3 years ago

    I demand my RAM allotment on higher end cards come in multiples of two. Hard pass.

      • GrimDanfango
      • 3 years ago

      I plan to get one, but it’ll certainly grate on my OCD… I’m bad enough about 12GB, not being a nice round power-of-two.

      • Freon
      • 3 years ago

      Most graphics cards only go to 8. This one goes to 11.

        • Voldenuit
        • 3 years ago

        But why don’t they just make 10 one louder?

          • UberGerbil
          • 3 years ago

          [url<]https://xkcd.com/670/[/url<]

            • psuedonymous
            • 3 years ago

            “And for $2,000 I’ll build you one that goes to 12”

            Wow, the Titan X is a steal at only $1,200!

    • chuckula
    • 3 years ago

    Weird RAM configuration… Smells like gtx-970 all over again!

    Overall it’s strong competition for a future Vega launch,

      • Krogoth
      • 3 years ago

      It is more likely that Nvidia just binned the GP102 yields by killing one of the shader, ROP and memory controllers blocks. They threw in faster GDDR5X chips to compensate for loss of the memory controller.

      • SoM
      • 3 years ago

      that +1GB is compensation pay for the 970 fiasco

      • ImSpartacus
      • 3 years ago

      No, they just disabled an entire controller.

      A 384-bit bus is 12 32-bit controllers, each paired with a memory package.

      Since they had access to Micron’s updated GDDR5X (we knew about this weeks ago when Micron’s catalog updated), they can disable an entire controller and get roughly the same bandwidth using only 11 controllers and 11 memory packages.

      The 970 was unique in that it had all of its controllers, but lacked the ROPs/L2 to properly feed them. So it had two “tiers” of memory in a 3.5+0.5GB setup. All 4GB were technically there.

      [url<]http://www.anandtech.com/show/8935/geforce-gtx-970-correcting-the-specs-exploring-memory-allocation[/url<] With the 1080 Ti, there's no 11+1GB setup. It's very simple. They cut chopped off 1/12 of everything.

    • synthtel2
    • 3 years ago

    Why be OCD when you could instead think of how this one goes to eleven?

      • ptsant
      • 3 years ago

      32-bit sub-channel x 11 –> 352bit interface. Each access to 1GB. There you have it.

      • Neutronbeam
      • 3 years ago

      It’s NOT OCD…it’s CDO, in the correct alphabetical order that god intended.

Pin It on Pinterest

Share This