Rumor: More GTX 1050 Ti and 1080 Ti details pop up

We've been hearing word through the grapevine that Nvidia is preparing more Pascal processors for upcoming graphics cards. A deleted post on ChipHell had a GPU-Z screenshot seemingly confirming the existence—and some of the specifications—of a GP107 chip that rumors say is bound for the GeForce GTX 1050 Ti. Meanwhile, at the other end of the market, rumors abound regarding a GP102-based GeForce GTX 1080 Ti.

First, the "little" guy. As the rumors go, there will actually be two GPUs based on the GP107 processor. The fully-enabled GP107 chip is rumored to have 768 shader processors. That would imply that it comprises six Pascal SMs, which in turn gives it 48 texture units. The screenshot from ChipHell lists the GP107 chip having 64 TMUs, however. That may mean that the chip is structured differently from other Pascal parts, that GPU-Z is incorrect on the pre-release card, or simply that the screenshot is fake.

In any case, rumors agree that in its fully-enabled form, the GP107 chip will have a 128-bit path to 4GB of GDDR5 memory. That configuration will purportedly be called the GeForce GTX 1050 Ti. A cut-down version featuring 640 shaders and 2GB of memory will apparently be known as the GeForce GTX 1050. These cards are very similar in both name and configuration to the first-generation Maxwell GTX 750 and 750 Ti parts. Thai tech site Zolkorn claims the new cards will be launching the last week of October, and predicts a price of $149 for the 1050 Ti.

The rumored GTX 1080 Ti is a far different beast. Whispers about this card have existed at least since the launch of the newer Titan X. Just as the GTX 980 Ti brought the lofty GeForce Titan X's performance into a price range attainable by mere mortals, so too should the 1080 Ti be a more affordable way to get most of a 2016 Titan X. HWBattle has an image showing data that was allegedly snatched from Nvidia's site before being quickly removed, and it lists the supposed GeForce GTX 1080 Ti as having 3328 shader processors and running at a boost clock of 1623 MHz.

Chinese tech site Zol.com.cn leaked a few more details about the card a few days ago. Contrary to the image linked above, the site said that the board could use 12GB of GDDR5X memory at 10 GT/s, just like on the Nvidia Titan X. The site also claimed that the new card will launch at CES 2017 next January, and that it will be priced somewhere between the GTX 1080 and Titan X. That would put the GTX 1080 Ti's release date around the same time as AMD's upcoming Vega GPUs. Since both green and red teams try to one-up each other with simultaneous releases, the rumored date seems at least plausible.

Comments closed
    • HERETIC
    • 3 years ago

    Latest rumor to add to list-
    Could be fabbed on Samsung 14nm
    [url<]https://www.techpowerup.com/226380/nvidia-gp107-gpu-built-on-samsung-14-nm-node[/url<]

      • ImSpartacus
      • 3 years ago

      That might explain the clocks, lol.

    • NovusBogus
    • 3 years ago

    Interesting, but…I want my x60 Ti, dammit!

      • ImSpartacus
      • 3 years ago

      Nvidia has no incentive to cut down gp104 that much if the 1070 is selling well at $400ish.

    • Gastec
    • 3 years ago

    You will be able to buy GTX 1080Ti (availabily one every week) from Amazon with $1000-1200 in the US and €1300-1500 or more because greed is good.

    • strangerguy
    • 3 years ago

    Blah, blah always the same old “evil Nvidia price gouging” on what is by definition luxury goods. Those aren’t happy with prices should whine about the lack of useful AMD competition when that’s just the free market doing it’s supposed job. Nvidia doesn’t owe you shit.

      • Srsly_Bro
      • 3 years ago

      agreed

      #entitlementgeneration

      • BurntMyBacon
      • 3 years ago

      [quote<]Blah, blah always the same old "evil Nvidia price gouging" ...[/quote<] It may be old, but the situation is still relevant. When nVidia stops price gouging, we (at least the reasonable ones) will stop complaining about it. Also note that if nVidia doesn't see some form of negative feedback from the customers, then they will have no incentive to pull back prices. I tend to think that not purchasing their product when they are price gouging is a powerful form of negative feedback, but it would seem that plenty of others would rather buy it and later complain about it on forums that nVidia doesn't monitor. [quote<]... on what is by definition luxury goods.[/quote<] Exactly. Don't by if you can't afford it, or don't like the price. [quote<]Those aren't happy with prices should whine about the lack of useful AMD competition[/quote<] I do. Doesn't mean I can't also complain about nVidia price gouging. I'm both a proponent of free speech and an equal opportunity hater. [quote<]Nvidia doesn't owe you ****[/quote<] They owe me what I pay for and nothing more. Whether, I want to pay the apparent early adopter fee, wait for prices to come down to reasonable levels, purchase from a competitor, or (and this one is huge) [b<]forgo a new video card[/b<] is at my discretion. I don't owe nVidia **** either.

    • jessterman21
    • 3 years ago

    That’s almost exactly the numbers GPU-Z shows for my GTX 960

    • USAFTW
    • 3 years ago

    So it seems we won’t be seeing a fully-enabled GP102, at least in the foreseeable future. It amazes me thart now I have to pay 1200 or a lot more for the rest of the world and get a card that is quite fast, but not the fastest iteration, with a die roughly the size of Hawaii (471 vs. 434) and further cut down.
    There’s no doubt anymore in my mind that it’s solely down to AMD’s incompetetiveness. The Titan X Pascal is what used to be, six years ago, a GTX 570 at $369.
    I hope I’m wrong and the 1080 Ti or maybe a 1080 TiTi has a fully enabled ASIC inside, but boy it bugs me.

      • tsk
      • 3 years ago

      The Quadro P6000 uses the full GP102 chip.

      • Voldenuit
      • 3 years ago

      You’re right that pricing is a function of competition in the market, and to a lesser extent, yields and margin.

      Die harvesting/disabling is also a function of the above three, as well as power budget. We’ve seen this in the desktop and mobile versions of the 1070, where the mobile version has more shader processors but running at lower clocks (and presumably with lower TDP and temperature targets).

      But as long as AMD remains uncompetitive in mindshare, market share and performance, nvidia will not have the same pressure to provide as much performance and value to the end user as they might have.

      • EndlessWaves
      • 3 years ago

      Solely down to that? Hardly.

      You just have to look at the rest of the gaming hardware market. There’s no shortage of competition in keyboards for example, yet nobody would deny that there are lots of expensive models that aren’t everything that they could be and have a nice wide profit margin.

        • ImSpartacus
        • 3 years ago

        Cmon, we all remember the generation where r600 caused g92 to sell for way way lower than previous gpus had sold for. Nvidia didn’t sell its flagship for like $300 because it wanted to be a nice guy. That wasn’t that long ago.

        Competition matters.

          • jihadjoe
          • 3 years ago

          lol R600 was so bad it couldn’t compete with G80 in terms of price:performance despite a full G80 being like $600. When G92 came out Nvidia realized it could beat R600 with a chip that cost half as much to produce, so they did and totally clobbered ATI’s marketshare.

          In case anyone needs a flashback:
          [url<]https://techreport.com/review/12458/amd-radeon-hd-2900-xt-graphics-processor/[/url<] [url<]http://www.anandtech.com/show/2231[/url<] Edit: Back then marketshare made a huge difference, which is why Nvidia was willing to cut prices in order to gain marketshare. These days Nvidia's marketshare is so big that going on a price war with AMD would hurt rather than help NV's bottom line, so instead they use their 'preferred' status to go for bigger margins by raising prices.

            • ImSpartacus
            • 3 years ago

            I’m sorry, I meant rv670, effectively the 3870 and 3850. I forgot that r600 was the infamous 2900 xt. But I think you know what I mean.

      • ImSpartacus
      • 3 years ago

      There’s time. Volta isn’t due to consumers until 2018 (probably mid-late 2018). Nvidia does one titan every year. We’ve got the titan X for 2016, but 2017’s Titan can’t use Volta, so it’ll be Pascal based.

      The 2017 Titan will probably be a fully enabled GP102 with higher clocks and 24GB of GDDR5X (clamshelled).

      Think about what happened with the original titan (cut down, 6gb), then the 780 ti (fully enabled, 6gb), then finally the titan black (fully enabled, 12gb, clamshelled), all based on gk110. It happened over two years, which is the time frame that we’re looking at.

    • Chrispy_
    • 3 years ago

    GP107 matters a lot for laptops. It’s likely to be a sub-60W part in 768CU/6SMM format and sub-50W part in 640CU/5SMM format.

    To put that in perspective, look at the cheap, thin, light laptops with GTX 940m and GTX 950m in them at the moment, and now imagine them playing your favourite games at 1080p60 without sounding like a hairdryer.

    Yes, it will be nice not to need a big heavy smelly hot noisy gaming laptop to game on.

      • JustAnEngineer
      • 3 years ago

      [quote=”Chrispy_”<] It will be nice not to need a big heavy smelly hot noisy gaming laptop to game on. [/quote<] Smelly?

        • Noinoi
        • 3 years ago

        Maybe it’s the smell of… hot-running chip-warmed air out of exhausts. (Apparently, whatever it has, it does smell a bit if you’re in front of a warm exhaust.)

        Or maybe it’s just a fun mistake 🙂

        • Chrispy_
        • 3 years ago

        I’m being facetious, obviously. Laptops only smell if they burn and die.

          • bandannaman
          • 3 years ago

          You, sir, have clearly never power-gobbled an XL bag of Cheetos while binge-watching Game of Thrones on your laptop.

            • Chrispy_
            • 3 years ago

            Good grief no, not even an XL bag would last more than one episode.

          • UberGerbil
          • 3 years ago

          Have you ever dealt with “fixing” a laptop that is repeatedly and arbitrarily shutting down because the fan vents are clogged? Believe me, they can run hot enough to burn the hair (whether from persian cats or teenage girls) that’s wrapped around the fan, and that definitely smells.

      • HERETIC
      • 3 years ago

      Be interesting to see what happens here with lappys.
      I’m seeing these as approximately a 960 and 950 performance wise.

      On desktop I’m expecting similar power usage-as broken dies tend
      to have same power usage as full ones.
      Laptop makers might reduce performance of 1050 to fit a lower TDP.

      • Firestarter
      • 3 years ago

      that’d be nice yes, but then new games come out and your small, light, cool and quiet gaming laptop that smells like roses is suddenly insufficient again

      as long as desktop GPUs routinely burn 200W+ to make pretty graphics, gaming laptops will have to compromise

        • chµck
        • 3 years ago

        that’s true, but laptop GPUs are usually binned dies that can run at lower volts.

          • Firestarter
          • 3 years ago

          so they’re slightly more power efficient. That won’t make up for 100W difference of power that you can never have in a laptop chassis

        • VincentHanna
        • 3 years ago

        I disagree. The thing is, the average laptop display has less than 1200×800 display. The average GAMING laptop has a 1080p. Not 2k, not qhd, and certainly not 4k. Beacause of that, something very close to a 1050 (not saying a 1050, but a 1070 certainly and we are in the same ballpark here) will not be “insuficient.” Not now, not 5-8 years from now. You’ll be able to play 1080p on ultra until the cows come home because the GPU will still, no matter what, only be drawing for Full HD.

          • Firestarter
          • 3 years ago

          that’s not even true anymore for my OC’ed AMD HD7950. Yeah those mobile chips will probably be faster than my GPU but not fast enough to be doing ultra at 1080p in 5 years when they’re barely good enough to do it [i<]today[/i<] I don't disagree that they'll probably be OK for a long time but the only way they'll stay relevant in 5 years is when you adjust the detail levels to be somewhat equivalent to what they are today, which I damn well hope won't be "ultra" by then anymore but rather "medium". If desktop GPUs can't manage significantly more eye-candy at 1440p in 5 years then I guess I should just stop reading Tech Report because there'll barely be anything worth reporting on

            • I.S.T.
            • 3 years ago

            It’s a lot worse if you had a mid end card that happened to be around the speed of the consoles, but you’ve got DX11 rather than the lower level APIs the consoles have to contend with… I can’t even play Evolve at lowest settings(Other than res. On evolve, blurring up your screen can mean could make it hard to play the game and my LCD and my card have crappy scalers) and have a consistent 30 FPS. I got a GTX 660, which has roughly the same horsepower as the PS4 give or take about 10%.

            I don’t think the 7870 fares too well either, but that arch had driver issues if you recall… Took like 8 months before the high end cards were sorted out and started getting the performance they actually had the potential for. Not sure how long it took for the low end cards and if they needed to have somewhat different balancing due to the cutting down of resources that results in lower tier SKUs. Might be faster than a GTX 660 now like the 7970 GHZ wound up faster than the GTX 680 once the driver issues were sorted out. 100% brand new arch=driver team not fully groking how to get the best out of it. It’d happen to anyone; I don’t blame AMD for that.

            • HERETIC
            • 3 years ago

            YUP-The goalposts are moving at a reasonable rate.
            4 years ago the 7870 was considered the best value 1080 card one could get.
            and continued with the 270.
            When the GTX970 came out it was considered as the best value 1440 card.
            Today 1060/3GB that is slightly faster than 970 is considered as a 1080 card.

            BUT
            Laptops are all about compromise-everything is tweaked/restricted to reduce
            power and heat,and for casual gaming on a lappy that’s not a mobile desktop
            1050 looks like the perfect fit……………………………………..

    • mczak
    • 3 years ago

    “five Pascal SMs”.

    Errm, not quite. 768/128 != 5 – not even on if you calculate it on an old Pentium :-).

    6 SMs. Which means 48 TMUs, and just forget whatever number GPU-Z spits out.
    (FWIW 6 SM per GPC would be a new record, if the assumption holds that there’s only a single GPC – wouldn’t be unlike GM107 which also had 5 SMM per GPC whereas other maxwell chips had a maximum of 4 per GPC.)

      • RAGEPRO
      • 3 years ago

      Man, I did that math like seven times. Bruno and I even talked about it.

      You’re right, of course. I dunno what happened. Thanks for the heads-up.

        • sweatshopking
        • 3 years ago

        why the crap don’t you have gold yet?

          • RAGEPRO
          • 3 years ago

          That’s a little personal, but I’m not in the best financial shape, man.

            • sweatshopking
            • 3 years ago

            I assumed you’d get it because you were an employee, as the other employees generally had it. My apologies, I wasn’t trying to imply anything financial.

            • SomeOtherGeek
            • 3 years ago

            Hang in there! The way you are writing, things will brighten up.

    • tsk
    • 3 years ago

    The 1080ti will be right on time for Vega, poor AMD.

      • Vhalidictes
      • 3 years ago

      I’m not sure how much that will matter, given the current NVIDIA prices.

        • brucethemoose
        • 3 years ago

        The pricing will depend on Vega’s relative performance and pricing.

        • ImSpartacus
        • 3 years ago

        Maybe, but look at what happened to fiji. Two weeks before its release, Nvidia releases a cheaper non-titan gm200 that basically matches its performance blow for blow, except it’s priced at $650, not $1000. So fiji gets to enjoy exactly none of those high prices that has previously been in effect AND their product is deemed a failure in the eyes of the public because it was literally a couple weeks too late.

        Nvidia has lost a little market share. I’m wondering how threatened they feel. I’m fascinated to learn more.

          • VincentHanna
          • 3 years ago

          At the moment, I don’t think Nvidia feels threatened at all. MSFT has been moving to make gaming PCs more enticing with PlayAnywhere(tm) and Gamers are brawling over their most high end offerings because of the massive gains brought on by the new node. People who normally buy every other year, or every 3 years have flooded the market and it’s creating a bit of a bubble.

          As a result they can enjoy their higher margins and ignore AMD’s marketshare, since we are in the free-flowing milk and honey stage.

            • ImSpartacus
            • 3 years ago

            Yeah, that’s probably what’s going on.

            Even towards the end of 28nm, Nvidia waited for what like ages before doing the price on the 970. They simply didn’t have to – I mean, it’s not like it’s the single most popular gpu on steam hw survey for nothing.

            I’m wondering if Nvidia will stay strong and resist the urge to drop prices on stuff like the 1080 when vega releases. I mean, Nvidia could land a vega-beating 1080 ti at like $800 so the $600ish 1080 still doesn’t have to budge (unless a cut down Vega is competitive). Compared to a $1200 titan x, a 1080 ti delivering 90-95% of titan performance for 2/3 the cost almost seems like a good deal. I’m very interested in the next 6 mo.

Pin It on Pinterest

Share This