Rumor: the GeForce GTX 1050 will arrive in mid-October

Benchlife.info is reporting that Nvidia will be answering AMD's Radeon RX 460 with a Pascal-powered GeForce GTX 1050. If that site's sources are correct, the new GP107 chip on this card includes 768 stream processors across six Pascal SMs. A 128-bit path to 4GB of GDDR5 memory runs at 7GT/sec, giving the card as much as 112GB/sec of memory bandwidth. If Benchlife's GPU-Z screenshot is correct, the card will have 32 ROPs and 64 texturing units. The card may run at 1316MHz base and 1380MHz boost clocks.

Source: Benchlife.info

If Benchlife's information is correct, the GTX 1050 will have the same number of stream processors as the cut-down GM206 chip in the GTX 950. It'll just run much faster. It's also getting 25% more ROPs and texturing units than the Maxwell part before it. While this rumored GTX 1050 might take over the GTX 750 Ti's place in the product stack, it's quite a bit more capable than that mini-Maxwell. We'd expect this part to deliver performance somewhere between that of a GTX 950 and a GTX 960.

The rumored GTX 1050's 75W board power puts it in direct competition with the Radeon RX 460 that we reviewed not long ago. If these rumors are correct, the GTX 1050 will have twice as many ROPs as the Radeon and slightly more texturing resources. Beyond that, the two cards are pretty closely matched on paper. Expect a contentious battle in the budget graphics space if (or when) the GTX 1050 actually breaks cover.

Comments closed
    • DragonDaddyBear
    • 3 years ago

    If the rumors/leaks are true then this does not bode well for the inevitable comparison between the GTX 1050 and the RX460. The RX460 got its clock cleaned by hotter-clocked GTX 950’s. But was the RX460 not fully enabled? I hope AMD has something in the works because the RX460 doesn’t look good if the Green Team decides to price the GTX 1050 competitively.

      • DoomGuy64
      • 3 years ago

      Keyword being priced competitively. Doubt we’ll see that, because since when.

      The only good thing here is that it finally seems like a decent low end nvidia card that is capable of playing games @ 1080p. The catch is that if it sells for $200, the 4gb 470 would be a better buy. I don’t think this card is competing with the 460, nor will it be priced against the 460.

      Nvidia certainly has some weird mid range products this gen. What i mean by that is their products are selling at points that don’t make sense for the performance. The 1060 is priced as a 1440p card for example, which it’s not, and if you’re gaming @ 1080p, the 1050 seems like a smarter choice.

      I think Nvidia really needs a 1065, priced as the existing 1060, because the performance gap between the 1060 and 1070 is too large. The existing 1060 is currently just a poor value 1080p card. AMD has hit the mid-range price points much better this gen.

    • chµck
    • 3 years ago

    So both the 1050 and the 460 will fit easily into the <75W TDP of a thin gaming notebook.
    Even if the 1050 is a little faster, I’d still go with the 460 for freesync over gsync.

      • K-L-Waster
      • 3 years ago

      Really not seeing why you’re getting downvotes (this coming from a guy who usually ends up buying from team Green….)

        • DPete27
        • 3 years ago

        I don’t think Nvidia is using GSync modules in notebooks. They’re using the VESA adaptive sync in the mobile arena.

          • DrDominodog51
          • 3 years ago

          I think the two sentences were completely unrelated.

            • DPete27
            • 3 years ago

            Ah, makes more sense if you look at it that way. Thanks!

            • chµck
            • 3 years ago

            I forget that those of us who exclusively use notebooks as desktop replacements are in the minority.
            I would prefer the 460 in my next notebook so i can pay less for external freesync monitors.

      • Voldenuit
      • 3 years ago

      [quote<]So both the 1050 and the 460 will fit easily into the <75W TDP of a thin gaming notebook. Even if the 1050 is a little faster, I'd still go with the 460 for freesync over gsync.[/quote<] For running an external display, or for gaming on the laptop screen? Because both amd and nvidia async implementations on the laptop should be transparent to the user. It's not like you can easily swap out screens on your laptop after all.

    • Aranarth
    • 3 years ago

    BUT! What happens if these rumors are NOT correct?! 😀

      • Redocbew
      • 3 years ago

      Shock and dismay! What ever will we do?

    • tipoo
    • 3 years ago

    Apart from the top end which AMD wasn’t going to address soon anyways, Nvidias game this round seems to have been release an answering card only when AMD has shown their cards (literal and figurative), so the price and final configuration can match.

    And with the more efficient architecture and fab, it’s a wise game, in any given wattage they can outmuscle AMD by a fair bit without spending more die area. Meanwhile AMD can’t afford to play the waiting game so their hand is forced.

      • Voldenuit
      • 3 years ago

      Pricing maybe, but configuration?

      It takes months to years to architect, design, fab test, tweak and then commit to production runs. There’s simply not enough time to do any major tweaking outside of clock speeds, voltages, die harvest, etc . to respond to a competitor in a meaningful manner.

      We can see that AMD tried to turn the voltage+clockspeed dial at the last minute with the 480, and it didn’t gain them anything other than bad press. Nvidia’s had the final specs for the 1050 for months, and their OEM partners have been in the loop with pricing, RAM configuration and TDP for a while as well, so they can plan their product lines.

      Nvidia simply hasn’t had to respond to AMD in any meaningful measure at many, many price brackets, so they were free to launch and announce their parts to suit their schedule and their board partners’.

        • tipoo
        • 3 years ago

        >die harvest,

        Primarily what I meant, plus final pricing. Many of the lower end parts are die harvested higher end parts, maybe they left some flexibility in choosing how high to go and how much yield to trade off based on what AMD would do. I.e maybe they planned two options per tier or something. It takes years to architect a new GPU architecture, but like AMDs example shows (even if it went poorly for them), some changes can come in the final months. Just a thought.

      • DPete27
      • 3 years ago

      I think the game is all about what people remember. It’s about overshadowing your competitor:

      1) NVidia was the first ones out of the gate with the highest performing chips (likely this year) so they win that crown.
      2) Then they answer to each of AMDs product launches with a slightly better performing card that consumes less power in the same price bracket.
      3) AMD shoots themselves in the foot with the RX480 power issue, piss-poor reference cooler, and dismal supply, meanwhile NVidia outperforms AMD in each price bracket while consuming less power and has enough supply that customers can actually buy their products.

        • tipoo
        • 3 years ago

        That’s true. I think the importance of the flagships is understated, sure the 200 dollar 480 is the biggest selling price point, but I imagine the population of people going “Who makes the best GPUs right now? Oh, Nvidia? I’ll just buy their 200 dollar card” isn’t insubstantial.

          • ImSpartacus
          • 3 years ago

          I think you’re probably right. The halo effect is real.

          • DPete27
          • 3 years ago

          Nvidia has been playing the “steal AMDs thunder” card pretty heavily this generation. Remember right before the RX480 launch (for example) Nvidia leaked rumors of the GTX1060 to draw attention away from AMD.

          Hype your product just before the competition launches theirs. Then launch yours shortly after so your product is freshest in peoples mind.

            • stefem
            • 3 years ago

            Company do this kind of tricks since always, is not that AMD has been silent prior to launch the RX480 and I remember one of the head at Sapphire going as far as saying that the GTX 1060 was a rushed product (oh really? c’mon, you are selling the reference design RX 480! stop kidding!!) and it was also AMD to organize press meeting parallel (and sometimes in the in the opposite building too) to Intel’s IDF

          • blahsaysblah
          • 3 years ago

          Who makes the second best XXX, ill get the cheaper version of that?

    • appaws
    • 3 years ago

    When are we going to get to the point where low end discrete GPUs are going away…? I mean replaced by graphics on the CPU becoming good enough?

    Like the sound cards of yore being phased out for the average user by pretty decent audio stuff on motherboards.

      • Jeff Kampman
      • 3 years ago

      Even an RX 460 or a GTX 950/1050 is a major upgrade from the best integrated graphics processors out there from AMD or Intel. We’re a long way away from the point where high-quality gaming experiences at 1080p are something you just get for “free” when you buy a CPU.

        • ImSpartacus
        • 3 years ago

        And if Intel has anything to say about it, you’ll never get anything “for free” regarding great integrated graphics. Quite sad.

        • Chrispy_
        • 3 years ago

        Yeah, I was looking at the Skull Canyon NUC IGP results and the minute you move up from low-res HD-Ready gaming to Full 1080p resolution (and many gerbils would also consider 1080p to be “low-res”) many games become [i<]unplayable.[/i<] The [b<]very best that Intel has to offer[/b<] results in "unplayable" at the most common gaming resolution.

        • ptsant
        • 3 years ago

        They kind of have to be major upgrades, otherwise what’s the point?

        Anyway, the newly anounced 9800 APU from AMD will have 512 shaders at 1100 MHz, connected to DDR4. Admittedly, this is the best iGPU out there, but it is quite close to the 460 or 950. A certain gap must be maintained to justify the expense…

      • Airmantharp
      • 3 years ago

      They already are: the low-end discrete GPUs are gone. These are faster (relative to their current generation!) than those that were released before Intel and AMD got serious about integrated graphics.

      • K-L-Waster
      • 3 years ago

      You could make a case for saying it’s already happening. NVidia used to sell “20” and “30” level cards as the entry level. Now they are at the point where it starts at the “50” level. That used to be lower-mid-range, but it has become entry level because anything lower than that is effectively obsolete due to IGP.

      In one or two CPU generations it would not surprise me if the “50” GPUs disappear and discrete cards start at the “60” level.

        • Leader952
        • 3 years ago

        Very valid points.

        The term low-end is a moving target.

        As long as low end discrete is more powerful (speed/features) than Integrated graphics it will always exist.

          • K-L-Waster
          • 3 years ago

          I don’t think it’s that cut and dried — I do think that dGPU is going to diminish over time.

          dGPUs will always be capable of more than an IGP. However, as IGPs get more capable we are going to find more and more that IGP is “good enough” for an ever growing slice of the market. That in turn will mean that dGPU vendors will need to make their money off a declining customer base, which eventually will make the market for affordable dGPUs non-viable. At that point, only extremely expensive professional cards will remain.

          As a parallel, consider the camera market. Professionals have and will continue to use expensive cameras (how much does a DSLR cost? Well, depends, how many lenses do you need..?) However, the consumer market for dedicated cameras has shrunk to near invisibility: point and shoots have almost entirely been eaten up by smart phones because smart phone cameras are “good enough” for the people who used to buy point and shoots.

          • JustAnEngineer
          • 3 years ago

          Don’t buy any crappy graphics card less capable than a Radeon RX-460 or a GeForce GTX950. Anything less than that isn’t enough better than integrated graphics to bother with and it won’t provide an acceptable gaming experience in most games.

            • CaptTomato
            • 3 years ago

            I consider the 460 overpriced for it’s terrible performance.
            Not sure why there’s such a huge gulf between 460 and the fairly powerful 470, but 460 makes no sense given it’s price/perf.

      • tipoo
      • 3 years ago

      When Intel spreads more of the yummy eDRAM love

        • Leader952
        • 3 years ago

        Intel’s CPUs with eDRAM are very expensive parts to make and buy.

        If that is what it takes to compete with low end GPUs then low end GPUs will always exist.

          • tipoo
          • 3 years ago

          Do we actually know it’s as or more expensive to make than an equivalent dedicated card, or do we just know it’s priced (nearly) as high?

          They use an N-1 process for the eDRAM, so their left over 22nm capacity…I’m not saying it’s cheap, but eDRAM probably has a fair bit higher yield tolerance than GPUs (disable bad memory cells and still come to 128MB, vs path critical GPU components not working)

      • Bauxite
      • 3 years ago

      That intel gpu isn’t “free”, hell on some configurations it is like 2/3rds of the die. Whats sad is in how many systems it sits unused, and you are paying intel’s [s<]fully operational monopoly[/s<] market dominance premium for it. Imagine the extra cores or how much cheaper it could be, hell I'd settle for a pure big fat L3 cache variant. (not the gravy add-on edram either) I'd much rather buy my gpu transistors from amd or nvidia, far better (bang+features+support)/$.

        • Airmantharp
        • 3 years ago

        You’re right, which is probably why you’re being downvoted.

        However, I will say that I do use the integrated video for secondary screens on my desktop, and the hardware video codec and transcoding support should be of use to some.

        • tipoo
        • 3 years ago

        There used to be Xeons that were pretty much their Core i brethren without the IGPs and socket compatible, but Intel put a stop to that.

      • Leader952
      • 3 years ago

      [quote<]When are we going to get to the point where low end discrete GPUs are going away[/quote<] How about never. This idea of integrating GPUs first into chip sets and then into CPUs would kill off low end GPUs has been stated year after year for well over 15 years and each year it turns out wrong again. An integrated CPU/GPU has a set power budget whereas a combined discrete CPU and a discrete GPU always has a higher one. Stating the obvious that will always have the discrete solution being higher performance.

      • swaaye
      • 3 years ago

      Most people are using IGPs and that’s how it’s been for a long time now.

      I imagine most of these discrete cards go to those OEM machines that need some kind of discrete card. Those machines that advertise at underinformed gamers on a budget. Machines that need more outputs. Whatever.

      Some probably sell to the consumer add-in-board market to people who for some reason just can’t save up a bit more money for a more useful gaming card.

        • Airmantharp
        • 3 years ago

        Workstations.

        Xeon CPUs, otherwise transistor equivalent to desktop i7s (and those 6+ core i7s based on Xeons) do not have graphics cores and thus usually ship with some extremely stripped-down Quadro or FirePro that’s likely not any faster than the better Intel solutions for most purposes.

        If/When Intel decides to start pushing pro graphics drivers (CAD certification and the like), we may even see this trend go away, at least for those workstation parts that are the same silicon as consumer i7s.

          • DrDominodog51
          • 3 years ago

          Some Xeon E3s do have integrated graphics. The problem is the drivers as you said. Intel’s drivers on Windows make AMD’s open source Linux drivers look good.

            • Airmantharp
            • 3 years ago

            It clearly depends on your intended app, but for most desktop stuff and light gaming (LoL) I find Intel’s support satisfactory.

            Getting their drivers certified for CAD (and whatever Nvidia/AMD do) for workstation use would drive the near final nail in the low-end discrete market.

            Stuff doesn’t have to be fast, it just has to be stable and compatible.

      • crabjokeman
      • 3 years ago

      Nvidia’s already abandoned low end discrete GPU’s (unless you count rebadging old Fermi/Kepler chips).

    • DPete27
    • 3 years ago

    Gawh! The 6-pin-less GTX950 was shorter lived than I expected….

    BTW: If this generation’s history is anything to go by, having a 75W board power doesn’t put it in performance contention with the 72W RX460 unless NVidia’s performance-per-watt advantage somehow fizzles out at the bottom end.

      • derFunkenstein
      • 3 years ago

      It’ll definitely be interesting to see, but I agree with you that performance will beat the 460. I also think it’ll be priced higher, like we’re looking at something that will set a new high-water mark at the 1080p / $150 range but not challenge the 460 on price.

      edit: totally rewrote this when I realized it’ll probably cost more than a 4GB 460.

        • DPete27
        • 3 years ago

        Yeah, for sure the GTX1050 will be priced according to the market performance/price curve. I don’t see them looking to undercut themselves just to beat AMDs RX460 when the GTX1050 will be the fastest 6-pin-less card available. That pretty much sells itself.

Pin It on Pinterest

Share This