Rumor: Nvidia to launch GTX 1080 and GTX 1070 at Computex

Nvidia's Tesla P100 is certainly an impressive bit of kit, but we'd wager more than a few gerbils were disappointed that the company didn't announce consumer graphics cards based on the Pascal architecture. If a report by Digitimes is correct, however, green-team gamers everywhere could soon have reason to rejoice.

According to "sources from graphics card players," Nvidia will unveil a consumer Pascal chip to the public at Computex 2016, in the form of its GeForce GTX 1080 and 1070 cards. Digitimes says that card makers will fire up mass production of Pascal-based GeForces during July. Asus, Gigabyte, and MSI are among the players expected to show cards at Computex.

The site also says Nvidia is already clearing out its inventory of current-generation GPUs in order to prepare for the impending launch. As far as the red team is concerned, Digitimes says that AMD is likely to reveal its Polaris GPUs later in the year.

The same industry sources also point out that the graphics card market remains in weak demand, and that most manufacturers are bracing for a 10% drop in shipments in the second quarter of 2016. It's not all doom and gloom, though—the site thinks that with the advent of VR, high-end graphics cards ought to sell well, offsetting the overall drop in shipments.

Comments closed
    • Chrispy_
    • 4 years ago

    [quote<]graphics card market remains in weak demand[/quote<] 1) All AAA games are pre-order traps consisting of a buggy release that barely functions, whilst the interesting indie games still run fine on a 9800GT 2) [i<]somebody[/i<] divided the gaming monitor market in two and thus stifled adoption of the ONLY THING that could increase demand for new GPUs - higher-refresh, higher-resolution gaming. I'm not sure how much damage Gameworks has caused to the AAA games industry and G-Sync has caused to the monitor market, [b<]but I'm confident it's non-trivial.[/b<]

    • anubis44
    • 4 years ago

    It’s funny that this article says nVidia will be first to market with consumer cards, when this guy says just the opposite:

    [url<]https://www.youtube.com/watch?v=_U8c1_yWS10&feature=em-subs_digest[/url<] According to him, GDDR5x is still only sampling, not shipping to anybody, so cards based on GDDR5x aren't likely to be released this summer.

    • AJSB
    • 4 years ago

    …and talking about VULKAN and NVIDIA…NVIDIA just launched a NEW driver BOTH for LINUX *and* WINDOWS with lot’s of VULKAN improvements, and i quote:

    “This updated NVIDIA Vulkan Linux driver brings compliance with Vulkan v[b<]1.0.8[/b<], improved pipeline creation performance and [b<]multi-threaded scaling[/b<], the maximum [b<]bound descriptor[/b<] sets have [b<]doubled[/b<], there is now support for an [b<]asynchronous transfer queue[/b<], improved VK_EXT_debug_report messages, improved Vulkan support on Optimus systems, and a variety of bug-fixes" [url<]http://www.phoronix.com/scan.php?page=news_item&px=NVIDIA-364.16-Linux-Vulkan[/url<] NOTICE: These drivers are NEW and were NOT the ones used in the test that i posted about it previously. In LINUX test, was used driver 364.12, the new driver is 364.16. In Windows 10 test, driver used was 364.72, the new driver is 364.91.

      • chuckula
      • 4 years ago

      Yes. The clear win over DX11 in windows shows that even Nvidia, which has been accused of not being ready for newer APIs, shows a nice peformance boost when software is written properly.

        • Deanjo
        • 4 years ago

        [quote<]which has been accused of not being ready for newer APIs[/quote<] Who's the dumba** making that claim? If anything, it is usually the exact opposite given their history.

          • chuckula
          • 4 years ago

          [quote<]Who's the dumba** making that claim?[/quote<] Oh, they are around, believe me.

            • AJSB
            • 4 years ago

            Those accusations, so far, are all about NVIDIA and it’s Async Compute issue in Dx12.

      • Deanjo
      • 4 years ago

      And as usual, AMD’s linux support is in the crapper.

      [url<]http://www.phoronix.com/scan.php?page=news_item&px=Radeon-VLK-Windows-Linux[/url<] And that has absolutely nothing to do with MS crippling support under windows. Just AMD's typical crap non-Windows support.

        • AJSB
        • 4 years ago

        If you “seriously” play games in LINUX , you have to use NVIDIA dGPUs.
        This is not new, it’s known for YEARS and YEARS.

        In my post about crippled performance, i was talking about OpenGL in Windows (not in LINUX) that affects BOTH AMD *and* NVIDIA.

          • Deanjo
          • 4 years ago

          And again, openGL benchmark that I have ran in the past from Windows XP though to Windows 10 vs linux using nVidia’s drivers have consistently shown within 2-3% difference between Windows and Linux. Sometimes linux is slightly higher, sometimes windows is. [b<]It really depends on the individual application.[/b<] Nothing has changed with that between windows versions. I have been benching the two consistantly throughout the years going back to when there were just a hand full of Loki games. Your MS and openGL conspiracy does not exist.

        • anotherengineer
        • 4 years ago

        Well fix them then 😉

        [url<]http://www.amd.com/en-us/who-we-are/careers/professionals[/url<]

          • Deanjo
          • 4 years ago

          Can’t fix something they keep cutting resources on. Besides, the summer interns need something to do.

        • kc77
        • 4 years ago

        I wouldn’t let that test deceive you. In fact it was noted in the comments. What you are looking at is AMD new open source backend vs completely closed source performance in that test. For people who have been following AMD’s performance on Linux that test is actually quite good.

        I’ve used nothing but nvidia when it came to Linux for years (even now). But a couple of run-ins I’ve had with AMD’s open source drivers have been quite positive. If using open source drivers is your thing AMD is actually doing MUCH better than Nvidia here.

        [url<]http://www.phoronix.com/scan.php?page=article&item=nouveau-linux46-kepler&num=2[/url<] This is what the performance delta is for AMD. [url<]http://www.phoronix.com/scan.php?page=article&item=cat-rad-amdgpu&num=2[/url<] Granted nvidia performance still is better in Linux than AMD for it's closed source driver (which is why I'll probably still give AMD another generation before I switch...(I might not though considering the games I play) but considering it's open source driver barely supports Maxwell at all I think it's a little unfair to say that AMD hasn't been showing support for Linux.

          • Deanjo
          • 4 years ago

          The open source drivers look great only compared to AMD’s own crap catalyst driver. Compared to the nvidia blob however it is just an embarrassment to see 2-3 gen old midrange nvidia cards running the blob laying a smackdown on AMD flagships.

            • kc77
            • 4 years ago

            If that’s what you think those numbers show. As I stated before, AMD’s work hasn’t been with it’s proprietary blob all of it’s efforts have been towards open source efforts which is why their open source driver supports their latest cards and the Nvidia ones are still working on Maxwell support.

            • Deanjo
            • 4 years ago

            Nvidia isn’t even working on nouveau, that is an entirely different team. Nvidia rather concentrate on giving a full featured, stable and performant driver rather than divide resources developing an open source one that will not be able to match their proprietary one in features or performance due to various reasons. Because of nvidia going that route, they have been able to deliver the driver in a timely manner that is equal to their windows counterpart. Linux isn’t given a “second tier” effort from nvidia. AMD on the other hand has a very small (and ever changing) crew working on their drivers and can’t benefit the same as Nvidia can by using the bulk of the codebase cross platform.

            Let’s put it this way, when I spend several hundred dollars on a new card, I don’t want to wait 2-3 years for it to catch up (if it ever does) to the performance of the windows drivers. This is AMD’s third crack at an open driver since acquiring ATI and promising far better linux support. Plus I also want to have drivers for more than just linux, I want proper BSD support as well.

            • kc77
            • 4 years ago

            Didn’t say they were which is the point. Most of the success of that driver was the result of community reverse engineering not because Nvidia was trying to be helpful. Let’s not even talk about Wayland which Nvidia’s driver doesn’t even support. If you want Wayland support then you are stuck with nouveau, which means you’ll get performance similar to what I linked above and you won’t have support for Maxwell.

            This is the problem I’m highlighting. It’s problematic no matter how you slice it. It’s great that Nvidia’s proprietary driver works great…. just as long as you work within Nvidia’s parameters. If you don’t you’re SOL which is the problem. Not supporting Wayland is not being helpful.

            • Deanjo
            • 4 years ago

            Wayland is still pretty much a curiosity point at this time, which is why nVidia has not gone gung ho on it (as well as the leading distro is going another route with its own solution). Xorg however is supported by everyone, that being said, nVidia has started supporting Mir and wayland with the 364.12 driver release.

            • kc77
            • 4 years ago

            That driver release won’t work. It only works on Wayland EGL which hasn’t been merged mainline (as in support only exists through Nvidia patches).

            It’s much more than a curiosity since Nouveau works as does AMDGPU with no patches required. Fedora is going to lead with Wayland support on F25. So it’s much closer than you might think at least for Nouveau and AMDGPU users.

            As for Mir that’s mainly because of Mint in terms of popularity. I don’t think there’s an official word on whether they will go with Mir or Wayland. If they go with Wayland then there will be far more users of it than Mir.

            • Deanjo
            • 4 years ago

            Fedora is planning on using Wayland for the [u<]login manager[/u<] by default if the drivers supports it, the desktop is still X.org as there is still a ton of work to be done such as getting all the more popular toolkits to support Wayland. It's not only the drivers that is holding back mainstream adoption of Wayland. (BTW, Ubuntu and Mint have said they are going Mir in the past, not Wayland).

            • kc77
            • 4 years ago

            Fedora already uses Wayland for GDM. GTK supports it. I don’t know about QT. They have been working on it but they are no where near where the gnome developers are.

    • southrncomfortjm
    • 4 years ago

    “Nvidia is ready to announce its Maxwell-based Pascal graphics cards at Computex 2016 from May 31-June 4,”

    Maxwell-based Pascal? What? Are they just confused or are we just getting a node shrink in the 1080 and 1070 with the same basic architecture?

      • Sabresiberian
      • 4 years ago

      Other sites report the 1000 series GTX cards will be based on GP104, so I think Digitimes is confused. If course it’s all rumor at this point and I wouldn’t be surprised by anything.

        • nanoflower
        • 4 years ago

        Or they are following the other rumors that Pascal is essentially an enhanced Maxwell chip. Supposedly Pascal was originally planned for 20NM but when that failed to pan out they took out the compute features and came out with Maxwell. So what we are getting now is the original Pascal plan or an enhanced version of Maxwell, or at least that is the rumor.

    • AJSB
    • 4 years ago

    Meanwhile, after receiving latest patches, here is a comparison between Dx11, OpenGL and VULKAN, in Windows 10 and LINUX:

    [url<]http://www.phoronix.com/scan.php?page=news_item&px=Win10-Linux-Vulkan-Early[/url<] In conclusion, this is performance ranking: 1) VULKAN under LINUX 2) OPENGL under LINUX 3) DIrectX 11 in WINDOWS 4) OPENGL in WINDOWS Notice that when this game started to support VULKAN in LINUX, using VULKAN was slower than OpenGL just like this game was slower using VULKAN in comparison to Dx11 under Windows 10. HOWEVER, like i said, latest patches for this game, patches for video drivers from NVIDIA for both OS, and a total of 9 (NINE !) Vulkan "updates" to the specification (they are NOT true updates, but actually DOCUMENTATION clarifications), are making VULKAN now CRUSHING all others. Notice also that this game is a kinda of a One_Man_Show and Vulkan implementation was initially rushed and the game when using VULKAN is not FULLY using VULKAN, but, AFAIK, still using some OpenGL stuff. Notice also that very poor performance of OpenGL under Windows 10 is normal...OpenGL was sabotaged by Microshaft starting from VISTA inclusive to make it look bad and promote Dx. In XP, OpenGL performance was better than Dx, at least in games i tried.

      • Klimax
      • 4 years ago

      OpenGL in Windows is implemented dominantly by GPU vendors not Microsoft. (Also use of malformed words don’t give you much credibility either)

      No DirectX 12. Strange. But then, I wouldn’t be surprised if there were other anomalies in tests to try to get certain results…

      And frankly, there is not much of reason to care about low-level APIs. Any performance boost there are temporary in nature and will be lost. (best case, worst case is negative gains)

        • AJSB
        • 4 years ago

        WRONG !
        MS with VISTA changed the graphics paradigm in such a way that, besides other objectives, completely screwed OpenGL performance in VISTA and beyond !

        I had/have dGPUs from AMD/ATI and NVIDIA and same OpenGL game that had excellent performance in XP , looses performance in anything after it, starting with VISTA all the way up to W10…are you trying to make me believe that in ALL THESE YEARS , AMD/ATI/NVIDIA didn’t managed to make a decent OpenGL driver when previously they could ? BULLSHIT !

        Even in modern cards, like GTX 950 or 960 that have drivers for all OS including XP, try to run ANY OpenGL game (compatible with XP) in XP or LINUX (if also compatible with it) and then try to run it in VISTA/W7/W8.x/W10…

        In the test there’s no Dx12 comparison because that game only supports OpenGL and Dx11 with now VULKAN added….it’s a very small dev, not enough resources to implement VULKAN *and* Dx12 that anyway only could run in Windows as opposite to VULKAN that works in LINUX and Windows
        So, please spare us about trying to say that is a distorted test because it isn’t.
        Like i said, VULKAN was tested also at day one after launch and was performing worse thatn Dx11 at that time and NO ONE TRY TO HIDE IT…but with the new improvements that i mentioned, NOW VULKAN is kicking arse, but more important, LINUX performance (when using NVIDIA dGPUs) is actually quite good over Windows.

        Your comment about no need for low level APIs is also so hilarious that i won’t even bother to answer except to say that you sounded just like someone desperate to play down the test :))

          • Deanjo
          • 4 years ago

          It used to be that if you used the downloaded drivers from MS, you would use their implementation of openGL. If you used the downloaded drivers from the vendor it would use the vendors openGL implementation.

            • AJSB
            • 4 years ago

            No, actually it’s the way i say:

            I NEVER download video drivers from MS for Windows, I ALWAYS download directly from AM/ATI and/or NVIDIA and i have a good enough amount of OpenGL games to test my claims and have no problem in format drive and make a fresh install of any OS to check these claims….i made so much tests that sometimes a single rig had drive reformatted twice per day during 1-2 weeks.

            I do precisely that already for YEARS and the results are consistent…just like doesn’t matter if dGPU is from AMD/ATI or from NVIDIA (or a APU from AMD), it also doesn’t matter if CPU is Intel or AMD, again, results are consistent.

            • Deanjo
            • 4 years ago

            Every openGL benchmark that I have ran in the past from Windows XP though to Windows 10 vs linux using nVidia’s drivers have consistently shown within 2-3% difference between Windows and Linux. Sometimes linux is slightly higher, sometimes windows is. [b<] It really depends on the individual application[/b<]. Nothing has changed with that between windows versions.

          • Klimax
          • 4 years ago

          Wronger and wronger. Also without any evidence whatsoever. Not even Valve managed to show it. And they tried hard to get there.

          ETA: Mantle already proved that. Battlefield 4 versus Tonga…

    • jb100r
    • 4 years ago

    What is the difference between a 1080 and a 1070?

      • Prestige Worldwide
      • 4 years ago

      Specs are mostly speculation atm. I imagine the 1080 is the full GP104 chip and the 1070 is cut down. Not sure if there will be any other difference, hopefully we don’t see another “3.5gb” debacle with the 1070.

      • ronch
      • 4 years ago

      One is good for 1080p gaming and the other is good for 1070p gaming.

      • yogibbear
      • 4 years ago

      The number 10? I don’t think it’s actually confirmed anywhere that these are the names of the cards.

      • Chrispy_
      • 4 years ago

      A 1070 is when you don’t quite nail the landing of a 1080 cleanly.

        • jihadjoe
        • 4 years ago

        Tony Hawk Pro Skater!

    • albundy
    • 4 years ago

    que in all the false hits on google for 1080p.

      • jihadjoe
      • 4 years ago

      If you did try searching for “GTX 1080” you’ll see there isn’t even a single false hit.

      Although I do recall the time when short/common words were excluded, resulting in that hilarious error when searching for “The Who”, but since then Google has gotten really good with its algorithms. Nothing but awe and respect to the logic and math wizards behind it all.

        • ronch
        • 4 years ago

        Give it time.

      • Anomymous Gerbil
      • 4 years ago

      Que?

        • southrncomfortjm
        • 4 years ago

        No entiendo.

    • anotherengineer
    • 4 years ago

    “The site also says Nvidia is already clearing out its inventory of current-generation GPUs in order to prepare for the impending launch. ”

    Hmmmm *checks gtx 970* (cnd) and

    $556 hmmmmmmmmm is that a clearance price on a 970??

    [url<]http://www.newegg.ca/Product/Product.aspx?Item=N82E16814121912&cm_re=gtx_970_4gb-_-14-121-912-_-Product[/url<] ($480+12 shipping+13%tax) edit oooo found a refurbished one for about $460 after tax [url<]http://www.newegg.ca/Product/Product.aspx?Item=N82E16814487207&cm_re=gtx_970_4gb-_-14-487-207-_-Product[/url<]

      • evilpaul
      • 4 years ago

      I got an open box 970 at Microcenter for ~$300 including tax about a year ago. It doesn’t suffer from the infamous “coil whine” problem, either.

        • nanoflower
        • 4 years ago

        I’ve seen 970s over on jet.com going for about $270 if you take advantage of their first time customer savings code.

      • nanoflower
      • 4 years ago

      What they mean is that supposedly Nvidia has no more orders placed for the chips that go into the 970/980/980TI so the current inventory will be shrinking over time. However that should take at least a couple of months to happen.

    • Nike
    • 4 years ago

    “Nvidia’s Tesla P100 is certainly an impressive bit of kit”

    In many ways, yes it is, but in one way it’s not. Nvidia has now stated the memory bw to peak at 720 GB/s, but originally they claimed Pascal would be hitting 1 TB/s in the mem bandwidth department and would come with 32GB mem onboard.*

    *Source: [url<]https://techreport.com/news/29591/jedec-updates-hbm-standard-with-bigger-stacks-and-faster-speeds[/url<]

      • Leader952
      • 4 years ago

      Only 4GB HBM2 stacks are available NOW.

      When 8GB HBM2 stacks get released and yields improve on the GP100 there probably will be refresh and then they will hit those marks.

      • the
      • 4 years ago

      Capacity as launch and when a product is eventually displays are two different things. For example, AMD’s Hawaii based Fire Pro cards initially shipped with 8 GB of memory but now you can get them with a whopping 32 GB on board.

      I see memory increases as a common mid-product cycle (aka rebrand) event a year or so after the new GPU launches this year.

    • nanoflower
    • 4 years ago

    LOL. Can we add a bit more to the rumor mill? After all AMD has shown working silicon for Polaris 10 and 11 back at the first of the year while Nvidia hasn’t shown anything yet in the way of consumer versions of Pascal (have they even shown working versions of their HPC Pascal?) yet we have the rumor mill saying Pascal will show up this summer and Polaris will show up later in the fall? Maybe it will play out that way but my bets is either both companies release products this summer or AMD beats Nvidia to the market with new consumer GPUs.

    It’s funny that just a few weeks ago this same site (not TR) posted that Polaris was coming in June. Now they say Pascal is coming in June and Polaris won’t show up till the Fall. Something feels a bit off in the reporting.

    • Tirk
    • 4 years ago

    Well if they release Pascal in June, that is well before any company has so far announced ramp up of GDDR5X.

    I hope you all like good old GDDR5 in your “miracle” GPU.

      • chuckula
      • 4 years ago

      It’ll be just as good at the GDDR5 that AMD is slapping into every model of Polaris.

        • Tirk
        • 4 years ago

        I absolutely agree that Polaris will use GDDR5, AMD even confirmed that, so that’s not really anything to be surprised about.

        There is absolutely nothing wrong with using GDDR5 but the hype train was so strong with Pascal that the claimed “miracle” GPU from Nvidia wouldn’t dare using GDDR5 according to the fanboys.

        I’d love the world to turn on end and all technology advances to spew forth all at once, I just like technology to be based a little more on science than faith.

          • ImSpartacus
          • 4 years ago

          Do you have the confirmation that Polaris is gddr5-only?

          I was sorta hoping that Polaris 10 would get gddr5x.

            • Tirk
            • 4 years ago

            Later models might, but if its releasing in June it is not. Earliest ramp up of GDDR5X production is this summer, if anything comes out with GDDR5X this year it’ll be at the end. No company can produce a product that has a component that hasn’t been produced yet.

            But hey if they prove me wrong I’ll be glad 🙂

      • bfar
      • 4 years ago

      That’s a fair point but If it means prices won’t go through the roof, I won’t get over excited about it.

      • tviceman
      • 4 years ago

      Nvidia may opt for risk production, lower speed GDDR5X (10gbps) with “first gen” Pascal parts. That may line up just right for a July/August launch. Micron has been sampling GDDR5X and announced months ago they were hitting 10gbps, on pace for 14 by the fall.

        • Tirk
        • 4 years ago

        That doesn’t sound like it’d be worth it but if they convinced enough to buy it at an exorbitant price I’m sure loyalists would fall for it hook line and sinker.

        Maybe Pascal is massively memory constrained and its worth the risk, but if it is wouldn’t you want to wait for it to be paired with a more robust memory?

      • cygnus1
      • 4 years ago

      You don’t think we’ll see it launched with HBM2 like on the Tesla P100 version?

        • Tirk
        • 4 years ago

        That would be a nice surprise but I doubt it, although they did say Pascal was made from “miracles” so who knows 😉

        • nanoflower
        • 4 years ago

        The Titan versions of Pascal will probably get HBM2 but I’m not sure any other version will get HBM2. It’s likely that we won’t see the Pascal Titan versions at the end of year (along with AMD’s Vega competition.) Remember it seems that HBM2 production is going to be tied up with HPC and Deep Learning products till after summer so any products based on HBM2 is delayed till the Fall. Also it’s unclear if HBM2 would be needed for anything other than the flagship products.

      • Klimax
      • 4 years ago

      It seems that GDDR5X is actually in full production for some time. (Not big change from original GDDR5 so it shouldn’t be too hard to ramp up)

        • NTMBK
        • 4 years ago

        Interesting, I hadn’t seen that. Any links?

          • Klimax
          • 4 years ago

          Guru3D:
          [url<]http://www.guru3d.com/news-story/rumor-amd-radeon-490-and-490x-polaris-graphics-cards-launch-end-of-june.html[/url<] But as far as I can say, nobody really knows outside of those who work with Micron directly. Given changes described in Anadtech article, I would think it shouldn't be too hard to ramp up.

            • Tirk
            • 4 years ago

            That seems completely counter to what Micron itself says about its GDDR5X ramp up time table:

            [url<]https://www.micron.com/about/blogs/2016/february/gddr5x-has-arrived[/url<] In the link it specifically states mass production to begin in the Summer. The only thing they say are being produced now are components that are working and close to full performance, that is far from full production. Now AMD and Nvidia could use these early chips but that would seem to be a highly risky proposition with very little return rather then waiting for the fully performing chips.

            • Tirk
            • 4 years ago

            And here’s another Micron link confirming GDDR5X is not in full production yet:

            [url<]https://www.micron.com/products/dram/gddr/8Gb#/productname%5B%5D=GDDR5X[/url<] Note it mentions sampling as availability. This is compared to GDDR5 which all say production as availibility: [url<]https://www.micron.com/products/dram/gddr/8Gb#/productname%5B%5D=GDDR5[/url<]

            • Ninjitsu
            • 4 years ago

            Well it seems they’ll hit production this summer, so if Pascal is announced in June then it’ll probably ship in July/August – should give AIB partners enough time to add in GDDR5X.

            [quote<] This past week Micron has quietly added its GDDR5X memory chips to its product catalogue and revealed that the DRAM devices are currently sampling to partners. The company also disclosed specifications of the chips they currently ship to allies and which [b<]potentially will be mass-produced later this summer.[/b<] As it appears, the first samples, though running at much higher data rates than GDDR5, will not be reaching the maximum data rates initially laid out in the GDDR5X specification. [/quote<] [url<]http://www.anandtech.com/show/10193/micron-begins-to-sample-gddr5x-memory[/url<]

            • Tirk
            • 4 years ago

            That’s some fast turnaround based on a lot of ifs. GDDR5X production could ramp up in August and still technically meet Micron’s summer production mark. That would throw pascal’s release towards the end of year which could happen……… and I’d rather trust Micron’s statements on the matter vs. Anandtech. Who knows maybe they’ll dupe consumers and use the non-full speed GDDR5X with 10GBps. I mean it’d be silly to just use GDDR5 with 8GBps that’s already in production right?

            Still sounds like the better bet is on cards coming out this year will have GDDR5.

      • jihadjoe
      • 4 years ago

      Well now we know exactly what’s going to come with the refresh.

      • bfar
      • 4 years ago

      I know there are a few guys here interested in compute, and they have particular needs. But given that it’s the gaming orientated GP104 we’re looking at here, the rest of us will probably be happy with a good power efficient GPU that performs favorably against last gen and Polaris and isn’t going to be obscenely priced. If good old GDDR5 allows for that, then so be it.

    • yogibbear
    • 4 years ago

    [quote<]The same industry sources also point out that the graphics card market remains in weak demand, and that most manufacturers are bracing for a 10% drop in shipments in the second quarter of 2016[/quote<] Isn't this just because everyone is waiting on Pascal to drop? It's already overdue based on previous release cycles by 1-2 months and we don't even have a date yet. Why would anyone buy a graphics card at the present moment when everyone knows these new cards are going to drop soon, or at least a date, so either heavy discounts on existing cards will come in to play or you wait.

      • Ninjitsu
      • 4 years ago

      [quote<]so either heavy discounts on existing cards will come in to play[/quote<] Well the article does mention inventory clearing, so hopefully the prices will drop. I wouldn't mind a 960 on the cheap, for example.

    • Deanjo
    • 4 years ago

    /me waits for pascal Titan

      • bfar
      • 4 years ago

      If you’re already running anything based on Fiji or GM200, I’d actually agree. If you’re running anything older, you’ve either been waiting a long time already, or you probably don’t care for expensive GPUs.

        • cygnus1
        • 4 years ago

        Yeah, I’m in that camp of waiting a long time. My dual GTX 760’s (not intentionally dual, got a 2nd one for free due to shipping error) are still holding up ok. But I do want to snag a one of the 34″ 21:9 VRR monitors and I’m thinking a newer GPU would be needed to do it justice. I’m also really hoping that nVidia embraces FreeSync and makes it a 2nd option with G-Sync. Considering they’ve already supported the eDP standard in mobile I bet the hardware will support it, just needing them to enable it in drivers.

        • Deanjo
        • 4 years ago

        Quite wrong actually on your hypothesis. I do not game, I use them for computing performance and when it comes to FP64, nothing has been released yet that can replace my current original Titans.

          • anotherengineer
          • 4 years ago

          Pfffff Deanjo, don’t like doing fast Fourier transforms longhand?? 😉

            • Deanjo
            • 4 years ago

            They just take too damn long on the slide rule.

            • anotherengineer
            • 4 years ago

            lol

            True

            Need a wiki link for the young’ins who have no clue what you’re talking about 😉

            [url<]https://en.wikipedia.org/wiki/Slide_rule[/url<]

            • Deanjo
            • 4 years ago

            I’m not kidding!!!! 😛

            [url<]http://s23.postimg.org/quafdzui3/IMG_2749.jpg[/url<] (I seriously do still use a slide rule. Keep a few around (in the shop, the toolbox, the combine and tractor), batteries never run out on them. The other item I still use on a regular basis is the Pocket Ref. [url<]https://www.amazon.ca/Pocket-Reference-Edition-Thomas-Glover/dp/1885071620[/url<]

            • anotherengineer
            • 4 years ago

            lol nice

            Going on 40 and I was not subjected to the slide rule in school, dad had a few and kinda showed me how to use it when I was a kid but cheap calculators were out by then so……….

            Pocket ref is nice, slapping together an amazon order now, might throw one in the cart.
            Thanks

            • Deanjo
            • 4 years ago

            Calculators were common when I was young as well (only 43 but the calculators took a crapload of batteries). My father used to use one when I was knee high and just picked it up from him. In fact I remember when the school got it’s first Apple II one of the educational programs it had was how to use a slide rule and read callipers.

            I think you’ll like the pocket ref. Always found it handy (especially when you are out of internet range).

            • anotherengineer
            • 4 years ago

            Calipers, learned how to read those from a young age, dad was a machinist. I remember he used to have a Starrett measuring tape graduated in 64ths whole way (only about a 10ft tape) and I had to help him with carpentry and he used that tape. God it was a happy day when it died lol.

            • Deanjo
            • 4 years ago

            Ya I was pretty young too learning that stuff. Dad was a construction/IA teacher and farmer. During the downtime we were always either building some fine cabinetry for someone or putting together a house/garage. His father was also a farmer but skilled in carpentry. The cupboards that he built on the old farm house are amazing. Perfectly straight cuts with a handsaw.

            • JustAnEngineer
            • 4 years ago

            I’ve still got an HP calculator in my desk, but these days, there’s an app for that, and my phone is always in my pocket.
            [url<]https://play.google.com/store/apps/details?id=uk.co.nickfines.RealCalcPlus[/url<]

          • bfar
          • 4 years ago

          That’s a very fair point, and I’ve been looking at it through the lens of a gamer. Those of you using your GPUs for compute obviously have particular needs. We can assume GP104 isn’t of much interest to you guys.

        • Klimax
        • 4 years ago

        Don’t know about Deanjo, but I am still on original Titan. And primary reason for long time for no upgrade was 17′ monitor; now I have 4K Dell.

          • Deanjo
          • 4 years ago

          Still using the Titan with tri-Asus PB278Q’s. It will be a while until I upgrade to tri-4K displays.

      • the
      • 4 years ago

      I wouldn’t expect that before 2017 and it may not even arrive until 2018.

      • Klimax
      • 4 years ago

      Same here. Maybe I’ll get meanwhile 980Ti from eBay while waiting. (After Pascal their prices should get quite nice cut…)

    • bfar
    • 4 years ago

    So GP104 and Polaris won’t be monster chips, they’re releasing within a couple of months of each other, and GPU vendors need to deal with a fall in shipments?

    Call me optimistic, but does it sound like this summer will be a good time to buy a decent GPU?

      • DPete27
      • 4 years ago

      [url=http://www.fudzilla.com/news/graphics/40412-polaris-in-june<]Polaris 10 is rumored to launch in June also.[/url<]

        • ronch
        • 4 years ago

        I don’t think both being released so close to each other is purely coincidental. Both companies surely keep their ears to the ground to listen to what the other company is doing. Better yet, both probably employ spies that ‘work’ for the other company.

          • bfar
          • 4 years ago

          No coincidence at all!

          With a major die shrink in the pipe, neither vendor could ever afford to let the other out of the traps first. And the silence from both camps is encouraging. Holding the cards (no pun intended) so close to the chest usually suggests that neither is confident of an outright victory. I really hope these two chips are close contenders, because it could mean the kind of competitive prices that we haven’t seen in a couple of generations.

        • Leader952
        • 4 years ago

        [quote<]Why did AMD slip Polaris and Vega? AMD is slipping their GPU roadmap by a few quarters[/quote<] [url<]http://semiaccurate.com/2016/03/22/why-did-amd-slip-polaris-and-vega[/url<]

        • _ppi
        • 4 years ago

        Wasn’t it pretty much announced by AMD already?

      • ImSpartacus
      • 4 years ago

      It might be an ok time to buy an old gpu.

      But there really isn’t ever a good time to buy a brand new gpu.

      I consider the post-mining Hawaii gpus to be a good deal. There are other good deals on retired gpus.

      Nvidia and amd are smart. They don’t let you have your cake and eat it too. This year is just a glorified shrink in what we had last year. I know that there are interviews and presentations saying that Polaris is an architectural redesign and Pascal is a big deal.

      However, Polaris needs to leave some in the tank so vega can provide a fair jump in perf/watt in early 2017 (hbm won’t be enough to get the increase that their roadmaps claim). Pascal is coming on the heels of Maxwell, which was literally just an architectural redesign to wring every last ounce of efficiency out. And considering how Maxwell is pretty fantastic, I doubt there’s a lot of low hanging fruit left for 2016’s Pascal.

      We’ve all forgotten that moving to a new process isn’t a cake walk. We can’t get our hopes up.

        • bfar
        • 4 years ago

        Some of the current AMD cards might be worth considering, as their driver optimizations have carried over to the older GCN parts.

        I’d think a bit harder about buying a current Nvidia card though. Maxwell is highly driver dependent. Based on my own experience with Kepler, I wouldn’t be surprised if Maxwell quickly becomes a second class citizen in the driver optimization stakes.

        • JustAnEngineer
        • 4 years ago

        [quote<] But there really isn't ever a good time to buy a brand new gpu. [/quote<] GeForce 256 Radeon 9700Pro GeForce 8800GT Radeon R9-290 New architecture designs (and new fabrication process nodes) provide new GPUs that can hold their value for a long time.

      • bwcbiz
      • 4 years ago

      Maybe, maybe not. Part of the reason for the drop is that somebody’s been profiteering ever since the holiday season. Over the holiday shopping season there were several offers like GTX 970s for $290 with no rebate where they haven’t dropped below $300-310 since. And this is despite the fact that falling fuel prices and a strong US dollar would tend to lower costs.

      Plus there’s a good chance that the new cards are going to bring VR capability down into much lower price ranges. If you have an interest in that, you might do better with the new generation than the old, at least after Red and Green both have their cards on the table.

        • bfar
        • 4 years ago

        I’m probably wrong, but I perceived 28nm to be an expensive node from the perspective of flagships, and the great bit coin mining craze that preceded it didn’t exactly help with prices. It drove up the cost of AMD cards, which had previously kept prices pretty competitive.

        I hold out a sad hope that we’ll get a generation of chips that are a little easier on the wallet. The fact that AMD is releasing simultaneously with Nvidia’s Pascal gives me a smidgen of forlorn hope….

    • tanker27
    • 4 years ago

    Dear Big Boss Jen-Hsun Huang,

    While I love my 970, please for the love of God do NOT give us a wonky memory setup with Pascal as seen in the 970.

    Sincerely,

    Concerned

      • Kretschmer
      • 4 years ago

      Why? How many 10s of people globally ran into issues with the 970’s memory setup?

        • Devils41
        • 4 years ago

        I have a 970 and play games at 1440P I have yet to hit that 3.5GB wall.

          • ImSpartacus
          • 4 years ago

          That’s because 4gb is enough to go all the way up to 4k. 3.5gb is probably plenty for most 1440p gaming.

            • Devils41
            • 4 years ago

            Which the 970 was never toted as a 4K card. Which is why I didn’t throw a fit over the .5GB of slower Vram. I agree it is enough but I wouldn’t say plenty, Vram mileage will vary depending on the game, AA and graphics settings. I’m on the using around 3.3GB in most games like the division and GTA V.

          • Jigar
          • 4 years ago

          Thats because your driver will not let you hit that wall. Once the pascal comes, let me know how things goes. Nvidia has a very bad habit of leaving their previous products in limbo.

        • Voldenuit
        • 4 years ago

        [quote<]Why? How many 10s of people globally ran into issues with the 970's memory setup?[/quote<] 7.5 people, actually :p. On a serious note, XCOM 2 with high res textures and a couple of fanmade texture mods loads my GTX 970 to 3.6 GB in a few areas at 1440p, although I'm led to believe that nvidia likes to keep the driver overhead in the upper memory range to deal with the weird memory split of the 970, so the game might not actually be storing much, if anything, in the upper 512 MB. I haven't seen drastic frame rate spikes when memory use goes past 3.5 GB, anyway.

      • sparkman
      • 4 years ago

      Your interest in internal architectural details is commendable, but the memory architecture you describe has little impact on the users.

        • _ppi
        • 4 years ago

        Users perhaps, but clearly nVidia drivers devs have to and very foreseeable future in DX12 game devs will have to.

        When will they stop caring? At the moment devs forget about 3.5+0.5, 970 users will suffer, a lot.

        I bought 970 in December, btw.

      • torquer
      • 4 years ago

      Tempest in a teacup. Worry more about how bad their drivers have gotten lately.

        • tanker27
        • 4 years ago

        Curious, Why do you say their drivers have gotten bad?

          • DPete27
          • 4 years ago

          There’s been quite a few Nvidia driver revisions as of late that have gotten reports of bricking/overheating/killing GPUs. The most recent was only a month ago IIRC.

            • tanker27
            • 4 years ago

            LOL. I never upgrade right away. I learned this lesson decades ago with the 3dfx Banshee.

            /facepalm

      • gamerk2
      • 4 years ago

      So…you’d rather they decreased the RAM onboard to just 3GB?

    • tipoo
    • 4 years ago

    I’d be surprised if they don’t rethink the naming, it makes it sound like a 1080p targeting card to anyone who doesn’t follow the market.

      • DrCR
      • 4 years ago

      1080 4K Edition

      • VinnyC
      • 4 years ago

      Last time they got up to 9800GTX they circled back with the next wave and came out with GTX280. We’re at the tipping point again. I’m thinking it’ll be a G2T8X0

    • ronch
    • 4 years ago

    Whether Polaris or Pascal comes out first, it would be prudent to wait for the other card to be released before buying. Myself, I’d probably get Polaris if it offers much better performance and perf/watt than my HD7770. Perhaps a Polaris variant that’s 2x-3x faster with no aux power? Yes, please!

      • DPete27
      • 4 years ago

      Hey good news, those new GTX950’s w/o 6-pin power are about 2.3x faster than your HD7770.
      References:
      [url=https://techreport.com/review/24562/nvidia-geforce-gtx-650-ti-boost-graphics-card-reviewed/10<]GTX750Ti/7790 about 47% faster than 7770[/url<] [url=https://techreport.com/review/29061/nvidia-geforce-gtx-950-graphics-card-reviewed/11<]GTX950 about 58% faster than GTX750Ti[/url<] I don't expect the 6-pinless GTX950s to be replaced anytime soon. AMD/Nvidia are likely to reserve their 16nm yeilds for the higher margin SKUs. At least through the remainder of 2016. That's my bet.

        • ImSpartacus
        • 4 years ago

        Yeah, I can’t imagine Nvidia doing this kind of revision to gm206 and then trashing it three months later.

        • NovusBogus
        • 4 years ago

        That chart makes my 650 Ti a very sad panda. Not sure I’m going to last until Pascal, lolz.

          • nanoflower
          • 4 years ago

          I’m in the same boat. I have a 650 TI and would like to upgrade but with Polaris/Pascal so close I’m holding off till it becomes clear if they will be released this summer or held off till the fall.

            • NovusBogus
            • 4 years ago

            My original plan was to stick it out since the vast majority of games I play are fine on a basic GPU, unfortunately the list of interesting games I can’t run just went from one to three. Probably going to get a $150 stopgap and upgrade to a performance card sometime next year when both platforms are more mature, though I’m on the fence about whether 950/960/380/380X makes the most sense.

            • nanoflower
            • 4 years ago

            Depends on what you are trying to play. The 950 is going to be the cheapest but it will have problems running a number of triple AAA games at 60fps unless you turn down a number of the graphic options. The 380X is where I’ve been leaning as it plays most games fine at 60FPS/1080P while enabling the graphic options. My problem is the main game I have that I want to play is the Witcher 3 and even the 380X will suffer slowdowns unless you turn down the graphic options so I’m waiting to see what Nvidia/AMD will off this summer in the $200-300 range.

        • ronch
        • 4 years ago

        I think AMD recently gave a demo of Polaris 11 (1024 stream processors) which showed it outperform/match a GTX 950 at less than half the power consumption. I take it with a grain of salt though, because I read about it at WCCF. Still, if true, I’ll get it instead of the 950 if Nvidia doesn’t come up with a Pascal-based product along these lines and if I feel like tossing my graphics card.

          • nanoflower
          • 4 years ago

          Oh it’s true as many people were there and saw it. However we don’t know what graphic options were enabled/disabled and we don’t where that particular model sits. Was that the best performance that card could deliver or did they turn the performance down to lower power usage? Is that the high end in the Polaris 11 range or the low end of the discrete Polaris 11 cards.

          I feel certain that both Nvidia and AMD will be competing across the performance range this year but it’s unclear if they will both have something in the low to mid range discrete GPUs this summer or if we will have to wait till later in the year.

    • crabjokeman
    • 4 years ago

    Nvidia, please give us a low-end/low-priced, passively-cooled Pascal card for use in HTPC and light gaming applications. No more of this rebadged Kepler crap…

    Thanks

    • chuckula
    • 4 years ago

    I doubt it.
    Not because Nvidia won’t at least paper-launch some sort of consumer-grade Pascal, but because I’m guessing it’s time for a new naming scheme.

      • The Egg
      • 4 years ago

      [quote=”chuckula”<]I'm guessing it's time for a new naming scheme[/quote<] Um....no. It's time to grab everyone involved in marketing for GPUs [i<]ever[/i<] and give them all swirlies.

        • chuckula
        • 4 years ago

        DID YOU JUST SAY WE SHOULD GIVE WASSON A SWIRLIE?!???!?!?

          • The Egg
          • 4 years ago

          Is he in marketing? I’ll give him a pass this time because he’s new….but let him watch, and if the 7950 gets rebranded again all bets are off. 😉

      • willmore
      • 4 years ago

      They will keep the naming scheme and release low end cards. It’s April, that’s a prefect time to release the 1040. It would be very EZ.

        • ImSpartacus
        • 4 years ago

        I ducking died. You win this thread. Bravo.

          • willmore
          • 4 years ago

          I thought NTMBK’s “Srsly no” was awesome.

            • Srsly_Bro
            • 4 years ago

            I did too. Perhaps he’s a marketer. And you know what we do with marketers???

        • BurntMyBacon
        • 4 years ago

        Nice.

        Does anyone else get the feeling that this will be a lot like the 600 series rollout. The high end consumer part (GTX 680) was a GK104 part. They didn’t release a “Big” Kepler part until the 700 series. The GP104 may very well be used for their x80 part this time around. I just hope they reserve the Ti suffix for “Big” parts (I.E. GP100/GP110) on the x80 cards.

      • superjawes
      • 4 years ago

      Hey, I’m sure a GTX 1080 would be buttery smooth at 1080p!

        • BurntMyBacon
        • 4 years ago

        They should release a GTX 1080 Platinum. We could all refer to it as the 1080P. ; ‘ )

    • Visigoth
    • 4 years ago

    Trust me, DOOM will take care of the graphics sales business.

      • tanker27
      • 4 years ago

      Don’t know why the down votes but I remember a time when Id offerings did influence graphics card sales. Not so much now though. 😉

        • bfar
        • 4 years ago

        When they put the words DLC and DOOM in the same sentence, I feel like a part of our youth has just been dumped on.

          • PixelArmy
          • 4 years ago

          DOOM 1 had Ultimate DOOM and DOOM 2 had Master Levels…

            • jihadjoe
            • 4 years ago

            Other Doom engine games were fun too. I had lots of fun with Heretic and Hexen which were basically Medieval Doom.

            • Firestarter
            • 4 years ago

            the Tome of Power was a lot of fun

            • Srsly_Bro
            • 4 years ago

            -ultimate master levels

            • Klimax
            • 4 years ago

            Ultimate Doom was free (through several patches to original version)

            • PixelArmy
            • 4 years ago

            DLC can be free. You got the patches via download (DL). And a 4th episode is certainly more [u<]C[/u<]ontent.

            • bthylafh
            • 4 years ago

            Ultimate Doom was a free download for people that already owned Doom, IIRC.

        • albundy
        • 4 years ago

        cuz doom3 sucked. good thing i tried before i buyed. playing a game in pitch back is not fun, its annoying.

      • ronch
      • 4 years ago

      And if it doesn’t, it’s DOOM and GLOOM!!!!

        • derFunkenstein
        • 4 years ago

        Oh, man. GLDOOM is awesome.

      • Firestarter
      • 4 years ago

      I played some and I disagree. The game is not [i<]that[/i<] good and my current GPU manages it fairly well

      • Jigar
      • 4 years ago

      I remember making Athlon 64bit based system just to play Doom 3.

      • Krogoth
      • 4 years ago

      [url<]https://youtu.be/V3E-NGejxE4[/url<]

    • Srsly_Bro
    • 4 years ago

    Anyone care to hear about my 7950?

      • ronch
      • 4 years ago

      Yes, it’s time to replace it given what you’ve told us about it.

      • NTMBK
      • 4 years ago

      Srsly no

        • Srsly_Bro
        • 4 years ago

        Lol

      • Firestarter
      • 4 years ago

      you messed it up and now every 7950 is trash, I get it

Pin It on Pinterest

Share This