GeForce 378.78 drivers supercharge DirectX 12 and Vulkan

The GeForce GTX 1080 Ti is out, and that means you need a driver that knows what the heck it is. Fortunately Nvidia just released GeForce driver version 378.78, and it supports the "Ultimate GeForce." It's also Game Ready for Ghost Recon: Wildlands, something that could help you keep performance up with all those fancy GameWorks effects the game can use. Neither of those items are the most interesting parts of the new driver, though. Nvidia says the 378.78 driver has the potential to improve performance by as much as 33% in DirectX 12 and 50% in Vulkan.

The emergence of these two low-level graphics APIs has been fraught with contentious discussion. Rival AMD's Radeon GPUs saw much greater improvements than Nvidia's GeForces when running games that use the new APIs. To some folks, this wasn't all that surprising, since Nvidia already had a tightly-optimized DirectX 11 driver, and many believed that further optimization would yield little. As it turns out, Nvidia's software engineers have shown that there are still performance gains on tap.

The new raft of optimizations nets small performance uplifts in DirectX 12 mode for Ashes of the Singularity, The Division, and Gears of War 4. Meanwhile, gamers playing Hitman and Rise of the Tomb Raider should see massive gains. As for Vulkan, it's not exactly clear what game or graphics card Nvidia tested, but the company claims that from June 2016 to now, it has improved Vulkan performance by roughly 50%—making it about 33% faster than OpenGL. We didn't quite see such an uplift in Doom, but it's nice to see the green team paying attention to open standards.

Besides the performance improvements and new product support, the 378.78 driver also fixes some bugs. GeForce Experience's FPS counter should no longer appear in Discord. Notebooks with GTX 1050 Ti cards should stop blue-screening so much. The long-standing "Optimized for Compute Performance" setting that was forcing users to sacrifice game performance if they wanted to run PhysX in hardware should now be fixed, at least on GTX 980 Ti cards. Steam should be able to encode streams using NVENC again, too.

Not every bug has been squashed, though. Don't open Ansel in Ghost: Recond Wildlands if you have FXAA enabled, or the game will crash. For Honor is still crashing if you skip the intro cutscene with Shadowplay enabled. Not as severe a bug, but Civilization VI might suffer video corruption if you skip its intro cutscene. Quantum Break may crash in windowed mode on GM204-based cards. Finally, with these drivers, make sure you do a clean install. Nvidia notes that installing these drivers over a previous version may fail.

Altogether, we're looking forward to seeing the fruits of Nvidia's driver team's labors. GeForce Experience users probably already have the new driver. The rest of us can do things the old-fashioned way at Nvidia's download page.

Comments closed
    • tay
    • 3 years ago

    Did doomguy64 find a link over from wccftech.com or something ? That comment section is absolute fanboy garbage

      • DoomGuy64
      • 3 years ago

      No, Anandtech which is 100% true. Read the article. They even proved it with a synthetic benchmark. Not a 48 ROP card in practice, only on paper.

      The comment section is the way it is, because TR is completely infested with koolaid drinking Nvidia fanboys. You can’t say anything against the precious Nvidia here without activating the wrath of the hive.

      I figured I’d just mention the 1060, since the fanboys really hate it when you bring something up that hits home, and anyone who isn’t one of them will realize how bad the infestation is here after I stirred the pot.

      This is why I asked them why they are defending Nvdia with such obvious semantic troll arguments. It doesn’t help their cause in the slightest to deny basic facts in such a hysterical manner. I may not change their mind, but everyone else is going to see how fanatical they are.

        • tay
        • 3 years ago

        OK thanks for the link and reply. But according to AT from [url<]http://www.anandtech.com/show/10540/the-geforce-gtx-1060-founders-edition-asus-strix-review[/url<] The ROP count being higher than the pixel rendering count has more to do with the fact that they went with 192-bit interface and so dragged the entire memory subsystem (L2, ROP, controller) along. These were in 32 bit wide chunks, they needed 6, and ended up with 48 ROPs. It doesn't negatively affect anything so why even bother mentioning it. The card is more efficient than an RX 480 by a good amount. (I own a cheap RX 470).

          • DoomGuy64
          • 3 years ago

          The point is false advertising, and lies by omission. The card is not “better” than a 480 either, as it trades blows depending on the game.

          Both the 480 and 1060 rasterize 32 pixels per clock. Those extra ROPs don’t do a damn thing for performance, and instead serves to hoodwink the masses being a fake spec.

          Sure, making a 1:1 ROP ratio doesn’t work for the 1060. That’s not the point. The point is stop pretending those extra ROPs mean anything, because they damn well don’t. You’re getting the performance of a 32 ROP card, not a 48 ROP card.

          This is especially egregious for Nvidia, because they have made REAL 48 ROP cards in the past. Not owning up to this deficiency is class action lawsuit false advertisement material. Nvidia already got sued for the 970, so the precedent is there. The 1060 is even worse than the 970, since unlike the 970, the 1060 doesn’t get “Slow access” to the extra ROPs. You get 32 pixels/clock, same as the 480.

          Fake spec is Fake. Looking forward to the next lawsuit.

    • NTMBK
    • 3 years ago

    Yay for free performance boosts 🙂

    • Ninjitsu
    • 3 years ago

    Would be cool if you could post a convenient link to the driver release notes too, in driver update articles!

    (I know it’s a short google search away, but you know, for the lazy ones 😛 )

    • wingless
    • 3 years ago

    But do they cripple KEPLER?!

      • stefem
      • 3 years ago

      Nice tongue-twister, I’m trying to repeat “cripple KEPLER” 10 times in a row and haven’t yet succeeded 🙂

    • DPete27
    • 3 years ago

    “GeForce 378.79 drivers introduce FreeSync support”

      • K-L-Waster
      • 3 years ago

      Why would they? Not supporting it doesn’t seem to be affecting NVidia’s bottom line any based on their most recent results.

      I know lots of gerbils *want* them to, but until there’s a financial reason for them to do so I don’t see it happening.

        • cynan
        • 3 years ago

        I’d wager it’s only a matter of time before there is a financial incentive to support FreeSync. Heck, there could even be now (fueling RX 480 over gtx 1060 sales). Now that we are seeing some half decent FreeSync monitors (and not only gimped displays compared to G-Sync counterparts), and if Vega is a compelling product giving Nvidia competition in the higher end segment (where most people who care about frame syncing reside) it’s going to be getting increasingly disadvantageous for Nvidia to cling to their G-Sync-only stance.

          • LostCat
          • 3 years ago

          I’m personally imagining the upcoming HDMI VRR stuff taking over from both G-Sync and Freesync.

          • VincentHanna
          • 3 years ago

          I’ll take that wager.

          As long as Nvidia is capable of putting out low latency, high FPS signals, G-sync/Freesync is a rather irrelevant checkbox. The longer it takes for anyone to turn the screws to them, the harder it will be. Assuming it takes just 2 generations, and assuming that the 1080ti is approx as powerful as the 1180 and the 1270 after that, odds are most people won’t have any use for freesync.

          As for VEGA, if AMD had anything there, they would have presented it at their launch party. They missed that window, and if they wait much longer, they aren’t even going to be competing against pascal anymore, they’ll be competing against Volta later this year, and who knows where Vega falls on that scale.

    • Neutronbeam
    • 3 years ago

    Its disappointing to see that some of these optimizations are still sub-optimal.

    • MagariNegi
    • 3 years ago

    Is it true Nvidia uses their drivers to collect your system information for marketing?

      • Goty
      • 3 years ago

      Probably? Maybe? Read your EULA.

        • MagariNegi
        • 3 years ago

        I don’t have a Nvidia card but it was from a comment on an Ars Technica article. Just wondering if anyone here knows about it and what they think. Somebody on that article copy and pasted the supposed EULA language and I was surprised to see the marketing info on a graphics driver EULA. If the commenter was accurate or not I do not know.

        • krazyredboy
        • 3 years ago

        I wish to become part of the human CentNvidia… I agree to the terms.

      • Kretschmer
      • 3 years ago

      They are totally going to open up a checking account with your HDD size and MAC address.

        • derFunkenstein
        • 3 years ago

        As long as they make deposits, too, I’m down.

      • I.S.T.
      • 3 years ago

      They collect system data for driver development purposes. It’s legitimate. There’s ways to block it if you do not want to do this; just web search it.

        • MagariNegi
        • 3 years ago

        I just thought it was strange the EULA would say they will use your hardware info for marketing purposes with 3rd parties. I understand for driver development they would want your hardware and software info, that makes sense. If it is true anyway.

          • thedosbox
          • 3 years ago

          I believe the aggregate the data – i.e. how many computers have which type of card, are running what OS, use which CPU etc. They don’t pass on individual user data.

          Having said that, you will be asked for an email address in order to use geforce experience, though that is fortunately an optional install.

            • stefem
            • 3 years ago

            They share those aggregate data only with developers, all the marketing and advertise came from the website EULA which, of course, cover only the website.

          • I.S.T.
          • 3 years ago

          Honestly? A lot of stuff in EULAs is to cover their asses [i<]in case[/i<] they do something. Often they do not do these things.

            • MagariNegi
            • 3 years ago

            You are probably right. I imagine they throw in all the CYA language in everything these days.

            • egon
            • 3 years ago

            Which is to say we’re expected to agree to the things we hope they won’t do, in order to protect their interests. The notion of privacy policies protecting the customer has been turned on its head in recent years.

            For example, under Western Digital’s old policy, information provided to them for an RMA could not be used for marketing purposes. That’s no longer true – they have a far more vague, basically all-permissive policy.

            • I.S.T.
            • 3 years ago

            I did not say I was a fan of this. Just that it was a thing and that for now you’d be fine.

          • stefem
          • 3 years ago

          They do not, some really “smart” editor put the NVIDIA’s website eula in one of its “article” to prove GFE was spying users…
          They look at system specs and they may share aggregate information (eg. how diffuse is a certain GPU/CPU, OS, amount of RAM…) with developers, it is clearly described in the GFE license provided you are smart enough to click on License agreement inside GFE and not on the website.

      • MagariNegi
      • 3 years ago

      I’m not trying to create any anti Nvidia statements, I figured Nvidia owners with good knowledge would be reading this article and would know more about this and set the record straight. I have an AMD card and it is possible that AMD does the same thing. I’ve got Windows 10 so I’m not a conspiracy theorist and know marketing data is most likely being pulled from me. Just thought it would be weird to have that language in a driver EULA, if it is indeed accurate.

    • I.S.T.
    • 3 years ago

    Two comments.

    1. What the hell kind of settings are Nvidia using for these tests?

    2. And more importantly, what are they gonna **** up this time? They’ve been regularly screwing up since September or October because of some reason, probably managements(The source of most screw ups in large as hell code bases). I have been a steadfast Nvidia user since the 8xxx era(got a nearly four year old GTX 660), and I can’t trust ’em right now.

    It’s one thing to have one driver here and there that somehow screws things up big time. It happens. But, five to six months of problems? This is AMD levels of bad here… And AMD stepped up their game last year and stopped doing that so much.

      • USAFTW
      • 3 years ago

      Best to wait and have a look around reddit or geforce forums to see if there are any major issues with the drivers.

        • I.S.T.
        • 3 years ago

        That’s the idea. I haven’t upgraded in months because of this(At this point given my card is old enough to where games built around the PS4/Xbone don’t work well…I upgrade mostly for potential bug fixes).

      • slate0
      • 3 years ago

      I figured it had to do with pulling in new tech or DX12. Drivers had been stable and games had had console level graphics for a long time, but then…

      • Chrispy_
      • 3 years ago

      Aye. I’ve had [i<]at least[/i<] one Nvidia GPU in my personal machines since the Geforce 2 days and their drivers have been worse in the last 6 months than any other period in their history (and there have been some pretty severe black marks in their history too).

    • DoomGuy64
    • 3 years ago

    Good, about time. Otherwise their price gouging is inexcusable to anyone who isn’t a koolaid drinking fanboy. Not to say it’s justifiable now, or ever was, especially with all the vendor lock features and hardware spec lies (*cough* 32 ROP 1060 *cough*), but at least you’re finally getting a return on that “investment.”

      • MathMan
      • 3 years ago

      Are you claiming that he 1060 doesn’t have 48 ROPs?

        • chuckula
        • 3 years ago

        Yes. Yes he is. In his mind, the world will never move past the GTX-970 “scandal”.

          • K-L-Waster
          • 3 years ago

          It’s funny how the vast majority of people who were enraged by the 970 memory “issue” were people who wouldn’t have bought an NV card if you put a gun to their heads. Actual owners weren’t all that concerned.

            • anubis44
            • 3 years ago

            You put the word issue in quotes. Are you somehow implying there was no memory issue with the 970? I mean, Jen Hsun himself personally apologized. Did he do it for no reason? It was in fact a boldfaced lie to simply put ‘4GB’ on the box, implying it was all usable at full speed when the reality is, it’s not. And there was a lawsuit that forced nVidia to pay reparations to buyers. Are you insinuating that never happened? That there was never any ‘issue’ with the 970? Because if you are, you’re being dishonest. It doesn’t help anybody to deny/sweep under the rug/minimize any cheating or misfeasance on the part of any company who seeks to profit by deceiving their own customers, so cut it out.

            • evilpaul
            • 3 years ago

            As a 970 owner who upgraded to a 1080 I have to say I rather liked the 970 memory “issue.” I got my 970 for the price of a 960 after it being returned as an open box part and the ~$32 class action settlement.

            • K-L-Waster
            • 3 years ago

            No… I’m pointing out that most people who own 970s (including me) *don’t care*. It’s only the AMD gotcha brigade who think it’s an issue.

        • DoomGuy64
        • 3 years ago

        Yes. It has 48 ROPs like the 970 had 4GB of ram, and can only rasterize 32 pixels per clock.

        TR did not expose this, but AnandTech did, and they did the tests to prove it.

        [url<]http://www.anandtech.com/show/10540/the-geforce-gtx-1060-founders-edition-asus-strix-review[/url<] [url<]http://www.anandtech.com/show/10540/the-geforce-gtx-1060-founders-edition-asus-strix-review/15[/url<] [quote<]At 54.8 GPixels/second, GTX 1060 trails GTX 980 significantly. The card not only has fewer ROPs, but it has half of the rasterizer throughput (32 pixels/clock) as GTX 980.[/quote<] Tech Report clearly missed the boat on this issue, and instead reported statistics based on (incorrect) paper numbers. [url<]https://techreport.com/review/30812/nvidia-geforce-gtx-1060-graphics-card-reviewed[/url<] [quote<] (Gpixels/s) 82 [/quote<] Anandtech proved it's point by doing a synthetic benchmark that showed the 1060 is only capable of 54.8 GPixels/second. This is why I read multiple sites. Numbers are meaningless when they're made up. Benchmarks can be just as bad if they're tuned for a specific usage case, and that information is omitted.

          • chuckula
          • 3 years ago

          Oh I see you like Anandtech. Here’s an actually relevant conclusion from that same article you are so obsessed with: [quote<] The RX 480 is a very capable card, but the launch of the GTX 1060 puts an end to AMD’s short-lived exclusive to that market. And to NVIDIA’s credit (and AMD’s chagrin), the GTX 1060 is 12% faster while consuming less power at the same time, more than justifying the $10 price difference and giving NVIDIA a legitimate claim as the superior GPU.[/quote<] [url<]http://www.anandtech.com/show/10540/the-geforce-gtx-1060-founders-edition-asus-strix-review/18[/url<] So, since apparently the GTX-1060 doesn't have any ROPs or something... what's your excuse for why the RX 480 sucks so hard given that it does have all the ROPs?

            • DoomGuy64
            • 3 years ago

            So let’s say the 1060 gets 58 fps, and the 480 gets 54 fps. Either case will cause screen tearing, or frame drops with vsync.

            The 1060 isn’t just more expensive as a card, it’s more expensive as a system, since you have to pay the gsync tax if you want VRR. Otherwise, you’re sticking with your old 1080p 60hz monitor.

            The 480 doesn’t “suck” either. It’s clearly capable of playing games just as good as the 1060 within a few percent, and that gap goes both ways depending on what game and API you’re using. There is no definitive performance winner, but there is a clear value winner.

          • MathMan
          • 3 years ago

          There is nothing to expose.

          Nvidia has always had their ROPs linked directly to the MC.

          So it most certainly has 48 ROP units.

          However, the GPCs that feed the ROPs can only feed those ROPs with 32 pixels per clock, which means that, on average the ROPs will idle 33% of the time.

          But look at the alternative, where Nvidia would have used 24 instead of 48 ROPs. In that case, they could be running at 100%, yay, but it’s still be 33% slower than having 48 ROPs.

          You’d have a point if you complained that the ROPs can’t be fully loaded, that’s true. It’s as much true as complaint that hypert-threading in a CPU core can’t recover all wasted resources of a non-HT core.

          But it’s still a major benefit, definitely much better than having only 24 ROPs.

      • Kretschmer
      • 3 years ago

      I’ve used recent cards from both AMD and Nvidia in the past few months, and you’re absolutely insane if you don’t see any benefits from playing games on the marketshare leader. AMD isn’t the driver hell that it was several years back, but you do get better developer coverage (and better card options, obviously) for the premium.

        • DancinJack
        • 3 years ago

        He’s a constant troll on these boards. Unless the story is pro-AMD, he has something terrible to say about any and all. Just leave him be.

          • chuckula
          • 3 years ago

          In his defense, he’s obviously grieving over today’s news.
          He couldn’t even muster the snark to claim that Ngreedia gimped the GTX-1080 to make the 1080Ti look better. I can’t imagine the pain he’s feeling right now.

        • DoomGuy64
        • 3 years ago

        No, I clearly see the benefits of Nvidia’s products. It’s just that they have stopped competing in the midrange, and gsync only furthered that gap. I don’t like spending insane amounts of money to get an acceptable gaming experience, and Nvidia currently requires spending more than necessary to get that experience. It wasn’t like that with Fermi, but it is now for sure.

        Nvidia basically just needs to support freesync and offer a decent mid-range card to get me to switch. Otherwise, I have no incentive to care, especially when I already have a freesync monitor that isn’t going anywhere.

        If you wanna talk about sanity, you’d have to be insane to spend $1K+ just to get a good experience with Nvidia, and even worse to justify sponsoring such nonsense. Nvidia has completely relegated it’s products to high spenders with disposable income, and in chuckula’s case a terrible snobbish attitude. Annoying PCMR elitists, basically.

        Nvidia and their fanboys are like Hillary Clinton and her base calling the rest of us, “Deplorables”. It’s a despicable and polarizing attitude to take with people only wanting acceptable mid-range products.

          • thedosbox
          • 3 years ago

          [quote<] Nvidia and their fanboys are like Hillary Clinton and her base calling the rest of us, "Deplorables". It's a despicable and polarizing attitude to take with people only wanting acceptable mid-range products. [/quote<] You're equating people who want a video card bargain with racist homophobes? Really?

            • DoomGuy64
            • 3 years ago

            Have you ever followed chuckula? Don’t ask such a stupid question.

            PS. Racism exists as much as a green frog meme from 4chan is related to neonazis. What actually existed in the Clinton/Trump campaign was Marxist classism vs American exceptionalism. The whole racist argument is fairy tale make believe from the likes of Anita Sarkesian who says, “everything is sexist, so you have to point it all out.” I don’t know what you call it exactly, Critical theory or something, but it’s basically Marxist brainwashing that has been thoroughly debunked and exposed to be false.

            • chuckula
            • 3 years ago

            Have you ever followed me?

            When was the last time I called AMD fanboys racists for no apparent reason?

            When was the last time I invented a fact-free conspiracy theory about AMD gimping its own products?

            • DoomGuy64
            • 3 years ago

            I’ve followed you enough to know every other post from you uses some form of logical fallacy, and is generally a troll. You also perpetuate plenty of conspiracies when it comes to AMD hardware and benchmarks. Ryzen being the latest example.

            Making a cut down mid-range product is one thing, but misrepresenting it’s capabilities is another. Nvidia doesn’t need to constantly lie about the actual capabilities of it’s cards. That’s not “fact-free” either, unlike the vast majority of your trolling. Anandtech ran the tests and proved the 1060 is not a true 48 ROP card.

            • MathMan
            • 3 years ago

            Does the 1060 have 48 ROPs units are not?

            The answer is “yes, it does”.

            • DoomGuy64
            • 3 years ago

            Does the 970 have 4GB of ram?
            The answer is “yes, it does”.

            The point is that this is misrepresentation of actual capability, not that the card physically has a spec that it cannot utilize.

            Playing semantics is a troll’s game that is easily recognizable for the sham it is. So what is the point? Why are you so adamant to defend this? It’s not a credible position to take.

            • I.S.T.
            • 3 years ago

            I…

            It can use that RAM. Not very well, yes. But it can actually use the RAM. You’re acting like it’s there on the card but cannot be accessed at all. If you want to argue it’s mostly useless then fine, but you’re not doing that. You’re implying that arguing that it’s there but most things can’t use it effectively due to lack of speed is semantics. It is not. It is being factually correct in a situation where this is needed.

            Being uber factually correct can be semantics, yes, but not in this case. Especially given your freaking out over this stuff.

            • DoomGuy64
            • 3 years ago

            The 970 is better off than the 1060 in that it can somewhat ineffectually use that ram.

            On the other hand, the 1060 cannot effectively use those 48 ROPs. It rasterizes 32 pixels per clock with those 48 ROPs. The only positive factual thing you can state about it’s ROP capability is that the ROPs aren’t the bottleneck, but it certainly isn’t a legitimate 48 ROP card.

            • Waco
            • 3 years ago

            Playing semantics…which is exactly what you’re doing along with a complete misrepresentation of reality.

            Is it fun posting on a site like this where you get called out for your BS? Do you enjoy having the most consistently downvoted posts (and no, it’s not because we’re all fanboys)?

            Return to reality and stop being a zealot, please.

            • K-L-Waster
            • 3 years ago

            Uh, could we leave the P & R in the P & R forum where it belongs and stick to video card discussions here?

            • bhtooefr
            • 3 years ago

            There were huge problems with Hillary Clinton both as a candidate and with her campaign, and I didn’t vote for her. (I didn’t vote for your choice of candidate either.)

            However, she didn’t say that all Trump supporters were deplorable, only half of them. (And then she walked even that back.)

            • I.S.T.
            • 3 years ago

            OH GOD NO PLEASE NO JUST FOR THE LOVE OF EVERYTHING ANYONE ON THIS SITE BELIEVES IN ****ING NO

          • I.S.T.
          • 3 years ago

          I…

          Dude, get some perspective.

      • Ninjitsu
      • 3 years ago

      And GTS 450 should be enough for everyone

Pin It on Pinterest

Share This