ForceWare 180 ‘Big Bang II’ drivers hit the web

We gave you the skinny on Nvidia’s ForceWare 180 (a.k.a. Big Bang II) driver release last month. This morning, the drivers finally became available on Nvidia’s website. You can grab the WHQL-certified ForceWare 180.48 release for Windows Vista x86, Windows Vista x64, Windows XP x86, and Windows XP x64.

As we wrote in October, these new drivers bring four major enhancements: across-the-board performance increases, multi-display support for SLI multi-GPU configurations, the option to dedicate a graphics card to PhysX computations, and SLI support for some Core i7 motherboards with Intel X58 chipsets.

Here’s what you can expect on the performance side of things, according to Nvidia:

  • Up to 10% performance increase in 3DMark Vantage (performance preset)
  • Up to 13% performance increase in Assassin’s Creed
  • Up to 13% performance increase in BioShock
  • Up to 15% performance increase in Company of Heroes: Opposing Fronts
  • Up to 10% performance increase in Crysis Warhead
  • Up to 25% performance increase in Devil May Cry 4
  • Up to 38% performance increase in Far Cry 2
  • Up to 18% performance increase in Race Driver: GRID
  • Up to 80% performance increase in Lost Planet: Colonies
  • Up to 18% performance increase in World of Conflict
  • There are some caveats, however. For one, the 180.48 release only supports GeForce 200-series, GeForce 9, and GeForce 8800-series graphics cards. Also, SLI multi-display support apparently doesn’t work in Windows XP for now. Finally, Nvidia says X58 mobos will only work in SLI mode with GeForce 9800 GTX, 9800 GTX+, 9800 GX2, and 200-series GPUs.

    Comments closed
      • Cannyone
      • 11 years ago

      These Drivers work great for me, with 2 8800GTs in SLI on a 780i chipset board. The Irony is that I no longer run multiple displays. A practice that I abandoned when I chose to try SLI. Maybe now I’ll try connecting to my new 42″ TV (via HDMI) as an alternate display. 🙂 I’d just need to re-arrange my furniture, as the TV is on the opposite side of the room from my computer. Or maybe I’ll just get a second 24″ display…

      • Murso24
      • 11 years ago

      despite all these complaint’s about the new drivers, you would expect they want to come out with new drivers for a reason. and with my Single 8800GTX i havn’t had any problems in gaming or programming, or CAD. i’ve actually played with Cuda. its great.

      i’ve had no problems. with no games whatsoever, but gains. in games like crysis, COD5, Supreme Commander..the physXengine actually helps in the physics. but i dont need it because i have the QX6800 Core 2 Quad, and 4gb of RAM. and Vista! no problems here fellas.

      • d0g_p00p
      • 11 years ago

      Anyone know how to change fan speed with this release? I was using 175.16 drivers with coolbits and I was able to set the fan speed from the default of 36% to 50% which helped cooling the GPU big time. Now with this release (180.480) I am unable to change the default fan speed. The option is just grayed out?

      My worry is that my idle temps are now what my load temps were prior to the the upgrade and I think that my temps are way too high under load. I did coolbits again and reinstalled nTune.

        • d0g_p00p
        • 11 years ago

        Never mind. It seems that the “Big Bang 2” drivers seem to have broken all my games(crazy lag and random pausing) so I deleted and reinstalled 178.24 and all is good. I even tried the 180.48 release on my non gaming comp and it too seemed to have lost all ability to play games. Also thank you MS for making system restore completely useless and not working.

        I knew I should have not tried to upgrade. Such is life………….

      • Mystic-G
      • 11 years ago

      Installed the driver, started up World in Conflict. Was in-game for 45 seconds and for the first time ever, my monitor turned completely gray. Had to do a hard reboot.

      It works fine with CoD5 though. I’m afraid to play WiC again since my comp doesn’t like to agree with hard reboots during start-up anymore. So much for a performance boost eh?

      -8800GTS 320MB
      -Win XP 32bit

      Guess, I’ll revert back later.

        • Saber Cherry
        • 11 years ago

        Why would you install something called “Big Bang” and not expect to get screwed?

          • Mystic-G
          • 11 years ago

          LOL… for a second there I thought your comment was gonna be serious. It is kinda of a big issue for Nvidia I think, gray screen? Really? O’well.

            • Saber Cherry
            • 11 years ago

            Not really. WHQL means “Windows Hardware Quality Lab”, so WHQL drivers are only certified to work as well as, say…
            1) Windows ME, which is Windows Quality
            or maybe
            2) an XBOX 360, with a failure rate of what seems like over 25%. That is Windows Hardware.
            3) Lab is short for Labrador Retriever. The entire WHQL division of Microsoft may actually just be a dog – you know, the friendly Windows Hardware Quality Lab, “Searchy””. He is very cute and helpful and will perform amusing antics when you try to run ‘search’ on a brand-new Windows install. If he sniffed a new hardware device and peed on it, that meant it passed. Unfortunately Microsoft learned that was a stupid policy when they discovered dog urine (DNA-matched to some sort of animated Labrador puppy) was the leading cause of X360 red-rings-of-death. Now, they pass hardware that he DOESN’T pee on.

            In other words, Windows Hardware Quality means exactly what it sounds like, and what you might expect from a company that considers Ballmer to be its most qualified employee.

      • PoohPall
      • 11 years ago

      What about the major letdown drivers ?

        • indeego
        • 11 years ago

        Je ne parle pas se habla Sprechen ze nothingg{.}g

      • happyxix
      • 11 years ago

      I still get a BSOD when installing this driver just as with the beta driver. >=[

      • Tumbleweed
      • 11 years ago

      No love for my fanless 7600GT. *sigh*

        • kmansj
        • 11 years ago

        You seem to be a fan of it. So it’s not entirely fanless.

      • ReductiMat
      • 11 years ago

      I installed the beta 180.43 on my P6T with 2 8800GTX’s yesterday and it says it recognized them as SLI…

      Any chance this is a typo? I’m hesitant on trying these out tonight and losing my current (apparent?) SLI…

      • Knuckler
      • 11 years ago

      Where’s OpenGL 3.0?

      • ssidbroadcast
      • 11 years ago

      Strange that Capcom games seem to get the most gains.

        • sdack
        • 11 years ago

        Why do you think this is strange?

        Perhaps they use the same 3D engine throughout their games or perhaps it is the same noobs who wrote the engines. Perhaps Nvidia engineers had the most sex with these noobs or perhaps these noobs paid a lot of money to Nvidia. Or … it is a coincidence.

          • TREE
          • 11 years ago

          I love how you call people who have probably had a much better education than yourself “noobs”.

            • sdack
            • 11 years ago

            Why so sensitive?? Maybe I do have a better education (…), but it does not take a genius to understand that noobs can be found everywhere. Besides, at my age I am long out of education.

              • Meadows
              • 11 years ago

              At the rate you’re calling everyone “noobs”, I don’t think so.

              • no51
              • 11 years ago

              He just needs a hug.

              • sdack
              • 11 years ago

              What you need is a spanking.

              • sdack
              • 11 years ago

              Well if you say so, Mr. Misconception, then it must be true.

        • ImSpartacus
        • 11 years ago

        Maybe they were unusually low to begin with. Taking 10 FPS to 20 FPS is a 100% increase. Taking 15 FPS to 20 FPS is only a 33% increase.

          • ssidbroadcast
          • 11 years ago

          Yes this response makes the most sense.

            • eitje
            • 11 years ago

            Spartacus is on point!

              • sdack
              • 11 years ago

              There is some truth in it but it is also not that simple. You do not add 10 FPS as if you had bought them in a super-market. You optimize code, you change the memory management, and perhaps leave out some level of detail for some 3D objects of these games and where no difference can be seen. This then affects not the entire benchmark but a few scenes and allows Nvidia to claim a gain of /[

              • Meadows
              • 11 years ago

              Then again, ATI’s numbers are just as useless, but they’ve been handy for giving gamers illusions and quantifiable placebo enough that nVidia copied them.

              • sdack
              • 11 years ago

              It is not illusions. I am sure these gains are real. And reading about gains is what makes us happy even when they are not our own gains. God knows why.

              • Meadows
              • 11 years ago

              Better ask God then. By the way, are you implying that ATI’s numbers are real and nVidia makes up half of them?

              • sdack
              • 11 years ago

              I have already spoken to God. We are on good terms. Btw, why do you keep bringing ATI into this? Did you buy the wrong brand??

              • Meadows
              • 11 years ago

              No, but because it was on topic.

      • CasbahBoy
      • 11 years ago

      I just got a bunch of new hardware in today, this is just in time. It feels good to have driver updates apply to me again.

      Edit: A 3.2GHz Northwood Pentium4 and 7800GS AGP to a Core 2 E8500 and 216SP GTX260…the difference is going to be like night and day 🙂

        • ecalmosthuman
        • 11 years ago

        Man I just went from a 2.6ghz dual core Opteron and a 8800GTS 640 to the same set up as your new one and was blown away. You are in for life changing experiences.

        • Gerbil Jedidiah
        • 11 years ago

        You mean like night on Earth and day on Neptune, right? =) Enjoy the new toys!

        • BoBzeBuilder
        • 11 years ago

        Congratulations for waiting so long. Other gerbils could learn from you.

          • CasbahBoy
          • 11 years ago

          I sincerely appreciate the compliment. I don’t want to mislead anyone though, I would have upgraded sooner had I not been more concerned more with making rent/car/food payments in that long intervening time period.

          • MadManOriginal
          • 11 years ago

          Yea there’s pluses and minuses to upgrading frequently. The minus is having to mess around with things more often and sometimes feeling like you’re losing money on each resale because you see the loss in a short time. The pluses are that you can resell hardware for a decent price to recoup some cost, 5 year old hardawre is worth almost nothing, and stay up to date. If you do it right the total cost over time can be close to the same, although lately great hardware is so cheap that might not apply so much anymore.

          • kmansj
          • 11 years ago

          Actually I think I can top that, for upgrades 🙂 My last was from a $69 Tigerdirect mobo combo (AthlonXP 2900, an undocumented flavor seemingly made only for HP, and sold for peanuts through Tiger) & GF5900 to a Core2 E6600 & GF8800 GTS640. That was like night and …. well, there isn’t a word for it.

        • kmansj
        • 11 years ago

        You finally did your dream update and……..chose the 260?

          • CasbahBoy
          • 11 years ago

          I did! I boot into Windows every couple days to play games, and all other times I’m using Gentoo Linux. I wouldn’t be surprised if ATI’s proprietary kernel drivers are at the same level of quality and stability as Nvidia’s by now, but old habits die hard I guess.

      • DrDillyBar
      • 11 years ago

      l[http://www.youtube.com/watch?v=NqEzyvvc6o8<]§

        • Kunikos
        • 11 years ago

        Yes, because every company loves to work indefinitely on products that no longer make them money.

          • MadManOriginal
          • 11 years ago

          Yea but 9800GTX and up is kind of absurd since it shows there’s no technical limitation on porting it to all G9x based cards at the least, and probably all G8x cars too since they are very similar. The drivers themselves work down to the 8800 series, really should be down to all true unified shader 8-series, it’s pretty lame to take away SLi on x58 from things like 8800s, 9600GT, and the rebadged 9-series cards.

            • Farting Bob
            • 11 years ago

            How many people will be using a x58/i7 system and 2 sub 8800 series nvidia card’s in SLI? Im guessing pretty much nobody.

              • MadManOriginal
              • 11 years ago

              Yea that’s true and I’d thought of that but people do sometimes use hardware they already have you know. It’s probably not a technical limitation but a marketing one which is what makes it so veing.

              • JustAnEngineer
              • 11 years ago

              NVidia’s evil marketing geniuses have more influence on what the consumers actually get than their engineering staff do.

              • A_Pickle
              • 11 years ago

              Yeah, but it doesn’t work with 8000-series cards. That’s pretty stupid/evil.

          • ludi
          • 11 years ago

          True. And yet not applicable to the 8800 or 9600 that are both supported by this release but not receiving some of the nicer updates.

          • sdack
          • 11 years ago

          Maybe, or maybe they had to make it to a deadline and support for the other cards is still coming.

          It does make little sense to have support for everything in a driver and for the 8-, 9- and 200-series, and then cut it again for one chipset from Intel.

      • Scorpiuscat
      • 11 years ago

      Finally!!!! Multi-Display support for SLI!!

      Man, I cant wait to get home and install these drivers and then enable SLI and dual monitors….WOOHOO!

        • Meadows
        • 11 years ago

        It’s been available since over a month, via the v180.43 beta.

          • Jon
          • 11 years ago

          Lol @ your troll comment.

            • Meadows
            • 11 years ago

            I’m reminded of pots, kettles and the colour of grief.

              • Jon
              • 11 years ago

              <Insert incredibly witty and sarcastic reply in response to Black Kettles, respond with the sound of mating Toads in C Major>

              • eitje
              • 11 years ago

              It is well to remember that the entire universe, with one trifling exception, is composed of others.
              — John Andrew Holmes

          • Byte Storm
          • 11 years ago

          Though you are indeed correct, one thing I am not chancing, is my computer to Beta Drivers. So I fully agree with the above exclamation.

      • Meadows
      • 11 years ago

      While you can technically modify the setup’s .inf file in 3 seconds with a simple copy-paste, I’m still not sure if it’ll help the other cards. Theoretically, nothing should prevent this driver set from working for 8600-owners or integrated owners.

        • SomeOtherGeek
        • 11 years ago

        What about the 6800 cards for PhysX? Will it apply? The card is PCI-e and just collecting dust and would like to use it somewhere!

          • Scrotos
          • 11 years ago

          Does it support CUDA? If not, then I’d doubt it.

          • Meadows
          • 11 years ago

          PhysX isn’t a set of graphics instructions, you need a C programming layer for that, which is CUDA. To even get there, you need a unified shader architecture and compatible drivers, which means GeForce 8 series and everything since them.

            • SomeOtherGeek
            • 11 years ago

            Cool! Thanks for the info… I guess I’ll just burn the card then!

    Pin It on Pinterest

    Share This