Nvidia spills the beans on ‘Big Bang II’ ForceWare release

If you were around last August, you might’ve seen the rumor mill break the news about Nvidia’s ForceWare 180 drivers. Nvidia has now given us the official skinny on what the upcoming graphics drivers have in store. Code-named “Big Bang II,” the release will bring four major changes: performance increases in games, support for multiple displays in SLI multi-GPU mode, the ability to dedicate a GPU to PhysX computations, and SLI support for upcoming X58 Intel motherboards.

Nvidia pretty much glossed over the performance aspect in its presentation, but it did show a chart where the new drivers improved frame rates by 5% to 35% on a GeForce 9800 GTX+ system. The biggest gains supposedly occurred in Half-Life 2: Episode Two, Far Cry 2, and GRID at resolutions between 1680×1050 and 2560×1600 with antialiasing cranked up. We might have to run our own tests to get a better feel for the new optimizations, though.

The biggest change in the Big Bang II release comes on the multi-monitor front, where Nvidia has finally enabled dual-head support for SLI rigs. Users should no longer have to go into the control panel and manually disable SLI to stretch their desktop across two monitors. Nvidia will offer three distinct modes: in full-screen 3D mode, only the “SLI focus display” will render the game, and the secondary display will go dark. In windowed mode, you’ll be able to play a game in a window with both monitors up and running, although Nvidia says you can expect “slightly less SLI acceleration” in that case.

In the third mode, Nvidia will support full-screen SLI gaming across two monitors in a handful of titles—World in Conflict, Supreme Commander Forged Alliance, and Flight Simulator X. The company says you won’t be able to span most other games across two displays, but it thinks that’s not a big downside. After all, nobody wants to be staring at the space between their monitors when playing a first-person shooter.

Nvidia mentioned that the ForceWare 180 drivers can now handle up to six monitors on an SLI system, as well, but you’ll need a third GPU that’s not hooked up via SLI to drive more than two displays. Either way, these improvements should finally bring Nvidia somewhat up to speed with AMD, which has had seamless multi-display support in CrossFire configs for a long time.

Moving on, the ForceWare 180 release will also tread new ground on the PhysX front. Current drivers already support using a single GPU for both physics and graphics computations, and Nvidia can similarly spread calculations across an SLI setup. With the new drivers, the company will let users dedicate a graphics card to PhysX processing. You won’t have to buy a second GeForce GTX 260, either—a low-end card (like an $85 GeForce 9600 GT) will reportedly make a fine sidekick to a faster GeForce for that purpose. That’s a neat option, although considering the current dearth of high-profile PhysX-enabled titles, we probably wouldn’t recommend buying a GPU for that purpose yet.

Wait a minute, though—didn’t Nvidia release ForceWare 180 beta drivers for Far Cry 2 yesterday? Indeed, and although you may see the “skeleton” of some of these features in the beta driver, Nvidia hasn’t finished implementing everything just yet. You’ll have to wait until next month for the feature-complete release.

Comments closed
    • coldpower27
    • 11 years ago

    Say I have a 9800 GX2, could I use the 9800 GX2 for graphics and an 8800 GT for PhysX? I have a Intel P35 motherboard however, would it work??

      • Meadows
      • 11 years ago

      Probably yes.

    • sdack
    • 11 years ago

    Do not get me wrong, if at all am I being an Nvidia fanboy. Still, you could be using an Nvidia card for PhysX acceleration and an ATI card for the graphics. 😉

      • Flying Fox
      • 11 years ago

      Not with Vista and its single video driver model.

        • MadManOriginal
        • 11 years ago

        It would be possible if NV released a standalone PhysX driver. The likelihood of that is about zero unfortunately.

    • DancingWind
    • 11 years ago

    i have a question 😀 Can i Dedicate a gpu to physics and .. then go do the rendering on a radeon?

      • lycium
      • 11 years ago

      hmm better ask the nvidia finance guys

    • Damage
    • 11 years ago

    robspierre6 has been banned for repeated personal attacks and rudeness to other users. Read the rules and follow them please, folks.

    • Krogoth
    • 11 years ago

    Funny that TR’s front page pictures show Forged Alliance. A game that is heavily CPU-bounded. SLI and CF is worse for it because they require significantly more CPU-overhead than single-card output.

      • Byte Storm
      • 11 years ago

      It is CPU bound because all the Physics calculations are done on the cpu. Now if the game could be made to offload that to a “physics processor”….

    • SecretMaster
    • 11 years ago

    replying failure!

    • ssidbroadcast
    • 11 years ago

    That multi-but-not-SLi option provides for a nice Upgrade path, imo. Got that lousy 8600GTS? Buy a newer card and delegate the 8600 to PhysX. Nice.

      • Forge
      • 11 years ago

      That still leaves you to find a game that supports PhysX. There aren’t very many at all, Nvidia’s big list of 50-100 titles is chock full of errors and ‘PhysX via patch later’ that hasn’t happened on 2+ year old titles.

        • ssidbroadcast
        • 11 years ago

        Well Forge as Scott says in the podcast, that’s the part where Software catches up with Hardware.

        I know with the current state of things the cupboard is fairly bare, but nVidia seems to be pushing it’s CUDA and PhysX api’s fairly hard. With luck, developers will have access to CUDA-aware tools soon if not already.

        So it’s a future proof thing. Buy a 9600GT now, and retire it to PhysX a year or two from now.

    • robspierre6
    • 11 years ago

    The big bang 2 is another big fart from N-vidia.It’s all about supporting multiple displays in sli which has been offered by AMD for a long time now.And the whole physx on gpu thing is the most stupid idea N-vidia came up with.It’s not gonna be applied any soon.We laready have gpus that are struggling in games with primitive texturing and geometry designs.Adding physx to that will make the things much worse.

      • Meadows
      • 11 years ago

      Today’s videocards often have power to spare. PhysX acceleration only kicks in when the videocard is idle. It will not accelerate PhysX if the videocard is too busy (like 2560×1600 with 4x antialias).

      Therefore, the acceleration could give you anywhere from “no gains” to “good gains”. It never takes away performance, unlike how some of the more reading-impaired TR members would tell you.

        • ssidbroadcast
        • 11 years ago

        Er, not that I’m close to the situation, but I’m not sure a game engine can add or subtract PhysX Object-threads on-the-fly as the number of triangles-on-screen goes up. I think it’s a “settings will not take effect until you restart” sort of thing.

        • robspierre6
        • 11 years ago

        PHYSX ON GPU DOESN’T IMPROVE PERFORMANCE IDIOT.It decreaes performance.Why are you such a thick headed idiot Medusa?
        Recheck the reviews of N-vidia’s physx on gpu.The perfromance hits are from 40%-65% in UT3.And it surpasses 80% in Ghost Recon.check the reviews from techreport and softpedia.and stop posting your bs about physx acceleration on the gpu improving the performance.
        Nowprocessing the physx on the cpu doesn’t actually cause a noticeable decrease in performance,which makes the cpu a much better choice for physx, since the whole game processing effort falls on the gpu.

        Secondly,In case you haven’t read about it. Nvidia’s physx is very heavy on the cpu. According to fudzilla,softpedia, a test was done comparing the Ageia card to a 9800gtx in processing physx. The cpu utilization during the usage of the 9800gtx was roaming around 85%-92% compared to 25%-40% when using the Ageia card. So the software actually optimizes the “cpu” very well to handle physx.
        The whole N-vidia’s physx on gpu is a fraud. It does utilize the gpu BUT…partly……

        What we need now,in my opinion,is a havok driver.And lets leave the GPU to process the graphics.

          • Sargent Duck
          • 11 years ago

          Easy there big fulla. Calm down with the insults

          • A_Pickle
          • 11 years ago

          /bubblemouth

          • PRIME1
          • 11 years ago

          Bye………………………

          • ssidbroadcast
          • 11 years ago

          Medusa? Who started /[

          • MadManOriginal
          • 11 years ago

          What’s needed for PhysX is a cheap older generation GPU as a dedicated PhysX card, with or without video outputs. <$50 with wide game support which I’m sure NV could get out of devs over time and significant advantages over CPU or single GPU and it would sell. Maybe some improvement in the PhysX programming for the differences between GPU and Ageia card physics calculations.

          • Meadows
          • 11 years ago

          Show me _[http://img518.imageshack.us/img518/3638/temphm5.jpg<]§ I drew this graph for you because you can't be bothered to read the TR article you keep referencing "against me".

            • DancingWind
            • 11 years ago

            I think the empahasis is on MORE effects.
            TR has doene some testing of it and as I understand those enhanced maps with MORE effects are UNplayable with CPU powered PhysX libraries, and _[

            • Meadows
            • 11 years ago

            The point is, these numbers and facts were taken from the TR article itself, something some people try to use against me without reading it.

    • ColdMist
    • 11 years ago

    /[

      • yogibbear
      • 11 years ago

      maybe look into triple head solutions?

      • Forge
      • 11 years ago

      No, Crossfire does not have these bizarre limitations. I guess it shows that ATI came out with Crossfire 1.0 (compositor on a ‘master’ card, funky dongles) which was bad, then Nvidia came out with SLI which was better, but then ATI brought Crossfire 2.0 (connectors between cards, any card works, seamless multi-mon, etc).

      Now Nvidia needs SLI 2.0, with all the things ATI already gave us, just to get back parity!

      I really have fewer and fewer reasons to buy Nvidia GPUs anymore. ATI is overtaking them very quickly. Linux support is about even, and the open-source stuff is going to let ATI pull further and further ahead. Multi-GPU is sitting pretty solidly in ATI’s side. The only things NV really has left are their ‘used-to-be-OMFG-but-rapidly-deteriorating’ driver quality, and the faster/bigger/hotter single GPU top end. ATI is just overcoming them from all sides.

      It’s like the 9700/9800 Pro days all over again. NV just has no really compelling products!

      (Disclaimer: Typed up on a machine with a 9800GT.)

        • ColdMist
        • 11 years ago

        About the only reason to get Nvidia now is CUDA support for windows app acceleration.

        • SecretMaster
        • 11 years ago

        I agree with many of the points you make, but Nvidia has that clutch “TWIMTBP!” logo which always seems to allow them to be favored over ATI/AMD.

        • BobbinThreadbare
        • 11 years ago

        SLI came out before Crossfire.

          • Silus
          • 11 years ago

          I’m sure he’s also part of the group that says the 7950 GX2 came out after the X1950 XTX and that it was a panic attack by NVIDIA 🙂

      • Dagwood
      • 11 years ago

      The way I read the post was that mode two is what you have been looking for. I still am not clear if this means it perform better or worse than AMD’s drivers.

      ” In windowed mode, you’ll be able to play a game in a window with both monitors up and running, although Nvidia says you can expect “slightly less SLI acceleration” in that case. ”

      So does AMD provide multi monitor for all games?

      ” In the third mode, Nvidia will support full-screen SLI gaming across two monitors in a handful of titles—World in Conflict, Supreme Commander Forged Alliance, and Flight Simulator X.”

      I think what you really want is play the game with three monitors and two cards. You would have thought this would have been done a long time ago.

    • Knuckler
    • 11 years ago

    What happened to OpenGL 3 support?

      • Scrotos
      • 11 years ago

      Heelllooooo Big Bang III!

        • ReAp3r-G
        • 11 years ago

        i lol’d 🙂

    • Vasilyfav
    • 11 years ago

    Let’s say you have a powerful triple-SLI combo, like 3xGTX260. Is there really a point of dedicating a whole GTX260 just for PhysX processing, instead of splitting its load over physics and graphics processing?

    It just seems like any one of the cards that support triple-SLI in the first place would be way too powerful just for physics processing.

      • OneArmedScissor
      • 11 years ago

      Obviously, it’s for people who play online so that they can shoot bullets with M0AR FIZZICKZ at you, than you can at them.

      • BobTheBacterium
      • 11 years ago

      I believe the point is that you CAN evenly distribute the load over each of the 3 cards in SLI, or you can have one more powerful cards like the GTX260 and have a budget card like the 9600GT handle PhysX for those who couldn’t afford 3 GTX260’s, but still want good PhysX performance

Pin It on Pinterest

Share This