Nvidia updates GPU drivers ahead of Battlefield 3 beta

As you may know, the hotly anticipated Battlefield 3 beta opens this week, and the folks at Nvidia are all over it. They’ve just released a new set of beta drivers, version 285.38, that should make a nice matched set with the BF3 beta. As usual, tailored drivers look to be the best way to go. Nvidia claims these betas "increase performance in the Battlefield 3 beta by up to 38% compared to the 285.27 drivers." They also include an SLI profile for BF3. You can download the drivers right here.

AMD is apparently preparing a BF3-oriented graphics driver update, as well, although we don’t have a clear sense of the time frame. We’ll keep an eye out for its release.

Comments closed
    • squeeb
    • 8 years ago

    Sweet…hope I can get in, hearing the servers are congested. Another 5.5 hours at work 🙁

    • kamikaziechameleon
    • 8 years ago

    hope my GTX 460 can run the game.

      • Silus
      • 8 years ago

      Sure it can, it just won’t be the smoothest experience ever, especially when cranking up the details. I am assuming that besides the GTX 460 you have a good dual core or quad-core CPU + 4 GB of RAM

        • l33t-g4m3r
        • 8 years ago

        While that statement is pretty obvious now, a lot of reviewers tricked people into thinking the 460 was some amazing beast by only using factory overclocked models in the benchmarks. Even then, the 460’s architecture was too gimped to be future proof. It was actually obsolete before it even hit the market. ATI’s 6850 was pretty similar in that respect, except reviewers didn’t fudge the numbers, so people had a honest idea of what they were buying.

          • kamikaziechameleon
          • 8 years ago

          The 460 has been good to me, I got the MSI one with there cool and quiet heat sink. The build is so good I’m officially a fan boy and looking pretty exclusively at MSI gpus in the future. I know what you mean that it is not a competitive design to begin with but a mix of the stagnation in game graphics and its done me fine. I’m gonna sell it when the time comes.

          • travbrad
          • 8 years ago

          [quote<]It was actually obsolete before it even hit the market[/quote<] How is it obsolete? There isn't a single game it can't run at decent framerates. Sure it won't be able to max out settings in all games, but when have you ever been able to do that with a mid-range card (especially one that is more than a year old)? [quote<]a lot of reviewers tricked people into thinking the 460 was some amazing beast by only using factory overclocked models[/quote<] Tricked? REALLY? All the reviews I've seen clearly indicate which cards have higher clocks. It's not really "overclocking" anyway in any real sense. If a card is clocked that way by default (and supported with a warranty), how is that overclocking? They just didn't rename it as a whole new card like they usually would (and people complain about that too)

        • kamikaziechameleon
        • 8 years ago

        yeah I have a 1090T x6 @ 3.8 Ghz and 16 gb of DDR3 dual channel @ 1600mhz. Its my home workstation machine otherwise I’d put a radeon in it. I’m thinking I might just get the game bundled with a GPU for an upgrade post launch. Not sure what I’m realy going to do though.

      • travbrad
      • 8 years ago

      The CPU seems to be more of a limiting factor, at least from what I’ve seen. My E8400 @4ghz chokes on the outdoor scenes pretty hard (drops to 20 or even less FPS). I lowered my settings from 1080p HIGH to 1024×768 LOW, and the framerates didn’t change AT ALL.

    • JohnC
    • 8 years ago

    …and here are ATI/AMD drivers:
    [url<]http://support.amd.com/us/Pages/AMDCatalyst1110Previewdriver.aspx[/url<]

      • lilbuddhaman
      • 8 years ago

      Thanks, now somehow magically save me the 3.2gb of bandwidth I’ll need to get the beta client in 2 days

    • Mystic-G
    • 8 years ago

    A beta running a beta. This should be interesting.

      • crazybus
      • 8 years ago

      There’s a Xzibit meme in there somewhere.

      • Dashak
      • 8 years ago

      Nvidia’s beta drivers worked flawlessly for Dragon Age 2.

    • RAMBO
    • 8 years ago

    I think I’ll wait until the official non-beta release of these. The last ones crashed and recovered like it had a mind of its own.

    • Farting Bob
    • 8 years ago

    Releasing game specfic drivers for a beta??? What the hell?? You should only be using the beta if you are a better tester, in which casde being able to turn up the settings an extra notch isnt really important.

      • lilbuddhaman
      • 8 years ago

      New engine, new bugs, new tweaks. I’m happy to see GFX manufacturers preparing instead of waiting till weeks after release.

      • JohnC
      • 8 years ago

      This will be an “open beta” for a highly anticipated FPS game, so of course GPU developers will release the drivers specifically for that, to prevent 1000’s of impatient people thinking “OMG, the beta runs so bad on my card, I’m throwing my card out and buying a card with competitor’s GPU instead !!!!111”.

    • lilbuddhaman
    • 8 years ago

    (Awaits ATI updates and hopes that crossfire users aren’t stuck with performance and stability issues the first 3 months like every other release)

      • bcronce
      • 8 years ago

      It would be ironic if an “ATI badged” game runs better on nVidia.

      Edit: Seems I may have been under the false assumption that DICE had a deal with ATI(like Dirt2). Thank You Silus

        • sschaem
        • 8 years ago

        Cant wait to see how well AMD tweak the Radeon driver for the FX serie…
        I have a feeling Radeon will perform very, very well on FX processors VS Intel.

        • Silus
        • 8 years ago

        Where did you get the idea that BF3 is an “ATI badged” game ?

        If it is, I really hope they don’t butcher it like they did with Dirt 2, with car skins and everything…ugh

          • BestJinjo
          • 8 years ago

          Every time DICE presented the game, it ran on either GTX580 or GTX580 SLI. They also publicly stated that you’d need GTX580 SLI to run this game on Ultra settings.

          You’ll need a GTX560Ti or HD6950 to run this game on High details. But no single GPU will let you max it out.

          Also, not sure how Dirt 2 was butchered? One of the best racing games on the PC with great graphics. Secondly, AMD does not tell Codemasters how the cars should look, are you crazy? They work with the developer on higher level things like code /performance optimization.

          Not sure why you don’t like the reflections and details on the cars in Dirt 2.

            • Silus
            • 8 years ago

            Well, in regards to this being an “ATI badged” game, you pretty much said what I already knew, hence why I asked where did the other poster saw that this was an ATI game, when everything else says otherwise. NVIDIA is or will be selling graphics cards with BF3 bundled for free, so it’s doubtful this is an ATI badged game.

            As for Dirt 2, I have no problems with its graphics. I specifically said that being an AMD game, they went as far as including skins for the cars, with the ATI logo on it…I don’t mind the company logos when we start the game, but car skins ? That’s pushing the “help the developer” concept (which is called bribing when NVIDIA does it around here)

            • l33t-g4m3r
            • 8 years ago

            In game logos don’t bother me whether it’s nvidia or ati. What bothers me and most other people is when the engine is physically sabotoged to remove features and performance from a particular brand. Nvidia does that, ati doesn’t. Nvidia may have finally gotten a hint from the community to stop doing that, as I haven’t seen as many sabotaged games lately. Doesn’t mean they’ve quit doing it altogether. I’m sure we’ll eventually see another triple A blockbuster come out with all nvidia specific effects, while ati get’s the dx9 treatment. Maybe the new Batman, that’s my prediction.

            • sweatshopking
            • 8 years ago

            batman is likely similar to it’s predecessor. if you had issues with the physx and the nvidia implementation, you likely will again. i have an nvidia 8800gt in my pc that i just use for physx and cuda, whilst i do my graphics on my 4890. seems to work well.

            • l33t-g4m3r
            • 8 years ago

            I had a slightly similar setup, but now I’m just using a 470. I most likely won’t have any problems, but it’s the idea that nvidia is sabotaging games that irks me.

            edit: seems like we got a cowardly vote troll problem here, ssk.

            Now a double troll, and they upvoted you to hide the first troll. Nice of them to do that, but they didn’t restore my original +1, and gave me a second -1 here. Isn’t this fun guys? You contribute nothing but anonymous hate without stating what your problem is. (The problem is that you exist, but I digress.) I guess this is kind of like a sport for trolls: anonymous downvoting. I agree with ludi’s thread earlier that we should do away with it, because idiots are constantly abusing the feature, and who knows if TR checks for 1 IP per vote (multiple accounts). Downvoting should require a response, and dare I say voting shouldn’t be anonymous. If we knew who you were, perhaps you wouldn’t be doing it, now would you?

            • Silus
            • 8 years ago

            “NVIDIA does it and ATI doesn’t”. Such idiocy, but not unusual for an AMD fanatic. NVIDIA doesn’t sabotage a thing. Both companies with their developer relationships, just provide resources to help them prepare the game for their own hardware.
            AMD/ATI fanboys love to talk about NVIDIA, yet forget that ATI essentially bought Valve games, among others of course. It’s the usual double standards…

            • lilbuddhaman
            • 8 years ago

            Both Nvidia and ati have done pretty sneaky things in the past, those days are more or less gone though as tech sites like this one quickly caught them and made they account for their actions.

            • l33t-g4m3r
            • 8 years ago

            What?! That’s a bunch of nonsense and outright lies if I’ve ever heard it. ATI doesn’t have a TWIMTBP sabotage program. Every game that ATI sponsored, which are few and far between, had effects that completely matched the DX standard. There were no ATI specific hacks used, or code that hurt nvidia users. ATI did not buy out Valve either, another lie. The FX architechture just sucked, and that was solely nvidia’s fault. The next generation of nvidia cards could play games fine.

            On the other hand, Nvidia has not used their sponsorship program benevolently. They are constantly creating nvidia specific hacks that they force implemeneted in games, that don’t run on ATI cards. A lot of good DX10/11 standardization has done, eh? Nvidia still finds ways around it, creating specific AA implementations and graphics effects using CUDA. Nvidia has no standards, so don’t talk about double standards, since nvidia doesn’t even have one. ATI always sticks to the spec, while nvidia does whatever. That’s the way it’s always been, so much that it’s become a stereotype. Even when nvidia starts playing by the rules you kind of keep looking over your shoulder for the next attack, because they’ve done it for so long that people expect it to happen. I have no problem with nvidia If they stop playing dirty, but it’s not like I’m going to forget everything they’ve done either. They’ve earned their disrespect.

            • Silus
            • 8 years ago

            Of course ATI doesn’t have that one. ATI has the “get in the game” and now with AMD, the “gaming evolved” one and they sabotage as much as NVIDIA does.
            The “sabotage” you keep referring to, is the Batman AA debacle, which fits right in the “created by AMD problem” category, that AMD fanatics turn into something else. There is no other story to back the sabotage theory and not even this one works. NVIDIA made their Anti-Aliasing routines specific to their cards and handed them over to the Batman AA developers. AMD should’ve done the same, but they didn’t and expected that NVIDIA’s specific AA would work with their cards too (which is quite stupid, since NVIDIA’s algorithms are different than their own, not to mention different architectures altogether that receive data differently). Since those routines obviously didn’t work, AMD whined…

            As for Valve, you know very well what I’m talking about. The Source engine was tailored for ATI cards and it’s still the same to this day. Obviously you don’t care, because as long as AMD does it (and not someone else), it’s fine by you. But that is called “double standard” if you didn’t know it.

            As for CUDA, you’ve got to be kidding…So NVIDIA is able to do more than just graphics on their GPUs (namely physics calculations) and they shouldn’t show them off ? You really are desperate in creating arguments here…it’s just like the arguments against Fermi by AMD zealots, that “NVIDIA wasn’t focusing on graphics and that is bad”, yet now that AMD is going the “Fermi route” with GCN, the AMD zealots are crying “Way to go AMD, Computing is the future”. Really, really sad…

            • l33t-g4m3r
            • 8 years ago

            Wrong. Batman AA was possibly the most notorious game nvidia ever rigged, but it wasn’t the first. Speaking of, the AA was just standard AA that an ati owner could turn on if they told the game they had an nvidia card. There was no nvidia specific code other than:
            if game detect nvidia then
            turn on AA
            else
            No AA and we laugh at you.
            end if
            If you fooled the game into thinking you had an nvidia card, the AA worked fine.

            You don’t get it about CUDA either apparently, as there are several games out there that specifically use CUDA to do graphical effects, like Just Cause2. This is the type of shenanigans nvidia constantly pulls. ATI doesn’t do that. Bullspit about innovation doesn’t fly here either, since ATI supports OpenCL. Keyword being: OPEN.

            Another recent game nvidia has stuck their dirty hands in is Crysis2, in which they didn’t add any noticeable improvements with tessellation. None. Square boxes, wood planks, concrete slabs, all tessellated to infinity with no visual difference. Too bad ATI owners now have a tessellation limiter.

            No I don’t care about the source engine being “tailored for ATI cards”, since that’s bunk. Valve optimized for the DX standard, nothing more, and actually even added some nvidia specific hacks because the FX series was too damn slow to run straight DX9.
            It’s plain historical fact that the FX series sucked. John Carmack even mentioned having problems with the FX series, and that aside from using nividia specific hacks, the cards were too slow and couldn’t perform well with normal shader code.

            Oh and I’m such an ATI fanboy that I’m using a 470. Yeah, I’m an ATI fanboy alright.
            The truth is that I just don’t like nvidia screwing us all over with their TWIMTBP program. You’re the fanboy.

            • Silus
            • 8 years ago

            You’re ridiculous. First NVIDIA supports OpenCL too. Second that’s not what AA is done. There are specific algorithms done by both companies. It was not generic AA, just goes to show how much you know about this…
            Thirdly and specifically about Crysis 2, you are hilarious. If NVIDIA didn’t partner with Crytek on Crysis 2 you wouldn’t even have DX11 in it (you know the thing that people like you complained about, when it shipped without it), much less tessellation. And no noticeable improvements ? It’s the ONLY game that uses tessellation quite heavily. Plus, DX11 itself brings almost NO visual improvements over previous DX versions, except for tessellation and again, Crysis 2 uses tessellation heavily.

            It’s because of programs like TWIMTBP and “Gaming Evolved”, that PC games get any goodies at all, in this age of console gaming. When you get that into your thick skull, maybe you’ll realize the amount of BS you’ve been babling about.

            As for not caring about the source engine, is yet another example of how biased you are. You don’t care about specific implementations when it gives ATI an advantage, but are outraged when the advantage is for NVIDIA. As I said, the typical fanboy double standard.

            I too have an AMD CPU so that must mean I’m not a fanboy right ? Those “I have X product from Y company that I keep criticizing, makes m a non-fanboy” arguments are getting old.

            • l33t-g4m3r
            • 8 years ago

            No spit sherlock. Nvidia does support OpenCL, but doesn’t promote it’s use vs CUDA, because CUDA is a closed standard. Nvidia wants to have it’s cake and eat it too. Regardless, OpenCL support has nothing to do with nvidia sponsoring visual effects in CUDA, which is vendor lock in. eg: Just Cause 2.

            Here is a link to how to cheat Batman to work on an ATI card, and it works:
            [url<]http://www.tomshardware.com/forum/293944-33-enable-msaa-batman-arkham-asylum-cheating-nvidia[/url<] [url<]http://techgage.com/news/amd_vs_nvidia_anti-aliasing_in_batman_arkham_asylum/[/url<] "What got AMD seriously aggravated was the fact that the first step of this code is done on all AMD hardware: "'Amusingly', it turns out that the first step is done for all hardware (even ours) whether AA is enabled or not! So it turns out that NVidia's code for adding support for AA is running on our hardware all the time - even though we're not being allowed to run the resolve code!" Defending Crysis2? You just lost all credibility with anyone who has 2 braincells to rub together. TR did an expose on it, you should read it sometime. "Yep, this flat, hard-edged slab is one of the most highly tessellated objects in the scene. The polygons closest to the camera are comprised of just a few pixels each. Further from the camera, the wireframe mesh becomes a solid block of red; there, we're probably looking at more than one polygon per pixel. Those are some very small polygons indeed. " Don't forget HAWX either. TWIMTBP helping gaming? Bull. They hindered both DX10.1 and DX11 from being put into games because nvidia didn't have FERMI ready. Nvidia even went so far as to have dx10.1 removed from assassin's creed, after it was already in the game. More Source engine lies? Boy you don't know when to quit. The FX series was the only nvidia card that had trouble with the source engine, and that was solely because it was the worst card ever made period. That had nothing to do with ATI or Valve. Valve isn't composed of idiots, they didn't make their engine run slow on nivida cards. You are a liar, and have not offered one iota of proof to the contrary. Specific implementations for ATI? You just stuck your fungus infected foot in your toothless mouth. Name some! ATI sticks to the industry standard, nvidia is the only one using vendor lock in effects. Perhaps ati/amd should have locked in DX11 and given you people a taste of your own medicine, but they they didn't do that, because that's not how AMD conducts business. Edit: Your manager at the nvidia shill center should probably tell you to stop, because you're hurting "the cause", and exposing yourself as someone beyond rationality or reason, proving that you're either paid for your posts, or just plain banana's.

            • Fighterpilot
            • 8 years ago

            BF3 is a “Gaming Evolved” AMD title.

            • Silus
            • 8 years ago

            Link ?

            • Silus
            • 8 years ago

            After some digging, DICE worked with both companies, so if anything t’s a Gaming Evolved and TWIMTBP game.

      • lilbuddhaman
      • 8 years ago

      Followup:
      [url<]http://twitter.com/#!/CatalystCreator/status/117343281641295872[/url<] [b<]Cat 11.9 and special BF3 drivers next week. [/b<] Thanks to a thread on Rage3d pointing it out.

    • kamikaziechameleon
    • 8 years ago

    so when do I get in there???

Pin It on Pinterest

Share This