GeForce Experience now in open beta

Last month, we told you about Nvidia’s GeForce Experience, a new software package designed to configure in-game graphics settings automatically based on your system’s hardware. The software was in closed beta when we wrote about it, but Nvidia has now opened the doors to the general public. You can download the GeForce Experience beta right here.

Now, I know what you’re thinking: “why use separate software when most games have auto-configure routines built in?” Because in-game optimizers don’t always recognize newer graphics cards. They also tend to be relatively conservative when it comes to turning on eye candy, which means manual tweaking is usually required to get the best experience. Fiddling with graphics settings might be second nature to seasoned enthusiasts, but the list of graphics options presented by most games can be a little daunting for the uninitiated.

GeForce Experience is fueled by a combination of automated and manual testing conducted in Nvidia’s labs, and it weighs both the visual impact and performance hit when deciding how to modify each setting. The nature of the game is taken into account, too; twitchy shooters have higher FPS targets than slower-paced strategy games. You’ll need a Fermi- or Kepler-based GeForce to take advantage of the optimization mojo. The number of supported games is still somewhat limited, but it’s grown since December and includes recent releases like Far Cry 3, Medal of Honor: Warfighter, and Hitman Absolution.

Comments closed
    • TaBoVilla
    • 7 years ago

    consolitis has hit =(

    • Bensam123
    • 7 years ago

    Valve where are you on this one?!?!

    • DeadOfKnight
    • 7 years ago

    Well I found one thing it is good for: making sure you have all your settings maxed out. This brought to my attention that I had object fading checked on for Skyrim. It can also give a good indication which settings can be the most taxing on your system, even if you don’t choose their “optimized” settings. Another thing, it turned on AA for Starcraft 2 and that wasn’t even an option in the game.

    • vargis14
    • 7 years ago

    Sure ill admit it that i measure my 1025mhz core and 2400mem SLI gtx560TI’s with my 2600k @ 4.9-5ghz just to see how big my e peenie is in 3dmark11. I do not care what anyone says 3dmark 11 has such a large testing group that it really tells you if your setup is performing up to par since you can compared to thousand + other 560ti sli setups. You can even select what cpu you want to compare with.

    I am happy with 10500+ points putting me in the top 50 when measuring my epeenie with max OC.

    24/7 975 cores 2400mem and a 4700mhz 2600k keeps me within the top 25% so i know it is working up to par ince there is such a large reference base to fall back on.

    3dmark 11 is good to see if everything is working up to snuff regardless of what videocard / CPU you are using.

      • travbrad
      • 7 years ago

      [quote<]3dmark 11 is good to see if everything is working up to snuff regardless of what videocard / CPU you are using.[/quote<] You know what else works great for that? Games. They have the added bonus of being fun and enjoyable too.

    • thesmileman
    • 7 years ago

    This concept is neat and I have been using it for a while but there are two problems:
    1. It doesn’t let you say “I what setting X at Y and then you decide the rest” or I consider optimized to mean 60fps or 30fps or the best graphics at 30+, or I don’t want it to ever dip below 60, Or I want Antialiasing even if it means you need to scale down other stuff. Just some basic choices would be nice otherwise you have no idea what they mean by optimize.
    2. It just doesn’t work. Sometimes it will give me 100fps with its settings. In the next game it will give me barely 30. Even in the same types of games. That is for my GTX 680 but for another system with a GTX480 it sets everything really really low. Like for Borderlands 2 it set textures to low and turn of anti-aliasing. A 480 is a pretty common video card so they should have enough data points to figure out that is way too low for this card.

    Also a third thing is I wish you could optimize and then change one setting but it is an all or nothing sort of thing. For example if you want to turn off vsync you can’t you can to revert back to what you had and can’t use the system.

    • Majiir Paktu
    • 7 years ago

    My algorithm for determining video game settings:

    1. Buy half-decent graphics card. (In the case of my desktop, it’s a very decent card, but this seems to work with mid-range cards, too.)
    2. Turn everything up as high as possible.
    3. Disable stuff like LOD switching that would compromise image quality.
    4. Enjoy smooth visuals, courtesy of high-end graphics hardware and the console-inspired stagnation of games.

    • Shark
    • 7 years ago

    This sounds like the wrong solution to the problem 🙁

      • bfar
      • 7 years ago

      I’ve been using it in closed beta. It actually does what it does very well. You can still see/adjust all of of the game’s advanced settings within the app to your hearts content. Think of this as an an improved version of the per-title 3D settings in the driver UI, except it gives recommendations.

      My only criticism – the list of supported games is a little short so far. Would like to see some titles from the last 2 or 3 years included.

    • tanker27
    • 7 years ago

    I missed the first article on this, but I went back and read it.

    It seems interesting. I am willing to try it out and see how it works out.

    • Stargazer
    • 7 years ago

    I like this initiative. I’m convinced that a large number of players are running with decidedly sub-optimal (default) settings, and there’s a big room for improvement there.

    Heck, many games default to some ridiculously low resolution, and there are huge gains to be made just by changing the resolution to match your monitor. If this initiative works out, lots of users could end up seeing a significant improvement in image quality.

    • Firestarter
    • 7 years ago

    [quote<][b<]Q: What is the target frame rate for determining optimal settings?[/b<] A: GeForce Experience targets 40 FPS (average) for its optimal settings. Note that graphics intensity can vary from scene to scene and we try to select intensive scenes for optimizing settings, so you may see averages above 40 FPS.[/quote<] Well, I was about to comment on how 40 FPS sounds like a pretty low average, but if they actually succeed in finding the particularly slow scenes in games, 40 FPS during those scenes probably means 50 FPS and up during normal play. I think this is a good initiative from Nvidia, especially for the gamers among us who care more for gaming than benchmarking 🙂

      • James296
      • 7 years ago

      Where FPS is concerned, it’s more of a personal taste. For me, I usually target between 30-50 with the exception being twitch shooters (like battlefield, COD, etc) which get bumped up to between 50-70.

      Note: I’ll humbly admit that I’m not above e-pe- I mean bench-marking, especially when I get a new, higher spec, graphics card (but who doesn’t :P)

      • ChangWang
      • 7 years ago

      It would be nice if they had a toggle where you could change that 40FPS target to 60. Even if it means lowering image quality by a substantial amount

        • Firestarter
        • 7 years ago

        That would double the required amount of testing and feedback though. I think the 40 FPS compromise is a good one, provided that they actually manage to find the most taxing parts of the games.

    • Great_Big_Abyss
    • 7 years ago

    Is it funny that all the games listed in this Nvidia focused article are all Gaming Evolved titles?

      • HisDivineOrder
      • 7 years ago

      One would hope the titles they’ve personally helped optimize wouldn’t need as much help as those optimized for AMD’s cards, right?

Pin It on Pinterest

Share This