How to enable FreeSync on the GeForce 417.71 drivers

I probably don't need to tell you that today's the day for the GeForce RTX 2060's release to retail. New graphics card means new drivers, and to that end, Nvidia just unleashed GeForce driver version 417.71 fresh off the compiler. Predictably, this driver adds support for the new upper-midrange Turing cards, but perhaps the more exciting news is that this is the first driver to enable any sort of support for FreeSync and VESA Adaptive Sync monitors.

We knew this was coming—arguably even before Nvidia announced it just ahead of CES—but now that it's here, lots of GeForce users can finally get around to seeing what their monitors can really do. Nvidia refers to said support as "G-Sync Compatible" mode, and if you're lucky enough to own one of the 12 monitors that Nvidia has marked as "compatible," then the feature should enable automatically after you install the driver. Otherwise, you'll have to enable it manually. Don't be confused by the 3D settings panel noting that the driver detected your monitor as G-Sync Compatible; you'll probably still have to turn it on yourself.

Make sure this setting is toggled on.

To do so, first make sure you have FreeSync or Adaptive Sync enabled in your monitor's OSD if it has such an option. Then, open up the Nvidia Control Panel and go to the "Set Up G-Sync" tab. Once there, make sure you have the correct display selected—only one display can use Adaptive Sync at any given time—and then tick the "Enable G-Sync, G-Sync Compatible" box. Nvidia notes that if the box isn't available, you may need to go to "Manage 3D Settings," scroll down to "Monitor Technology," and select "G-Sync Compatible" in the drop-down. You may also need to have your display set to a high refresh rate mode before it presents the FreeSync option to the graphics card.

The nominal refresh rate of Bruno's Acer XF270HU is actually 144 Hz.

Variable refresh rates aside, a long-standing issue where some DisplayPort monitors would remain black when resuming from sleep should be resolved in this driver. GeForce GTX 1080 cards should stop dropping to their idle clock rate when you hook up three monitors. BenQ XL2730 monitors should work at 144 Hz again. Shadow of the Tomb Raider should stop crashing in DirectX 12 mode, and Gu Jian Qi Tan 3 should work on GeForce GTX 1060-equipped notebooks.

Persistent driver niggles include random flickering on G-Sync screens when a non-G-Sync monitor is connected over HDMI, brief corruption when hovering over links in Firefox, and possible blue-screen crashes in ARK: Survival Evolved. There's also a couple of HDR growing pains. In Ni no Kuni 2, enabling HDR will cause the application to crash on launch. Meanwhile, Shadow of the Tomb Raider may suffer flickering on systems with SLI, HDR, and G-Sync all enabled.

Folks who use GeForce Experience are probably already downloading the new driver whether they realize it or not. Everyone else can trek on over to the Geforce.com download site to grab the latest version. The PDF release notes are here, and for your convenience, here's the Windows 10 64-bit edition.

Comments closed
    • Chrispy_
    • 9 months ago

    So when is Nvidia going to support Freesync over HDMI?

    As pleased as I was about Nvidia supporting the VESA VRR standard, they’ve still managed to ruin it by supporting Displayport only.

    I can count the number of televisions with Displayport and Freesync on the fingers of no hands.

      • Ryu Connor
      • 9 months ago

      NVIDIA can’t, HDMI FreeSync is a proprietary AMD technology.

      At some point NVIDIA might support HDMI 2.1 VRR, which is not FreeSync.

    • SoM
    • 9 months ago

    Acer XG270HU here, i’m already running @143, do i need this ? evga 1070ftw

    installed anyways, it was enabled to g-sync mode but at 120hz, had to manually change back to 144hz

    only game i can test is Path of Exile, and i know it’s not an FPS , but i got a big fps boost for a game that has fps issues at times. i did notice that scrolling a page up/down seems a lil jerky, but i also had a win10 update to coincide with the gfx update

    also there is no option in the Acer OSD setting to enable other than making sure DP format in your monitors settings is set to “1.2”. FreeSync will then automatically be enabled if setup correctly on your pc.

    • DoomGuy64
    • 9 months ago

    I’d just like to point out that Nvidia is claiming Variable Overdrive is not supported for freesync on their site. Which is blatantly false. The monitors directly support Variable Overdrive, and only ONE monitor that I know of did not, which that was even questionable, since it was a firmware issue that was updated. This was the first freesync panel to ever exist, and all the following panels supported it. Classic Nvidia, still at it.

    Also, LFC on AMD is supported in software. From what I’ve heard reported on Nvidia’s certification process, they want the monitor to handle LFC, which is why so many monitors are not certified. I’m actually surprised about that, because I didn’t think that was a thing. So these certified monitors either directly support LFC now, or the range is so wide that it doesn’t need it. AMD also does HDR processing in software, which if Nvidia doesn’t support, also means HDR panels will have to support everything on board. Neither feature in hardware is as versatile as handling it in software, so maybe if monitors start supporting everything on board, they will also allow USB firmware flashing for updates. Either way, Freesync and Gsync compatible are going to perform differently on the same panel. Lesser freesync monitors will require Vsync to stop tearing and glitches outside of the VRR range.

      • psuedonymous
      • 9 months ago

      Variable Overdrive is a panel controller feature (i.e. the physical device commanding the row and column drivers). The Freesync ‘implementation’ is a frame-level hack that sends modified pixel values to the monitor and tries to compensate for the hardware’s lack of variable driving voltage*. It works to an extent, but as well as needing to be tuned for each monitor (i.e. it is not something the ‘monitor’ supports, but something the GPU driver needs to implement and the driver vendor needs to tune) it only provides frame-level granularity to pixel driving, and suffers from clamping at the extremes of range (full-black and full-white).

      * e.g. Frame 1 had a pixel value of 100, Frame 2 arriving 10ms later (instantaneous framerate 100FPS) should have a value of 200. For the G-sync implementation, the GPU sends the next frame with a pixel value of 200, and the panel controller, having buffered the last frame and using a frame interval timer and overdrive LUT, drives the pixel to hit 200. For Freesync, the GPU instead knows the fixed overdrive level in the monitor (intended for 60Hz, or a 16.6ms frame interval) would only hit 175 if it sent a value of 200, so instead sends a value of 255 knowing from calibration that the actual output value would then hit above 190 as the monitor attempts (though fails) to drive at 255.

        • DoomGuy64
        • 9 months ago

        lol, so that’s the game they’re playing then. They’re playing semantics by calling freesync overdrive a “hack”, and the gysnc implementation “variable overdrive”, and confusing everyone into thinking freesync doesn’t support *any* form of overdrive.

        That said, it sounds like Freesync overdrive requires some level of driver support to work, which if true, also means Nvidia users are potentially not getting proper overdrive. Lovely.

        Yet another reason why only 12 panels are “certified” by Nvidia. The only positive of what they are doing is that this may force manufacturers to improve adaptive functionality to work without driver support. Maybe.

          • psuedonymous
          • 9 months ago

          No, [b<]I[/b<] called it a hack, because it is, in the traditional usage of the word hack.

            • DoomGuy64
            • 9 months ago

            “hack” or not, doesn’t matter because it works. Neither you nor Nvidia have any business claiming that the feature doesn’t exist. Nvidia just wants to not put work into their driver, and spread FUD about AMD.

            Of course, none of these conspiracies were even being debated until after Nvidia started supporting adaptive. Which is finally spreading light on all the lies that have gone unchecked for years.

            Hell, this isn’t something that’s going to be easily cleared up either. What it’s going to take is TIME, as more people buy adaptive panels and say, “Nvidia lied about this” from their personal experience. None of these conspiracies are going to hold an ounce of weight in a year’s time, if that, and at that time everyone still trying to spread FUD will be recognized for the shills that they are.

    • cynan
    • 9 months ago

    Sigh… Now how am I supposed to rationalize to myself that I can’t afford to upgrade from my static 60Hz monitor now that the G-SYNC tax is optional?

      • JustAnEngineer
      • 9 months ago

      Because NVidia is demanding $1400 for a graphics card suitable for use with that lovely 3840×2160 120Hz monitor that you’d like to buy?

        • psuedonymous
        • 9 months ago

        It’s an interesting value proposition from AMD: simply do not produce a competing high-end card in the first place, and buyers save [b<]100%[/b<] of the RRP!

    • ALiLPinkMonster
    • 9 months ago

    CRAP. I was so excited to see if I can finally utilize my Freesync monitor again, only to realize my DP cable is MIA. Probably got lost during the move last year, and I’m not able to get one right now. :'(

      • JustAnEngineer
      • 9 months ago

      [url=https://www.monoprice.com/product?c_id=102&cp_id=10246&cs_id=1024601&p_id=31180&seq=1&format=2<]$16½[/url<] 3-foot DisplayPort 1.4 cable [url=https://www.amazon.com/dp/B079736Z53/<]$13[/url<] 6-foot DisplayPort 1.4 cable

        • ALiLPinkMonster
        • 9 months ago

        I know how much they are. I’m flat broke right now.

        • ALiLPinkMonster
        • 9 months ago

        Okay I finally had some money deposited today so I went and got one. Viewsonic VX2757-mhd confirmed to work with my GTX 1050 Ti. Gaming is a LOT smoother.

    • ronch
    • 9 months ago

    Just like Intel, Nvidia is too stubborn to call any AMD-created technology by its name. Remember EMT64 instead of AMD64?

      • derFunkenstein
      • 9 months ago

      Two things:

      * Who cares?
      * FreeSync is one of AMD’s [url=https://www.amd.com/en/corporate/trademarks<]trademarks[/url<] so of course Nvidia isn't using it.

        • f0d
        • 9 months ago

        holy crap i looked at that list and amd has trademarked freedom.!

        • ronch
        • 9 months ago

        Who cares? Their stubbornness isn’t only about adopting names from other companies, but also with regards to supporting those standards. Take Freesync here, for example. How long did users have to wait for this feature talked about here? And Intel? Yes they adopted AMD64 but ONLY BECAUSE Microsoft forced their hand, and even so they had to make slight changes to it just to justify the name change. Let’s not forget their outright refusal to support any other instruction set extension created by AMD. I’m also reminded (although this isn’t quite the same topic… I’m just being reminded of it, to be clear) of how they flipped the cards with FMA, announcing support for FMA4 then abruptly going for FMA3, which kinda took AMD off guard.

        See, AMD is about open standards. Intel and Nvidia want to keep things for themselves whenever they can.

          • derFunkenstein
          • 9 months ago

          You want Nvidia to infringe on AMD’s trademarks? Or pay money to license them? You can’t be serious.

            • ronch
            • 9 months ago

            What about AMD using the MMX trademark, for example? Or putting the Windows logo on their CPUs many years ago? At least they weren’t too stubborn like Intel and Nvidia are. Like I said, it’s not just using tradenames, it’s supporting open standards.

            Double standards much?

            And if you wanna wait for Intel and Nvidia to support other standards not created by them like this article talks about, sure go ahead.

            • f0d
            • 9 months ago

            the thing is “freesync” isnt the standard – its the name amd gave to vesa’s adaptive sync

            • K-L-Waster
            • 9 months ago

            Or, to be more precise the name AMD gave to the tech they worked out then submitted to VESA as a standard which was subsequently adopted…

            But the point still stands. The VESA standard name is Adaptive Sync. Freesync is AMD’s marketing name.

            • f0d
            • 9 months ago

            iirc eDP variable sync was vesa’s work and amd worked with them to bring it to displayport for adaptive sync so they could use it for freesync

            edit: found this
            [url<]https://techreport.com/news/26451/adaptive-sync-added-to-displayport-spec[/url<]

            • K-L-Waster
            • 9 months ago

            Which in turn was based on this (link from the article you linked):

            [url<]https://techreport.com/news/25867/amd-could-counter-nvidia-g-sync-with-simpler-free-sync-tech[/url<]

            • ludi
            • 9 months ago

            Those aren’t examples of open standards, those are a mix of patents and trademarks that AMD had to license in order to use and/or co-brand.

          • K-L-Waster
          • 9 months ago

          Someone’s flying their fanboy flag today…

            • DancinJack
            • 9 months ago

            When doesn’t he?

            I generally don’t mind because ronch isn’t a jackwagon, but ronch’s fanboi-ness is on display on the reg.

            • ronch
            • 9 months ago

            Fanboi or not, I’m talking about companies supporting each other’s standards instead of making their own proprietary standards and locking out those that aren’t theirs. Don’t you agree it’s good for us? As for being a fanboi, this has nothing to do with me being pro-AMD. I mouth off against AMD if they do something stupid, so don’t count me as one of those fanbois who blindly follow wherever AMD goes.

            • K-L-Waster
            • 9 months ago

            [quote<]I'm talking about companies supporting each other's standards instead of making their own proprietary standards and locking out those that aren't theirs.[/quote<] Except you're *not* talking about that: you're talking about which brand name is used in the spiel. Nvidia is now semi-supporting the VESA standard, and you're complaining that they're dissing AMD's name for the tech. Sorry, it just comes across as sour grapes. At least DoomGuy64 is sticking his attacks to how well / poorly they've done the technical implementation.

            • ronch
            • 9 months ago

            Like I said, this isn’t just about FreeSync. AMD adopts technologies from other companies (like MMX, for example), while companies like Intel just diss whatever AMD puts out and even renames what they adopt from AMD to make it look like they created it. Take EMT64 for example. This isn’t about being a fanboi, this is about what companies are actually doing to create open standards that benefit everyone. Intel even attempted to lock out AMD when they created Itanium because they know that by ditching x86 they can also ditch AMD and that agreement to let them make x86 CPUs. If they had it their way I bet it won’t be too rosy for us.

            But then, hey, if you guys wanna support and defend companies like Intel and Nvidia, companies that want to ‘proprietize’ everything, well.

            • Redocbew
            • 9 months ago

            Strong in the RDF, this one is.

          • YellaChicken
          • 9 months ago

          Nvidia now support FreeSync displays. It’s something gamers have wanted for a long time and a lot of ppl have been begging for this.

          Who cares what it’s called? Nvidia bowed to the pressure, it’s a good thing, celebrate.

            • derFunkenstein
            • 9 months ago

            That would require not moving the goalposts every time something happens.

      • Krogoth
      • 9 months ago

      It should be just called Adaptive Sync. They are both using VESA’s adaptive sync spec. The difference is implementation and testing. Nvidia is going have to play catch-up but it shouldn’t take them long.

        • Voldenuit
        • 9 months ago

        Nvidia is probably charging monitor makers a *ahem* ‘certification fee’ to be able to advertise their monitors as ‘GSync Compatible’, since GSync is a trademark.

          • K-L-Waster
          • 9 months ago

          If that was the limiting factor you would expect all of the Freesync monitors from the same manufacturer to be certified. (i.e. why would they certify some Acer monitors but not others if it’s solely a financial deal?)

          • jihadjoe
          • 9 months ago

          THX yo!

        • psuedonymous
        • 9 months ago

        They used “A-Sync” during the unveil presentation for monitors that are outside the G-sync Compatible range, though that doesn’t appear to have propagated elsewhere.

    • Forge
    • 9 months ago

    I just played several hours of Destiny 2 on my Acer KG240 and 1080 Ti. Before I would need to drop the settings a few presets to guarantee 144Hz+, or run full pretty and deal with some dips as low as 50fps, with massive judder and hung frames. With mock-Gsync/Freesync-redux, that’s completely and totally gone, I dropped as low as 70fps and felt NOTHING. Perfectly smooth throughout. Shame on you, Nvidia, for keeping this option locked away for political/ideological/marketing reasons!

      • Krogoth
      • 9 months ago

      ^^^^^^ This

      Adaptive syncing is arguably one of the best things for gaming graphics since pixel/vertex shading.

      • cynan
      • 9 months ago

      Shame on you Nvidia! Shame on you and THANK YOU!!

    • JosiahBradley
    • 9 months ago

    [s<]Does not work on Asus MG279Q. Locks monitor to 30Hz.[/s<] MASSIVE EDIT: I got it working! Had to use CRU to add a freesync range to the EDID of 60-144 and edit the master range to 60-144 as well.

      • DoomGuy64
      • 9 months ago

      I have one of those, and even AMD had problems on their Fury cards. Apparently there is a weird firmware bug that works fine with first gen cards, and even fine with Vega, but not at all well with Fury. You ether RMA the monitor, or use a compatible video card. No surprise that Nvidia has issues with it. It might be fixable in the driver, but that would require Nvidia writing a patch for it, which is unlikely especially since AMD never fixed it on Fury afaik.

        • JosiahBradley
        • 9 months ago

        It worked perfectly on my 290Xs though so what’s the deal?

          • DoomGuy64
          • 9 months ago

          Firmware. Multiple issues, early one was fixed, later one require firmware update or a card unaffected by the bug.

          [url<]https://www.overclock3d.net/news/gpu_displays/asus_mg279q_firmware_issues_and_delays/1[/url<] [url<]https://community.amd.com/thread/204848[/url<] [url<]https://community.amd.com/thread/205764[/url<] [url<]https://rog.asus.com/forum/showthread.php?87817-Strange-line-down-the-middle-of-my-MG279Q-alot-of-R9-Fury-users-have-this-problem&p=609355#post609355[/url<] [url<]https://rog.asus.com/forum/showthread.php?87380-Firmware-Update-for-ROG-SWIFT-PG279Q[/url<] I had a 390, upgraded to fury, experienced problem, sold the fury. Upgraded to Vega, no problem. I don't know if Asus will honor RMA at this point, but worth a try? That or buy a Vega. Only way to get it working. The "vertical line bug" also affected refresh rate, as CRU didn't work right, and 144hz was also problematic. Default settings would force a "safe" refresh rate, which was usually the 90hz maximum. Most likely Nvidia is experiencing the same bug, and forcing a "safe" refresh rate, simply because the controller doesn't know how to handle the buggy firmware. Hawaii works the best with this monitor, but polaris and vega also work.

      • Krogoth
      • 9 months ago

      It is a software bug. Adaptive sync didn’t work at first for my MG278Q and Vega 64 with the earlier builds of Windows 10. The problem was the default “Generic PnP Mointor” driver profile. I had to use a third-party “MG278Q monitor” driver profile to get adaptive sync working. It appears that it was fixed with Spring 2018 update.

      I suspect it is a similar issue where either the OS and/or Nvidia drivers aren’t playing nice.

      • ptsant
      • 9 months ago

      I also have the MG279Q and am very interested.

      Do you mean you actually unlocked the FreeSync range to 60-144? I was under the impression that the scaler could not handle rates over 90Hz reliably. Are there any visual artefacts? Maybe point to a resource that explains how to do this?

      Thanks

        • DoomGuy64
        • 9 months ago

        Yes, that’s how freesync works, since it is mostly software controlled. You can use the utility [url=https://www.monitortests.com/forum/Thread-Custom-Resolution-Utility-CRU<]CRU[/url<] to edit your freesync range to what you want. The MG279 supports a limited window, but you can raise it from 35-90, to 60-144. Editing the range larger than the monitor supports will not work, so 35-144 is not possible. There are no visual artifacts, unless you are running a video card like Fury that has issues with older firmware. Apparently most newer video cards are not affected by the bug, and the newer panels should be fine, so it is unlikely. That said, it is quite likely Nvidia does not support adaptive LFC in software, so you would have to edit your ranges to something that doesn't need LFC. Also, Josiah mentioned something about a "master range", which sounds like the values also have to be edited in Nvidia's control panel.

        • JosiahBradley
        • 9 months ago

        [url<]https://nils.schimmelmann.us/post/133778060542/modded-asus-mg279q-drivers-with-60-144-hz-freesync[/url<] Look at the CRU sections. This works and I was getting artifacting on the lower ranges not the high end ones.

      • chuckula
      • 9 months ago

      Can we take back our upthumbs now that it works?

        • derFunkenstein
        • 9 months ago

        I retroactively declare that my upthumbs mean “glad you got it working”

    • dklingen
    • 9 months ago

    Just an FYI that ASUS XG32VQ auto enabled the G-Sync Compatible box after driver install – so cool. Let’s see if I can tell a difference…

      • RAGEPRO
      • 9 months ago

      Your monitor isn’t on the list, so you may want to check the settings just to be 100% sure. Bruno thought his auto-enabled too, but it actually wasn’t—the driver detected his monitor as “G-Sync Compatible” but didn’t actually enable it.

    • Hsldn
    • 9 months ago

    with a Gsync compatible monitor how is the experience compared to running it with Amd gpu Freesync on?

      • JosiahBradley
      • 9 months ago

      It was better on AMD as my monitor actually had adaptive sync, now it doesn’t.

      • RAGEPRO
      • 9 months ago

      It really just depends. For some displays it will be worse (see: JosiahBradley’s MG279Q which apparently doesn’t work), for other displays it will be better (see: my ROG XG27VQ that is very unreliable on my RX 580, but works fine on another machine with a GTX 1070).

      I suspect for most displays it will be functionally equivalent.

    • albundy
    • 9 months ago

    anyone try this on a 65″ or larger tv with freesync?

      • JosiahBradley
      • 9 months ago

      This only works over DP, so no TVs unless they have DP.

        • albundy
        • 9 months ago

        so…hdmi to DP active adapter wont do it?

    • adampk17
    • 9 months ago

    I have both a true GSYNC display and a Freesync/Gsync Compatible display connected to my 1080ti.

    It seems to me that I dont have the Gsync Compatible option in the monitor technology setting in the nvidia control panel.

    Some might ask why would you need to and, honestly, I probably don’t need to. I just want to know if I’m running in to a limitation of the software or doing something wrong.

      • Waco
      • 9 months ago

      From the release notes:

      Only single displays are currently supported; multiple monitors can be connected
      but no more than one display should have G-SYNC enabled.

      • Voldenuit
      • 9 months ago

      Can you let us know if Freesync works if you disable GSync on your main monitor or disconnect it? Also, did you turn on Freesync in the OSD?

      I’m curious to know if nvidia’s implementation:
      a. only lets you have 1 VRR standard active at a time
      and/or
      b. only lets you have 1 Freesync-enabled monitor at a time

    • oldog
    • 9 months ago

    Is this a DP only thing, or does it support HDMI 2.1?

      • DancinJack
      • 9 months ago

      2.1 natively supports VRR, so that’s kinda moot? It (this Gsync compatible or whatever program) won’t cover Freesync over HDMI (2.0x and below).

    • Litzner
    • 9 months ago

    I know they limited the GSync compatibility to 10XX+ cards, but I am curious if this will also be able to be pushed to 9XX cards similar to the way you could force FastSync for older cards. Was the 9XX series cards on DisplayPort 1.2a+?

      • Voldenuit
      • 9 months ago

      Most of nvidia’s 900 cards shipped with DP1.2/1.2a. However, the first generation Maxwell cards had issues with DP1.3 and 1.4 monitors, and there was an updated BIOS nvidia released which was supposed to address this.

      Some AIBs released Geforce 970 and 980 cards with DP1.2 (at least according to their tech specs).

      It sounds like 9xx cards have a potential alphabet soup of display port specs and various versions of vBIOS, and it was probably safer for nvidia not to officially support FreeSync on cards that might not have the physical connector or appropriate BIOS. Perhaps a registry hack might let users play around at their own risk.

    • firewired
    • 9 months ago

    It appears to be working so far with my AOC AGON AG271QX and EVGA GTX 1080 Ti SC2.

    After driver installation the NVIDIA control panel indicated the panel as Not Validated as G-SYNC Compatible. Once I enabled it manually the panel did a quick reset and the driver enabled it with no further action needed.

    None of the symptoms NVIDIA reports as problems have presented themselves, yet. I have not done any advanced testing, that will take some time. So far I have not detected any differences for in-game experience before-and-after enabling it.

    My suspicion is that those with 1080/1070/1060-class cards running on slower CPUs might benefit more from this than I would. With my 1080 Ti being driven by an 8700K @ 4.8GHz all-core overclock, my frame rates are often running beyond my monitor’s 144Hz refresh rate.

    I expect any Adaptive Sync benefits will only reveal themselves when I run into more widely varying frame time delivery situations.

    Regardless, I am pleased NVIDIA has come to their senses by finally supporting this. It is long overdue.

      • morphine
      • 9 months ago

      I tested AssCreed Origins and Destiny 2 to provide some info for this article. That was my first hands-on experience with VRR, only some 10 minutes, and from that little experiment, I can describe two things:

      1) A lower framerate with adaptive-sync on is as good or better than a higher one with standard v-sync. I’d eyeball it as 80-90 FPS with adaptive-sync being as nice as 120 to 144 Hz, though I’d need more time to assess this better. Point is, with adaptive sync on, motion at lower-than-ideal framerates looks waaaay better than you’d think.

      2) Adaptive sync helps with any frame rate, but as predictable, it has its best effect when the frame rate shifts. You completely avoid the visual “judder” that you get if the frame rate drops and you get v-sync quantization (or if it goes off entirely). Animation continues to look smooth even though it visually doesn’t have as many frames, and it’s way easier to track objects with your eyes, particularly but not only in fast-paced games.

        • firewired
        • 9 months ago

        Agreed. Until today I had zero experience with VRR as well, but in testing a few games that tend to play very differently, the VRR experience is best when the frame rate fluctuates by significant amounts.

        Most (not all) of the games I play are older single player games with more constant frame rates and my 1080 Ti runs them beyond the 144Hz refresh rate most of the time, so Adaptive Sync felt no different enabled or disabled in those cases.

        But I do play a lot of Mechwarrior Online (older version of CryEngine), in which a typical Quick Play match has 12-vs-12 teams pitted against each other. When combat gets close the frames really drop in this game. Frame rates fluctuate between lows of around 60 FPS and highs above my monitor’s 144Hz refresh rate. While playing this game with Adaptive Sync disabled it plays with a lot of stuttering in the low-end of the frame rate range. With Adaptive Sync enabled, in the few matches I played today, I did not encounter any stuttering when the frame rates dipped into the 60’s. Need to do more testing to fully confirm this though, and probably some game captures as well.

        So yeah. It works, and it is a much better gaming experience in situations where frame rates can fluctuate mildly for those running FPS in the range up-to their monitor’s refresh rate, or significantly for those with wider ranges of FPS fluctuation.

        <edits for typos and grammatical clarity>

        • BigTed
        • 9 months ago

        I have the same display (Acer XF270HU) paired with a 1080ti and have noticed a couple of issues so far… wondering if you have seen this too?

        1) After logging in to Windows the monitor will go black for 1/2 a second and then come back on. Also does this after waking from sleep. Not a deal breaker, but not working 100% correctly either.

        2) Game specific, but in Ghost Recon Wildlands there is some flickering in a 1″ horizontal band at the very top of the display. Shadow of the Tomb Raider is perfect.

          • morphine
          • 9 months ago

          1) Yep, I get that about every other time when the monitor’s coming back from sleep.

          2) I have no idea about Wildlands since I don’t play it, but all the games I tried so far work fine.

    • thedosbox
    • 9 months ago

    There’s a google spreadsheet being used to track people’s experience with this:

    [url<]https://docs.google.com/spreadsheets/d/1YI0RQcymJSY0-LkbjSRGswWpJzVRuK_4zMvphRbh19k/htmlview?usp=sharing&sle=true#[/url<] The downside is that not everyone seems to be running the same tests, or reporting their result in a consistent manner, but it may be useful for anyone in the market for a new monitor.

    • psuedonymous
    • 9 months ago

    [quote<]You may also need to have your display set to a high refresh rate mode before it presents the FreeSync option to the graphics card.[/quote<]In addition to raising the refresh rate, many Freesync monitors must have VRR enabled manually from an OSD option that could be buried in a random submenu, rather than it being exposed by default.

      • morphine
      • 9 months ago

      If there is such an option, yes. It’s automagic the case of my XF270HU.

        • sweatshopking
        • 9 months ago

        Such grammar

    • DPete27
    • 9 months ago

    When will Nvidia move out of Windows XP – era UI for their software?

      • DancinJack
      • 9 months ago

      honestly i could care less so long as they actually fix stuff, and add things like Freesync support.

      • RAGEPRO
      • 9 months ago

      Hopefully never. It’s clean, it’s intuitive, it works.

      Have you tried to use Radeon Settings? That’s why I don’t want Nvidia to change [i<]anything[/i<].

        • morphine
        • 9 months ago

        First, do no harm…

          • RAGEPRO
          • 9 months ago

          This reply won’t get the upvotes it deserves.

        • cegras
        • 9 months ago

        I use Radeon Settings, I don’t really have a problem with it. There’s a difference between getting used to a new UI and actually bad UI design.

          • RAGEPRO
          • 9 months ago

          You are absolutely, 100% correct. There is a difference between “unfamiliar” and “bad.”

          Unfortunately, I am all too familiar with Radeon Settings.

        • sweatshopking
        • 9 months ago

        I disagree. I think their control panel sucks. So does amds.
        A nontech person couldn’t figure out what most of those settings and things do.

          • DancinJack
          • 9 months ago

          What “nontech person” is messing with either AMD or Nvidia’s control panel? Guessing that population is pretty small.

            • sweatshopking
            • 9 months ago

            I agree, but it shouldn’t need to be.

            • godforsaken
            • 9 months ago

            possibly because steep learning curves (rock faces?) like these exist?

      • meerkt
      • 9 months ago

      The problems I see are the awkward layout, and slowness to open and to apply some settings (at least on my setup), not the classic UI style.

      Would you really prefer some Win10’s styled one-option-per-screen theme?

        • Prestige Worldwide
        • 9 months ago

        This. The slowness to open has been happening for YEARS and I wish they would fix it. It’s been an annoyance for far too long.

Pin It on Pinterest

Share This