Updated: GeForce cards mysteriously appear to play nice with TR’s FreeSync monitors

Update 9/30/18 3:22 AM: After further research and the collection of more high-speed camera footage from our G-Sync displays, I'm confident the tear-free gameplay we're experiencing on our FreeSync displays in combination with GeForces is a consequence of Windows 10's Desktop Window Manager adding its own form of Vsync to the proceedings when games are in borderless windowed mode, rather than any form of VESA Adapative-Sync being engaged with our GeForce cards. Pending a response from Nvidia as to just what we're experiencing, I'd warn against drawing any conclusions from our observations at this time and sincerely apologize for the misleading statements we've presented in our original article. The original piece continues below for posterity.

It all started with a red light. You see, the primary FreeSync display in the TR labs, an Eizo Foris FS2735, has a handy multi-color power LED that flips over to red when a FreeSync-compatible graphics card is connected. I was setting up a test rig today for reasons unrelated to graphics-card testing, and in the process, I grabbed our GeForce RTX 2080 Ti Founders Edition without a second thought, dropped it into a PCIe slot, and hooked it up to that monitor.

The red light came on.

Some things are just not supposed to happen in life, like the sun circling the earth, people calling espresso "expresso," and FreeSync monitors working in concert with Nvidia graphics cards. I've used GeForce cards with that Eizo display in the past as the occasion demanded, but I can't recall ever seeing the monitor showing anything other than its white default indicator with the green team's cards pushing pixels.

At that point, I got real curious. I fired up Rise of the Tomb Raider and found myself walking through the game's Geothermal Valley level with nary a tear to be seen. After I recovered from my shock at that sight, I started poking and prodding at the game's settings menu to see whether anything in there had any effect on what I was seeing.

Somewhere along the way, I discovered that toggling the game between exclusive fullscreen and non-exclusive fullscreen modes (or borderless window mode, as some games call it) occasionally caused the display to fall back into its non-variable-refresh-rate (VRR) default state, as indicated by the LED's transition from red to white. That color change didn't always happen, but I always noticed tearing with exclusive fullscreen mode enabled in the games I tried, while non-exclusive fullscreen mode seemed to reliably enable whatever VRR mojo I thought I had uncovered.

Our Eizo FS2735 failing to do the variable-refresh-rate dance in exclusive fullscreen mode

 

Our Eizo FS2735 delivers tear-free gaming, courtesy of double buffering and not VRR, with our RTX 2080 Ti in non-exclusive fullscreen mode

Next, I pulled up my iPhone's 240-FPS slow-mo mode and grabbed some footage of Deus Ex: Mankind Divided running on the RTX 2080 Ti while it was connected to the Eizo monitor. You can sort of see from the borderless windowed mode video that frames are arriving at different times, but that motion is advancing an entire frame at a time, while the exclusive-fullscreen mode shows the tearing and uneven advancement that we expect from a game running with any kind of Vsync off.

Now that we seemed to have a little bit of control over the behavior of our Nvidia cards with our Eizo display, I set about trying to figure out just what variable or variables were apparently allowing us to break through the walls of Nvidia's VRR garden beyond our choice of fullscreen modes.

Our LG 27MU67-B failing to sync up with the RTX 2080 Ti in exclusive fullscreen mode

 

Our LG 27MU67-B exhibits regular Vsync—not VRR—with the RTX 2080 Ti in non-exclusive fullscreen mode

Was it our choice of monitor? I have an LG 27MU67-B in the TR labs for 4K testing, and that monitor supports FreeSync, as well. Shockingly enough, so long as I was able to keep the RTX 2080 Ti within its 40-Hz-to-60-Hz FreeSync range, the LG display seemed—emphasis: seemed—to do the VRR dance just as well as the Eizo. You can see what I took as evidence in the slow-motion videos above, much more clearly than with the Eizo display. While those videos only capture a portion of the screen, they accurately convey the frame-delivery experience I saw. I carefully confirmed that there wasn't a visible tear line elsewhere on the screen, too.

Was it a Turing-specific oversight? The same trick seemed to work with the RTX 2080, too, so it wasn't just an RTX 2080 Ti thing. I pulled out one of our GTX 1080 Ti Founders Editions and hooked it up to the Eizo display. The red light flipped on, and I was able to enjoy the same tear-free experience I had been surprised to see from our Turing cards. Another seemingly jaw-dropping revelation on its own, but one that didn't get me any closer to understanding what was happening.

Was it a matter of Founders Editions versus partner cards? I have a Gigabyte RTX 2080 Gaming OC 8G in the labs for testing, and I hooked it up to the Eizo display. On came the red light.

Was it something about our test motherboard? I pulled our RTX 2080 Ti out of the first motherboard I chose and put it to work on the Z370 test rig we just finished using for our Turing reviews. The card happily fed frames to the Eizo display as they percolated through the pipeline. Another strike.

Was Windows forcing Vsync on thanks to our choice of non-exclusive fullscreen mode? (Yes, as it turns out, but we'll get to why I think so in a moment). I pulled out my frame-time-gathering tools and collected some data with DXMD running free and in its double- and triple-buffered modes to find out. If Windows was somehow forcing the game into Vsync, I would have seen frame times cluster around the 16.7-ms and 33.3-ms marks, rather than falling wherever.

Our graphs tell the opposite tale, though. Frame delivery was apparently happening normally while Vsync was off, and our Vsync graphs show the expected groupings of frame times around the 16.7-ms and 33.3-ms marks (along with a few more troublesome outliers). Didn't seem like forced Vsync was the reason for the tear-free frame delivery we were seeing.

Update: Some reasoning about what we're seeing underlines why the above line of thought was incorrect. If the Desktop Window Manager itself is performing a form of Vsync, as Microsoft says it does, we probably wouldn't see the results of those quantizations in our application-specific frame-time graphs for games running in borderless windowed mode. The DWM compositor itself would be the place to look, and we don't generally set up our tools to catch that data (although it can be logged). The application can presumably render as fast as it wants behind the scenes (hence why frame rates don't appear to be capped in borderless windowed mode, another source of confusion as we were putting together this article), while the compositor would presumably do the job of selecting what frames are displayed and when.

We didn't try and isolate drivers in our excitement at this apparent discovery, but our test systems were using the latest 411.70 release direct from Nvidia's website. We did install GeForce Experience and leave all other settings at their defaults, including those for Nvidia's in-game overlay, which was enabled. The other constants in our setup were DisplayPort cables and the use of exclusive versus non-exclusive (or borderless windowed) modes in-game. Our test systems' versions of Windows 10 were fully updated as of this afternoon, too.

Conclusions (updated 10/1/18)

So what ultimately happened here? Well, part of the problem is that I got real excited by that FreeSync light and the tear-free gaming experience that our systems were providing with the settings we chose, and I got tunnel vision and jumped the gun. There was one thing I neglected to do, though, and that was to double-check the output of our setups against a genuine variable-refresh-rate display. Had I done that, I probably would have come to the conclusion that Windows was performing Vsync of its own a lot faster. Here's some slow-motion footage of the G-Sync-compatible Asus PG279Q we have in the TR labs, running our DXMD test sequence:

You can see—much like in our original high-speed footage of G-Sync displays—that the real VRR experience is subtly different from regular Vsync. Motion is proceeding smoothly rather than in clear, fixed steps, something we would have seen had our GeForces actually been providing VRR output to our FreeSync displays. The FreeSync light and tear-free gaming experience I was seeing made me hope against hope that some form of VRR operation was taking place, but ultimately, it was just a form of good old Vsync, and I should have seen it for what it was.

Even without genuine VRR gaming taking place, it's bizarre that hooking up a GeForce graphics card would cause a FreeSync monitor to think that it was receiving a compatible signal, even some of the time. Whatever the case may be, the red light on my Eizo display should not have illuminated without a FreeSync-compatible graphics card serving as the source. We've asked Nvidia for comment on this story and we'll update it if we hear back.

Comments closed
    • harryk100
    • 9 months ago

    Hello. to anyone who can test a Freesync monitor/Nvidia card/Intel processor.
    My Freesync monitor seems to be recognizing Freesync since the Win10 October update and the most recent Nvidia update. I have the Samsung LC27HG70QQNXZA monitor, a GTX 1080 by MSI, and an intel i7 7700. The monitor’s setting options now have Freesync options that were not definitely not there the last 9 months. The computer also reacts when I change to Freesync on the monitor. Can someone confirm that that is the case with their Nvidia/ Freesync monitor combo? The monitor reports “Freesync” on both windowed and fullscreen modes on games.

    • DeadOfKnight
    • 9 months ago

    Aww, come on Nvidia, I thought were friends.

    • DoomGuy64
    • 10 months ago

    FWIW, my Asus MG279 monitor has an OSD that reports the [i<]exact[/i<] framerate being shown when using freesync. I can't think it's the only model to do so, and would be pretty easy to verify whether or not freesync works on Nvidia cards. Way more accurate than using a LED. Also, pretty hilarious that this is all done through DWM, and coincidentally is kinda how the freesync APU hack works too. Which makes me think Microsoft has had enough of Gsync and is putting pressure on Nvidia to standardize.

    • K-L-Waster
    • 10 months ago

    I think the whole “G-Sync must die in a fire!!1!” position some posters take is way overblown (haven’t seen much of it in this thread to be fair, but it’s been prominent in previous ones).

    And I also think the “NVidia is doomed if they don’t support FreeSync” position is, well, wishful thinking on the part of fanboys. If you look at NV’s last quarterly results it’s pretty clear they aren’t struggling to sell GPUs.

    Having said all that, I would have no objection to NVidia deciding to add support for Freesync / VESA Adaptive Sync / Whatever the Heck You Wanna Call It Sync. It would be rather petty to say “I don’t care if the tech works, only people who bought the right brand should be allowed to have VRR.” If your Freesync monitor suddenly works with your GeForce card, enjoy.

    It’s just the “FreeSync must be the only VRR that exists” position that bother’s the eff out of me.

      • Krogoth
      • 10 months ago

      G-Sync 1.0 is obsolete proprietary non-sense that really needs to go the way of proprietary non-sense. It was a stop-gap solution to work on Kepler so Nvidia could get a jump start on the VRR craze before VESA finalize its VRR spec with Displayport 1.2a. It has outlived its usefulness.

      Nvidia make the bulk of their current revenue from GPU sales not “G-Sync” monitor sales. They don’t even do R&D and design their own monitors. They have to deal with monitor vendors where the majority of them don’t want to deal with G-Sync as seen with the lack of exposure and limited selection of G-Sync monitor SKUs versus Freesync-brand monitors. Despite the fact that Nvidia GPUs have been outselling AMD RTG GPUs for the past several years by a massive margin.

      It is very vexing to me why some senior members of Nvidia insist on pulling an walled garden strategy with G-Sync (copying from Apple) when they don’t even have a vertical monopoly.

        • Usacomp2k3
        • 10 months ago

        Gysnc is branding not technology. It is a guaranteed good experience; something Freesync can’t provide with monitors that have a stupid 40-60hz adaptive range.

          • DoomGuy64
          • 10 months ago

          What a dead horse strawman. Freesync gaming monitors are the exact same panels as their gsync equivalents, with the only difference being a single letter in the model name.

          The point you are bringing up has [i<]nothing[/i<] to do with gaming monitors, as VESA adaptive sync is the new display standard, which means all monitors will use it going forward. That means non-gaming monitors will support it. You are cherry picking those non-gaming monitors and falsely comparing them to Gsync panels, while deliberately ignoring the fact that there are freesync gaming panels with exactly the same specs as the Gsync panel. This is a credibility problem for anyone making this argument. The technology has been around for years now, cat's out of the bag, and that old tired strawman no longer holds the persuasiveness that it did originally. I wish people would grow up and stop using it, but apparently since that is not possible, it only serves as an example of how people lie on the internet.

          • DPete27
          • 10 months ago

          I can’t down-thumb this enough. Ugh.

            • Voldenuit
            • 10 months ago

            He’s more or less accurate about GeForce being more about ‘branding’ than technical implementation (since we have desktop monitors using FPGA buffers and laptop monitors using essentially VESA Adaptive Sync).

            The kerfufle w shoddy FreeSync ranges seems a bit harsh, there’s really lots of people that share the blame for all those monitors out there with 48-75 Hz and 40-60 Hz ranges, from AMD being too hands-off, to VESA not caring about sync ranges, to monitor makers trying to save a buck. AMD tightened the requirements for Freesync 2, but that horse has left the stable, and to this day, every time a Freesync monitor is brought up, the first question is usually ‘What’s the FreeSync range?’, typically followed half the time with ‘I don’t know, the manufacturer doesn’t list it in the specs’.

            • Redocbew
            • 10 months ago

            I think we’re proving K-L-Waster’s point here.

            Since I’m just a pleb stuck with a 60hz display I’ve often wondered if there’s any objective differences between the various approaches we’ve seen to building a variable refresh rate display. There probably is some difference, but I have a very difficult time believing the differences are big enough to cause the kind of fuss that often shows up around these things.

            • DoomGuy64
            • 10 months ago

            K-L-Waster’s point would be fine if both monitors natively worked on Nvidia cards. The problem is that they don’t, and Nvidia has spent a lot of effort keeping it that way. Now that workarounds are appearing, it’s going to be hard to crack down on it without perception worsening even further.

            If anything, Gsync is the sole chink in Nvidia’s armor when it comes to consumer perception. It’s nothing more than a walled garden that cost more to implement, because the performance is the same.

            The only difference is price, because it cost more for the gsync modules. If you don’t care to pay more for a module, go ahead. However, there are people who recognize that this is a pointless waste of money, and want Nvidia to drop it for that reason. Nvidia could just as easily add DRM to freesync and save $200+ if they wanted to keep their walled garden. This makes it all the more irritating when Nvidia makes no attempt to innovate, and instead raises prices on the modules for 4k. There’s no way this is sustainable long term, and will eventually cause a backlash the second AMD sells competitive products. At least for the segment of people who care about this issue.

            Now that these workarounds exist, the best course of action would be to just enable both, and let consumers decide what they want to use. We’ll see if that happens though.

            • Voldenuit
            • 10 months ago

            [quote<]Since I'm just a pleb stuck with a 60hz display I've often wondered if there's any objective differences between the various approaches we've seen to building a variable refresh rate display. [/quote<] I have 3 "G-sync" monitors at home - 2 desktop (120 Hz IPS Alienware AW3418DW and 144 Hz TN Asus ROG PG278Q), and a laptop with a 120 Hz IPS G-sync Panel. I'd be hard pressed to find any noticeable difference in the VRR performance of any of the 3 monitors, despite the laptop display using a system similar to VESA Adaptive Sync. G-Sync may or may not have technical merits, but nvidia most likely regards it as a walled garden they can use to lock in users and lock out competitors, so expect them to have to be dragged kicking and screaming to the VESA table, if ever.

            • Redocbew
            • 10 months ago

            I’m sure they do regard it as a walled garden, and I also wouldn’t expect them to give it up willingly.

            • Gastec
            • 9 months ago

            What do “kerfufle w shoddy” mean, can you translate to English or Spanish?

            • DoomGuy64
            • 10 months ago

            This. We have a problem with people substituting reality for propaganda.

            [url<]https://techreport.com/news/34040/acer-xv273k-brings-4k-144-hz-and-freesync-together[/url<] [url<]https://techreport.com/news/34039/acer-xb273k-makes-4k-144-hz-and-g-sync-a-bit-more-accessible[/url<] Here is the reality of freesync. Not some imaginary scenario where 144hz freesync panels only support 40-60 hz because some kool-aid drinker has an agenda. Gaming monitors are not budget monitors. If you want a gaming monitor, you don't buy some low resolution Walmart special. Those Walmart specials are not equivalent to Gsync panels, and anyone comparing the two is full of it. Wanna do a comparison? Compare the two that are actually equivalent. "xv" vs "xb", and price. That's the difference. Not some strawman that pits a budget special vs a gaming monitor.

            • Redocbew
            • 10 months ago

            Crappy hardware is crappy, but you are aware that there’s never been such a thing as a “gsync panel”, correct? When gsync was still something new you could buy the kit and do the integration yourself with the appropriate hardware. Given that, I’m not sure what you think the point is of comparing panels.

            • DoomGuy64
            • 10 months ago

            There is none, and there never will be. I’m only mentioning it because some people are obsessed with doing so. Freesync and gsync panels are exactly the same. Only the connecting display method/software is different.

            This whole “controversy” about 40-60 hz panels is about freesync enabling low end panels to support adaptive, and fanboys taking those panels as “evidence” that freesync is lesser. There is no such thing. It’s just that budget manufacturers are never going to put a $200+ gsync module in a budget monitor, and freesync is free. Of course those monitors are going to exist. You just have to not be stupid when buying a monitor, and buy a panel that is appropriate for gaming. Woo, so problematic.

            This is ridiculous that people are still bringing up this non-argument in 2018. It speaks more about people’s tunnel visioned fanboyism than any actual real problem. 40-60hz gaming monitors? Show me. Where is this imaginary gaming panel being sold? What model number? What brand? You can’t, because it only exists inside fantasy land. If you’re buying a monitor on newegg, you simply pick between the xv273k and the xb273k, and it’s the same for any other monitor. If you pick a lesser monitor, that’s on you for not paying attention. It’s not the fault of freesync existing, it’s on the consumer to make an educated purchase. The whole argument is based on people being too stupid to make that choice, and IMO it speaks more about the arguer than the argument. Caveat Emptor.

            • K-L-Waster
            • 10 months ago

            This focus on the panels is diverging from the point a bit. The panel itself has nothing to do with either VRR technology. The difference is in what device feeds the panels the instructions on when to display a frame.

            * In G-Sync, there is a dedicated chip that manages this (used to be an FPGA, I’m not sure if they are still using FPGAs or if they have a purpose built chip). This add cost to the monitor, but means you have dedicated hardware managing the timing, and that hardware can be tuned to the specific characteristics of the panel it is feeding, giving it the potential to perform better (if the internal firmware is timed properly to the specific panel, of course).

            * In FreeSync, the timing is managed by the GPU. This keeps costs down, but does mean the GPU has an extra job to do (more overhead) and the GPU driver has to know how to match frame times to any panel it happens to be attached to.

            I completely agree there is no technological reason NVidia could not support both. It’s a business decision on their part, not a technical decision.

            • DoomGuy64
            • 10 months ago

            It’s still using FPGA on the desktop, and it is technically inferior to freesync. The GPU is sending non-vsync frames to a buffer on the FPGA, which then handles synchronizing the framerate to the monitor. 4k modules cost more because you have to increase the processing power and buffer memory. Which also means increased power use.

            There is no possible way for this to work better than a direct hardware approach like freesync. Freesync eliminates redundant hardware, works directly with the panel, and more features can be managed in software instead of firmware.

            Considering how the panel itself is the bottleneck, gsync doesn’t perform worse than freesync, but there are increased points of hardware failure via the FPGA chip, and increased power use. No panels need “tuning” because that is already handled by the native LCD controller. The sole reason for this argument stems from one of the first “beta” freesync panels not supporting overdrive due to rushed firmware, but every panel since then has. Overdrive and timing issues are all done by the LCD controller, freesync merely has to tell the panel it is using adaptive and send the frames. The software aspect of freesync handles things like LFC, and no panel specific issues. The only reason why Gsync handles everything is because the FPGA replaces the LCD controller, which would normally do those functions, as it currently does under freesync.

            Arguing about Gsync doing panel timings is like saying panels don’t already do that. If that was the case, those panels wouldn’t be capable of functioning, which is an insane insinuation. No, they both function properly. Gsync just uses a FPGA to replace the normal display controller, and as such needs to replicate the display controllers functionality, albeit in a much less efficient manner.

        • psuedonymous
        • 9 months ago

        [quote<]It was a stop-gap solution to work on Kepler so Nvidia could get a jump start on the VRR craze before VESA finalize its VRR spec with Displayport 1.2a. [/quote<] Pure bovine ejecta. If G-Sync had not launched, you can bet your bottom dollar that eDP's Panel Self Refresh would have remained as a mobile power-saving function, and would never have received the rebranding to Adaptive Sync. The entire reason for the G-sync FPGA module in the first place was because no desktop panel display controller had the capability to accept non-synchronous updates. That is why there was a year-and-a-half delay between G-Sync being released and Freesync compatible monitors being released: panel controllers had to be modified to add that capability, go through the normal tapeout->fab->test cycle, and then be integrated into monitors by manufacturers. This is also the reason you will not see non-OLED 'Freesync 2 HDR' monitors achieving adequate contrast ratios for at least a year: no panel controllers exist that can handle asynchronous refresh with a FALD backlight, so they are relegated to the old 'dynamic constrast' BS, or at best a handful of edge-it zones (and even then with some terrible update rates, slow enough that you can see the zones lighting and dimming [i<]after[/i<] a fast bright object has passed in test scenes). The current crop, even in synchronous HDR mode, are achieving contrast ratios in the 1500:1/2000:1 range, which is embarrassingly poor for HDR (where you want to be looking for 20,000:1 as an absolute minimum).

          • Krogoth
          • 7 months ago

          Nvidia was a member of the VESA’s VRR spec while it was in development. They saw the potential of it. They wanted to capture marketshare and mindshare by being “first one” with VRR support. Kepler was already taped out by the time Displayport 1.2 spec was in the process of being ratified. Maxwell wasn’t ready yet. Nvidia didn’t want to wait for VESA VRR spec to be finalized. Instead they just copied most of VESA’s VRR spec and threw in proprietary middleware so it could be adapted to their Kepler silicon. The “walled garden” model was just icing on the cake.

    • Wonders
    • 10 months ago

    [quote<]It all started with a red light.[/quote<] And it will end with a red light. Despite the mitigating statements appended after the fact, the reality is that this will remain a capital-M caliber Mystery until the Freesync indicator light is fully explained. And I mean just that: Fully explained, not brushed aside. Thank you Jeff for reporting on this, er, illuminating situation.

      • ET3D
      • 10 months ago

      It all started at the red light district. I was there for a night of free syncing, visiting on one of my favourite monitors, of particular flexibility and known for being highly adaptive when it came to frames.

      But that night, my passion overwhelmed my sense. I felt a need for power, for speed, something that I knew the AMD cards, lovely as they were, would not be able to supply me. I admit, I was weak, and the call of the RTX was strong, and I could already imagine in my mind the assault on my senses that the FPS would bring, and the thrust of the rays as they traced their paths.

      As my excitement mounted, I was almost ready to forsake my beloved Eizo for the young and feisty RTX, but then a thought crossed my mind: what if I could do them both, together? I didn’t matter if they weren’t compatible. I wanted, no, needed them both!

      To be continued… (or maybe not)

    • Chrispy_
    • 10 months ago

    With enough articles like this prompting investigation into VRR on Nvidia cards, perhaps Nvidia will pull their thumbs out of their asses and actually support more than just G-Sync….

    • DPete27
    • 10 months ago

    Jeff.
    Were you hooked up via HDMI or DP?
    (Did I miss that stated somewhere?)

      • Ryu Connor
      • 10 months ago

      DisplayPort, he does mention it.

      As discussed before, HDMI FreeSync is proprietary to AMD. Only HDMI 2.1 VRR is available to NVIDIA. Edit: Turing has HDMI 2.0b, so no love there either.

        • pogsnet1
        • 10 months ago

        Freesync is proprietary free, so in theory Nvidia can use it too.

          • Ryu Connor
          • 10 months ago

          Adaptive Sync aka FreeSync 1 from the DisplayPort standard is a standard without a royalty.

          HDMI FreeSync is not built by HDMI and is not part of the HDMI standard. HDMI FreeSync was built by AMD using the vendor specific signaling options. NVIDIA cannot use HDMI FreeSync. The vendor specific extensions AMD created are proprietary to AMD.

          NVIDIA can use HDMI 2.1 VRR, as they already pay HDMI a royalty per port for the standard. That said, Turing does not have an HDMI 2.1 port.

    • sconesy
    • 10 months ago

    Best thing I’ve read online all week. Thought-provoking and detailed.

    • obarthelemy
    • 10 months ago

    “posterity” as in “I think I made a huge mistake.”

    ;-p

      • exilon
      • 10 months ago

      So did Nvidia… their latest driver is telling monitors that the GPU is capable of sending variable refresh rate DisplayPort streams.

    • Zaryab
    • 10 months ago

    What is the hack though?! Because I have a GTX 1080ti and a free sync display does that mean it’s just working? You didn’t really explain how you did it and from what I could read it just happened for you meaning it’s not a hack… So I don’t get it

    • Jeff Kampman
    • 10 months ago

    Folks, after further research and the collection of more high-speed camera footage from our G-Sync displays, I believe the tear-free gameplay we’re experiencing on our FreeSync monitors with GeForce cards is a consequence of Windows 10’s Desktop Window Manager adding some form of Vsync of its own to the proceedings when games are in borderless windowed mode rather than any form of VESA Adapative-Sync being engaged, a fact that may have been made easy to overlook by the refresh rate ranges of our 2560×1440 displays. Pending an official response from Nvidia as to just what we’re experiencing, I’d caution against drawing any conclusions from our observations at this time. I apologize for any misleading conclusions we’ve put forward.

      • USAFTW
      • 10 months ago

      If so, I find the fact that the Freesync indicator LED in your setup turns on a bit strange. Other users on reddit have also reported Freesync being shown on their display, even with Maxwell generation cards.
      [url<]https://www.reddit.com/r/nvidia/comments/9k446z/my_monitor_reports_freesync_working_with_a_gtx980/[/url<] I want to believe that it's happening, though.

        • exilon
        • 10 months ago

        Depending on how Freesync is implemented, it’s possible that the GPU is just sending frame start/end packets at regular intervals, treating the monitor as a regular monitor, while the monitor is primed to receive them at varying intervals.

      • YellaChicken
      • 10 months ago

      Jeff, according to the manual for that particular monitor, for the freesync light to come on the card must support freesync AND have it enabled. Would it be worth trying the same thing with a Radeon card with freesync disabled in the driver settings and see if the monitor is still fooled into detecting freesync? Might rule out windows vsync if it doesn’t show red in that situation.

      Edit: or rule in vsync if it does

        • Redocbew
        • 10 months ago

        If windows has its own VRR scheme in place, then I would think someone from the driver team at AMD or Nvidia must know about it. That seems like a great way to cause problems otherwise.

        Edit:. Why not also try that test with an Intel IGP? If Windows is the culprit here I wonder how much the hardware really matters at all.

          • faramir
          • 10 months ago

          +1, please re-run the tests with AMD (with FreeSync explicitly disabled) and Intel graphics in non-exclusive node to see whether it is Windows DM causing this.

          • ET3D
          • 10 months ago

          I’m sure someone would know, but I also think it’s not strictly necessary for the driver team to have implemented adaptive sync. If the Windows device driver model has been updated to include control of low level signalling, it’s possible that Windows itself is driving this.

      • Redocbew
      • 10 months ago

      As a side note, don’t sweat it if you got hoodwinked by the system here. Discovery is messy, and if you’re not afraid to get some of that on you, then you’re probably not going to come away with much at the end of it.

      • Jeff Kampman
      • 10 months ago

      While we still haven’t heard from NV on what might be happening, I’ve updated the article to more cohesively reflect our findings, along with footage of an actual G-Sync display. Sorry again for the confusion.

        • YellaChicken
        • 10 months ago

        Meh, no worries Jeff. I don’t see any need for you to apologise and I doubt many others would here either. It was an interesting find and you followed the trail of evidence through a sensible conclusion.

        Many other sites would have written something a lot more click baity and not bothered looking past “OMG Nvidia cards stealth support Freesync, no response from green team!”

        You piqued our interest and gave us a very well investigated article. And if any gerbils come across something similar in future we’ll know it’s not necessarily what it looks like.

    • christos_thski
    • 10 months ago

    intel is going to support freesync/vrr too, and assuming that their discrete GPU is competitive, this will mean that it becomes the de facto standard.

    it will be … awkward if nvidia pulls freesync support now (even if it happened by accident) only to sheepishly reintroduce it when people are buying freesync intel and AMD discrete GPUs in a couple of years…

    give it up nvidia… keep gsync support but finally support the industry standard as well

    I purchased an RX580 instead of a 1060 just because of freesync, and I’m not the only one. lack of freesync support may become an important factor for avoiding nvidia gpus when amd/intel catch up.

    • Wonders
    • 10 months ago

    This is a delightful turn of events. What happens next, now that the cat’s out of the bag?

      • Krogoth
      • 10 months ago

      Hopefully, Nvidia finally realizes that G-Sync 1.0 is at a dead-end and would make far more revenue from influx of GPUs sales from people who were on the fence on the whole G-Sync/Freesync non-sense. They don’t realize that Freesync is the only reason AMD RTG is even viable on the performance gaming market.

      Nvidia’s G-Sync 1.0 “walled garden” strategy will not work in the long-run simply because they do not have a vertical monopoly (They produce their own monitors and dominate that market).

    • Redocbew
    • 10 months ago

    What?!? Freesync working with Nvidia hardware?

    This ruins EVERYTHING!

    /swipes desk clean

    • Forge
    • 10 months ago

    Awesome. I’ve been hoping for something like this for a long time. The 0$ price premium has led to me owning a few Freesync monitors despite running Nvidia GPUs, and I figured I’d have to wait for HDMI 2.1 to backdoor Freesync support in. I hope Nvidia has had a change of heart and quietly enabled Freesync on all their cards. We will see, I guess.

    • Krogoth
    • 10 months ago

    This isn’t a surprise at all. Desktop Turing SKUs are likely quietly supporting Displayport 1.4a spec which means that they should work with VESA VRR spec. Freesync 1 is simply AMD’s implementation of VESA’s VRR spec.

    You can do the same thing on Nvidia’s mobile GPUs which have been supporting VESA’s VRR spec under G-Sync mobile moniker. If you were to hook-up a “Freesync” monitor to a laptop equipped with a Nvidia GPU it’ll operate “G-Sync” on the monitor while the monitor thinks it is using “Freesync” mode.

      • Waco
      • 10 months ago

      Even my crappy ASUS Zenbook supports “G-sync” via the Displayport spec. It’s wonderful but annoying since my desktop can’t do the same.

        • NoOne ButMe
        • 10 months ago

        do you know if any of those notebooks can support Freesync displays from their displayport outputs?

        • JustAnEngineer
        • 10 months ago

        [quote=”Waco”<] Even my [s<]crappy[/s<] [i<]thin & light[/i<] ASUS Zenbook supports "G-sync" via the Displayport spec. [/quote<] FTFY. 😉

      • RAGEPRO
      • 10 months ago

      Jeff notes that it also works on a Pascal GeForce GTX 1080 Ti.

      • DoomGuy64
      • 10 months ago

      Nvidia admits they are. The tech specs have an almost hidden disclaimer that say “1.4a ready”. It is posted in the tech specs popup on the very bottom in black text against a black background. Now if only they sold the 2070 for $400.

        • Topinio
        • 10 months ago

        Hear hear, but the $500 for the partner cards will be tempting if it will work with my XL2730Z

        • danazar
        • 10 months ago

        VESA Adaptive Sync (the Freesync 1-based open standard) became part of the DP spec with DP 1.2a… but even at 1.4a it’s still optional. Saying a device is “1.4a ready” doesn’t mean it supports Adaptive Sync. You can be 1.4a compliant without enabling Adaptive Sync.

    • Waco
    • 10 months ago

    I think the short version of this is that Nvidia cards run VRR without issue…as long as the driver [i<]thinks[/i<] you aren't playing a game. Nvidia...how the mighty have fallen. The entire industry is pissed off at you for various reasons and you still persist with the idiocy.

      • RAGEPRO
      • 10 months ago

      I actually think this is some mobile GPU code that got turned on for desktop GPUs.

      Basically, mobile GPUs should use Mobile G-Sync at all times when possible, because updating the screen as little as possible saves power. I think what’s happening is that when you enter Exclusive Fullscreen mode the driver and/or the application are aggressively forcing a specific vsync mode, while on the desktop the driver is defaulting to some other behavior. That’s why playing in borderless window mode coincidentally is freesync’d.

    • NoOne ButMe
    • 10 months ago

    can only hope it is actually intentional and is Nvidia stealthily rolling out Freesnyc support….

    Hell, who am I kidding.

      • NovusBogus
      • 10 months ago

      Well, I’m cynical enough to assume that NV has probably had moderately tested FreeSync support on standby for years, just waiting to be included or uncommented when market forces (read: analysts calculate a substantial number of lost sales) finally compel them to act. It’s what I would do.

        • NoOne ButMe
        • 10 months ago

        of course they have, it is how mobile Gsync works.

        • Krogoth
        • 10 months ago

        Nvidia has been supporting VESA’s VRR spec on their post-Kepler mobile GPUs under “G-Sync Mobile” brand-name.

        There’s nothing in the silicon that prevents VESA VRR spec from working. It is purely firmware + software locks that aren’t much different from the locks that prevent Geforce SKUs from turning into a “Quadro/Telsa” on the cheap (Nvidia started doing this with G8x).

        • Redocbew
        • 10 months ago

        That’s kind of what I figured also. Jeff’s probably on the right track that there’s a bug somewhere which is allowing for functionality to be switched on that wouldn’t be otherwise. It reminds me of that bug which for a few hours killed everyone on Facebook a while ago.

        • d0x360
        • 10 months ago

        There is no need to test. Freesync is part of the HDMI 2.0+ standard so nvidia has to support it or they can’t get certified for HDMI and wouldn’t be allowed to use HDMI ports on their cards.

        They actively prevent freesync from working because of gsync. Gsync earns them money, freesync doesn’t.

        That’s literally the ONLY reason freesync isn’t an option in the control panel for Nvidia users and quite honestly it’s ridiculous. I’m not spending extra for a gsync display when they should be supporting freesync.

        There must be a way to enable it via bios hack. Hopefully someone figures it out. It would work for the last 3 generations of Nvidia cards.

          • Ryu Connor
          • 10 months ago

          FreeSync is not part of the HDMI standard.

          HDMI 2.1 VRR is also not FreeSync.

          • danazar
          • 10 months ago

          VRR is only part of HDMI 2.1, and as of 2.1 it’s an optional feature. A device can be “HDMI 2.1 compliant” without supporting VRR.

          If you think about it for even half a second, this should be obvious—HDMI is a broad standard covering a range of devices. Most TVs (and definitely most output devices) don’t need VRR. Outside of gaming, practically every HDMI video source you would use to feed a TV (HDTV broadcast receiver, cable/satellite, Roku / streaming device, video camera, Blu-Ray player) will work fine outputting at a fixed rate. A lot of devices (and a lot of consumers) don’t need VRR at all. Making VRR mandatory would just slow down the transition to HDMI 2.1.

          They’ll probably make it mandatory eventually, on the display side at least, but making it optional for now allows it to be a high-end feature that will trickle down eventually as it gets cheaper.

      • pogsnet1
      • 10 months ago

      Freesync is proprietary free, so in theory Nvidia can use it too.

      • Wirko
      • 10 months ago

      “Freesnyc”, that’s right. Half-compatible.

Pin It on Pinterest

Share This