Updated: GeForce cards mysteriously appear to play nice with TR's FreeSync monitors


Update 9/30/18 3:22 AM: After further research and the collection of more high-speed camera footage from our G-Sync displays, I'm confident the tear-free gameplay we're experiencing on our FreeSync displays in combination with GeForces is a consequence of Windows 10's Desktop Window Manager adding its own form of Vsync to the proceedings when games are in borderless windowed mode, rather than any form of VESA Adapative-Sync being engaged with our GeForce cards. Pending a response from Nvidia as to just what we're experiencing, I'd warn against drawing any conclusions from our observations at this time and sincerely apologize for the misleading statements we've presented in our original article. The original piece continues below for posterity.

It all started with a red light. You see, the primary FreeSync display in the TR labs, an Eizo Foris FS2735, has a handy multi-color power LED that flips over to red when a FreeSync-compatible graphics card is connected. I was setting up a test rig today for reasons unrelated to graphics-card testing, and in the process, I grabbed our GeForce RTX 2080 Ti Founders Edition without a second thought, dropped it into a PCIe slot, and hooked it up to that monitor.

The red light came on.

Some things are just not supposed to happen in life, like the sun circling the earth, people calling espresso "expresso," and FreeSync monitors working in concert with Nvidia graphics cards. I've used GeForce cards with that Eizo display in the past as the occasion demanded, but I can't recall ever seeing the monitor showing anything other than its white default indicator with the green team's cards pushing pixels.

At that point, I got real curious. I fired up Rise of the Tomb Raider and found myself walking through the game's Geothermal Valley level with nary a tear to be seen. After I recovered from my shock at that sight, I started poking and prodding at the game's settings menu to see whether anything in there had any effect on what I was seeing.

Somewhere along the way, I discovered that toggling the game between exclusive fullscreen and non-exclusive fullscreen modes (or borderless window mode, as some games call it) occasionally caused the display to fall back into its non-variable-refresh-rate (VRR) default state, as indicated by the LED's transition from red to white. That color change didn't always happen, but I always noticed tearing with exclusive fullscreen mode enabled in the games I tried, while non-exclusive fullscreen mode seemed to reliably enable whatever VRR mojo I thought I had uncovered.


Our Eizo FS2735 failing to do the variable-refresh-rate dance in exclusive fullscreen mode
 

Our Eizo FS2735 delivers tear-free gaming, courtesy of double buffering and not VRR, with our RTX 2080 Ti in non-exclusive fullscreen mode

Next, I pulled up my iPhone's 240-FPS slow-mo mode and grabbed some footage of Deus Ex: Mankind Divided running on the RTX 2080 Ti while it was connected to the Eizo monitor. You can sort of see from the borderless windowed mode video that frames are arriving at different times, but that motion is advancing an entire frame at a time, while the exclusive-fullscreen mode shows the tearing and uneven advancement that we expect from a game running with any kind of Vsync off.

Now that we seemed to have a little bit of control over the behavior of our Nvidia cards with our Eizo display, I set about trying to figure out just what variable or variables were apparently allowing us to break through the walls of Nvidia's VRR garden beyond our choice of fullscreen modes.


Our LG 27MU67-B failing to sync up with the RTX 2080 Ti in exclusive fullscreen mode
 

Our LG 27MU67-B exhibits regular Vsync—not VRR—with the RTX 2080 Ti in non-exclusive fullscreen mode

Was it our choice of monitor? I have an LG 27MU67-B in the TR labs for 4K testing, and that monitor supports FreeSync, as well. Shockingly enough, so long as I was able to keep the RTX 2080 Ti within its 40-Hz-to-60-Hz FreeSync range, the LG display seemed—emphasis: seemed—to do the VRR dance just as well as the Eizo. You can see what I took as evidence in the slow-motion videos above, much more clearly than with the Eizo display. While those videos only capture a portion of the screen, they accurately convey the frame-delivery experience I saw. I carefully confirmed that there wasn't a visible tear line elsewhere on the screen, too.

Was it a Turing-specific oversight? The same trick seemed to work with the RTX 2080, too, so it wasn't just an RTX 2080 Ti thing. I pulled out one of our GTX 1080 Ti Founders Editions and hooked it up to the Eizo display. The red light flipped on, and I was able to enjoy the same tear-free experience I had been surprised to see from our Turing cards. Another seemingly jaw-dropping revelation on its own, but one that didn't get me any closer to understanding what was happening.

Was it a matter of Founders Editions versus partner cards? I have a Gigabyte RTX 2080 Gaming OC 8G in the labs for testing, and I hooked it up to the Eizo display. On came the red light.

Was it something about our test motherboard? I pulled our RTX 2080 Ti out of the first motherboard I chose and put it to work on the Z370 test rig we just finished using for our Turing reviews. The card happily fed frames to the Eizo display as they percolated through the pipeline. Another strike.

Was Windows forcing Vsync on thanks to our choice of non-exclusive fullscreen mode? (Yes, as it turns out, but we'll get to why I think so in a moment). I pulled out my frame-time-gathering tools and collected some data with DXMD running free and in its double- and triple-buffered modes to find out. If Windows was somehow forcing the game into Vsync, I would have seen frame times cluster around the 16.7-ms and 33.3-ms marks, rather than falling wherever.

Our graphs tell the opposite tale, though. Frame delivery was apparently happening normally while Vsync was off, and our Vsync graphs show the expected groupings of frame times around the 16.7-ms and 33.3-ms marks (along with a few more troublesome outliers). Didn't seem like forced Vsync was the reason for the tear-free frame delivery we were seeing.

Update: Some reasoning about what we're seeing underlines why the above line of thought was incorrect. If the Desktop Window Manager itself is performing a form of Vsync, as Microsoft says it does, we probably wouldn't see the results of those quantizations in our application-specific frame-time graphs for games running in borderless windowed mode. The DWM compositor itself would be the place to look, and we don't generally set up our tools to catch that data (although it can be logged). The application can presumably render as fast as it wants behind the scenes (hence why frame rates don't appear to be capped in borderless windowed mode, another source of confusion as we were putting together this article), while the compositor would presumably do the job of selecting what frames are displayed and when.

We didn't try and isolate drivers in our excitement at this apparent discovery, but our test systems were using the latest 411.70 release direct from Nvidia's website. We did install GeForce Experience and leave all other settings at their defaults, including those for Nvidia's in-game overlay, which was enabled. The other constants in our setup were DisplayPort cables and the use of exclusive versus non-exclusive (or borderless windowed) modes in-game. Our test systems' versions of Windows 10 were fully updated as of this afternoon, too.

Conclusions (updated 10/1/18)

So what ultimately happened here? Well, part of the problem is that I got real excited by that FreeSync light and the tear-free gaming experience that our systems were providing with the settings we chose, and I got tunnel vision and jumped the gun. There was one thing I neglected to do, though, and that was to double-check the output of our setups against a genuine variable-refresh-rate display. Had I done that, I probably would have come to the conclusion that Windows was performing Vsync of its own a lot faster. Here's some slow-motion footage of the G-Sync-compatible Asus PG279Q we have in the TR labs, running our DXMD test sequence:

You can see—much like in our original high-speed footage of G-Sync displays—that the real VRR experience is subtly different from regular Vsync. Motion is proceeding smoothly rather than in clear, fixed steps, something we would have seen had our GeForces actually been providing VRR output to our FreeSync displays. The FreeSync light and tear-free gaming experience I was seeing made me hope against hope that some form of VRR operation was taking place, but ultimately, it was just a form of good old Vsync, and I should have seen it for what it was.

Even without genuine VRR gaming taking place, it's bizarre that hooking up a GeForce graphics card would cause a FreeSync monitor to think that it was receiving a compatible signal, even some of the time. Whatever the case may be, the red light on my Eizo display should not have illuminated without a FreeSync-compatible graphics card serving as the source. We've asked Nvidia for comment on this story and we'll update it if we hear back.

Tip: You can use the A/Z keys to walk threads.
View options