Trouble brewing? What happens at the edges?
One intriguing question about FreeSync displays is what they do when they reach the edges of their refresh rate ranges. As I've noted, the XL2730Z can vary from 40Hz to 144Hz. To be a little more precise, it can tolerate frame-to-frame intervals between 25 and 6.94 milliseconds. What happens when the frames come in from the GPU at shorter or longer intervals?
AMD has built some flexibility into FreeSync's operation: the user can choose whether to enable or disable vsync for frames that exceed the display's timing tolerance. Consider what happens if frames are coming in from the GPU too quickly for the display to keep up. With vsync enabled, the display will wait a full 6.94 ms before updating the screen, possibly discarding excess frames. (G-Sync always behaves in this manner.) With vsync disabled, the display will go ahead and update the screen mid-refresh, getting the freshest information to the user's eyes while potentially introducing a tearing artifact. Since variable refresh is active, the screen will only tear when the frame rate goes above or below the display's refresh range.
Giving users the option of enabling vsync in this situation is a smart move, one that I fully expect Nvidia to copy in future versions of G-Sync.
The trickier issue is what happens when the GPU's frame rate drops below the display's minimum refresh rate. I've seen some confusion and incorrect information at other publications about exactly how FreeSync handles this situation, so I took some time to look into it.
As you may know, LCD panels must be refreshed every so often in order for the pixels to maintain their state. Wait too long, and the pixel will lose its charge and drift back to its original color—usually white or gray, I believe. Variable-refresh schemes must cope with this limitation; they can't wait forever for the next frame from the GPU before painting the screen again.
Some reports have suggested that when the frame-to-frame interval on a FreeSync display grows too long, the display responds by "locking" into a 40Hz refresh rate, essentially quantizing updates at multiples of 25 ms. Doing so would be pretty poor behavior, because quantization at 25 ms steps would mean horribly janky animation. You'd be making the worst of an already bad situation where the attached PC was running up against its own performance limitations. However, such talk is kind of nonsense on the face of it, since we're dealing with a variable-refresh display working in concert with a GPU that's producing frames at an irregular rate. What happens in such cases differs between FreeSync and G-Sync, but neither solution's behavior is terribly problematic.
Let's start with how G-Sync handles it. I talked with Nvidia's Tom Petersen about this question, since he's made some public comments on this matter that I wanted to understand.
|Such talk is kind of nonsense on the face of it, since we're dealing with a variable-refresh display working in concert with a GPU that's producing frames at an irregular rate.|
Petersen explained that sorting out the timing of a variable-refresh scheme can be daunting when the wait for a new frame from the graphics card exceeds the display's maximum wait time. The obvious thing to do is to refresh the display again with a copy of the last frame. Trouble is, the very act of painting the screen takes some time, and it's quite possible the GPU have a new frame ready while the refresh is taking place. If that happens, you have a collision, with two frames contending for the same resource.
Nvidia has built some logic into its G-Sync control module that attempts to avoid such collisions. This logic uses a moving average of the past couple of GPU frame times in order to estimate what the current GPU frame-to-frame interval is likely to be. If the estimated interval is expected to exceed the display's max refresh time, the G-Sync module will preemptively refresh the display part way through the wait, rather than letting the LCD reach the point where it must be refreshed immediately.
This preemptive refresh "recharges" the LCD panel and extends its ability to wait for the next GPU frame. If the next frame arrives in about the same time as the last one, then this "early" refresh should pay off by preventing a collision between a new frame and a gotta-have-it-now refresh.
I asked AMD's David Glen, one of the engineers behind FreeSync, about how AMD's variable-refresh scheme handles this same sort of low-FPS scenario. The basic behavior is similar to G-Sync's. If the wait for a new frame exceeds the display's tolerance, Glen said, "we show the frame again, and show it at the max rate the monitor supports." Once the screen has been painted, which presumably takes less than 6.94 ms on a 144Hz display, the monitor should be ready to accept a new frame at any time.
What FreeSync apparently lacks is G-Sync's added timing logic to avoid collisions. However, FreeSync is capable of operating with vsync disabled outside of the display's refresh range. In the event of a collision with a required refresh, Glen pointed out, FreeSync can optionally swap to a new frame in the middle of that refresh. So FreeSync is not without its own unique means of dealing with collisions. Then again, the penalty for a collision with vsync enabled should be pretty minor. (My sense is that FreeSync should just paint the screen again with the new frame as soon as the current refresh ends.)
Everything I've just explained may seem terribly complicated, but the bottom line is straightforward. FreeSync's logic for handling low-FPS situations isn't anywhere near as bad as some folks have suggested, and it isn't all that different from G-Sync's. Nvidia's method of avoiding collisions seems like it might be superior in some ways, but we're talking about small differences.
You can see a difference between FreeSync and G-Sync in a contrived scenario involving a fixed frame rate below 40Hz. To record the video above, I ran Nvidia's "Pendulum" demo side by side on the XL2730Z and a G-Sync display, with the demo locked to 30 FPS on both systems. In this case, G-Sync's collision avoidance logic looks to be pretty effective, granting a marginal improvement in animation smoothness over the BenQ FreeSync monitor. (In most browsers, you can play the video at 60 FPS via YouTube's quality settings. Doing so will give you a more accurate sense of the motion happening here.)
The video above was shot with vsync enabled on the FreeSync display. If you turn off vsync, you'll see lots of tearing—an indication there are quite a few collisions happening in this scenario.
When playing a real game, though, the frame times are more likely to look something like the plot above most of the time—not a perfectly spaced sequence of frames, but a varied progression that makes collisions less likely.
Testing the practical impact of these differences in real games is tough. Nothing good is happening when your average frame rate is below 40 FPS, with bottlenecks other than the display's behavior coming into play. Sorting out what's a GPU slowdown and what's a display collision or quantization isn't always easy.
Still, I made an attempt in several games intensive enough to push our R9 290X below 40 FPS. Far Cry 4 was just a stutter-fest, with obvious system-based bottlenecks, when I cranked up the image quality. Crysis 3, on the other hand, was reasonably playable at around 35 FPS.
In fact, playing it was generally a good experience on the XL2730Z. I've seen low-refresh quantization effects before (by playing games on one of those 30Hz-only 4K monitors), and there was simply no sign of it here. I also had no sense of a transition happening when the frame rate momentarily ranged above 40Hz and then dipped back below it. The experience was seamless and reasonably fluid, even with vsync enabled for "out of bounds" frame intervals, which is how I prefer to play. My sense is that, both in theory and in practice, FreeSync handles real-world gaming situations at lower refresh rates in perfectly acceptable fashion. In fact, my satisfaction with this experience is what led me to push harder to understand everything I've explained above.
Remember, also, that we're talking about what happens when frame rates get too low. If you tune your image quality settings right, the vast majority of PC gaming should happen between 40 and 144Hz, not below the 40Hz threshold.