We got our first look at G-Sync a couple of months ago at an Nvidia press event in Montreal, and we came away impressed with the handful of demos being shown there. Now, we’ve had the chance to spend some quality time with early G-Sync hardware within the comfy confines of Damage Labs, and we have much more to say about the technology. Read on to see what we think.
So what is G-Sync?
In order to understand G-Sync, you have to understand a little bit about how current display technology works. If you’ve been hanging around TR for any length of time, you probably have a sense of these things. Today’s display tech is based on some fundamental assumptions borrowed from ye olde CRT monitors—as if an electron gun were still scanning rows of phosphors inside of today’s LCDs. Among the most basic of those assumptions is the refresh cycle, where updates are painted on the display at rapid but fixed intervals. Most monitors are refreshed at a rate of 60 times per second, or 60Hz. Going a little deeper, most LCDs still paint the screen much like a CRT: updating rows of pixels from left to right, starting at the top of the screen and scanning down to the bottom.
Updating the screen at fixed intervals can be a fine way to create the illusion of motion. Movies and television do it that way, and so do video games, by and large. However, most motion picture technologies capture images at a fixed rate and then play them back at that same rate, so everything works out nicely. The rich visuals produced by graphics processors in today’s video games are different. Graphics chips produce those images in real time by doing lots of math very quickly, crunching through many billions of floating-point operations each second. Even with all of that power on tap, the computational workloads vary widely as the camera moves through a dynamic, changing game world. Frame rendering times tend to fluctuate as a result. This reality is what has driven our move to frame-time-based performance testing, and we can draw an example from one of our recent GPU reviews to illustrate how frame rendering times vary. He’s a look at how one of today’s faster graphics cards produces frames in Battlefield 4.
The plot above shows individual rendering times for a handful of frames. There’s really not tons of variance from frame to frame in this example, but rendering times still range from about 16 to 23 milliseconds. Zoom out a bit to look at a longer gameplay sequence, and the range of frame times grows.
The crazy thing is that the stem-winding plot you see above illustrates what we’d consider to be very decent performance. No single frame takes longer than 50 milliseconds to produce, and most of them are rendered much quicker than that. In the world of real-time graphics, that’s a nice looking frame time distribution. As you can imagine, though, matching up this squiggly plot with the regular cadence of a fixed refresh rate would be pretty much impossible.
Here’s what’s crazy: that impossibility is at the heart of the interaction between GPUs and displays constantly, with every frame that’s produced. GPU rendering times vary, and display refresh rates do not. At its lowest level, the timing of in-game animation is kind of a mess.
For years, we’ve dealt with this problem by choosing between two different coping mechanisms, neither of them particularly good. The usual default method is a technology called vsync, or vertical refresh synchronization. Vsync involves storing completed frames in a buffer and only exposing a fresh, buffered frame when the time comes to paint the screen. This technique can work reasonably well when everything else in the system cooperates—when frames are coming out of the GPU at short, regular intervals.
Frame rendering times tend to vary, though, as we’ve noted. As a result, even with some buffering, the system may not have a frame ready at the start of each new refresh cycle. If there’s no new frame to be displayed when it’s time to paint the screen, the fallback option is to show the preceding frame once again and to wait for the next refresh cycle before flipping to a new one.
This wait for the next refresh interval drops the effective frame rate. The usual refresh interval for a 60Hz display is 16.7 milliseconds. Turn in a frame at every interval, and you’re gaming at a steady 60 FPS. If a frame takes 16.9 milliseconds to render—and is just 0.2 ms late to the party—it will have to wait the remaining 16.5 ms of the current interval before being displayed. The total wait time for a new frame, then, will be 33.3 ms—the equivalent of 30 FPS.
So the consequences for missing a single refresh interval are dire: half the performance and presumably half the perceived smoothness. Things get worse from there. Missing two intervals, with a frame that requires just over 33.3 ms to produce, delays the display update to 50 ms in length (equal to 20 FPS). Missing three intervals takes you to 66.7 ms, or 15 FPS. Those are your choices: 60 FPS, 30 FPS, 20 FPS, 15 FPS, and so on.
Now imagine what happens in action, as vsync works to map a wavy, up-and-down series of rendered frames to this stair-step series of effective animation rates. Hint: it ain’t exactly ideal. Here are a couple of examples Nvidia has mocked up to illustrate. They’re better than the examples I failed to mock up because I’m lazy.
This stair-step effect is known as quantization, and it’s the same effect that, in digital audio, can cause problems when mapping an analog waveform to a fixed sampling rate. Heck, I’m pretty sure we’re hearing the effects of intentionally exaggerated quantization in today’s autotune algorithms.
Quantization is not a friend to smooth animation. The second scenario plotted above is fairly common, where frame rendering times are ranging above and below the 16.7-ms threshold. The oscillation between update rates can lead to a halting, uneven sense of motion.
That’s true not just because of the quantized update rates alone, but because of the side effects of delaying frames. When buffered frames waiting in the queue are finally displayed at the next refresh interval, their contents will be temporally out of sync with their display time. After all, as frames are generated, the game engine has no knowledge about when they’ll be displayed. Also, buffering and delaying frames adds latency to the input-response feedback loop, reducing the immediacy of the experience. You’ll wait longer after clicking the mouse or pressing a key before you begin to see the corresponding action taking place onscreen.
Nvidia calls this quantization effect stuttering, and I suppose in a sense it is. However, I don’t think that’s a helpful term to use in this context. Display refresh quantization is a specific and well-understood problem, and its effects are distinct the from longer, more intermitted slowdowns that we usually describe as stuttering.
The downsides of vsync are bad enough that many gamers have decided to opt for disabling it, instead. Turning off vsync is faster and more immediate, but it means the GPU will flip to a new frame while the display is being drawn. Thus, fragments of multiple rendered frames will occupy portions the screen simultaneously. The seams between the frames are sometimes easy to see, and they create an artifact called tearing. If you’ve played a 3D game without vsync, you’ve probably seen tearing. Here’s a quick example from Borderlands 2:
Tearing is huge penalty to pay in terms of visual fidelity. Without any synchronization between GPU render times and frame display times, tearing is likely to be happening somewhere onscreen almost all of the time—perhaps multiple times per refresh cycle, if the GPU is pumping out frames often enough. As with quantization, the type of game and the nature of the motion happening in the game world will influence how readily one perceives a problem.
Like I said, neither of these coping methods is particularly good. G-Sync is intended to be a better solution. G-Sync’s goal is to refresh the display when the GPU has a frame ready, rather than on a fixed schedule. One could say that G-Sync offers a variable refresh rate, but it’s more about refresh times than rates, since it operates on a per-frame basis.
On a fast display with a 144Hz peak refresh rate, G-Sync can vary the refresh interval between 6.9 and 33.3 ms. That first number, 6.9 milliseconds, is the refresh interval at 144Hz. The second is equivalent to 30Hz or 30 FPS. If a new frame isn’t ready after 33.3 ms, G-Sync will paint the screen again with the prior frame. So the refresh interval isn’t infinitely variable, but it does offer pretty wide leeway.
In theory and in practice, then, G-Sync is easily superior to the alternatives. There’s no tearing, so the visual integrity of displayed frames isn’t compromised, and it provides almost immediate display updates once a frame is ready. Even though GPU frame rendering times vary, G-Sync’s output looks smoother than the quantized output from traditional vsync. That’s true in part because each frame’s contents more closely corresponds to its display time. G-Sync also reduces the wait time imposed by the display refresh cycle, cutting input lag.
G-Sync isn’t a perfect solution by any means. It doesn’t eliminate the left-to-right, top-to-bottom motion by which displays are updated, for instance. The 33-ms frame time cap is a little less than ideal, too. Still, this is a far sight better than the antiquated approaches we’ve been using for years.
If you’ve managed to wade through these explanations so far, you might be wondering how hard it is to make something like G-Sync work. After all, the GPU acts as the timing source for the display. Couldn’t a variable refresh scheme be implemented in software, perhaps via a GPU driver update?
Turns out it’s not that simple. Doing refreshes at varying intervals creates all sorts of havoc for LCDs, including color/gamma shifting and the like. I had no idea such problems were part of the picture, but happily, Nvidia seems to have worked them out. You’d never know about any potential for trouble when seeing the early G-Sync solutions in action.
Still, making G-Sync work requires much more than a GPU driver update. Nvidia has developed a new control module for LCD monitors. Displays must be equipped with Nvidia’s custom electronics in order to support the new refresh scheme.
The first-gen G-Sync module is pictured on the right. The biggest chip on the module is an FPGA, or field programmable gate array, which can be made to perform a range of custom tasks. In this case, the FPGA is serving as the development and early deployment vehicle for G-Sync. The FPGA is accompanied by a trio of DDR3 memory chips, each 256MB in capacity. I doubt the module requires all 768MB of memory to do its thing, but it likely needs the bandwidth provided by three separate memory channels. Nvidia tells me this first G-Sync module can handle 4K resolutions at refresh rates up to 60Hz.
We don’t yet have a lot of details on G-Sync monitors and how they’ll be priced. I expect to learn a lot more at CES next week. However, the early info seems to indicate that the G-Sync module will add about $100 to the price of a display. That’s a considerable premium, but I expect Nvidia could cut costs considerably by moving from an FPGA to a custom chip. In fact, since a G-Sync control chip would replace a traditional scaler ASIC, it shouldn’t add much cost at all to the total solution—eventually.
Nvidia seems to be intent on keeping G-Sync proprietary for the time being. As a result, if G-Sync succeeds in winning wide adoption, then Nvidia will have established itself as a provider of display electronics. Also, folks who want to take advantage of G-Sync tech will need to own a GPU based on the Kepler generation of technology—essentially any GeForce GT or GTX 600- or 700-series graphics card.
Although having an open standard for display technology of this sort would be ideal, there’s reason be pleased about Nvidia’s involvement in display chips going forward. There’s plenty of room for further innovation beyond the first-gen G-Sync tech. For instance, at the event in Montreal, John Carmack mentioned a low-persistence mode that, similar to Nvidia’s LightBoost tech for 3D glasses, strobes the display backlight after each frame has been painted in order to reduce motion blur. I don’t think that mode is ready for prime time yet, but that sort of technology could further mitigate the weaknesses of current displays.
We got the chance to take an early look at G-Sync in action using the monitor above. That’s an Asus VG248QE that’s been retrofitted with a pre-production G-Sync module. This display is marketed to gamers, and it’s based on a 24″ TN panel, with a 1920×1080 resolution and all of the attendant goodness and badness that comes with that sort of display. It’s blazing fast, with a 144Hz refresh rate and a claimed one-millisecond gray-to-gray response time, which makes it ideal for this sort of mission. Beyond that, to my eyes spoiled by constant exposure to glorious IPS panels, there’s little to recommend it. The color reproduction is poor, with obvious banding on simple gradients, and the panel has more light bleed than my neighbor’s Christmas display. The resolution, pixel density, panel size, and viewing angles are all relatively poor.
(Told you my eyes were spoiled. You try sitting in front of an Asus 4K panel for several weeks and then switching to something else.)
Once you have the right hardware and Nvidia drivers installed, using G-Sync is fairly straightforward. Just set the refresh rate to 144H, pop over to the “Manage 3D settings” portion of the Nvidia control panel, and choose “G-Sync” as the vertical sync method. After that, G-Sync should be working.
There are a few limitations to navigate, though. In order to use G-Sync, games and other applications must be running in full-screen mode, not in a window. Also, many games have their own vsync settings and don’t entirely comprehend what’s happening when G-Sync is enabled. Nvidia recommends setting vsync to “disabled” in any in-game menus. Even then, some games will limit frame rates to 60Hz, so you may need to tweak the “Preferred refresh rate” option shown above in the Nvidia control panel to “highest available.” Doing so allowed me to work around problems with a couple of games, including Call of Duty: Ghosts, without any obvious negative side effects. However, really high frame rates in Skyrim can make that game’s physics engine go bonkers, so handle with care.
So how well does it work?
Once I had the G-Sync setup working and ready to roll, I was hit by a realization that prompted a bit of soul searching. You see, as I’ve mentioned, I spend most of my computer time looking at IPS displays with amazing color reproduction. Those same displays, though, are largely limited to 60Hz refresh rates. That’s fine as far as it goes, but I’m very much aware that higher refresh rates can make for smoother gaming.
If you haven’t played with a 120 or 144Hz monitor, and if you’ve heard the oft-repeated myth about the human eye not being able to detect anything faster than 60 FPS, you might think that high-refresh displays aren’t any big deal. You’d be wrong. Just dragging an Explorer window across the desktop at 60Hz and then at 120Hz in a side-by-side setup is enough to illustrate how much smoother motion can look at twice the frame rate. There’s no question that a fun and fluid gaming experience is possible at 60Hz, but taking things up to 120 or 144Hz? More fluid and more fun, especially for quick-twitch games like first-person shooters.
I realized I needed to be careful about understanding the benefits of G-Sync above and beyond the goodness provided by a high-refresh panel.
For its part, Nvidia doesn’t seem to see any contraction here. After all, part of its pitch is about the benefits of the faster displays that will be equipped with G-Sync. Also, interestingly enough, Nvidia claims G-Sync’s most obvious benefits will come at lower performance, when frame rates are ranging between 40 and 60 FPS. I think that’s true, but the benefits of fast displays at high frame rates aren’t exactly trivial, either. They’re just not entirely exclusive to G-Sync.
Once I cranked up the G-Sync setup and began to play games, though, it didn’t take long to squelch my worries. My concerns about where the goodness came from, my snobby complaints about light bleed and color banding—these things all melted away as I was sucked into the reality unfolding with freakish fluidity in front of my eyes.
I’ll tell you more about my impressions shortly, but this is TR, where we’re slaves to empirical testing when possible. Let’s have a look at some slow-motion video captures.
Some slow-mo video examples
I wasn’t sure I could capture the benefits of G-Sync with my cheap slow-motion video camera, but it wasn’t terribly hard to come up with a meaningful example. All I had to do was find a scenario where the differences between vsync, no vsync, and G-Sync were apparent to the naked eye.
In this particular case, I wanted to compare the benefits of G-Sync to regular vsync on our fast 144Hz display. The scenario I came up with is in Skyrim, at 1080p resolution, on a GeForce GTX 660 graphics card. Frame rates fluctuate from about 60 to 85 FPS, and in this case, the camera pans around my character in a 360° arc, creating the sort of motion where both tearing and vsync quantization are easily spotted.
I recorded the results at 240 FPS, substantially faster than the display can refresh itself. Please pardon the low resolution and especially the different exposure levels. The camera auto-adjusted to make the scene look darker at lower refresh rates, and there wasn’t anything I could do to remedy that. That said, I still think these little videos, which came out of the camera unretouched and went straight onto YouTube, do a decent enough job of illustrating the relative smoothness and motion-related artifacts of the various display modes. Let me walk you through them.
At 60Hz without vsync, you can see how motion steps forward with each display refresh in very noticeable increments. Also, tearing is readily apparent. Oftentimes, multiple “seams” between rendered frames are onscreen at once. The tearing becomes easiest to notice starting at about 20 seconds into the video, as the city walls and the cathedral lurch by unevenly.
In this case, since frame rendering rates are generally above 60 FPS, switching on vsync looks like a clear win. The tearing artifacts are gone, and the animation advances fairly evenly. However, objects near or far from the camera, in the foreground and background, still advance in relatively large steps from one frame to the next.
Ah, see? At more than twice the refresh rate of the previous example, the animation looks much more fluid. However, without vsync, tearing is still prominent. The worst of it begins at about 16 seconds into the video, as the advancing hillside and walls appear to “waggle” from top to bottom as new and old frames intermix onscreen. Kinda ugly.
Here’s the near-perfect experience offered by G-Sync. The tearing is gone, the waggle is banished, and the animation is vastly more fluid. At full speed, the result is glassy smooth perceived motion. Compare this video to the one at 60Hz without vsync, which is how most hard-core gamers tend to play, and the contrast is stark. Good grief. What have we been doing all these years?
Once you’ve seen the G-Sync video, you can appreciate how even this mode, with vsync enabled at 144Hz, isn’t ideal. Watch as the cathedral swings by in the background with the halting, uneven motion caused by vsync quantization. Almost looks as if the frame update intervals are short-short-long, short-short-long, and so on. The animation looks a darn sight better than vsync at 60Hz, but even at 144Hz, G-Sync’s output is simply more correct and desirable.
As the videos demonstrate, G-Sync has tangible benefits over a fixed-rate 144Hz monitor, with or without conventional vsync. Nvidia says the improvements will be most easily perceptible at lower frame rates, from 40-60 FPS, and that makes quite a bit of sense. After all, that’s where vsync quantization is the worst. I tried several things to put this theory to the test.
First, I cranked up the image quality in games like Crysis 3 and Arkham Origins to get frame rates to drop into the range in question. I then compared G-Sync at 144Hz (max) to conventional vsync at 60Hz and 144Hz. G-Sync’s benefits were obvious over regular vsync at 60Hz in these cases, even more so than in our Skyrim example with higher frame rates. The difference was subtler at 144Hz, but G-Sync was still an incremental improvement over conventional vsync. Again, the degree to which you’d notice the difference was dictated by the type of movement happening in the moment.
Out of curiosity, I also wanted to see what G-Sync could do for a slower display, like an IPS panel with a 60Hz peak refresh rate, so I used the Nvidia control panel to cap the monitor at 60Hz. At these frame rates and with vsync enabled at 60Hz, frame output in Arkham Origins was often quantized to 30Hz or less. As a result, the game felt sluggish, with somewhat jerky animation. Turning on G-Sync with a 60Hz upper limit was a major improvement. It wasn’t quite as nice a G-Sync at 144Hz, of course, but playability was clearly improved. There’s a case to be made for outfitting a 60Hz panel with G-Sync control logic, no question.
However, I have to admit that the majority of my G-Sync testing time was spent at much higher frame rates, in Borderlands 2, a glassy-smooth Unreal Engine-based shooter with a frenzied, kinetic feel. This is my favorite game, and it’s my favorite way to use a brand-new gaming-focused display tech. I played on the GeForce GTX 660 at frame rates from 60-85 FPS. I played on a GTX 760 with frame rates ranging to 100 FPS and beyond. And I played on a GeForce GTX 780 Ti, where frame rates went as high as 120-144 FPS. Holy crap, it was awesome in every case, and more so with the faster graphics cards.
I tend to have a wee bit of an addictive personality, and playing BL2 with G-Sync’s creamy smoothness fed that tendency in wonderful and dangerous ways. That’s probably why this G-Sync write-up here is conspiciously late, and it’s definitely why my Christmas shopping was a perilously last-minute affair. I played through the rest of the Tiny Tina DLC, burned through the Headhunter packs, and found myself pondering another play-through when I finally had to stop myself and focus on Yuletide cheer rather than looting another bandit. Borderlands 2 offers very immediate feedback; it doesn’t use double- or triple-buffering, which is why its FCAT and Fraps results closely correlate. The speed and fluidity of that game on this G-Sync display is like a combo meth/crack IV drip. I could probably make “faces of G-Sync” a thing, if others are affected by it like I am.
I should note that not even a wicked-fast graphics card, a Sandy-Bridge-E CPU, and a G-Sync display can deliver perfectly smooth gameplay all of the time. You will notice the occasional hitch where a frame or two has taken too long to draw. We measure those all of the time in our performance tests, and the really fast, low-latency nature of G-Sync makes those hiccups keenly felt. G-Sync doesn’t solve every problem, even though it’s a vast improvement in display synchronization.
So I’m a fan of G-Sync as a technology. The questions now are about how the tech gets implemented. At present, you can only get G-Sync upgrade kits from select vendors, for installation in this Asus monitor—or you can pay extra to have the module installed for you. I suspect Asus will start selling a version of the monitor with the G-Sync module already integrated before long.
That will be a good start, but there’s still plenty of room for improvement. The Asus VG248QE monitor’s LCD panel just isn’t very good, even by TN standards. Some TN panels are pretty decent, believe it or not, and if we can’t get IPS displays in the first wave, I’m hoping we’ll see a few higher-quality TN panels built into new G-Sync monitors unveiled at CES. Nvidia’s module is capable of driving everything up to 4K displays, so here’s hoping we have a variety of choices as the year progresses.
I can tell you what I’d like to see happen. I’d like to see somebody produce a G-Sync-compatible monitor based on a 27″ IPS panel with a 2560×1440 pixel grid at a reasonable price. We already know that folks have had some success overclocking their cheap 27″ Korean IPS monitors to 85Hz, 100Hz, or better. An affordable 27″ IPS monitor with a peak refresh of 85 or 100Hz would be a very nice thing indeed. In my view, a panel of that sort with G-Sync and a low refresh interval would be a vastly superior choice for gaming than even one of the fancy 60Hz 4K panels here in Damage Labs. You’re better off spreading your pixels per second across successive frames than jamming them all into really tight pixel pitches, honestly.
Looking further ahead, one would hope that something like G-Sync could become a standard for computer displays at some point in the non-too-distant future, so the entire industry can embrace this sort of functionality in a broadly supported standard. Because it’s very much the right thing to do.
Twitter implements a 140-character low-pass filter on my thoughts.