Single page Print

The hardware
If you've managed to wade through these explanations so far, you might be wondering how hard it is to make something like G-Sync work. After all, the GPU acts as the timing source for the display. Couldn't a variable refresh scheme be implemented in software, perhaps via a GPU driver update?

Turns out it's not that simple. Doing refreshes at varying intervals creates all sorts of havoc for LCDs, including color/gamma shifting and the like. I had no idea such problems were part of the picture, but happily, Nvidia seems to have worked them out. You'd never know about any potential for trouble when seeing the early G-Sync solutions in action.

Still, making G-Sync work requires much more than a GPU driver update. Nvidia has developed a new control module for LCD monitors. Displays must be equipped with Nvidia's custom electronics in order to support the new refresh scheme.

The first-gen G-Sync module is pictured on the right. The biggest chip on the module is an FPGA, or field programmable gate array, which can be made to perform a range of custom tasks. In this case, the FPGA is serving as the development and early deployment vehicle for G-Sync. The FPGA is accompanied by a trio of DDR3 memory chips, each 256MB in capacity. I doubt the module requires all 768MB of memory to do its thing, but it likely needs the bandwidth provided by three separate memory channels. Nvidia tells me this first G-Sync module can handle 4K resolutions at refresh rates up to 60Hz.

We don't yet have a lot of details on G-Sync monitors and how they'll be priced. I expect to learn a lot more at CES next week. However, the early info seems to indicate that the G-Sync module will add about $100 to the price of a display. That's a considerable premium, but I expect Nvidia could cut costs considerably by moving from an FPGA to a custom chip. In fact, since a G-Sync control chip would replace a traditional scaler ASIC, it shouldn't add much cost at all to the total solution—eventually.

Nvidia seems to be intent on keeping G-Sync proprietary for the time being. As a result, if G-Sync succeeds in winning wide adoption, then Nvidia will have established itself as a provider of display electronics. Also, folks who want to take advantage of G-Sync tech will need to own a GPU based on the Kepler generation of technology—essentially any GeForce GT or GTX 600- or 700-series graphics card.

Although having an open standard for display technology of this sort would be ideal, there's reason be pleased about Nvidia's involvement in display chips going forward. There's plenty of room for further innovation beyond the first-gen G-Sync tech. For instance, at the event in Montreal, John Carmack mentioned a low-persistence mode that, similar to Nvidia's LightBoost tech for 3D glasses, strobes the display backlight after each frame has been painted in order to reduce motion blur. I don't think that mode is ready for prime time yet, but that sort of technology could further mitigate the weaknesses of current displays.

We got the chance to take an early look at G-Sync in action using the monitor above. That's an Asus VG248QE that's been retrofitted with a pre-production G-Sync module. This display is marketed to gamers, and it's based on a 24" TN panel, with a 1920x1080 resolution and all of the attendant goodness and badness that comes with that sort of display. It's blazing fast, with a 144Hz refresh rate and a claimed one-millisecond gray-to-gray response time, which makes it ideal for this sort of mission. Beyond that, to my eyes spoiled by constant exposure to glorious IPS panels, there's little to recommend it. The color reproduction is poor, with obvious banding on simple gradients, and the panel has more light bleed than my neighbor's Christmas display. The resolution, pixel density, panel size, and viewing angles are all relatively poor. (Told you my eyes were spoiled. You try sitting in front of an Asus 4K panel for several weeks and then switching to something else.)

Once you have the right hardware and Nvidia drivers installed, using G-Sync is fairly straightforward. Just set the refresh rate to 144H, pop over to the "Manage 3D settings" portion of the Nvidia control panel, and choose "G-Sync" as the vertical sync method. After that, G-Sync should be working.

There are a few limitations to navigate, though. In order to use G-Sync, games and other applications must be running in full-screen mode, not in a window. Also, many games have their own vsync settings and don't entirely comprehend what's happening when G-Sync is enabled. Nvidia recommends setting vsync to "disabled" in any in-game menus. Even then, some games will limit frame rates to 60Hz, so you may need to tweak the "Preferred refresh rate" option shown above in the Nvidia control panel to "highest available." Doing so allowed me to work around problems with a couple of games, including Call of Duty: Ghosts, without any obvious negative side effects. However, really high frame rates in Skyrim can make that game's physics engine go bonkers, so handle with care.

So how well does it work?
Once I had the G-Sync setup working and ready to roll, I was hit by a realization that prompted a bit of soul searching. You see, as I've mentioned, I spend most of my computer time looking at IPS displays with amazing color reproduction. Those same displays, though, are largely limited to 60Hz refresh rates. That's fine as far as it goes, but I'm very much aware that higher refresh rates can make for smoother gaming.

If you haven't played with a 120 or 144Hz monitor, and if you've heard the oft-repeated myth about the human eye not being able to detect anything faster than 60 FPS, you might think that high-refresh displays aren't any big deal. You'd be wrong. Just dragging an Explorer window across the desktop at 60Hz and then at 120Hz in a side-by-side setup is enough to illustrate how much smoother motion can look at twice the frame rate. There's no question that a fun and fluid gaming experience is possible at 60Hz, but taking things up to 120 or 144Hz? More fluid and more fun, especially for quick-twitch games like first-person shooters.

I realized I needed to be careful about understanding the benefits of G-Sync above and beyond the goodness provided by a high-refresh panel.

For its part, Nvidia doesn't seem to see any contraction here. After all, part of its pitch is about the benefits of the faster displays that will be equipped with G-Sync. Also, interestingly enough, Nvidia claims G-Sync's most obvious benefits will come at lower performance, when frame rates are ranging between 40 and 60 FPS. I think that's true, but the benefits of fast displays at high frame rates aren't exactly trivial, either. They're just not entirely exclusive to G-Sync.

Once I cranked up the G-Sync setup and began to play games, though, it didn't take long to squelch my worries. My concerns about where the goodness came from, my snobby complaints about light bleed and color banding—these things all melted away as I was sucked into the reality unfolding with freakish fluidity in front of my eyes.

I'll tell you more about my impressions shortly, but this is TR, where we're slaves to empirical testing when possible. Let's have a look at some slow-motion video captures.