We've been excited about Nvidia's G-Sync technology for nearly a year, since the firm first unveiled the concept to the world at a press event last fall. The basic idea—letting the graphics card tell the display to update itself when the next frame of animation is ready—is simple yet revolutionary. Getting rid of the fixed display refresh interval had an immediate, dramatic impact in those initial demos at Nvidia's announcement.
We liked G-Sync just as well once we got to spend time with it in pre-production form early this year. Heck, I kind of fell down the rabbit hole while testing it and wound up spending way too much time just, you know, playing games.
That said, the first G-Sync monitor we tested was by no means ready for prime time. The variable refresh rates worked, sure, but other basic display functions, like on-screen menus and color dithering to prevent banding, weren't implemented yet. Getting that stuff together—and refining the operation of G-Sync to be as widely compatible as possible—has taken the better part of 2014.
Happily, the first production G-Sync monitor has finally arrived in Damage Labs, and it was easily worth the wait. The monitor comes from Asus, the ROG Swift PG278Q. Actually, I believe the full and official name is ASUS ROG SWIFT PG278Q, if you want to get technical, but WHY ARE WE SHOUTING?
The basic specs and stuff
I dunno, maybe a little shouting is warranted, because this is a heck of a nice place to start with G-Sync displays. The PG278Q's display area measures 27" corner to corner and has a resolution of 2560x1440 pixels. Here are the rest of its vitals.
|Panel size||27" diagonal|
|Peak refresh rate||144Hz; variable via G-Sync|
|Display colors||16.7 million|
|Max brightness||350 cd/m²|
|Peak contrast ratio||1000:1|
|Optimal viewing angles||170° horizontal, 160° vertical|
|Response time (Gray to gray)||1 ms|
|Display surface||Matte anti-glare|
|Inputs||1 x DisplayPort 1.2, 1 x USB 3.0|
|Outputs||2 x USB 3.0|
|Peak power draw||90W|
|Wall mount support||VESA 100 x 100 mm|
|Weight||15.4 lbs (7 Kg)|
The LCD panel in this monitor is of the twisted nematic (TN) variety. I'm sure that choice will prove controversial in some circles, since TN panels are not known for stellar color fidelity at broad viewing angles. As we've noted, though, not all TN panels are created equal. The PG278Q is more capable than most. It can display eight bits per color channel, which means it can produce up to 16.7 million colors, all told.
Interestingly, the color story here goes a little deeper than that. Most monitors incorporate a display logic chip from some quasi-anonymous third party. That chip provides things like scaling to non-native resolutions, brightness and contrast control, and support for various input types. In the PG278Q, that work is done by Nvidia's G-Sync module. Nvidia tells us the G-Sync module does its internal processing at 10 bits per color channel. The module then uses a form of temporal dithering called FRC to approximate higher-precision images on the PG278Q's eight-bit panel. FRC is pretty widely used, including in the affordable 4K Asus PB287Q that we reviewed a while back, but it seems like a high-zoot feature for the first G-Sync monitor.
Anyhow, TN panels do have a clear upside: they're fast. The PG278Q can update itself at a peak rate of 144Hz, which works out to a gap of less than seven milliseconds between successive frames. The gray-to-gray response times for switching individual pixels are even shorter, rated at just one millisecond. If you're after smooth gaming, that kind of quickness is easy to appreciate. No doubt that's why Asus chose this panel for its first G-Sync display.
The PG278Q's pixel density is a decidedly non-weird 109 pixels per inch, the same as those 27" Korean IPS monitors that everybody went gaga over a couple of years ago. That means you won't run into any of the weird image or font sizing issues that you might with one of those super-dense new 4K monitors. It also means you won't get the razor-sharp outlines of a high-PPI display, either.
That fact wouldn't really bug me too much if it weren't for this next bit. You see, the ROG Swift PG278Q will set you back quite a few bones. The list price is a very healthy $799.99. So you'll be paying more than the $649 that Asus is asking for its 28" 4K 60Hz monitor without G-Sync. That may be a hard pill to swallow.
Then again, we're going through a time of tremendous innovation in display technologies. There aren't many easy choices right now—but there sure are a lot of good ones. Before you dismiss the PG278Q for its hefty price and TN panel tech, let me say this: this thing is probably the finest gaming monitor on the planet. So there's that. If you don't want to find yourself contemplating parting with 800 bucks for a TN panel, do not—I repeat, do not—seat yourself in front of one and play video games.
A few words about G-Sync
I've already said that G-Sync lets the GPU inform the display when it's time to draw a new frame. That's a fundamental change from the operation of conventional displays, which typically update themselves 60 times per second.
Synchronizing the display with the GPU has a number of benefits over the coping methods we've been using to date. The usual default method, vertical refresh sync or vsync, involves delaying each new frame created by the graphics processor until the next available display refresh cycle.
Delays in the graphics-to-display pipeline aren't great. They're not too big a deal if the GPU is able to produce new frames consistently every 16.7 milliseconds (or 60 times per second.) Too often, though, that doesn't happen. If the GPU takes just a smidgen longer to render the next frame, say 16.9 milliseconds, then the system must wait for two full refresh intervals to pass before putting new information on the screen. Suddenly, the frame rate has dropped from 60 FPS to 30 FPS.
Things go even further sideways if a frame takes more than two intervals to produce. You'll end up waiting 50 milliseconds for the next frame to hit the display, and at that point, you're likely to notice that the sense of fluid animation is compromised. Your character may also end up being scattered in a shower of giblets across the floor.
Frame production times tend to vary pretty widely, as we've often noted and as illustrated in the plot above, so vsync-induced slowdowns can be a real problem. Many gamers sidestep this issue by disabling vsync and letting the GPU flip to a new frame even as the display is being drawn. Going commando on vsync has the advantage of getting new information to the screen sooner, but it has the obvious downside of chafing. Err, I mean, tearing, which looks like so:
Seeing these seams between successive frames can be distracting and downright annoying, especially because it sometimes happens multiple times per refresh.
<commercial guy voice> There has to be a better way. </commercial guy voice>
Nvidia's answer is putting the display refresh timing under the control of the GPU. In theory, a variable refresh technology like G-Sync should banish tearing, eliminate vsync-induced slowdowns, reduce input lag, and allow for more fluid animation for 3D games.
The downside? G-Sync is Nvidia's own proprietary technology. If you want to use the PG278Q's variable refresh feature, then you have to connect it to a recent GeForce graphics card. If at some point down the road you decide to switch to a Radeon, you can still use the PG278Q, but you'll lose out on variable refresh rates.
I can see Nvidia's reasons for keeping G-Sync to itself. Real money went into developing this technology, and they'd like to reap the benefits of their innovation. Still, I think everybody involved here probably realizes that an open standard for variable refresh rates, likely based on DisplayPort with VESA's Adaptive-Sync specification, is the best outcome when all is said and done.
Thing is, the Adaptive-Sync monitors being shepherded to market by AMD's Project FreeSync won't be shipping until some time in 2015. G-Sync is here now.
|Intel crams 100 GFLOPS of neural-net inferencing onto a USB stick||18|
|Toshiba's XG5 1TB NVMe SSD reviewed||4|
|Microsoft and Johnson Controls put Cortana in a thermostat||16|
|Space Exploration Day Shortbread||16|
|Geil de-blings its Evo Spear memory modules||11|
|Thermaltake View 21 chassis doubles up on tempered glass||5|
|Asus Crosshair VI Extreme pulls out all the stops for AM4||18|
|Doom 6.66 update brings free DLC and a multi-platform free weekend||29|
|Intel graphics driver 15.46 fixes a slew of games||34|