Asus’ ROG Swift PG278Q G-Sync monitor reviewed

We’ve been excited about Nvidia’s G-Sync technology for nearly a year, since the firm first unveiled the concept to the world at a press event last fall. The basic idea—letting the graphics card tell the display to update itself when the next frame of animation is ready—is simple yet revolutionary. Getting rid of the fixed display refresh interval had an immediate, dramatic impact in those initial demos at Nvidia’s announcement.

We liked G-Sync just as well once we got to spend time with it in pre-production form early this year. Heck, I kind of fell down the rabbit hole while testing it and wound up spending way too much time just, you know, playing games.

That said, the first G-Sync monitor we tested was by no means ready for prime time. The variable refresh rates worked, sure, but other basic display functions, like on-screen menus and color dithering to prevent banding, weren’t implemented yet. Getting that stuff together—and refining the operation of G-Sync to be as widely compatible as possible—has taken the better part of  2014.

Happily, the first production G-Sync monitor has finally arrived in Damage Labs, and it was easily worth the wait. The monitor comes from Asus, the ROG Swift PG278Q. Actually, I believe the full and official name is ASUS ROG SWIFT PG278Q, if you want to get technical, but WHY ARE WE SHOUTING?

The basic specs and stuff

I dunno, maybe a little shouting is warranted, because this is a heck of a nice place to start with G-Sync displays. The PG278Q’s display area measures 27″ corner to corner and has a resolution of 2560×1440 pixels. Here are the rest of its vitals.

Panel size 27″ diagonal
Native resolution 2560×1440
Aspect ratio 16:9
Panel type/backlight TN/WLED
Peak refresh rate 144Hz; variable via G-Sync
Display colors 16.7 million
Max brightness 350 cd/m²
Peak contrast ratio 1000:1
Optimal viewing angles 170° horizontal, 160° vertical
Response time (Gray to gray) 1 ms
Display surface Matte anti-glare
HDCP support Yes
Inputs 1 x DisplayPort 1.2, 1 x USB 3.0
Outputs 2 x USB 3.0
Peak power draw 90W
Wall mount support VESA 100 x 100 mm
Weight 15.4 lbs (7 Kg)

The LCD panel in this monitor is of the twisted nematic (TN) variety. I’m sure that choice will prove controversial in some circles, since TN panels are not known for stellar color fidelity at broad viewing angles. As we’ve noted, though, not all TN panels are created equal. The PG278Q is more capable than most. It can display eight bits per color channel, which means it can produce up to 16.7 million colors, all told.

Interestingly, the color story here goes a little deeper than that. Most monitors incorporate a display logic chip from some quasi-anonymous third party. That chip provides things like scaling to non-native resolutions, brightness and contrast control, and support for various input types. In the PG278Q, that work is done by Nvidia’s G-Sync module. Nvidia tells us the G-Sync module does its internal processing at 10 bits per color channel. The module then uses a form of temporal dithering called FRC to approximate higher-precision images on the PG278Q’s eight-bit panel. FRC is pretty widely used, including in the affordable 4K Asus PB287Q that we reviewed a while back, but it seems like a high-zoot feature for the first G-Sync monitor.

Anyhow, TN panels do have a clear upside: they’re fast. The PG278Q can update itself at a peak rate of 144Hz, which works out to a gap of less than seven milliseconds between successive frames. The gray-to-gray response times for switching individual pixels are even shorter, rated at just one millisecond. If you’re after smooth gaming, that kind of quickness is easy to appreciate. No doubt that’s why Asus chose this panel for its first G-Sync display.

The PG278Q’s pixel density is a decidedly non-weird 109 pixels per inch, the same as those 27″ Korean IPS monitors that everybody went gaga over a couple of years ago. That means you won’t run into any of the weird image or font sizing issues that you might with one of those super-dense new 4K monitors. It also means you won’t get the razor-sharp outlines of a high-PPI display, either.

That fact wouldn’t really bug me too much if it weren’t for this next bit. You see, the ROG Swift PG278Q will set you back quite a few bones. The list price is a very healthy $799.99. So you’ll be paying more than the $649 that Asus is asking for its 28″ 4K 60Hz monitor without G-Sync. That may be a hard pill to swallow.

Then again, we’re going through a time of tremendous innovation in display technologies. There aren’t many easy choices right now—but there sure are a lot of good ones. Before you dismiss the PG278Q for its hefty price and TN panel tech, let me say this: this thing is probably the finest gaming monitor on the planet. So there’s that. If you don’t want to find yourself contemplating parting with 800 bucks for a TN panel, do not—I repeat, do not—seat yourself in front of one and play video games.

A few words about G-Sync

I’ve already said that G-Sync lets the GPU inform the display when it’s time to draw a new frame. That’s a fundamental change from the operation of conventional displays, which typically update themselves 60 times per second.

Synchronizing the display with the GPU has a number of benefits over the coping methods we’ve been using to date. The usual default method, vertical refresh sync or vsync, involves delaying each new frame created by the graphics processor until the next available display refresh cycle.

Delays in the graphics-to-display pipeline aren’t great. They’re not too big a deal if the GPU is able to produce new frames consistently every 16.7 milliseconds (or 60 times per second.) Too often, though, that doesn’t happen. If the GPU takes just a smidgen longer to render the next frame, say 16.9 milliseconds, then the system must wait for two full refresh intervals to pass before putting new information on the screen. Suddenly, the frame rate has dropped from 60 FPS to 30 FPS.

Things go even further sideways if a frame takes more than two intervals to produce. You’ll end up waiting 50 milliseconds for the next frame to hit the display, and at that point, you’re likely to notice that the sense of fluid animation is compromised. Your character may also end up being scattered in a shower of giblets across the floor.

Frame production times tend to vary pretty widely, as we’ve often noted and as illustrated in the plot above, so vsync-induced slowdowns can be a real problem. Many gamers sidestep this issue by disabling vsync and letting the GPU flip to a new frame even as the display is being drawn. Going commando on vsync has the advantage of getting new information to the screen sooner, but it has the obvious downside of chafing. Err, I mean, tearing, which looks like so:

Seeing these seams between successive frames can be distracting and downright annoying, especially because it sometimes happens multiple times per refresh.

<commercial guy voice> There has to be a better way. </commercial guy voice>

Nvidia’s answer is putting the display refresh timing under the control of the GPU. In theory, a variable refresh technology like G-Sync should banish tearing, eliminate vsync-induced slowdowns, reduce input lag, and allow for more fluid animation for 3D games.

The downside? G-Sync is Nvidia’s own proprietary technology. If you want to use the PG278Q’s variable refresh feature, then you have to connect it to a recent GeForce graphics card. If at some point down the road you decide to switch to a Radeon, you can still use the PG278Q, but you’ll lose out on variable refresh rates.

I can see Nvidia’s reasons for keeping G-Sync to itself. Real money went into developing this technology, and they’d like to reap the benefits of their innovation. Still, I think everybody involved here probably realizes that an open standard for variable refresh rates, likely based on DisplayPort with VESA’s Adaptive-Sync specification, is the best outcome when all is said and done.

Thing is, the Adaptive-Sync monitors being shepherded to market by AMD’s Project FreeSync won’t be shipping until some time in 2015. G-Sync is here now.

 

All the trimmings

Of all the Stealth-fighter-inspired PC hardware designs of the past 10 years, the ROG Swift PG278Q is one of the finest. It’s just so very… Nighthawk-like, with a little bit of VW GTI-style red trim thrown in for good measure. If you plan on raiding Baghdad at night and then doing some autocross, there’s no better monitor for it than this one.

Seriously, lots of folks try for an interesting design, but few execute as well as Asus has here. The bezels surrounding the screen are way less than a centimeter wide and make the monitor look much sexier than it has any right to.

The monitor’s enclosure and stand combine to give you just about everything you’d want in terms of flexible positioning. The screen can tilt from 5° downward to 20° upward and swivel left or right up to 60°. The display can also pivot 90° into a portrait orientation, as shown above. The height is adjustable through 120 mm in the default landscape orientation, as well.

If none of that will suffice, the stand is attached to the back of the monitor via a standard VESA mounting interface. Detach the base, and you can attach the monitor to the custom mount of your choice.

If you do switch to a custom mount, you’ll be missing out the glowing red ring around the included base. Its brightness throbs and decays according to, uh, something. Doesn’t exactly seem to be the display’s content, exactly, but hey, pretty lights.

The PG278Q’s inputs are spartan, to say the least. There’s an input for the power connection (from the external brick) and a DisplayPort connector. This thing doesn’t support DVI, HDMI, VGA, or picture-in-picture, and there aren’t even any cheesy speakers included for basic audio. The only extra perk is a USB 3.0 hub that accepts one input and supports two outputs. As far as extras go, that’s not a bad one, I suppose.

Menus and such

Quick confession: I pretty much hate monitor menus and controls. They’re all clumsy, and they’re all different. Somehow, the button placements have managed to get even more awkward over time, too. That’s why the control scheme on the PG278Q comes as a truly pleasant surprise.

The whole setup is anchored by that single eraser-nubbin type control stick at the top of the button stack. One may push down on it to invoke the on-screen menus and to select menu items, and the directional control allows one to navigate through the choices. This single control button does everything that an array of five or more buttons might do on the average monitor—and it’s ridiculously easy to use by comparison.

The menu system in the ROG Swift PG278Q is fairly simple, in part because of a smart and logical layout—and in part because it’s not packed with a rich feature set. Thing is, pretty much everything you’d want to adjust is represented. The only big omission I’ve noticed involves color temperatures; there’s no sRGB mode, just “normal,” “warm,” “cool,” and user mode. Some of the simplicity comes from the fact that this monitor doesn’t have multiple input ports to manage. However you slice it, though, the thing is a joy to use compared to your average monitor.

Below the ThinkPad-style control nubbin is the exit button, which you’ll need for getting out of the menus. Beneath that are a couple of gimmicky controls that aren’t part of the main setup. The upper one invokes a “GamePlus” feature that will place a transparent crosshair overlay in the center of the screen, I guess in case your game decides not to do that for you. It can also place a countdown timer in the top-left corner of the screen, with times ranging between 30 and 90 minutes. Both could be useful features, I suppose, but I dunno. Somehow, I’ve gotten this far without them, and my PC can run lots of software to do similar things when needed.

The button below that is the Turbo control, which lets the user toggle between 60, 120, and 144Hz refresh rates. It kinda-sorta works, but I’m never sure I can trust it to set the G-Sync mode properly and such. I expect Asus put this button there for folks who just don’t understand much about their PCs and want to be sure they’re getting the fastest possible refresh rate. At least it can easily be ignored.

 

The G-Sync experience

Welp, they’ve done it. That’s my summation of the G-Sync experience. Asus and Nvidia have managed to bring variable refresh technology to the market in a working and seemingly well-refined form—and it works like gangbusters. (Although, really, who are these gangbusters, and why do they get so much credit?) We’ve already talked about the theory behind G-Sync. I could wear my keycaps thin trying to describe the subject experience of using it, but you really do have to try it for yourself in order to appreciate it fully.

In games, everything that happens onscreen with G-Sync is more immediate, more fluid. As with a lot of game-changing technologies, adjusting to it isn’t hard. My son Nathan and I both had the same experience: you sit down in front of the screen, you use it, and periodically throughout the gaming session, you say to yourself, “Wow, this really is smoother.” Then any thoughts of monitor technology mostly just disappear, and you’re better able to concentrate on the game.

The hard part is going back to gaming on a 60Hz monitor afterwards. My immediate reaction was, “This is broken somehow.” I was briefly perplexed, but then I realized: I was not wrong. G-Sync has just fixed an incredibly long-standing problem.

There’s no good way to transmit the G-Sync experience over the intarwebs for display on conventional monitors. What I can do is record what happens onscreen with a high-speed camera and replay the results in slow-motion. I’ve already posted comparisons from a whole range of sync modes and refresh rates using early G-Sync hardware right here, so go look at those if you want an extensive set of examples. For today, let’s focus on the PG278Q running G-Sync at a 144Hz peak refresh rate versus a conventional 60Hz vsync setup, since that’s probably the comparison most folks will find relevant. We’ve taken an example from Guild Wars 2, recorded it at 240 FPS, and turned it into a side-by-side video that should illustrate the differences nicely.

With G-Sync at 144Hz, the on-screen animation advances more often and in smaller increments, as expected.  Also, crucially, the content of each of those updates fits with when the update takes place. Each new frame advances the scene’s rotation the appropriate amount for when it’s painted. Not only are the updates on the 60Hz vsync side less frequent, but some of them seem “off” a little in terms of timing, too. That fact contributes to a kind of lurching, loping sense of advancing motion.

At full speed, these differences are subtler in some ways, since fast updates cure a lot of ills. The overall added goodness of G-Sync seems even more pronounced, though, when your eye is fooled into seeing constant movement rather than a series of individual frames. Paradoxical, maybe, but that’s my sense of it.

There’s also a distinct sense of solidity with G-Sync that’s not present at any refresh rate with vsync disabled. The utter lack of tearing on the display is very welcome.

Making use of G-Sync at a full 144Hz does involve a bit of fuss. You have to enable it via a checkbox in the Nvidia control panel, and then you have to go to the 3D gaming settings section and choose G-Sync as the display refresh mode. Most games make better use of G-Sync if you set “preferred refresh rate” to “highest available” in that same menu. This option circumvents some of the FPS caps built into a lot of PC titles. In other cases, you may have to dig into config files in order to remove the FPS limit.

Getting rid of the FPS caps can cause problems in some games, too. The physics in Skyrim go hilariously sideways at high frame rates, for instance. If you’ve been playing with uncapped frame rates and vsync disabled like I have for ages, though, all of this fuss will be familiar territory. Fortunately, for the majority of games, you can just set vsync to “off” in their video settings menu and you’re good to go.

Alternative goodness: low-persistence mode

The PG278Q has another interesting display mode, in addition to G-Sync, known as ultra-low motion-blur or ULMB. This bit of dark magic exists separately from G-Sync and, unfortunately, can’t be used in conjunction with it. You have to disable G-Sync in the Nvidia control panel and set the display refresh rate to 120Hz or less in order to enable ULMB mode. Also, as far as I can tell, ULMB only works with GeForce graphics cards.

Once it’s working, ULMB mode attempts to reduce motion blur by modifying the backlight behavior. Specifically, the backlight cycles off while the display is being updated and then pulses on once each new frame is completely painted. This strobing effect reduces the overall brightness of the backlight somewhat, but it’s otherwise imperceptible to human eyes. I didn’t notice any flicker with the display strobing at 120Hz.

This low-persistence display method is a well-known trick for reducing motion blur that’s even been deployed in the latest prototypes of the Oculus Rift. And it works. Everything from scrolling text to in-game action is affected by the change, with added clarity and sharper object edges in each case.

I tried to capture the impact of the ULMB mode using my high-speed camera at 240 FPS, but what I got instead was a recording of the strobing in action. Check it out. The video starts in the regular backlight mode, and part-way through, I enable ULMB instead. You’ll notice when it happens.

Fortunately, it doesn’t look like such a train wreck to human eyes.

In fact, the strobing isn’t at all obvious in this next video recorded at 120 FPS, and I think maybe it kind of captures some of the additional clarity—although honestly, the focus isn’t great, and the benefits are much easier to perceive in person. Again, we start in regular mode and then switch to ULMB.

Yeah, so make what you will of that, I guess. You’ve really gotta see it with your own eyes. Some of the BlurBusters demos with scrolling texts and animated objects reveal dramatic improvements.

I probably need to spend more time gaming with ULMB mode enabled in order to appreciate it fully. My sense is that it’s an improvement, but it doesn’t have the same visceral impact as G-Sync’s variable refresh rates do. If I had to pick—and the PG278Q essentially forces you to—I’d choose G-Sync for gaming with no regrets. Still, ULMB mode adds something different to the mix, and it hints at a possible future where low-persistence modes might be combined with variable refresh rates, if such a thing becomes feasible.

 

Brightness and contrast

Beyond the, uh, synchronicity, how good a display is the PG278Q? To find out, we’ve compared it to a couple of obvious rivals, both from Asus’ stable. The PB278 is kind of the PG278Q’s older brother, also a 27″ monitor with a 2560×1440 resolution. It’s based on in-plane switching (IPS) technology and is limited to 60Hz refresh rates. In theory, the PB278 should have superior color fidelity and wider optimal viewing angles at the expense of speed and fluidity. The PB278 is currently selling for $478 at Amazon, so you could almost buy two for the price of the PG278Q.

Our other comparative reference is the Asus PB287Q, a slightly larger 28″ monitor with a 4K TN panel. In addition to its much higher resolution and pixel density, the PB287Q is probably the best TN panel-based monitor we’ve seen. And it can accept a DisplayPort SST connection to drive its 4K resolution as a single tile. Asus only asks $649 for the PB287Q, so it’s also cheaper than the ROG Swift PB287Q. Again, though, this rival is limited to fixed 60Hz refresh rates.

The ROG Swift PG278Q has its work cut out for it. Let’s see how it stacks up.

Our “typical” readings were taken with the monitors normalized to 200 cd/m² at the center of the screen—or as close as we could get.

The PG278Q’s spec sheet says its peak brightness is 350 cd/m², but apparently it’s being modest. Our measurement says the peak brightness is even higher. You’d have to be working in, I dunno, direct sunlight perhaps in order to need that kind of brightness in a desktop display. The PG278Q has plenty of extra lumens when needed, I guess.

Add in the black level measurements, and you can figure out the contrast ratios. The PG278Q’s TN panel doesn’t quite gate off as much light as the IPS-based PB278 when asked to make the screen black, but its overall contrast ratio is comparable—and exceeds that of the 4K TN panel in the PB287Q.

Color reproduction

Click through the buttons below to see the color gamuts for the displays, both before and after calibration. Color gamut has to do with the range of colors the display can produce. These things tend to vary pretty widely from one monitor to the next. The gray triangle on each diagram below represents the standard sRGB color space.


The PG278Q performs surprisingly well in our color gamut measurements, nearly encompassing the entirety of the sRGB color space. The IPS-based PB278 can produce a few more deep reds and purples, but the PG278Q easily has more range than its TN-based sibling, the PB287Q. Note that, at the time we tested the PB287Q, we were impressed by its range.


Out of the box, the PG278Q’s color temperature is about 6100-6200K, not far from our 6500K target. Our calibration introduces a little skew into things at the lowest gray levels but otherwise brings the PG278Q closer to the goal.

Delta-E is a measure of color difference—or error—compared to a reference. Smaller delta-E values generally mean more accurate colors. In this case, we measured delta-E in the sRGB color space with a D65 white point, both before and after calibration.

The PG278Q has the lowest delta-E prior to calibration, which should be no surprise since our copy of the monitor came out of the box at around 6200K. After calibration, the PG278Q looks even better, with less overall error than the IPS-based PB278.

We can go into more detail and see what the sources of error were for each display. After calibration, most of the PG278’s deviance from the D65 reference point comes at those lower gray levels. Otherwise, it acquits itself quite nicely.

 

Display uniformity

Displays typically don’t produce the exact same image across their entire surface. We’ve quantified the uniformity of the PG278Q by taking a series of luminance readings in different regions of the panel. We set the brightness level at ~200 cd/m² at the center of the screen before starting.

177 cd/m²

(91%)

168 cd/m²

(87%)

169 cd/m²

(87%)

186 cd/m²

(96%)

194 cd/m²

(100%)

182 cd/m²

(94%)

171 cd/m²

(88%)

170 cd/m²

(87%)

169 cd/m²

(87%)

This monitor’s 13% variance from the center of the screen to the edges isn’t anything you’re likely to notice, even while staring directly at the screen and looking for problems. Both of the other Asus displays we’ve tested have similar light distribution. The PB287Q has a 15% max variance from the center to the edge of the display.

I’ve chosen to convey backlight bleed using a picture rather than a series of measurements. Trouble is, I never know exactly how this image will end up looking on the reader’s own screen. What I see here is a bit of light bleed around the bottom corners of the monitor, with more bleed extending up the left edge. In regular use, it’s not a deal-breaker or even terribly noticeable, but this is a little more light bleed than we saw with the PB287Q.

Viewing angles

   
 
   

I’ve taken these pictures in order to demonstrate how the display’s color and contrast shift when the viewing angle changes. As you can see, at the angles we’ve chosen, the PG278Q comes out looking pretty good. The bottom image, looking up at the panel from below, shows a little too much contrast, especially in the sky, but the colors don’t shift or invert at this angle like you might expect from a lower-quality TN display.

Input lag

TN panels tend to be quick, and this one is no exception. The PG278Q’s gray-to-gray transition time is one millisecond, substantially quicker than the five-millisecond rating for the IPS-based PB278.

Input lag comes from many sources, including the scaler chip inside the monitor. Nvidia’s G-Sync module is a brand-new and comes from a new player in the market, so we’ll want to see how it performs. To find out, we compared the PG278Q against my old Dell 3007WFP-HC. The 3007WFP-HC’s IPS panel isn’t particularly fast, with an 8-ms gray-to-gray spec, but this monitor has no internal scaler chip, so there’s zero input lag from that source.

Dell 3007WFP (left) vs. Asus PG278Q

Well, I didn’t expect to see the PG278Q running ahead of our ASIC-free reference unit. I suppose the victory could be the result of the PG278Q’s faster pixel-switching times, but I suspect the real culprit is the GPU lag inherent to the screen mirroring mode we used. We’ve seen this same kind of thing before, with a frame or so of lag for one display or the other, depending on which video card output the display’s using.

Anyhow, I’m not terribly concerned about that. What this result demonstrates nicely is that the G-Sync module in the PG278Q offers very low latency. It doesn’t appear to introduce even a single frame’s worth of additional input lag. That’s exactly the outcome one would hope to see.

Power consumption

The G-Sync module driving this display is based on an FPGA rather than a custom chip, so I expected it to draw a little more power than the typical monitor’s electronics. I’m not sure that it does, though. The three monitors are within a watt of one another at our typical brightness level of 200 cd/m². The PG278Q draws more power at peak and minimum, but we’ve already established that it’s brighter overall at those levels. I’d chalk up the extra power use to the WLED backlight’s additional candlepower.

 

My subjective impressions of display quality

You’ve seen enough measurements on the preceding pages to know that the PG278Q is a pretty darned solid monitor, apart from its G-Sync superpowers. The contrast ratio, color gamut, and color accuracy measurements we presented speak to that fact. This display is firmly in the “good TN panel” end of the spectrum. I’ll bet some folks would have trouble discerning whether it’s IPS or TN upon casual inspection. The real giveaway is the loss of contrast if you move your eyes to, uh, knee level. That’s really not very good posture, though.

I sat the PG278Q next to the IPS-based PB278 for some side-by-side image viewing. Flipping through artificially oversaturated nature scenes at InterfaceLift, I found very little discernible difference between the two monitors’ images in most cases. Where there were differences, they came in two places.

First, the PB278 seems to spread its contrast out across a wider range of dark to light colors. The PG278Q’s color contrast is compressed into a relatively smaller, brighter portion of its overall range. Functionally, that means the PB278’s images are more striking, but it’s easier to lose some of the detail in the shadows on that display. The PG278Q shows you more detail in those areas, but contrasts are sometimes reduced as a result.

Second, the PB278 shows you a lot more deep red and orange hues than the PG278Q. That’s consistent with the IPS monitor’s slight advantage in our color gamut readings. Obviously, there are some deep reds the PG278Q just can’t quite replicate. It’s not something you’d miss while using the monitor without a side-by-side reference, but still.

Trouble is, I also think the PB278 exaggerates some of those colors where its own range excels. Look at the amount error for red in our delta-E measurements, and you’ll see the source of my worry. In the end, I’m unsure how much of the color difference “belongs” to each display.

One thing I have learned to notice since our review of Asus’ 4K TN monitor is the “screen door” effect caused by FRC, or temporal dithering. So that’s great, because now I can’t un-see it, ever. I have noticed some screen-door artifacts while gaming on the PG278Q, usually in hues of medium brightness. They are by nature ephemeral and hard to spot, but it is what it is. I don’t think I’ve ever noticed such patterns on an IPS display.

Then again, I’m not entirely sure FRC really is to blame. Like I said, the effect only shows up in motion, and I’ve not noticed it the Windows desktop or while browsing the web, where FRC dithering artifacts are generally apparent. We’re on the outer edge of the nitpicking bell curve here, at any rate, so I wouldn’t make too much of it.

To me, the bottom line on the PG278Q is that it’s a plenty good enough display to keep itself in the conversation. Go see one in person before ruling it out because it’s a TN panel.

Conclusions

I keep telling you that you need to see this thing in action for yourself—in order to appreciate G-Sync, in order to grasp how decent a “good TN” panel can be. Earlier, then, I pointedly directed you not to try out the PG278Q in person if you didn’t want to end up buying one.

I suppose what I’m trying to tell you is that the PG278Q is pretty darned fantastic. For gaming, it’s the best display I’ve ever used, thanks to the goodness of G-Sync. Outside of gaming, it’s very much competent to serve the movie-watching, web-surfing, and photo-editing needs of the above-average PC enthusiast. (Aren’t all PC enthusiasts above average?) On the strength of its gaming prowess, I’d prefer the PG278Q over either the 4K/60Hz PB287Q or the IPS-infused PB278. I’m not sure it’s worth the $799 price of entry, but that’s a personal question to be discussed between you and your credit-card company.

This is one of the few explicitly “gamer-focused” products we’ve seen over the years that really is better for gaming. Pro gamers endorse all sorts of mouse mats and motherboards and such, where one is probably not that different from another. Those folks should run out and buy a PG278Q because it truly is better and faster than any other gaming monitor.

In fact, the PG278Q’s most formidable competition doesn’t exist yet. Asus plans to release a 4K monitor with G-Sync support later this year, and I could see the logic of waiting for it. If that monitor combines the goodness of the PB287Q and the PG278Q, well, you’ll see the true power of minding your P’s and Q’s, I suppose.

Sorry, been typing these names too much.

There’s also the possibility that somebody releases an IPS monitor with G-Sync support, although we don’t know for sure when that might happen—or whether it will. (If it happens, the IPS panel won’t likely support 144Hz refresh rates, anyhow.) Finally, some folks may wish to wait for variable-refresh monitors that don’t lock the owner into buying GeForce GPUs. These are all sensible options.

But if you want variable refresh now, G-Sync is the only game in town, and the PG278Q does a darn fine job of offering a credible choice at this very moment—or at least very soon. (Asus says the PG278Q is slated to go on sale in North America on August 26.) It’s a minor revolution in display technology and one of the most exciting things to happen to PC gaming hardware in years.

I provide updates at variable intervals on Twitter.

Comments closed
    • TardOnPC
    • 5 years ago

    I am holding out for the XB280HK. Is it possible to hook up two computers to a monitor with only one DP input? I’d like to be able to hook up my game and DAW computers without plug/unplugging things. 🙁

    • itachi
    • 5 years ago

    Does that mean the Dell IPS panel is as fast (nearly and should be not really noticeable??) as a TN ?? the price point is surely high.. but all in all this is 4k and G-Sync undoubtedly is overpriced, so it’s a bit much, but I bought my Assu 1900×1200 25.5″ 500 like 5 years ago, so it’s not all that much after all considering the improvements..

    Of course this quality comes at a huge premium, running this screen, you might want some beefy graphic card ..I’d wait for the new nvidia if I were to buy this screen for sure.. and that surely would end up being a 1500 screen/graphic card combo, at the time I bought a 1000$ combo, with my HD 5870.

    • Terra_Nocuus
    • 5 years ago

    Now listed at the ‘egg: [url<]http://www.newegg.com/Product/Product.aspx?Item=N82E16824236405[/url<]

    • ThatStupidCat
    • 5 years ago

    I don’t like the idea that soon I might have to decide if my computer will be exclusively AMD or Nvidia video card depending on the monitors I already have. I’ll wait to see if someone makes something compatible with both.

      • Airmantharp
      • 5 years ago

      No one likes the idea of vendor lock-in, and that’s certainly the way it works right now.

      However, just reading through the rest of the comments I think you’ll find enough arguments to make the case that the situation is still rather fluid.

    • kamikaziechameleon
    • 5 years ago

    Played some console games this past week. I got sad that games like Halo 4 are stuck with technical limitations of the hardware they are on. Its such a great game but its rendered at 720p and has draw distance and texture resolution limitations.

    • Sabresiberian
    • 5 years ago

    This thing looks great, but it is far too expensive.

    I mean, consider. We can buy 2560×1440 IPS monitors for less than $400. We can buy (subject to availability because right now Overlord sells all they get very quickly) overclock-able 2560×1440 IPS monitors that can usually reach 120Hz for $450, sold by an American company with a 1-year warranty.

    Overlord is working with Nvidia to make a G-Sync monitor. Whether or not that actually happens is anybody’s guess, but even if it added $150 to their existing panels we would still be talking $200 less than this thing with a TN panel.

    Personally though I think G-Sync should bump the price by no more than $50 and if it does then it’s a pricing fail, too. It’s far too expensive at the rumored $125 increase to a monitor’s price, especially since it limits you to Nvidia’s cards if you want to use G-Sync. I mean, I tend to prefer Nvidia but I’m not into cutting my nose off to spite my face by closing the option to buy an AMD graphics card if it turns out to be the better solution at the time I’m buying. And I’m sure not going to pay $125 for a feature it turns out I don’t even use at some point. $25, no problem, $125, absolutely not.

    Frankly I think the extra $200+ is for the Asus ROG name. Really $300, in my opinion, this thing should go for about $500. I like the ROG brand, but I’m not going to pay that much extra money to get ROG on my piece of gear.

      • Airmantharp
      • 5 years ago

      Early adopters always pay; the rest of us wait for economies of scale and market effects to bring pricing into line :).

      The price is certainly still accessible, and it’s not a sub-par product.

      And I look forward to what Overlord is able to produce!

    • mcnabney
    • 5 years ago

    Nobody has been able to answer this question. What happens if you plug something that ISN’T a recent Nvidia GPU into this monitor?

    Radeon? BluRay player? AV Receiver? Tablet/laptops?

    This device doesn’t have a scaler? Will it fail to connect or just show the native resolution of the input floating in the center?

      • Klimax
      • 5 years ago

      Monitor will still work, you just won’t get G-Sync itself.

      • Damage
      • 5 years ago

      The monitor works fine with a Radeon. Haven’t tried image scaling, but that’s not a big concern since the Radeon GPU will scale for you if the monitor can’t do it. (I’ll betcha it can, though.)

    • Convert
    • 5 years ago

    I found it somewhat humorous that ULMB is useful, not because I don’t believe it to be, just that it reminds me of CRTs and how happy we all were to get rid of the blinking. Now it’s a feature haha.

      • Voldenuit
      • 5 years ago

      Some of us aren’t happy with the motion blur of LCDs.

      Plus, CRT refresh strobing caused eystrain because it was always on, even when reading text.

      By contrast, ULMB is only activated when playing games – moving images vs static displays are two different kettles of fish.

      Sounds like we’re potentially getting the best of both worlds here.

        • rahulahl
        • 5 years ago

        Pretty sure ULMB works even when browsing or reading webpages. And pretty much anything.

          • Voldenuit
          • 5 years ago

          ULMB can be turned on and off. And should, unless the situation calls for it (eg you are playing an old game that’s always getting 100+ fps),

            • rahulahl
            • 5 years ago

            Yes it can, but the way I saw it the best use was actually outside games.
            I have always found scrolling pages makes things very blurry. I know it might seem a bit hard to believe, but I believe if that blur was not there, I can do my stuff much faster. Skimming through pages is annoying with the blur.

            When I get mine, I intend to try ULMB on all the time and switch to G-Sync when in games.
            This is of course only based on my understanding of ULMB. I have not actually seen it in action, so I might be way off the mark.

      • Luminair
      • 5 years ago

      Both LCD and CRT have refresh rates, ULMB or not. Low refresh rate looks “blinky”, ULMB or not. And high refresh rates don’t look blinky, ULMB or not. ULMB isn’t about the blink, it’s about what they insert in between the blink: the blackness of night.

        • GrimDanfango
        • 5 years ago

        Low refresh rates on LCD don’t look blinky. They may look stuttery, or blurry, but they don’t strobe by default, and it’s the low-frequency strobe that gave headaches with CRTs.

        ULMB is absolutely about the blink… all the “blackness” is doing is hiding the worst part of the transition cycle, so that all your eyes see is a pulse of solid image each refresh, and not the awkward mess inbetween. The fact that it’s black has no intrinsic effect on image quality besides making it uniformly darker, unless compensated by cranking the backlight even brighter during the pulse.

        • Convert
        • 5 years ago

        I understand what you are saying, and as Grim points out LCDs don’t have the traditional “blinky” refresh rates.

        I’m sure ULMB looks and works great and I’m sure I’d be happy to buy a monitor with the feature some day.

        Just found it funny is all.

      • rahulahl
      • 5 years ago

      I think the issue with CRT was that they didn’t blink enough. If you had a CRT on high refresh rate, it was awesome on the eyes. The ones with lower refresh rate like 60Hz were the ones that caused headaches and eye strains.

        • Meadows
        • 5 years ago

        I actually got into PCs and gaming at a really very young age and never had a proper monitor until something like 10 years ago, so 60 Hz is all the refresh frequency I had for about a decade while growing up. Never bothered me. Once, one of my secondary school classmates would complain about flicker on school computers and I just stared at him completely puzzled.

        I don’t know if it’s acclimatisation or if I’m just genetically lucky, though.

          • Terra_Nocuus
          • 5 years ago

          I was one of “those guys” that would raise the refresh rate while helping someone on their computer. 60hz was/is awful to me, even 75hz would give me a headache eventually. 85hz was the best I ever saw, back in the day.

    • nia
    • 5 years ago

    I’m really excited about this monitor, for that price it seems like a steal… If you’re loyal to Nvidia. Dropping $800 on anything that’s not vendor agnostic? No thanks.

    What if AMD makes a good graphics card again? (*shakes head sadly and bursts into tears*) What if Intel or ARM or PowerVR make a play for desktop performance in two years? Monitors have the longest lifespan of any of our components, not willing to to take that dive for futureproof on 1/5th of vendors.

      • rahulahl
      • 5 years ago

      You can either play is safe and save your money, while missing out on the smoothness that G-Sync provides or you can spend the money and actually enjoy it save yourself a headache (literally in my case) dealing with the older refresh rate model.

      You can look at the past history that Nvidia has always been very competitive if not better than the rest in graphics department. And looking at the roadmap, its not likely to change in the next 5 years.

      If you don’t have the funds to afford it, that’s fine. Just appreciate it for what it is even if you wont be buying it. Putting it down like this is not really doing any justice to this monitor.

      • Voldenuit
      • 5 years ago

      [quote<]I'm really excited about this monitor, for that price it seems like a steal... If you're loyal to Nvidia. Dropping $800 on anything that's not vendor agnostic? No thanks.[/quote<] I definitely hear you on that score. My past 2 GPUs have been nvidia (and the previous 3 before that AMD), but even though I have a G-Sync compatible system right now, I won't be buying a monitor with that degree of lock-in. I actually have a shield tablet, which is locked in to nvidia GPUs for Gamestream, but that's more of an ancillary feature that I don't even use, so haven't been as concerned - there's a big difference for me between a $299 tablet with some nice bonus features if you have an nvidia GPU to a $799 monitor whose raison d'etre hinges on me sticking with one GPU vendor for the life of the monitor.

      • Airmantharp
      • 5 years ago

      If you’re willing to pay for the premium that G-Sync requires, the price difference between an appropriate AMD card and Nvidia’s equivalent really shouldn’t bother you, and could easily be in your favor.

      Nvidia’s not going to be able to really cash in with higher GPU prices outside of the unlikely instance that FreeSync is somehow definitively inferior, after all.

      There’s no loyalty to it.

    • Bensam123
    • 5 years ago

    Sucks I purchased a VG248QE before G-Sync was even talked about. I expected to hang onto it for quite awhile, but I suspect it’ll get replaced next year when Freesync comes around. Something like this really does warrant a monitor upgrade.

    Apparently Asus is selling the upgrade kit for my monitor for the low-low price of $199.99. That doesn’t seem like a worthwhile endeavor.

    [url<]https://store.nvidia.com/store?Action=DisplayPage&Env=BASE&Locale=en_US&SiteID=nvidia&id=QuickBuyCartPage[/url<]

      • Airmantharp
      • 5 years ago

      $199 seems worth it to me… I mean, yeah, that’s a lot, but variable V-Sync is worth it!

        • Bensam123
        • 5 years ago

        That’s more then the price of the monitor if you bought it new right now.

          • SS4
          • 5 years ago

          Its actually about the same or maybe more than what i paid mine brand new 2 years ago lol.
          Anyways, the VG248QE is still an awesome monitor and you can still use the lightboost mode instead of ULMB (colors suffer a little though) but not Vsync.
          If you wait longer, of course better technology will come out. Same will happen if you buy the ROG swift, by next year there might be some monitor outclassing it or equaling it at better price and whatnot lol.
          So yeah, if you want to hold on for something better you will have to do so your whole life and stay on your 20+ year old hardware because of the non stop improvments 😛

            • Bensam123
            • 5 years ago

            Yeah, I bought mine when they first came out for like $370 something. 144hz was definitely worth it back then. G/Free-sync definitely looks like it’s worth it again. But I don’t own a Nvidia card so it doesn’t appeal nearly as much.

            I really wish monitors were more upgradeable. It makes you wonder what they could do by offering upgraded modules and such for panels instead of constantly trying to replace existing products like the home router industry.

    • drfish
    • 5 years ago

    Not that I expect it to work with G-Sync enabled, but being a 120Hz capable screen this should also work well with Nvidia’s 3D Vision, right? [i<]Edit: [url=http://rog.asus.com/forum/showthread.php?42516-ROG-SWIFT-PG278Q-Nvidia-3D-Vision&s=ae4ec401bf0c0f2f66658baa7953f978&p=361989&viewfull=1#post361989<]Mostly I guess.[/url<][/i<] Care to comment any further on the impact of G-Sync on low frame rates? Part of the way to justify the expense of such a beast is if it can make 25-45fps more tolerable and extend the life of your GPU as a result (or in the case of something like DayZ help mitigate a failing of the engine)...

      • Spunjji
      • 5 years ago

      DayZ is a bane of my gaming life for frame-rate variability!

        • drfish
        • 5 years ago

        I would spend $800 just for smooth frames in cities and towns… It doesn’t matter what hardware you have, you’re not getting more FPS so you need to make what you do have [i<]better.[/i<]

    • EZTV2
    • 5 years ago
    • Laykun
    • 5 years ago

    The only problem I have with this monitor is it has the potential to be better. And while you say it’s a good TN panel, it’s obvious that there are better panels out there . While the IPS might be producing the deep reds in error, most people don’t care at all about the actual accuracy of the display, most people use that term “accurate” to extend their e-peen. Subjectively I find them very appealing, even if they are highly inaccurate, simply because it tickles my eyes and I want to browse my media and play my games with colours that please me, not colours that stroke my techno-ego.

    In the article it sounds like you’re apologising for the panel being TN, trying to reassure us that it’s good, but you know if this was IPS or even maybe VA nobody would be able to shut up about this monitor and it’d be lauded as the saviour of PC gaming.

      • Airmantharp
      • 5 years ago

      They know their audience.

      • Thresher
      • 5 years ago

      I’m not sure, and maybe someone can confirm this, but I don’t think that IPS panels are capable of refresh rates up to 144Hz.

        • Laykun
        • 5 years ago

        I don’t think IPS panels are currently capable of it but I believe VA panels are?

          • Voldenuit
          • 5 years ago

          Overlord has a 27″ 120 Hz IPS monitor:

          [url<]http://www.tomshardware.com/reviews/overlord-tempest-x270oc-monitor,3879.html[/url<] EDIT: Also worth noting that the early crop of overclockable no-name korean monitors were IPS.

        • Flapdrol
        • 5 years ago

        You can refresh any panel at any frequency, but it takes time for the pixel to get to the right orientation and form a picture.

        If you’re refreshing a slow panel at high frequency everything that moves will be a blurry mess, but at least it’ll be smooth, probably still preferable to the same panel at lower frequency.

        • Chrispy_
        • 5 years ago

        IPS panels typically have a 5-8ms response.
        1000/8 = 125Hz
        1000/5 = 200Hz

        This is why I’m stunned there are almost no 120/144Hz official IPS panels since the overclocked ones manage 144Hz just fine and the pixel response is fast enough that it’s not a smeary mess.

        TFT central has some pretty in-depth latency measurement in their reviews and they show that IPS pixel response isn’t really that much slower than TN pixel response, the difference is in the aggressiveness of the overdrive. I posted links and figures in an article last week, but IIRC an 8ms IPS pixel response was measured as 8.9ms and a 1ms TN panel was measured as 7.5ms. Even with the overdrive cranked up to insane, artefacting and ghost-outline levels, the TN panel never even got close to the 1ms “marketing value”, it was 3.5ms at best.

          • GrimDanfango
          • 5 years ago

          I don’t know for certain, but I’d speculate that maybe 8ms isn’t enough to create an unblurred image at 120hz… Sure, it’ll keep up, but a panel doesn’t just need to transition to a new colour every refresh… it has to get there and hold that colour for a while before the next refresh. If it spends the entirety of 8ms transitioning, it’ll mean the screen update will be one constant transition, and you’ll never get a settled image on the screen.

          So I guess for 120hz, you’d need at most around 4ms, and ideally the less the better. 8ms is just the absolute technical limit, so I presume that’s why so far, IPS hasn’t generally been pushed that far.

            • Firestarter
            • 5 years ago

            Do we need a settled image when we’re looking, driving or flying around anyway? Seems to me that most content that would benefit from high refresh rates is content where we’d want smooth animation, and blurred transitions don’t really detract from that. It will of course make it harder to see detail when stuff is moving fast, but my guess is that for all but the most hardcore players, this isn’t much of a problem.

            • GrimDanfango
            • 5 years ago

            I would agree with that, except I suspect that liquid crystal transitions tend to be inherantly ugly, hence all the attempt to minimise them using overdrive, and hide them using ULMB strobe effects.
            I suppose if the pixels transitions in a smooth, even way, at the same rate regardless of start/end values, and looked good in a half-transitioned state, then what you’re suggesting would be true – we’d get natural looking motion blur. As it is, I think it’s inherantly an effect that needs to be hidden, and so a monitor that spent its entire refresh cycle displaying it would likely exhibit some nasty artefacts.

            • cobalt
            • 5 years ago

            From what I’ve seen from people overclocking their IPS displays, you’re probably right — the color distortions and artifacts start to become more noticeable the closer you get to 120Hz. But as Chrispy_ suggests, maybe it’s just because no one tries to fix the speed of IPS displays, and the implication might be that some modest overdrive would improve the situation.

            • Bensam123
            • 5 years ago

            Yup, it looks like finger paints. IPS panels usually use overdrive to get the ‘low’ response times they already have.

            • Chrispy_
            • 5 years ago

            Indeed – the 6ms of the typical Korean 1440p LG-Philips AH-IPS is with mild overdrive, but the overdrive is so mild that there’s almost never any unwanted overshoot.

            Gaming screens have upwards of 25% overshoot for some transitions, which is insanely inaccurate. If people actually applied such powerfully innacurate overdrive to IPS I’m sure the response times would come down more. Perhaps not as much as 1ms, but perhaps from 6-8ms it could be dropped to 3-4ms, which would make it appropriate for strobing backlights and 120+Hz.

            • Chrispy_
            • 5 years ago

            What you’re talking about is sample-and-hold blur, which actually gets worse the faster pixel response is. I won’t explain it in detail because it’s not a simple issue, but blurbusters.com has some great articles on it.

            The only solution to this is backlight strobing, and yes – for that you need a completed pixel response during the backlight’s “off” period, and for a 50% backlight period at 144Hz, that means that the colour change has to start and end within 3.5ms.

            Without backlight strobing, the slower 6-8ms response of an IPS panel doesn’t really make any visual difference over a TN panel because of the way your retina/persistance of vision is affected by sample-and-hold blur.

            • Bensam123
            • 5 years ago

            The world exists in a state, the monitor does not. You only know what the picture looks like when it finishes morphing. When it’s in transition it’s just a blurry mess because there is no interprotation or transition between the states that makes sense to the human brain. I mentioned actually controlling the transition in a way that could make sense to a human brain at one point, but there is nothing like that yet available.

            That’s why lightboost works so well. We can’t tell what the blurry mess is anyway, so if you completely remove it, it makes things look better.

          • Rectal Prolapse
          • 5 years ago

          I just noticed this when I got my Benq XL2411Z – it has the backlight strobing.

          I was surprised that the pixel response of the IPS Dell UP2414Q (4K) was better than the Benq’s! When I move a black mouse pointer over a white background, on the benq I can see an afterimage of the pointer as I move it across the screen, but there is almost none at all on the Dell! The Benq has overdrive enabled (set to High, which is the default), with motion blur reduction, at 1920×1080@120 hz, while the Dell was at 3840×2160@60hz.

          I wasn’t expecting an afterimage at all, so was kind of shocked. But then again, the Dell’s overdrive must’ve been fairly mild.

          This makes me think that we really should have IPS panels running at 120 hz with ULMB at the very least….

      • Luminair
      • 5 years ago

      This is the best gaming monitor ever built (or announced) and you’re saying you know how it can be done better. If only those monitor companies would just listen to you. I doubt that!

        • Airmantharp
        • 5 years ago

        It CAN be done better, and the monitor companies ARE listening to us.

        This is just round one, and in most respects has been resoundingly successful.

          • Luminair
          • 5 years ago

          Present tense it cannot be done better. This is the biggest, fastest, and best-looking gaming monitor ever. Every review concurs. The OP is wrong. Unequivocally.

            • Airmantharp
            • 5 years ago

            You’re treating your opinions as fact; this monitor isn’t better in every objective way than what’s already on the market, let alone what can be made with present technology.

      • PixelArmy
      • 5 years ago

      I’m confused… You’re complaining about the panel being TN, while also saying most people don’t care about color accuracy, which would support the decision to go TN…

        • rahulahl
        • 5 years ago

        The worst thing about TN isnt the color accuracy.
        But rather the colors look washed out and especially the viewing angles are too narrow.
        With IPS this is not an issue.

        With this monitor at least, the colors are not washed out and it supports 8 bit native colors.
        Its not important for most people to have accurate colors, as long as it does not look washed out.

    • f0d
    • 5 years ago

    great… i have to sell my 144hz asus and buy one of these now

    • laggygamer
    • 5 years ago

    I’m surprised that you prefer the variable refresh to to ULMB mode. This is blurbuster’s description of the differences:

    LightBoost: Better for games that sustain a perfect 120fps @ 120Hz
    G-SYNC: Better for games that have lower/fluctuating variable framerates.

    So since the majority of the games you played had an fps below your monitor’s high refresh rate that could be why gsync’s default mode was better.

    Basically what I want to know is for the game that you were actually getting 144 fps, borderlands 2, was ULMB actually superior for that game.

    I’m just trying to see which mode offers the greatest peak performance and it seems like the variable refresh mode was counteracting the deficiency of your single card config stuttering trying to produce for a high resolution and high refresh rate. But maybe in sli with 2 top cards where you can max the framerate then ULMB is better?

    Or is gsync just better regardless. And with the way games are coded stutters tend to happen regardless of your config so I get that.

      • rahulahl
      • 5 years ago

      I think even if you get 120 FPS, there is still microstuttering.
      Not every one of those 120 FPS in that second is generated at exactly the same interval, which is when you will get the stuttering. Some people notice it a bit more than others, but its there.
      With G-Sync, since each and every frame is synced, that does not happen. This is why everyone who uses G-Sync makes a big deal about how smooth it is.

        • laggygamer
        • 5 years ago

        Yeah I think lightboost actually makes stutters more noticeable too since it can’t be hidden by blur anymore. I’m trying in vain to tell myself that I don’t need gsync because I can’t pay an extra 300 just for that, let alone 800 for a tn panel. I don’t have lightboost either, probably not worth it either though if gsync is this good. I was just surprised because on my regular 120hz screen the video for the ULMB was a more noticeable difference than gsync, for me anyway.

          • rahulahl
          • 5 years ago

          I am curious though.
          In a game like Counter Strike GO, where its quite easy to get 200+ FPS, is it better to use ULMB?
          I mean, because there is less blur with motion, it means you will notice enemies quicker and are less likely to “not spot them”.

          Also, is there an easy way to switch between G-Sync and ULMB?
          Like maybe just have a shortcut on desktop, with a script, that will toggle it? If its just a tickbox in settings, it shouldnt be hard to find the registry changes it makes, and do it via a script.

            • cobalt
            • 5 years ago

            From various other places I’ve been reading, use G-Sync if you’re at 110 FPS or lower. Above that, ULMB will be a bigger help. (Rough numbers, of course.)

            And yeah, if you’re getting 200FPS, how many frames will be delivered below a rate of 144FPS? I’m guessing you could just enable v-sync at 144Hz and use ULMB.

            • npore
            • 5 years ago

            Yup, although in an ideal world you want both at the same time, in this case ULMB with vsync.

        • Flapdrol
        • 5 years ago

        ingame fps limiter + vsync + lightboost should get you a perfect result.

        I use something similar in assetto corsa: crt on 100 hz, 100 fps limit, vsync on. make sure to turn off scaling and set pre rendered frames to the minimum, otherwise it might be laggy.

      • mutantmagnet
      • 5 years ago

      Lightboost is best used at 120 fps but it does work at 100fps though it’s less than ideal.

      ULMB’s minimum is 85 fps but anyone who has used it say you should be above 90 at minimum.

      So ulmb is a pretty big improvement over lightboost and it would be nice to see a pro review at various refresh rates.

      • Damage
      • 5 years ago

      [quote<]stutters tend to happen regardless of your config[/quote<] That's the key, right there. Averaging 144 FPS or even something well above it won't guarantee that you get a new frame every 6.94 milliseconds when it's time to refresh the display. In fact, like I show with the frame time plot in the article, that's generally far from the case. Frame times vary pretty widely depending on a bunch of different bottlenecks, from CPU and GPU performance limits to software-level overhead and slowdowns. G-Sync better addresses that reality, and even in Borderlands 2 with a GTX 780 Ti where FPS averages are pretty high, it feels like a clear win vs. ULMB mode to me.

        • rahulahl
        • 5 years ago

        I really feel that people only think of it in terms of a way to cover up low fps. They don’t consider that even with high FPS, this will give you a fluidity that purchasing the best GPU can’t give you.

          • Firestarter
          • 5 years ago

          In my experience, the stutters can be even worse at high fps, especially when the average is very close to the refresh rate. With a low frame-rate, all the animation is a jittery, stuttery mess anyway (so to speak) so your brain does its best to filter that input into a reasonable picture of the game world despite everything that is wrong with it. With high frame-rates, the animation is going to be smoother all in all, so when it does stutter and tear because of a missed Vsync interval, it’s going to show up a lot worse because your brain wasn’t already filtering other stutters out.

        • npore
        • 5 years ago

        ULMB with G/adaptive-sync [i<]needs[/i<] to happen. My monitor, the Ezio Foris FG2421, has ULMB (essentially). It's amazing, but for the magic to happen you need to keep frame times under 8.3ms as much as possible, and you really want vsync on. The resulting [i<]clarity[/i<] of motion is really something; I'm too young to remember CRTs 😉 Scrolling in MOBAs/RTSs is clear. Textures and edges stay clear as you move the mouse around in a FPS. Objects moving past/across you are clear. You can strafe without the world blurring. Very early on when I could be bothered switching to and from strobing mode when jumping between desktop use and gaming I forgot to turn the strobing on for a session of BF4. I was flying around in a chopper (a very good example of where ULMB shines, especially when strafing) and actually thought my eyes must be tired or something; I told my gunner I was going to call it a night. Then I realised the strobing was off, flicked it back on and the blur was gone. Also noticed since having this monitor that when we're running around on foot I'm spotting more enemies and faster than my mate, when we used to be even in that regard. That's my experience anyway, and I guess some people are more sensitive to motion blur than others. I really despise the blur in movies... But staying under 8.3ms 99% of the time is a tough job. Pretty much impossible not to spike above it. When you do go over the magic is gone until you get under again. I have been lucky(&crazy) enough to have been able to build a ridiculous Ivy-E + 2x GTX 780 Ti overclocked system under water. It holds up pretty well with my monitor, but I imagine 1440p would a harder ask without dialling things down. As you say, there are many different bottlenecks in hardware and software and you just can't keep under a tight target like that. G-Sync eliminates the need for a hard target. Most gamers, even if they could afford it, wouldn't be silly enough to build a system like mine. Adaptive-sync is far more important. You need to have smooth motion before clearer motion (I guess G-Sync not only makes motion smoother but helps with clarity by eliminating distracting tearing). Even if my rig is running perfectly at 120Hz vsync'd, with all frames under 8.3ms, there is still variation in the the time it takes to draw the frame and, even though the frame rate displayed is constant, that variation can be perceptible in the content of the frames - as you demoed in the review with 60Hz vsync. Probably not as noticeable with higher frame rates but still. [b<]TL(yes);DR[/b<] - It's awesome that we are moving to better motion in gaming with G-Sync. Adaptive-sync [i<]with[/i<] low persistence really needs to happen though! Low persistence is a big deal...just not as big a deal as adaptive-sync.

          • GrimDanfango
          • 5 years ago

          I assume the problem that prohibits ULMB being combined with G-Sync/Adaptive-sync, is because the principles behind them pretty much directly conflict.

          G-Sync works on the assumption that it waits an unknown length of time before a refresh is required. ULMB needs to pulse for an exact fraction of a refresh, each refresh. If you tried to combine them, ULMB wouldn’t have any means to know how long to pulse for, as it doesn’t know how long the next refresh will take.

          It could pulse for a predefined set length of time each refresh, and then wait in darkness for the next, but that would have the effect of causing the display’s overall brightness to fluctuate constantly, getting darker the lower the framerate.

          It could turn off for a predefined length of time, and then pulse on until the next refresh is called, but that would give the same problem, except it would make higher framerates darker instead.

          I can’t think of any way they could get around those problems… but lets hope someone clever can eventually 😛

            • Airmantharp
            • 5 years ago

            G-Sync (or alternatives) > ULMB- but if they could figure out how to make the two play nice at the same time, I’m all in 🙂

            • npore
            • 5 years ago

            Yeah, I can’t think of a clean easy way.

            Fluctuating brightness is a problem, but maybe could be offset somewhat by adjusting the pulse brightness based on some sort of running average frame rate. But that would fall down with large changes/spikes. You would have an upper frame time threshold for ULMB switching off though.

            Or maybe hold off displaying a frame until the next frame is ready, so the frame time is known, the pulse length is accordingly adjusted. But this obviously adds delay, and variable delay at that, which sounds nasty. At high frame rates it might not be such a big deal?

            At the very least they could have an auto switching option from G-Sync to ULMB when you consistently have a frame rate above a certain threshold… although that would probably be too noticeable suddenly losing G-Sync.

            Yup, fingers crossed someone clever finds a way 🙂 I’d jump on it in a heartbeat.

            • thor84no
            • 5 years ago

            [quote<]G-Sync works on the assumption that it waits an unknown length of time before a refresh is required. ULMB needs to pulse for an exact fraction of a refresh, each refresh. If you tried to combine them, ULMB wouldn't have any means to know how long to pulse for, as it doesn't know how long the next refresh will take.[/quote<] I don't see how this makes sense. G-Sync knows as much about the target refresh rate as any other system would, and you never really know any more than that since any frame could be delayed. Unless there's some reason G-Sync would be less likely to deliver a frame on time, I don't see how it would be fundamentally at odds with ULMB. It's not you could ever "pulse for an exact fraction of a refresh, each refresh" without buffering and doing ULMB post-buffer.

            • GrimDanfango
            • 5 years ago

            Well, before G-Sync, all screen refreshes happened at synchronised regular intervals, whether there was a fully rendered frame ready to display or not. That’s the whole problem that G-Sync was created to address. Normal 60hz monitors refresh 60 times a second, at precise 16.66ms intervals, and if a frame is only partially rendered, it either displays the part that is (vsync off), or it just keeps repeating the last frame each refresh until a complete one is ready (vsync on).
            So, ordinarily, you know exactly how long each refresh will be, because it’s completely detached from how long it takes the GPU to render a frame. You can pulse the backlight 8.33ms on, 8.33ms off, and it will synchronise with the refresh rate.

            The innovation of G-Sync is specifically to tie the screen refresh directly to the moment that the GPU finishes rendering a complete frame. At that point, you can no longer depend on each screen refresh being a specific length regardless of what the GPU is sending, because it’s intentionally been made variable.

      • Bensam123
      • 5 years ago

      Blurbusters aren’t the ‘opinion’ on subjective material.

      From using Lightboost myself I found it strained my eyes and often times there would be skips that would ruin the immersion every now and then. At lower FPS it actually makes things worse.

        • Meadows
        • 5 years ago

        FPS is independent from Lightboost.

        • Prestige Worldwide
        • 5 years ago

        Lightboost’s smoothness was nice, but it made the picture too dim and became annoying after a while. I just use plain old 120hz.

    • Voldenuit
    • 5 years ago

    Thanks for the article.

    Does GSYNC work with 24 and 25fps film content? ie Does it correctly refresh the display at 24,25,48 or 50 Hz (or multiples thereof)?

    If so, does it work with any EVR-compatible playback app?

      • jihadjoe
      • 5 years ago

      You know that is one excellent use of Gsync that I did not think of. Now having to do 3:2 or 2:3 pulldown will be a nice boost to movie quality.

      • riflen
      • 5 years ago

      > Does GSYNC work with 24 and 25fps film content? ie Does it correctly refresh the display at 24,25,48 or 50 Hz (or multiples thereof)?

      G-Sync is a solution specifically designed to tackle tearing and stutter in games. In its current implementation, its lowest supported frequency is 30Hz / 33.3ms frame time.

      > If so, does it work with any EVR-compatible playback app?

      I can’t test this personally, because my display is still on back-order. G-Sync is only supported for games running in full-screen mode. According to posters in this thread on blurbusters, it can work. Sometimes.

      [url<]http://forums.blurbusters.com/viewtopic.php?f=5&t=239[/url<] There are problems related to G-Sync's 30Hz restriction and content with a frame-rate close to 30fps. For what its worth, AMD have stated that FreeSync will be supported for "video playback and power-saving purposes" and will potentially support the frequencies 36-240Hz, 21-144Hz, 17-120Hz and 9-60Hz

        • GrimDanfango
        • 5 years ago

        17-120Hz sounds like the ideal middle-ground. 30Hz sounds like it could be a bit too restrictive for heavy games, but 17 seems like a reasonable target for an absolute-lowest FPS.

        Is there any word on what actually happens when games dip under 30 while using G-Sync? Does it handle it in an elegant way?

        • jihadjoe
        • 5 years ago

        [quote<]G-Sync is a solution specifically designed to tackle tearing and stutter in games. In its current implementation, its lowest supported frequency is 30Hz / 33.3ms frame time.[/quote<] Won't it work if you run a frame-doubling filter (like some of the deinterlacers) so the GPU refreshes at 48/50 instead of 24/25?

    • Airmantharp
    • 5 years ago

    No input lag penalty! The most important unanswered question has been answered.

    • Neutronbeam
    • 5 years ago

    I think the only really important question in this discussion is for Scott. When are you offering the test monitor in a giveaway contest? 😉

      • jessterman21
      • 5 years ago

      Haha – half past never. You’ll have to pry it out of Scott’s cold dead hands.

        • Captain Ned
        • 5 years ago

        It’ll be a TR BBQ prize within 3 years.

    • rahulahl
    • 5 years ago

    Pre ordered mine a couple of weeks ago.
    Selling for $999 in Australia. Expensive, but I think it’s gonna be well worth it. Getting the 880GTX when it releases next. Can’t wait for AUG 29, when this gets in stock and we get more info about the haswell-e as well.

      • rahulahl
      • 5 years ago

      Actually pccasegear.com.au already got it in stock a week early.
      So if anyone in Australia wants to buy this, its in stock now.
      Mine’s in transit now. 🙂

        • rahulahl
        • 5 years ago

        Okay, I have it now but having an issue.
        The overdrive was making text unreadable when scrolling, so I turned it off. But even now, its not very clear when its scrolling. You can see dark shadows/imprints left behind for fraction of a second.

        Its not a big issue for me, but I certainly didn’t expect this to be there. And for some people it is certainly gonna be a deal breaker, so I am just putting it here so other potential buyers are aware of it. Still waiting for the 880 to release so I can try G-Sync on it. Hopefully it will be worth it.

    • merrymarian
    • 5 years ago

    Acer releases a 4K 28 inch g-sync monitor.
    [url<]http://us.acer.com/ac/en/US/press/2014/77934[/url<] It seems be available in September. Price: probably 800$.

      • Milo Burke
      • 5 years ago

      And probably 60 Hz.

      • merrymarian
      • 5 years ago

      Yeah, if they don’t specify 144hz, it probably won’t be.

    • jjj
    • 5 years ago

    somewhat off topic but mildly related ,newegg has the 39 inch 4k Seiki as low as 280$ [url<]http://slickdeals.net/f/7147924-39-seiki-4k-120hz-led-hdtv-se39uy04-279-99-after-rebate-free-shipping-w-visa-checkout[/url<] just had to share it since i haven't seen it bellow 400$ before

      • Bauxite
      • 5 years ago

      No gsync but has option to run 1080p@120hz for games instead of 4k@30 for media so maybe slightly less off topic 😉

    • Ninjitsu
    • 5 years ago

    [quote<] One thing I have learned to notice since our review of Asus' 4K TN monitor is the "screen door" effect caused by FRC, or temporal dithering. [/quote<] That happens to whites (like the explorer window) on my (most likely 6-bit + FRC) IPS display at some contrast ratios. Really annoying but thankfully found a good setting to make it mostly invisible.

    • FroBozz_Inc
    • 5 years ago

    Posting in support of the usage of the word “nubbin”. Love that word.

    • Pville_Piper
    • 5 years ago

    I really want one of these too… But I can’t afford a new monitor and video card. I know my GTX-660 won’t cut it at that resolution. Any word on when they will release a 1080p monitor with G-Synch?

    • smunter6
    • 5 years ago

    What frame times were you getting when gaming on this monitor? Was it mostly a full 144 FPS or something less? And at what point did you start to see diminishing returns?

      • Damage
      • 5 years ago

      The performance ranged in my use, depending on the game. Borderlands 2 was a part of my testing, and it ran at something near to 144 FPS most of the time on the GTX 780 Ti that I was using. Other games, not so much.

      I talk more about the benefits of G-Sync at higher and lower frame times in my look at the prototype:

      [url<]https://techreport.com/review/25788/a-first-look-at-nvidia-g-sync-display-tech[/url<] Most of what I said there still applies. Sometimes, the biggest impact of variable refresh is when the GPU is stressed, and delivering the next frame right as it's ready can really help.

    • Meadows
    • 5 years ago

    On a different note: why the f- have they not fixed the goddamn Skyrim engine yet? <insert Picard image>

    • TwoEars
    • 5 years ago

    “For gaming, it’s the best display I’ve ever used”

    I have one and I’ll second that comment! +100

    • anotherengineer
    • 5 years ago

    Not bad review for being new to reviewing monitors.
    [url<]http://www.tftcentral.co.uk/reviews/asus_rog_swift_pg278q.htm[/url<] shows the viewing angles to have the typical TN color shift, but looks like they did it at a much steeper angle though.

      • Meadows
      • 5 years ago

      Who the hell looks at a monitor from those angles?

        • anotherengineer
        • 5 years ago

        Exactly.

        But it does show that the color shift does exist though.

          • Ninjitsu
          • 5 years ago

          It will, obviously, it’s TN.

        • bandannaman
        • 5 years ago

        This one rotates vertically, so in that mode it’s an important factor. Not relevant for gaming, but for office work it’s useful to know.

          • Meadows
          • 5 years ago

          I assure you, for office work and managing Word/PDF documents and spreadsheets you will pretty much be physically unable to care about colour accuracy.

          Edit: also, not a single company in their right mind will buy this for any employee when they can provide the same functionality for one third of the cost.

            • JustAnEngineer
            • 5 years ago

            Even in an office environment, there are times when the inferior nature of TN LCD panel technology creates problems.

            Yesterday, I was looking at a chart on our intranet at work. For a few minutes, I had trouble interpreting it because the pink trend line in the middle of the monitor showed as blue-ish gray in the legend at the bottom of the TN panel.

            True, you may be able to get by if you position yourself perfectly in front of your monitor for ideal viewing angles, but if you’re sharing something with a co-worker or glancing at the screen while you’re working at something else on the desk, you may find TN playing tricks on you.

            • bandannaman
            • 5 years ago

            I was more thinking about my own productivity in my home office. I don’t care about color accuracy per se (Dammit Jim, I’m a developer not a designer!). But vertically oriented TN displays can be very distracting to work with if you can’t get them facing you dead-on due to space constraints, because of the desaturation relative to your other monitors.

            Like I said, useful to know. Not a major factor.

        • MadManOriginal
        • 5 years ago

        To be fair, I think they are testing at around the manufacturer’s quoted specs? Or maybe they test at 170/170 for all monitors. They do test at two different horizontal angles. Plus, TN has this issue “As you move your head from side to side in a horizontal plane, there is a contrast shift and the image becomes darker and introduces a slight green hue.” With such large monitors, from a normal viewing distance portions nearer the edge will be out of the ideal viewing angle and these issues that are exagerrated at large angles are still present to a lesser extent at shallower angles.

          • Meadows
          • 5 years ago

          Judging by TR’s review (they tested those “shallower angles”), it’s absolutely fine. I have a cheap TN LCD at home that does worse than that, and I think [i<]even that[/i<] is passable. My point is you simply get used to it if it's subtle enough.

        • Ninjitsu
        • 5 years ago

        Our knees!

      • MadManOriginal
      • 5 years ago

      Frig. At least link straight to the part you’re talking about, too much scrolling makes MMO go somethingsomething

      [url<]http://www.tftcentral.co.uk/reviews/content/asus_rog_swift_pg278q.htm#viewing[/url<]

    • Meadows
    • 5 years ago

    I’ve seen passable TN displays myself, but I was still skeptical upon starting to read this.

    Then the “viewing angles” photos convinced me. This really is a very good monitor. I might stop pining for an eventual “120 Hz [b<]IPS[/b<]" upgrade after all.

    • Prestige Worldwide
    • 5 years ago

    I really want this monitor, but won’t shell out $800 for it.

    What I’m more looking forward to is a 24″ 1080p equivalent. 144fps is hard to get at 1440p with the latest / most demanding games on a single-gpu setup.

      • cobalt
      • 5 years ago

      Speaking of which, do we know how well this handles scaling, e.g. could you set the resolution to 1280×720 and still gain all the benefits for a tradeoff of resolution for framerate? I can’t find any reviews trying out non-native resolutions, even perfect 2:1 ratios.

      (I’m going to ask this question again for any 4k reviews — there, you could game at 1080p, which is even more reasonable, and still have all those nice pixels for “real” work.)

        • Airmantharp
        • 5 years ago

        I’m betting that it will depend on whether the GPU or the monitor does the scaling, though if the GPU does it- the likely scenario- it shouldn’t make any difference to the monitor, which should still be receiving it’s native resolution.

          • Liron
          • 5 years ago

          The article indicated that the G-Sync board would do the scaling. Which is why I would have loved so much to see the scaling quality being reviewed.

          • cobalt
          • 5 years ago

          I’ve essentially never done scaling, so I’d very, very much like to hear more. What’s the quality of GPU versus monitor scaling (as Liron mentions, the article says G-Sync has a scaler)? What’s the latency penalty of each? If GPU scaling is better, which cards support it? (I don’t see any options in the driver or windows control panel to let me choose GPU scaling, so I’m assuming it’s not universally supported.)

          On a barely-related note, I really want to know how multi-monitor setups interact. Can I get a 144 Hz ROG for my primary monitor, and have a second IPS display always hooked up, and still use all the juicy G-Sync/144Hz/ULMB goodness on the main ROG monitor without having to turn off the second monitor?

            • Rectal Prolapse
            • 5 years ago

            I have a Benq XL2411Z set as primary, and a Dell UP2414Q (4K IPS) running as secondary. It works.

            The only issue is that if you switch configurations around the Benq sometimes loses the 120 hz I have it set at, and drops down to 100 hz. I usually can tell because my benq is tweaked to be brighter at 120 hz. Now and then I have to go to NV control panel to fix it, whenever I switch off a monitor or enable my projector (on the same videocard).

            I usually have to set the benq as primary because games will often only work on the primary monitor.

      • juzz86
      • 5 years ago

      The AOC G2460PG is probably what you’re chasing, then. Looks a bit tackier, but fits the bill. Mine is on pre-order, I couldn’t justify the AU$1000 for the ROG one.

    • superjawes
    • 5 years ago

    G-Sync might be on the market now, but, with an alternative on the way. Nvidia should get the hardware finalized quickly. That current vendor lock could be worked around if the design was finished, and heck, they could publish a “how to implement G-Sync” paper for anyone to use and make money by selling the ASIC to monitor manufacturers.

    As it stands, FreeSync only needs to be “close enough” and it will be a more appealing alternative, even if G-Sync can edge out some extra benefits here and there.

      • JustAnEngineer
      • 5 years ago

      The VESA standard is always a more appealing alternative.
      1) It’s royalty-free.
      2) It’s IN THE STANDARD – therefore it will work with a wide range of hardware.
      3) Widespread adoption of standard components means that the parts get cheaper through economies of scale.

      I remain amazed that any hardware ODMs have bothered with proprietary G-Sync when it has only a few months head start on the VESA standard. With the low-volume high-cost proprietary electronics, How much money can they possibly make on these in less than six months before high-volume low-cost STANDARD parts become the norm?

        • mutantmagnet
        • 5 years ago

        The fact they are doing it this way should tell you that you don’t know what additional costs need to be spent to make this type of technology work.

        Currently with Gsync, display manufacturers are in a queue where Nvidia makes a gsync module that fits the specs of their displays.

        The RnD costs are high enough that they would rather rely on Nvidia fronting the bill than doing it themselves.

        Adaptive sync becoming standard doesn’t null and void the costs of creating a display to fit a certain performance range.

          • superjawes
          • 5 years ago

          [quote<]Adaptive sync becoming standard doesn't null and void the costs of creating a display to fit a certain performance range.[/quote<]This bit is important, and is what I alluded to. Monitor manufacturers are still going to have to develop the chips to implement adaptive sync, but if Nvidia already has it finished, they can simply sell the chips and everyone can use G-Sync (including AMD implementations). However, they still need to get the hardware done. As long as it is in development, many monitor manufacturers could move forward with the alternative and forget about G-Sync.

    • Ninjitsu
    • 5 years ago

    [quote<] The physics in Skyrim go hilariously sideways at high frame rates, for instance. If you've been playing with uncapped frame rates and vsync disabled like I have for ages, though, all of this fuss will be familiar territory. [/quote<] Why not: A) Set adaptive vsync on and the monitor refresh to 60 Hz? B) Set the GSync refresh to 60 Hz?

      • Damage
      • 5 years ago

      Should work, but you don’t get the full benefit of higher refresh rates. Still not a bad option!

        • Ninjitsu
        • 5 years ago

        50 is usually the point at which I’m more sensitive to tearing than smoothness, but I guess it’s like SSDs…you don’t need higher refresh rates till you’ve been exposed to them! 😀

          • jihadjoe
          • 5 years ago

          I think a lot of the oldies here remember what zero ghosting high framerate stuff was like from the CRT days. Voodoo2 SLI was super good because it would do Quake2 at 90-100fps even at 1024×768.

          I sort of stepped away from FPS games when CRTs went away. Never could get the same feel out of LCDs, even though IPS did at least bring the nice colors back.

            • Ninjitsu
            • 5 years ago

            I stepped away from a CRT last year, to a 7ms IPS display. Sad times indeed, as far as ghosting is concerned.

            A friend of mine who’s been using LCDs for at least 8-9 years bought the same model as I have, a bit before me. When I pointed out ghosting, he was like, “oh, i thought it was a feature”.

            😐

      • GrimDanfango
      • 5 years ago

      Why not:

      A) Give up on shoddily written Bethesda games?

      I did this a long time ago, and it’s caused me absolutely no technical difficulties 😛

    • torquer
    • 5 years ago

    Pre ordered one on the 16th from PC Nation. Great to see the positive review

    • Terra_Nocuus
    • 5 years ago

    One of these & a GTX 880, and my mITX build will be complete 🙂

      • libradude
      • 5 years ago

      Just out of curiosity, what mITX case are you gonna fit a GTX 880 in?

      I’ve got my eye on a Corsair Obsidian 250D (and 780ti) for my first foray into water-cooling, but I’m open to suggestions. 🙂

        • jessterman21
        • 5 years ago

        Hadron Air 🙂

        • Voldenuit
        • 5 years ago

        [quote<]Just out of curiosity, what mITX case are you gonna fit a GTX 880 in? [/quote<] Maybe a [url=http://www.silverstonetek.com/product.php?pid=503<]Silverstone ML07[/url<]? Accommodates a double wide, 13" long GPU. Corsair's "small" cases are all way too SUV-sized for me to even consider.

          • Terra_Nocuus
          • 5 years ago

          I have a Node 304. I figure if a Titan/780/780 Ti can fit in there, an 880 should be fine.

Pin It on Pinterest

Share This